New Regulations Transform Children’s Digital Safety and Privacy Standards

October 2, 2024

Over the past five years, significant strides have been made globally in protecting children’s safety and privacy online. This transformation is driven by public concern over the digital risks children face, as well as an acknowledgment that the internet’s original architecture lacked provisions for their unique needs. As a response, various international bodies and major legislative efforts from regions such as the European Union, the United Kingdom, and the United States have taken center stage.

Heightened Global Awareness and Regulatory Push

In 2021, the United Nations underscored the urgency of this issue with its General Comment 25 on Children’s Rights in relation to the Digital Environment. This foundational document affirms that children’s rights online are just as critical as their rights offline. It encompasses various dimensions including data privacy, freedom of expression, and protection from commercial exploitation. The detailed framework provided by the UN highlights the pressing need for nations to align their policies with these guidelines to safeguard children’s rights in the digital age.

Around the same time, the Organisation for Economic Co-operation and Development (OECD) produced its Recommendation on Children in the Digital Environment. This recommendation underscored the need for age-appropriate safety-by-design principles to secure a safer digital landscape for children. The OECD’s proactive stance mirrored global concerns, pushing nations to foster digital environments where children are protected from potential harms. These actions by global bodies set a precedent aiming to ensure that children’s rights are prioritized and effectively implemented across digital platforms.

European Union’s Digital Services Act (DSA)

The European Union has been a frontrunner in regulatory measures with the promulgation of the Digital Services Act (DSA) in 2022. The DSA stipulates rigorous obligations on major platforms and search services, particularly focusing on children’s rights online. This includes mandatory risk assessments of children’s data, banning targeted advertising towards minors, and ensuring transparency in risk evaluations conducted by these platforms. These comprehensive measures usher in a new era of digital accountability where platforms must anticipate and mitigate risks to children proactively.

The DSA’s extensive provisions are cultivated from the core belief that more stringent regulations will mitigate the hazards children face in digital environments. By embedding children’s rights into their operational frameworks, platforms are now mandated to align with protective design standards. This requirement compels online services to rethink their data practices, thereby fostering a safer and more child-friendly digital sphere.

The United Kingdom’s Comprehensive Approach

The United Kingdom has mirrored and arguably expanded on the EU’s efforts with its Online Safety Act (OSA) passed in 2023. The UK’s OSA goes beyond the DSA, imposing comprehensive duties of care on online services, specifically targeting content that could negatively impact children. The act outlines specific types of content that must be restricted from children, setting precise compliance standards that platforms are required to meet. Additionally, the OSA mandates regular audits and reports to ensure platforms adhere to these standards consistently.

Moreover, the UK’s Information Commissioner’s Office (ICO) rolled out the Age-Appropriate Design Code (AADC) with 15 standards aimed at minimizing data and privacy risks for children. The AADC mandates platforms to apply age-appropriate design principles, fostering a safer online environment for young users. This British initiative highlights the importance of a tailored approach in dealing with children’s digital risks, further solidifying the UK’s commitment to protecting young users online.

Impact and Adaptation by Major Online Platforms

The influence of these regulations is evident in the operational changes by major online platforms between 2017 and 2024. A report titled “Impact of Regulation on Children’s Digital Lives” outlines 128 modifications by giants such as Meta, Google, TikTok, and Snap. These companies have transitioned towards adopting safety-by-design frameworks, driven by early regulations like the AADC, DSA, and OSA. This shift emphasizes a transformative approach where platforms no longer view children’s safety as an afterthought but as a central component of their operational mandate.

Such regulatory pressures have led these platforms to proactive risk management, infusing child-specific safety features into their design processes. The adaptations reflect an industry-wide acknowledgment of the necessity for heightened transparency and accountability. As these companies embed compliance measures into their operations, the digital landscape becomes increasingly safe for children, setting a new standard for online services globally.

United States: Federal and State Regulatory Efforts

In the United States, regulatory efforts at both federal and state levels spotlight ongoing developments in child online protection. The Children’s Online Privacy Protection Act (COPPA) has been in place since 1998, regulating online services directed at children under 13. More recent legislative debates are now centered around the Kids Online Safety Act (KOSA), which introduces a broader duty of care, personal data protection tools for children, and demands to disable addictive features on social media. KOSA’s proposed measures reflect the evolving understanding of the digital risks children face and aim to impose stricter responsibilities on online platforms.

Simultaneously, individual states have initiated their laws to bolster online safety for children. For instance, California’s Age-Appropriate Design Code Act, modeled after the UK’s framework, represents proactive state-level interventions. Though currently blocked due to First Amendment challenges, such initiatives signify localized yet robust efforts towards ensuring children’s online safety. These state-level measures highlight the diverse and multifaceted approaches within the United States, aiming to create safer digital environments for young users.

Transitioning from Self-Regulation to Legislative Mandates

In the past five years, there has been substantial progress in safeguarding children’s safety and privacy on the internet. This shift is largely driven by growing public awareness and concern about the digital risks that children face. Historically, the internet was not designed with children’s specific needs in mind, leading to a gap in appropriate safety measures.

In response, numerous international organizations and legislative bodies have stepped up their efforts to address these issues. Notably, regions such as the European Union, the United Kingdom, and the United States have led significant initiatives to enhance online protections for children. These efforts have resulted in stricter regulations and frameworks aimed at creating a safer online environment for young users.

The push to fortify children’s digital safety includes implementing new laws, enhancing existing regulations, and fostering collaborations among governments, technology companies, and advocacy groups. These actions are crucial as the digital landscape continues to evolve, exposing children to new and complex risks. The collective aim is to build a more secure and privacy-respecting internet that caters to the unique needs of the younger generation.

Subscribe to our weekly news digest!

Join now and become a part of our fast-growing community.

Invalid Email Address
Thanks for subscribing.
We'll be sending you our best soon.
Something went wrong, please try again later