The Federal Trade Commission (FTC) has proposed new amendments to the Children’s Online Privacy Protection Act (COPPA) Rule, seeking to impose stricter limits on websites and online services regarding their usage and monetization of children’s data. Platforms such as social media networks and learning apps would be subject to these proposed changes.
The FTC has opened a 60-day commenting period for the public to provide feedback on the proposed amendments before the commission finalizes them.
Critics of COPPA argue that the rules in place are not stringent enough to safeguard younger users. Common Sense Media founder Jim Steyer referred to the COPPA rules as hopelessly outdated, and journalist Kara Swisher labeled them toothless due to their limited scope, which only extends to children under the age of 13. In contrast, California and the European Union have enacted laws that protect children up to the age of 16.
This move comes amidst a growing concern over teen mental health and an ongoing dispute between tech companies and state governments regarding the protection of children online. Meta, the parent company of Facebook and Instagram, faced a lawsuit from 33 states alleging that the platforms misled the public about the presence of harmful content and addictive features geared toward younger users. Meta responded to the lawsuit by expressing their commitment to providing teenagers with safe and positive experiences online. Similarly, video-sharing platform TikTok has faced legal action from U.S. states, including Arkansas and Utah, for allegedly having addictive features that can negatively impact children’s mental well-being.
YouTube also adjusted its policies last month, restricting recommended videos related to potentially sensitive topics like body weight to protect younger users. YouTube, along with other social media platforms, faced multiple lawsuits accusing them of being addictive, dangerous, and altering the thoughts, feelings, and behaviors of younger individuals. Both YouTube and TikTok have emphasized that safeguarding younger users remains a top priority.
The FTC conducted its latest review of the COPPA Rule in 2019, and it received over 175,000 comments from various stakeholders: members of the public, tech and advertising industry trade groups, academics, and members of Congress. Suggestions included expanding the definition of a website or online service directed to children to encompass platforms that may not explicitly target children but still attract a significant percentage of child users or generate content appealing to children. The previous amendment to the COPPA Rule was implemented in 2013 to accommodate the rise in smartphone and social media usage among children. Apps like Instagram and TikTok comply with the law by prohibiting children under the age of 13 from creating accounts. However, the FTC and other government regulators have consistently taken action against major tech companies that fail to comply with laws pertaining to the protection of child users. For instance, Google was fined $170 million for privacy violations and using children’s data for targeted advertising purposes.
The proposed changes to the COPPA Rule aim to address concerns regarding the online safety and privacy of children. As public discourse continues to evolve, regulators, industry stakeholders, and the public must navigate the complex landscape of children’s data privacy and their online experiences.