Reed Smith Client Alerts

On February 16, 2022, Senator Richard Blumenthal (D-CT) and Senator Marsha Blackburn (R-TN) introduced legislation called the Kids Online Safety Act (KOSA). KOSA’s purpose is to develop a national strategy for technology policy across the United States, including new privacy protections for children, broader restrictions on collection of personal data, and stricter enforcement of consumer protection rules.

According to its principal drafters, KOSA is partially the result of the need to update the Children’s Online Privacy Protection Act (COPPA). Currently, COPPA only applies to children younger than 13, and was enacted in the late 1990s, prior to the existence of the major social media platforms on which teens spend a very significant amount of time online today. KOSA aims to provide transparency and empowerment to parents and minors.

More specifically, KOSA would require all online platforms, including social media platforms, to provide parents with “easy-to-use” tools to protect their children younger than 16 from harmful images, sexual exploitation, bullying, and product recommendations. KOSA currently is drafted broadly to cover any online service of any size that is “reasonably likely to be used” by a child younger than 16. Such a broad scope may include video game platforms, streaming services, and many other commercial websites.

Many questions have already arisen regarding how this broad scope might affect any number of interactive entertainment platforms and even the burgeoning development of “metaverse” platforms. Additionally, the costs associated with KOSA are yet to be determined. KOSA would require companies to provide parents and minors with new controls and would create new obligations for platforms to address self-harm, suicide, eating disorders, sexual abuse, substance abuse, and other content that might harm children and teens. Companies that determine that they have services that are likely to be used by children must create tools to allow parents to track how much time their children spend on a service, or to opt out of features such as autoplay that might extend time online. Those providers would also have to offer parents and minors the ability to opt out of algorithmic recommendation systems, which is defined in KOSA as “a fully or partially automated system used to suggest, promote, or rank information.” Another requirement would be to prevent third parties from viewing a minor’s data. All of these protections would be set as a default, instead of an opt-in option. Overall, these new standards would allow parents to limit or ban certain types of content their children can view and access on social media sites and would likely impose new development and monitoring costs on interactive platform providers.

Further to the monitoring requirements, KOSA would require covered companies to conduct an annual independent audit and report on the risks to minors on their platforms. This would assist the National Telecommunications and Information Administration to create a program for researchers to access the audit, report, and any other data from the companies so that they can assess the platforms’ potential harm to children and teens.