From Utah to the UK, and soon France, new laws aim to restrict access to social media to minors. Study after study show that TikTok, Facebook, Instagram, and other sites increase the risks of grooming, cyberbullying, and child sexual abuse.
But disagreement reigns over how to achieve this common goal of keeping children safe. Some favor soft encouragement. Others want to impose hard prohibitions. On one side, the UK takes a targeted approach, building an accountability regime that encourages platforms to protect minors. On the other, Utah lawmakers want to require parents to give permission for their children to create a social media account. France is adopting an in-between approach.
This debate extends beyond child safety to the broad and central issues of privacy, free speech, and the responsibility of platforms for moderating content. Until now, social media has not been held liable for most material uploaded to their sites. Even Europe’s new Digital Services Act, designed to increase platforms’ responsibility for the content they host after posting, upholds the prohibition against proactive monitoring. Some fear child safety laws might undermine this founding element of the Internet.
“The most important and underexamined thing happening in US Internet law right now is the emergence of various ‘child safety’ laws that effectively regulate content on platforms, but don’t say so,” judges Daphne Keller, a former Google lawyer now at the Stanford Cyber Policy Center.
Child safety depends on the ability of companies to control the age of their users. That’s difficult and often collides with privacy and free speech protections. Should parents have access to everything about their children? How should distressed children obtain help from experts without their parents’ knowledge?
As Georgetown University professor and Data & Society founder Danah Boyd reminds us, teenagers most often turn to social media when feeling stressed out and in need of socialization. “Does social media cause mental health problems? Or is it where mental health problems become visible? I can guarantee you that there are examples of both,” she says.
The UK: Building an Accountability Regime
Children are flocking to social media. A third of British children have opened an account claiming to be over 18 years old, according to a study by the UK regulator Ofcom. The French data regulator CNIL reports that French children create their first social media account on average by the age of eight and a half.
In 2017, the UK Parliament passed the Digital Economy Act requiring websites to implement ‘robust’ age verification. But the Act only focused on adult content such as pornography and gambling and was never enforced.
The 2020 UK’s Children’s Code went further, obliging platforms to design age-appropriate products. In response, for example, YouTube made all accounts for 13 to 17-year-olds private by default and turned off autoplay for teenagers so potential predators are blocked from seeing their accounts.
Companies are rolling out parental controls. Snapchat has introduced a parent guide detailing how their teens use the app, including who they’ve been talking to within the last week. Facebook’s Safety Center provides parents with articles and advice from leading experts.
The upcoming Online Safety Bill goes further, demanding that social media mitigate risks or face sanctions. If media regulator Ofcom deems a site inappropriate for children, platforms must block minors from access. The question of how to verify age remains unanswered.
Utah: Giving Parents Access to Their Child’s Social Media
These measures fail to satisfy Utah legislators. Under the state’s new SB 287, parents must give permission to any child under 18 to open a social media account. Even after they give permission, parents will retain access to the child’s posts and messages.
The law also requires social media companies to install a youth curfew between 10:30 in the evening and 6:30 in the morning. Unlike the UK, it’s up to social media companies to verify age. Children Will find it difficult to lie. Companies will be required to collect a driver’s license or other evidence of a subscriber’s age. Utah’s Consumer Protection Division will be responsible for developing rules.
Social media companies are furious. They are expected to sue before the law takes effect in March 2024.
France: Creating a Digital Majority at 15
Under a new law under consideration, France is following the Utah path. The age of consent will be lowered to 15 and social media will need to put in place an age verification method and obtain parental consent. Media regulator ARCOM will be in charge of certifying the age verification methods.
But unlike in Utah, the French have inserted privacy protections into their approach. Data regulator CNIL is weighing solutions to verify age while preventing the person certifying age to know for what service the certification is required. In addition, websites asking for age certification should not know the identity of the potential subscriber.
Other ideas have emerged to verify a child’s age. A French law passed in 2022 asks device manufacturers to include a parental control tool in their hardware. The European Union has documented various methods in its Code of Practice on age-appropriate design.
Thorny Questions Remain Unanswered
Age verification methods raise two main concerns, one practical and another philosophical. The Electronic Frontier Foundation fears that Utah’s law not only violates child privacy but also will curb free speech. And despite the new identity controls, VPNs could circumvent the age restrictions from locations without child safety legislation.
Social media access for children will be regulated. How and whether the new rules will prove effective remain unanswered — and controversial — questions pitting two fundamental rights, child safety against Internet freedom.
Théophile Lenoir is a researcher working on disinformation and new media. He ran a research program at Institut Montaigne for four years, where he is now an associate researcher. He is a Ph.D. student at the University of Milan and an affiliate at the Sciences Po Medialab.
Bandwidth is CEPA’s online journal dedicated to advancing transatlantic cooperation on tech policy. All opinions are those of the author and do not necessarily represent the position or views of the institutions they represent or the Center for European Policy Analysis.