Kids are addicted to social media. No one can agree on a solution.
While the EU takes a stab at regulating big platforms, countries and companies bicker over who's responsible for protective measures.
Children and teens can’t stop scrolling — and it’s hurting their health.
The time young people spend on social media networks has more than doubled since 2010 to around three hours a day.
More than 1 in 10 teenagers showed signs of problematic and addictive social media use in 2022 — including struggling to control their use and experiencing withdrawal, according to the World Health Organization.
“Everybody knows it’s addictive,” said Hanna Kuźmitowicz, a Polish high school student who worked with the Polish presidency of the EU on this topic. “I know the dangers, the benefits,” she told POLITICO. “I still use it.”
At the urging of public health experts, European governments are considering new ways to keep youth off their phones through age verification policies, public awareness campaigns and even social media bans.
Countries have the freedom to set their own restrictions, and they’re running with it. President Emmanuel Macron is calling for an outright ban on under-15s in France, while Denmark, Greece, Spain, Italy, the Netherlands and others have rallied around new restrictions.
Meanwhile, tech companies are rolling out measures like age-specific content restrictions, disabling certain features and their own privacy features — though some argue these are not enough and the right way forward remains undecided.
Passing the buck
Some experts argue that social media isn’t all bad and can offer benefits to young people.
“Certain sorts of technology [were] actually quite good for their friendship formation and friendship closeness,” said Jessica Piotrowski, chair of the University of Amsterdam’s School of Communication Research and an adviser to YouTube on the protection of minors, echoing various studies.
However, growing evidence links it to decreased well-being, including depression and sleep disorders, along with higher levels of substance use — things that can no longer be ignored.
“There needs to be regulation” and “some kind of reckoning for the tech companies that you are harming teens and children, and something has to be done,” said Kadri Soova, director of Mental Health Europe.
She also believes it’s important to have dialogue with tech companies rather than be enemies. “But if there is no self-regulation, or the terms of regulation are not deep enough, then there needs to be rules.”
Numerous scandals over the past few years have shown that tech companies have not always taken a safety-first approach for their underage users. In 2021, former Meta (then Facebook) employee Frances Haugen leaked internal documents that revealed the company was aware of the harm being done to teens’ mental health and did little to stop it.
Health experts argue that existing regulatory tools are not enough. They want more action from tech companies, which they say design their platforms to be addictive.
Theo Compernolle, a neuropsychiatrist and former professor at the Free University of Amsterdam who advocates a social media ban for children, said regulation must focus on the companies; otherwise, “It’s like fighting a drug while not doing anything about the producers.”
Social media, like gambling, tobacco and alcohol, “depends upon the denial of harms,” said Mark Petticrew, a professor of public health at London’s School of Hygiene and Tropical Medicine. It’s no different from any other type of addiction, he added.
Health ministers in June adopted conclusions at the Council of the EU calling on countries to consider preventive policies to regulate young people’s access to digital technologies. These include screen-free zones and digital limits within schools and urging digital platform designers to “take more responsibility.”
Landmark law?
One of the most significant pieces of legislation on online platforms is the European Union’s Digital Services Act. It calls on social media to set up “appropriate and proportionate measures to ensure a high level of privacy, safety and security of minors.”
Meta’s Facebook and Instagram, as well as TikTok, are under investigation for breaches of the DSA’s rules for minors.
Since the landmark law only gave vague responsibilities to platforms, the EU executive drew up a set of highly contested guidelines to narrow down what platforms should do.
They include not using minors’ browsing habits to suggest content, turning off the streaks and read receipts features in messaging applications, setting privacy and security by default in settings, and considering shutting off some features like camera access.
The guidelines are not binding, and if minors lie about their age or their parents circumvent the controls, they have no effect. That has led to a shift in the debate toward how platforms can verify the age of users.
Scrap over age checks
Under Europe’s General Data Protection Regulation, children under 13 cannot consent to having their data processed for online services. Platforms like TikTok and Instagram say only children over 13 can join in their terms of service, but regulators have woken up to the fact that merely ticking a box doesn’t work.
As many as 94 percent of Danish kids have social media accounts before they turn 13, according to a report by local nonprofit Børns Vilkår quoted by a government-commissioned study.
The policy debate has shifted toward mandating measures to check the age of users to ensure that any other measures are effective. Some argue this is the platform’s responsibility, but operators like Meta and TikTok contend that Google and Apple, which develop device operating systems, should be responsible for age verification measures.
Meta’s director of public policy, products and monetization, Helen Charles, said that new legislation should target age verification and parental approval at the operating system and app store level. That “will be easier on parents” in a “privacy protective way,” she said.
But Google and Apple don’t think it should be just up to them.
“We think it’s a shared responsibility … There’s no single bullet or silver bullet to say: This one company, solve it for everyone,” said Vinay Goel, director of age assurance at Google. “Developers are best situated to know what is potentially risky.”
The ban debate
There’s doubt, even among the staunchest supporters of strict action and teenagers themselves, that a ban would be effective.
“Well-enforced age verification, parental tools and digital literacy programs,” for example, might achieve better outcomes than bans, said WHO Health Security Director Natasha Azzopardi-Muscat.
Others, including Kuźmitowicz, are concerned that there are always ways to circumvent bans and restrictions, rendering them ineffective.
Meanwhile, health ministers believe there is currently insufficient evidence to support a total ban.
“How do you enforce that?” said Cyprus’ Health Minister Michael Damianos. The “bigger issue” is making sure policies work in practice.
A social media ban “is really and truly walking into the unknown. Such a policy is not backed by evidence,” said Malta’s Health Minister Jo Etienne Abela. “But on the other hand … we know there is a problem, should lack of evidence cripple us and freeze us, and we do nothing about it?”
Giedre Peseckyte contributed to this report.