EU pushes stricter children's social media rules across bloc
Amid rising concerns over youth mental health, EU ministers are backing new rules to protect children online, focusing on algorithmic risks, harmful content, and stricter access controls for social media platforms.
-
Students work on a laptop computer at Stonewall Elementary in Lexington, Kentucky, Feb. 6, 2023. (AP)
Amid growing concern over harmful online content and the impact of social media on young users, the European Union is considering sweeping new regulations to strengthen child safety on digital platforms. The initiative aims to introduce a unified digital age of adulthood across the bloc, requiring minors to obtain parental consent before accessing social media.
The push comes as part of a broader effort to tighten EU children's social media regulations, with Greece leading the charge, supported by France and Spain. Greek Digital Minister Dimitris Papastergiou announced that the proposal will be formally presented to EU ministers in Luxembourg on Friday, calling for action “so that Europe can take the appropriate action as soon as possible.”
At the heart of the plan is a proposal to standardize the digital age of adulthood throughout the EU. While current regulations vary between member states, the new initiative would ensure consistent safeguards for minors across all platforms. Since its introduction last month, the plan has gained traction with backing from Cyprus and Denmark, both of which plan to prioritize the issue during their upcoming EU presidencies.
Parental consent and age checks at the center of reform
The proposed changes would make parental consent a legal requirement for underage users, echoing national measures already adopted by some member states.
France, for instance, passed legislation in 2023 mandating parental consent for users under 15, though the law still awaits EU-level approval. Paris has also introduced mandatory age verification for adult websites, prompting several platforms to withdraw access in protest.
To support these efforts, the European Commission plans to launch an age-verification EU app next month, designed to confirm users’ ages while safeguarding personal data. Additionally, draft guidelines suggest defaulting children's accounts to private settings and enabling stronger parental controls.
Nations Push Back Against Harmful Online Content
Authorities cite a growing body of evidence linking harmful online content, including diet fads, disinformation, and cyberbullying, to rising rates of anxiety, depression, and low self-esteem among children. This week, under pressure from French regulators, TikTok banned the "#SkinnyTok" hashtag, which critics say promoted dangerous body image standards.
Greece emphasized that the proposal also addresses excessive screen time, noting its potential to hinder cognitive and social development in young users. The plan criticizes social media algorithms for amplifying addictive and damaging content, particularly for minors.
The EU is ramping up enforcement under its Digital Services Act, with ongoing investigations into how platforms like Facebook, Instagram, and TikTok verify users’ ages and protect children. A separate probe launched last week targets four pornographic websites suspected of allowing minors to access explicit content.
Meanwhile, negotiations continue over a controversial law aimed at combating child sexual abuse material online. While the proposal seeks to enhance digital safety, it remains stalled due to privacy concerns, particularly over encrypted messaging services.
Read more: Half of UK youth would give up the internet: Survey