German Parties Back Social Media Ban, Legal Hurdles Loom Child & adolescent health 25/02/2026 • Felix Sassmannshausen Share this: Share on X (Opens in new window) X Share on LinkedIn (Opens in new window) LinkedIn Share on Facebook (Opens in new window) Facebook Print (Opens in new window) Print Share on Bluesky (Opens in new window) Bluesky Germany’s governing coalition parties CDU and SPD under Chancellor Friedrich Merz (bottom right) back a social media ban for minors. Following the lead of Australia, France, and Spain, Germany’s governing parties have found common ground on a proposal to ban social media for children. But as politicians prepare strict age restrictions, experts warn that significant legal roadblocks and data protection concerns could stall the legislation. At a recent party conference in Stuttgart, the centre-right Christian Democratic Union (CDU) voted to push for a ban on social media for children under 14. To enforce the restriction, the proposal suggests heavy fines for platforms that fail to maintain robust age-verification systems. Days earlier, the Social Democratic Party (SPD) tabled a similar proposal for a complete ban up to age 14. For adolescents up to 16, the party is demanding a mandatory “youth version” of platforms, stripped of addictive algorithms, infinite scrolling, and intrusive push notifications. These domestic efforts mirror a global legal crackdown on “addictive by design” features, headlined by Australia becoming the first country to enforce a national ban for children. This pressure is mounting on multiple fronts. While Meta defends itself against charges in Los Angeles, the EU is simultaneously investigating TikTok under its Digital Services Act (DSA) for similar allegations of addictive design. Experts warn of addictive online platforms Professor Christian Montag warns that social media features are designed to manipulate the reward systems of the developing brain. Proponents of a ban point to skyrocketing screen time and the deterioration of youth mental health as the primary problems to be solved. Professor Christian Montag, a researcher in cognitive and brain sciences at the University of Macau (China), confirms that specific platform features are designed to manipulate the brain. “If one gets many likes for their own post compared to few likes, this relatively clearly triggers the reward system of the brain, including the ventral striatum,” which can explain habit formation. The ventral striatum, including the nucleus accumbens, is a key brain region for reward processing, motivation, and emotional regulation. Montag identifies a bouquet of interconnected issues affecting minors online, pointing to the debates around the “displacement hypothesis,” which suggests the sheer volume of time extracted by the platforms’ data-driven business models leaves children with too little time for crucial developmental experiences, such as physical play outdoors or offline social interaction. Furthermore, minors are regularly exposed to age-inappropriate content, cyberbullying that scales far beyond the school yard, and algorithmic feeds that promote unrealistic beauty ideals. The latter correlate with body dissatisfaction and eating disorders, Montag said. However, researchers face a major difficulty in proving definitive causality due to many cross-sectional studies in the field and because tech giants have largely closed off data access since the Cambridge Analytica scandal, preventing independent scientists from testing how specific algorithmic designs affect user behaviour. The company Cambridge Analytica had harvested private data of 87 million Facebook users without consent. Pushing ahead: Australia, Spain, and France Data from Australia justifying the ban shows 96% of minors are active online, with 71% exposed to harmful content and 1 in 7 experiencing grooming. Germany’s announcement comes after Australia enacted the world’s first ban in December 2025, prohibiting children under 16 from holding accounts on 10 major platforms, including TikTok, Instagram, X, YouTube, and Snapchat. The Australian government justified the ban with alarming survey data: 96% of children aged 10-15 used social media, and 71% had been exposed to harmful content, including violent material and content promoting suicide. Alarmingly, one in seven children reported experiencing grooming-type behaviour from adults or older teens. Under the Australian model, tech giants face massive fines of up to A$49.5 million for serious or repeated failures to keep children off their platforms, as BBC reported. The early days of the ban, however, revealed significant workarounds, with teenagers rapidly turning to virtual private networks (VPNs) and using fake birthdays to bypass local restrictions. While experts claim it is too early to determine the effectiveness of the Australian regulations, other countries are pursuing similar measures. The Spanish government has drafted a law to ban access for minors under 16, demanding effective age verification systems. Going further, Spanish Prime Minister Pedro Sanchez recently announced plans to hold tech CEOs criminally liable for failing to remove illegal content, and to turn algorithmic manipulation into a new criminal offence. In France, a parliamentary enquiry has recommended banning children under 15 from social media and introducing a social media “curfew” for 15- to 18-year-olds, restricting them from accessing or using social media platforms during specific hours of the day, such as late at night. European Union regulation slows national ambition Media law expert Stephan Dreyer warns that the EU’s Digital Services Act creates significant legal roadblocks for national laws. Despite the political fervour in Berlin, Paris and Madrid, national legislatures in Europe face a massive legal roadblock: Under the EU’s Digital Services Act (DSA), platform regulation is harmonised across the continent. Stephan Dreyer, a media law expert from the Leibniz Institute for Media Research in Hamburg (Germany), points to the complexities of the actual legal frameworks: “European regulations are fully harmonising and contain no opening clauses for national laws. The individual member states cannot enact national regulations addressed to the same recipients with the same protective purpose.” Simply put, Germany cannot pass a law ordering TikTok or Meta to ban under-14 year olds from their platforms. Any national attempt to do so would be legally inapplicable because it contradicts higher-ranking European law. To bypass the DSA, a state would have to pass laws that target parents or criminalise the children themselves, alternatives Dreyer considers highly undesirable. The European Commission has cautioned member states against overstepping their legal boundaries. Placing additional regulatory obligations directly on platforms is a “clear no-go,” a Commission spokesperson said. As the regulatory power lies with the EU, the Commission is planning a new initiative by summer. It is expected that the upcoming European “Digital Fairness Act” may introduce minimum age limits and age verification requirements for specific services at the EU level, potentially enforcing age checks at the App Store rather than on the platforms themselves. Additionally, the current EU-level investigation into TikTok proves the bloc already has the legal tools to force platforms to change their addictive designs, rendering legally precarious national bans largely unnecessary. Assuming that leading politicians know this, the national announcements from Germany, France and Spain are more likely to be strategic posturing: “These are attempts to rush ahead to create facts to increase the pressure on Brussels,” Dreyer explained. Data protection concerns form major hurdle to strict bans Implementation hurdles range from biometric scanning and high-risk ID uploads to the proposed EU Digital Identity Wallet. Even if a national ban were legally viable, enforcing it is currently a privacy nightmare, Professor Anja Lehmann, IT security expert at the Hasso-Plattner Institute Potsdam (Germany), warns. Existing age verification methods, such as submitting ID photos or using AI to estimate age from face and voice scans, force users to surrender excessive personal data. Further, while there can be secure, privacy-preserving solutions like the EUDI Wallet, which allows anonymous age verification, teenagers can still bypass these systems. “If age verification is only mandatory in certain countries, you can always use a VPN to trick the service into thinking you are in a country where there is currently no mandatory age verification,” Lehmann explains. Regulating platform design instead of bans IT security expert Professor Anja Lehmann warns that current age verification methods are privacy-intrusive. Instead of strict bans, experts advocate for a different alternative: regulating the platforms’ design rather than banning the users. As the DSA already mandates a high level of privacy and safety for minors, the existing legislation could be used to force platforms to offer age-appropriate environments by disabling endless scrolling, autoplaying videos, and hyper-personalized feeds. According to research, the effects of social media depend heavily on whether a child is passively scrolling or actively engaging. Additionally, experts point out that social media is a vital communication tool for youth to manage relationships and express opinions, making strict bans a Draconian limitation on their societal participation. IT security expert Lehmann echoes this sentiment, arguing that banning children from social media treats the symptom while rewarding tech companies with a fresh trove of identity data. “I would strongly advocate tackling the fundamental problem, namely that we have algorithms and systems that are currently very harmful not only to children but to the whole of society,” she concludes. Image Credits: Deutscher Bundestag/Thomas Imo, Christian Montag, Felix Sassmannshausen, Stephan Dreyer, Anja Lehmann. Share this: Share on X (Opens in new window) X Share on LinkedIn (Opens in new window) LinkedIn Share on Facebook (Opens in new window) Facebook Print (Opens in new window) Print Share on Bluesky (Opens in new window) Bluesky Combat the infodemic in health information and support health policy reporting from the global South. Our growing network of journalists in Africa, Asia, Geneva and New York connect the dots between regional realities and the big global debates, with evidence-based, open access news and analysis. To make a personal or organisational contribution click here.