BRUSSELS — European governments are increasingly restricting minors’ access to social media, a shift that could force platforms such as TikTok and Snapchat to change features designed to keep users online for longer.
Poland has become the latest European country to signal tighter restrictions on youth access to social media, with officials proposing measures that could ban users under 15 from holding accounts. Similar debates are underway in Italy, Germany, Spain, and France, where parental consent is already required for users under 15.
The proposals reflect a wider international trend. Australia last year introduced strict age-based limits on social media access for users under 16.
Concerns over children’s social media use have intensified following lawsuits in the United States accusing platforms of contributing to teenage self-harm and suicide. The European Commission has issued preliminary findings that TikTok may breach the Digital Services Act (DSA) through design features such as infinite scroll, autoplay, and personalized recommender systems that could encourage compulsive use among minors.
Catholic experts say the debate raises questions that go deeper than regulatory compliance.
Alessandro Calcagno, policy adviser for education and youth at the Commission of the Bishops’ Conferences of the European Union (COMECE), told EWTN News that excessive social media use risks trapping children in virtual environments and weakening the skills needed for a healthy emotional and relational life.
Annemie Dillen, professor of family and pastoral theology at KU Leuven, said the stakes extend beyond harm prevention.
“From a Christian ethical perspective, the question is not only how to prevent harm but how digital environments shape the development of children and their capacity for meaningful relationships. The goal should not only be safety but helping young people grow in freedom, responsibility, and authentic human connection,” she said.
Similar concerns have been raised by the COMECE Youth Net, a network of young Catholics serving as a consultative body to the EU bishops’ body, which has warned that heavy use of digital media risks turning young people into “social hermits,” eroding real human relationships.
Design features under scrutiny
Experts say platform design lies at the heart of digital advertising markets, where companies compete to maximize user engagement and time spent online.
Leanda Barrington-Leach, executive director of the child rights group 5Rights Foundation, told EWTN News that “children have been ignored or even exploited for too long on online platforms, leaving them exposed to harmful features, excessive data collection and design choices that were made to maximize their engagement and time spent.”
Dillen added that efforts to protect children online must also focus on how platforms are designed. “Priority should be given to platform design and the structural responsibility of companies, since many of the risks children face are linked to the way these platforms work,” she said.
Age assurance debate
Concerns about platform design have prompted debate over whether age verification alone can adequately limit children’s exposure to online harms.
Council of Europe Commissioner for Human Rights Michael O’Flaherty has urged governments to focus on regulating platforms rather than restricting children’s access to social media, warning that blanket access restrictions risk shifting responsibility from technology companies to young users.
Civil society groups also argue that age assurance should not be treated as a standalone solution. “Age assurance is not an end in itself,” Barrington-Leach said, adding that protecting children online requires addressing systemic risks in platform design rather than relying only on access restrictions such as age verification.
Fragmentation risk in the EU single market
Efforts to require platforms to change design features could be slowed by EU rules requiring online platforms to be supervised by regulators in the country where they are headquartered — a principle known as the country-of-origin rule.
Cani Fernández, president of Spain’s National Commission on Markets and Competition, said this arrangement can delay action when harm affects users in other EU countries. Under the system, she said, other national regulators cannot intervene directly but must raise concerns with the European Commission if the authority overseeing a platform is slow to act.
Fernández said the commission is examining ways to speed up responses when risks to minors are involved.

















