If We Ban Social Media for Children, What Do We Offer Instead?

This essay was originally published on my Substack, Digital Serendipities, where I regularly share reflections on technology, society, and human connection. You are warmly welcome to subscribe for future essays.

There is a new global policy practice spreading rapidly: protect children by banning them from social media. It began with Australia, whose landmark law restricting social media access for users under sixteen took effect in December 2025, marking one of the world’s most ambitious attempts to regulate young people’s digital lives. What initially appeared as a domestic policy experiment has quickly evolved into something larger: a regulatory template now being watched, adapted, and debated across regions.

Across Asia, momentum is building. In March 2026, Indonesia announced restrictions for children under sixteen on what it classifies as “high-risk platforms,” including TikTok, Facebook, Instagram, and Roblox, becoming the first Southeast Asian country to move at this scale. Malaysia has signaled similar intentions.

Across Europe, the momentum is equally visible: France has advanced restrictions for under-15s, Germany’s governing coalition is openly debating age thresholds of 14 to 16, Greece has announced plans for a nationwide under-15 ban beginning in 2027, and the European Union is piloting bloc-wide age verification infrastructure.

In the United States, states such as Utah and Arkansas have already moved toward age-based restrictions and parental consent regimes, signaling that similar debates are increasingly entering the American mainstream.

Yet this global turn is unfolding across profoundly unequal digital contexts, where access, parental literacy, institutional capacity, and online opportunity vary dramatically between and within societies.

The impulse itself is understandable. Parents are worried, and often with good reason. Digital platforms increasingly function not merely as tools children use, but as developmental environments that shape attention, identity, social belonging, aspiration, and self-worth. When these systems are designed primarily around engagement metrics, algorithmic amplification, and data extraction, children are not simply participating in digital life; they are growing up within architectures optimized to capture and hold attention. In such environments, visibility becomes currency, belonging becomes performative, and the boundary between connection and dependency grows increasingly fragile.

But it is also why the current debate risks becoming too narrow.

Policy discussions have become increasingly focused on restriction: how to limit access, enforce age thresholds, and reduce exposure to harmful content. Far less attention is being given to a more fundamental question, one that goes to the heart of youth development in digital societies:

If we remove one infrastructure of youth connection, what do we offer in its place?

Photo: Oodi Central Library, Helsinki (author’s photo)

Displacement Is Not Protection

Young people do not disappear from digital life when mainstream platforms are restricted. They adapt, quickly and creatively, often in ways adults do not anticipate.

We have already seen this pattern emerge in everyday practice. Teenagers repurpose collaborative tools such as Google Docs into informal social spaces during class, using shared documents as backchannel chatrooms invisible to traditional forms of monitoring. Others migrate toward gaming ecosystems like Roblox, private messaging groups, encrypted networks, or smaller hybrid platforms where social interaction remains possible, but visibility to parents, educators, and regulators becomes significantly lower.

Connection does not disappear. It relocates. Yet relocation is not neutral.

For some young people, particularly those with strong offline social networks, supportive families, and access to multiple forms of social participation, alternative spaces are relatively easy to find. For others, especially vulnerable children, socially isolated youth, and those whose communities exist primarily online, displacement may deepen exclusion or push them toward less safe digital environments where oversight is weaker, moderation is inconsistent, and risks become harder to identify.

For many young people, digital spaces are not simply entertainment, but places of belonging. For marginalized youth, including those living in rural isolation, migrant communities, disabled youth, or LGBTQ+ young people who may lack safe peer communities offline, digital platforms often function as important spaces of recognition, support, and identity formation. Restriction without replacement risks removing not only exposure to harm, but access to meaningful forms of connection.

From a policy perspective, this matters because harm does not necessarily diminish when visibility declines. It may simply become harder to see, harder to measure, and harder to govern.

Restriction alone should not be mistaken for comprehensive protection.

What Should Replace It?

This is where the debate becomes more fundamental.

Children and adolescents are not seeking platforms in the abstract. They are seeking friendship, belonging, recognition, experimentation, and shared social life, all central dimensions of healthy development. These needs do not disappear when regulation changes. They remain, and young people will continue to seek ways of meeting them, online or offline, formal or informal, safe or unsafe.

This means investing in digital literacy not as a one-off curriculum module or isolated awareness campaign, but as a lifelong developmental capacity that must be cultivated across the entire ecosystem of care: children learning how digital systems shape attention, identity, and behavior; parents building the confidence and literacy to guide rather than simply monitor; educators developing the tools to support healthy digital habits in and beyond the classroom; and trusted adults becoming active participants in children’s digital lives, rather than passive observers of them. It means stronger platform accountability, greater transparency around recommender systems, and child-centered design standards that prioritize developmental well-being over engagement optimization.

It also requires asking a question that has largely disappeared from public life: where are the third places for young people?

Where are the spaces, digital and physical, where young people can gather, socialize, and form identity in environments designed around developmental needs rather than engagement optimization? In many societies, such spaces are shrinking, inaccessible, or were never meaningfully built in the first place. As offline social infrastructures weaken, digital platforms increasingly become default spaces of belonging, however imperfect or exploitative they may be.

If society removes one infrastructure of connection without rebuilding another, young people will create their own. The question is whether those spaces will be healthy, inclusive, and safe, or simply less visible to adults.

Perhaps most importantly, governments should treat social media restrictions as major social interventions that require rigorous evidence before, during, and after implementation. We need longitudinal research on behavioral adaptation, circumvention practices, parental mediation, well-being effects, and the uneven impacts these policies may have across different groups of young people. Policy should not only ask whether restrictions reduce exposure to harm, but where risk moves, who becomes more vulnerable, and what forms of connection are lost or rebuilt in the process. As this policy landscape evolves, robust evidence will be essential, not only to measure what restrictions reduce, but to understand what they displace, whom they protect, and whom they may leave behind.

Children do not need less connection. They need better places to find it: spaces designed not around extraction and addiction, but around dignity, agency, creativity, belonging, and human flourishing.


Danica Radovanović is a writer and digital society scholar working at the intersection of technology, digital inequality, and child online protection across research, policy, and practice.

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.