Notification

×

Iklan

Iklan

News Index

Tag Terpopuler

Scroll, like, repeat: How the EU plans to boost children's protection online

Saturday, April 25, 2026 | 6:59 AM WIB | 0 Views Last Updated 2026-04-25T00:00:37Z
    Share

A significant portion of European Union citizens are deeply concerned about the mental well-being of children, with a staggering 93 percent expressing this worry. Furthermore, a vast majority, 92 percent, identify cyberbullying as the paramount threat lurking in the digital realm, according to the insights gleaned from the State of the Digital Decade Eurobarometer 2025. In response to these pressing concerns, Brussels has signaled a more stringent approach to safeguarding children online.

European Commission President Ursula von der Leyen recently unveiled plans for a new "age verification app," which she stated is "technically ready and soon available for citizens to use." This innovative system aims to allow individuals to confirm their age for accessing online platforms without the need to divulge sensitive personal data. Concurrently, EU member states are not waiting idly, with several already implementing or preparing to enact decisive measures. France, for instance, has already legislated a ban for users under the age of 15, and countries like Spain, Austria, Greece, Ireland, Denmark, and the Netherlands are poised to introduce similar regulations in the near future.

However, not all stakeholders are entirely satisfied with the pace of the Commission's actions. Christel Schaldemose, a Member of the European Parliament and a key figure in drafting a report on a unified EU age limit for social media, has voiced a sense of apprehension regarding the Commission's perceived hesitation. "I don't know if they're delaying [actions] on purpose, but I think that they are too slow," she remarked. "Like this we end up with a fragmented internal market because so many countries have already suggested an age limit."

The Pervasive Influence of Social Media on Young Lives

Social media has undeniably become an omnipresent force, yet it presents a landscape fraught with risks for children. These platforms are often characterized by addictive design elements, constant connectivity, hyper-personalization, and the integration of sophisticated AI tools. The statistics paint a stark picture: a 2025 Joint Research Centre (JRC) study revealed that in 2022, a remarkable 96 percent of 15-year-olds were active on social media, with a concerning 37 percent dedicating over three hours daily to these platforms. The study also highlighted a gender disparity, with female teens exhibiting higher engagement rates at 42 percent compared to their male counterparts at 32 percent.

The issue of excessive usage is further underscored by findings that among children aged 9 to 15, daily social media engagement frequently surpasses the three-hour mark. A significant 78 percent of teenagers aged 13 to 17 admit to checking their devices at least once every hour. Alarmingly, a quarter of these young individuals report struggling with dysfunctional internet habits, as detailed in a November 2025 report from the European Parliament. The pervasiveness is further cemented by the Eurobarometer survey, which indicated that in 2025, nearly 99 percent of teens aged 16-17 were actively participating on social media platforms by creating profiles, posting content, and engaging with services like Facebook and X.

The Growing Mental Health Crisis Linked to Digital Overload

The unchecked immersion in social media is taking a significant toll on the mental health of children. The JRC warns that such usage can exacerbate levels of depression and anxiety. The constant exposure to harmful content, including violent imagery, sexualized material, and pro-eating disorder content, poses a direct threat to children's developing brains and their social behaviors. The JRC study revealed that 60 percent of young females exhibit symptoms of depression, compared to 35 percent of males, and 65 percent experience anxiety, versus 41 percent of males.

The advertising-driven business models of many internet platforms, often designed with adult users in mind, have profound and detrimental repercussions for younger users, fostering a sense of dependence. A 2024 World Health Organisation (WHO) report indicated that 36 percent of adolescents across Europe, Central Asia, and Canada maintain constant contact with others via social media. Furthermore, 11 percent of these adolescents demonstrate problematic social media use, with girls (13 percent) reporting higher rates than boys (9 percent).

National Bans: A Growing Trend in Europe

Recognizing the escalating crisis, several EU member states are taking a firm stance by implementing national bans on social media access for minors. Greece, for example, announced a ban for children under 15, set to take effect in January 2027, pending parliamentary approval. This legislation aims to prevent minors from creating social media accounts and obligates platforms to enforce strict age verification, with financial penalties for non-compliance. This decisive action was reportedly spurred by data showing that 75 percent of Greek primary school children were active on social media, and approximately 48 percent of teenagers reported negative mental health effects. Public support for such measures surged following a US verdict in March 2026 that held major tech platforms accountable for addictive app design.

Greece's move follows similar initiatives across the continent. France approved a bill in January 2026 to ban social media for those under 15, citing a "health emergency" and the imperative to protect minors from cyberbullying and psychological harm. In February 2026, Spain announced its intention to implement an under-16s ban, aiming to "tame the digital Wild West." Austria, Denmark, and Slovenia are also in the process of drafting bans for individuals under the ages of 14, 15, and 15, respectively. Italy and Ireland are exploring similar restrictions for under-15s and under-16s. Germany and other nations are actively debating age limits or the development of "youth versions" of existing platforms. These national efforts are motivated by a surge in mental health issues among young people and a growing desire to hold tech giants accountable for the addictive nature of their platform designs, drawing inspiration from Australia's pioneering under-16 ban enacted in 2025.

The effectiveness of self-reported birthdates for age verification has proven to be insufficient. To address this, the European Commission's new age verification app is intended to serve as a technical framework that can be integrated into national digital wallets or standalone applications. As President von der Leyen explained, "the app will allow users to prove their age when accessing online platforms, just like shops ask for proof of age for people buying [alcohol]." While national regulators will be responsible for enforcing compliance through oversight and penalties, EU-wide regulations like the Digital Services Act (DSA) and the General Data Protection Regulation (GDPR) provide a foundational layer of protection for minors. National bans, however, aim to go further by establishing strict age limits and increasing accountability for technology companies.

Diverse Perspectives on Regulation

Despite the growing momentum for stricter regulations, there are dissenting voices. Some political figures, such as members of Spain’s Vox party and Italian lawmakers, argue that these bans represent an excessive governmental overreach. They contend that education, parental guidance, and enhanced digital literacy would be more effective than outright prohibitions. This perspective is echoed by consumer rights advocates. Olivia Brown, Policy Officer at the global consumer group Euroconsumers, views blanket bans as a "political shortcut that let platforms off the hook." She elaborates, "Banning social media doesn't make the internet safer. It just moves the problem out of sight. What minors need is safety built into platforms by design, real user controls, and algorithms they can shape themselves, not doors that are simply slammed shut, only to swing wide open the moment they turn 18."

Towards Harmonized EU-Wide Regulation

The issue of online child safety is politically charged, and a unified EU-wide ban could potentially exacerbate existing divisions. Consequently, the Commission's initial strategy involves releasing the age-verification app as a tool to empower member states in implementing their own national measures.

The age verification app, first conceived in 2025, is designed as a flexible technical framework. Its integration into national digital wallets or separate applications would enable users to verify their ages. The verification process typically involves downloading an app, granting consent for data usage, scanning an identity document (including its chip), and undergoing facial recognition. This process might need to be repeated regularly, and platforms could mandate verification for each access to age-restricted services.

However, concerns have been raised regarding the app's complexity, potential privacy implications, and the ease with which it could be circumvented, for instance, through VPNs. There are also fears that it might inadvertently shift responsibility away from the platforms themselves, a concern that contrasts with the broader aims of other EU-wide regulatory tools.

Existing EU legislation already provides a robust framework for protecting children online. The GDPR, implemented in 2018-2020, established stringent rules concerning children's data, setting the default age of digital consent at 16 (with a lower threshold of 13), requiring parental approval for younger users. The revised Audiovisual Media Services Directive, effective since 2020, mandates age-rating systems and parental controls on streaming platforms, alongside strict prohibitions on harmful content such as child exploitation material. In 2021, the EU launched a comprehensive child online safety strategy, incorporating funding, research, and voluntary codes of conduct to address risks like grooming and disinformation.

More recent proposals from the Commission include practical measures such as private-by-default accounts for minors and limitations on addictive features like autoplay and infinite scrolling. Parts of the AI Act, in effect since February 2025, specifically prohibit AI systems that employ subliminal techniques or exploit children's vulnerabilities to manipulate their behavior. The Digital Fairness Act, anticipated for formal proposal in late 2026, aims to further tighten platform design rules by banning "dark patterns" and addictive features like infinite scrolling.

At the core of this regulatory landscape is the Digital Services Act (DSA), a landmark piece of legislation designed to modernize the oversight of online platforms. Proposed in 2020, agreed upon by the European Parliament and Council in 2022, and fully implemented in February 2024, the DSA places a strong emphasis on protecting users, with minors as a priority. This includes mandating safer default settings, enhanced content moderation, and restrictions on targeted advertising. The DSA also establishes a new enforcement system involving national Digital Services Coordinators and EU-level oversight.

Since the implementation of the DSA, EU citizens have witnessed increased transparency, strengthened user rights, and significant limitations on harmful or exploitative practices. Users now have clearer channels for reporting and appealing content moderation decisions, while minors benefit from more stringent privacy protections and reduced exposure to targeted advertisements.

The Economic and Operational Impact on Digital Platforms

The imposition of age restrictions on social media platforms can have a considerable impact on their reach among teenagers, a demographic that significantly drives online activity. This can translate into reduced ad impressions and a decline in traffic revenue for businesses. Online platforms heavily rely on young users for their advertising income, and age limits directly shrink the pool of young users, making targeted advertising more challenging.

Furthermore, compliance with these new regulations can lead to increased costs for companies. They are compelled to invest in more sophisticated age-assurance systems and parental-consent processes, which are often complex and expensive due to the necessity of advanced identity verification and robust data protection technologies. The prohibition of addictive design features and engagement algorithms will likely necessitate significant product redesign, leading to higher engineering costs and potentially delaying market launches within the EU. A shift towards prioritizing safer content may also strain operational budgets.

According to Schaldemose, large companies have a responsibility to develop "new platforms with a completely different business model that protects children." Beyond operational costs, companies also face heightened legal risks in the event of non-compliance. The European Parliament has proposed holding platform owners personally accountable for serious and repeated violations of provisions designed to protect minors. "They're the ones who make the platforms available," Schaldemose emphasized. "If we agree on an age limit, the responsibility is definitely on the companies in case of violations."

The Urgency for European Action

Christel Schaldemose has expressed growing impatience with the pace of the European Commission's response. She noted that while the panel on child safety online was announced in September, it only commenced its work in March. "The longer it takes for the Commission to come up with a proposal, the more likely it is we have a fragmented market and loopholes," she warned, adding, "I have become impatient with the Commission. It looks like member states are also a bit impatient because they are also pushing."

Schaldemose firmly believes that privacy and data-sharing concerns can no longer serve as valid excuses for inaction. "In the last two years, we have developed tools that do not compromise personal data and safety," she stated. The European Parliament remains committed to advocating for a swift and effective solution. "We need to act at the European level, and the Parliament is clear on this," Schaldemose concluded, underscoring the collective resolve to prioritize the safety and well-being of children in the digital age.

No comments:

Post a Comment

×
Latest news Update