Regulators in the European Union have launched a formal enquiry into Meta for possible violations of child safety regulations pertaining to online content on its Facebook and Instagram platforms.
The popular social media platforms’ algorithmic algorithms for suggesting movies and messages have alarmed the European Commission, which expressed concern on Thursday that they can “exploit the weaknesses and inexperience” of minors and encourage “addictive behaviour.”
Additionally, its investigators will look into whether these systems are bolstering the “rabbit hole” theory, which links consumers to progressively upsetting content.
The executive arm of the bloc stated in a statement, “In addition, the Commission is also concerned about age-assurance and verification methods put in place by Meta.”
The Digital Services Act (DSA), a regulation that forces the biggest internet companies in the world to step up their efforts to protect European customers online, is the legal basis for the probe.
Strict guidelines are enforced by the DSA to safeguard children and guarantee their online security and privacy.
The EU’s internal market commissioner, Thierry Breton, stated on X that the authorities were “not convinced that Meta has done enough to comply with the DSA duties – to minimise the risks of detrimental consequences to the physical and mental health of young Europeans on its platforms Facebook and Instagram”.
Meta released a statement saying, “We have spent a decade developing more than 50 tools and policies designed to protect young people online. Our goal is to ensure that they have safe and age-appropriate experiences.”
“This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission,” the tech giant with headquarters in the United States continued.
There is no time limit on when the investigations must be completed. Infractions may lead to fines of up to 6% of a platform’s worldwide revenue or, in the case of severe and persistent infractions, an outright ban.
23 “very large” online platforms, including Facebook and Instagram, are subject to the DSA. YouTube, TikTok, and Snapchat are a few more.
A spate of investigations has been initiated by the bloc, including one on Meta last month due to concerns that Facebook and Instagram were not doing enough to combat misinformation in front of the June EU elections.
Based on concerns that the well-known video-sharing app TikTok might not be doing enough to address detrimental impacts on youth, the commission launched an investigation into the company in February.
In April, the EU also ordered TikTok to halt its reward programmes for the offshoot Lite app, citing concerns that the app’s “addictive” qualities would endanger users’ mental health.