The European Union regulators have initiated a formal investigation into Meta for potential violations of online content regulations concerning child safety on its Facebook and Instagram platforms.
Concerns have been raised by the European Commission regarding the algorithmic systems utilized by these popular social media platforms, which may exploit the vulnerabilities of children and encourage addictive behavior.
The investigation will also focus on whether these systems are contributing to the creation of a ”rabbit hole” effect, leading users to increasingly disturbing content.
Furthermore, the Commission is examining the age-verification methods implemented by Meta to ensure compliance with regulations.
This inquiry falls under the Digital Services Act (DSA), which mandates major tech companies to enhance their efforts in safeguarding European users online, especially children.
The DSA includes stringent guidelines to protect children and uphold their privacy and security on digital platforms.
Thierry Breton, the EU’s internal market commissioner, expressed doubts about Meta’s compliance with DSA obligations in safeguarding the physical and mental well-being of young Europeans on Facebook and Instagram.
🚨 Today we open formal #DSA investigation against #Meta.
We are not convinced that Meta has done enough to comply with the DSA obligations — to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms Facebook and Instagram. pic.twitter.com/WxPwgE5Opc
— Thierry Breton (@ThierryBreton) May 16, 2024
Meta responded in a statement, emphasizing their commitment to providing safe online experiences for young individuals and highlighting the numerous tools and policies developed over the years to protect them.
The tech giant, headquartered in the United States, acknowledged the challenges posed by ensuring a safe online environment for all users.
Article from www.aljazeera.com