The European Commission has initiated formal proceedings against Meta, the provider of Facebook and Instagram, under the Digital Services Act (DSA). The investigation focuses on potential violations related to the protection of minors on these platforms.
The Commission's concerns include:
- The potential for Facebook and Instagram's systems, including their algorithms, to stimulate behavioural addictions in children and create 'rabbit-hole effects'.
- The adequacy of Meta's age-assurance and verification methods.
The proceedings will address:
- Meta's compliance with DSA obligations on risk assessment and mitigation related to designing Facebook's and Instagram's online interfaces. This includes potential exploitation of minors' weaknesses and inexperience, causing addictive behaviour and reinforcing the 'rabbit hole' effect.
- Meta's compliance with DSA requirements on mitigation measures to prevent minors' access to inappropriate content, particularly the effectiveness of Meta's age-verification tools.
- Meta's compliance with DSA obligations to ensure a high level of privacy, safety, and security for minors, especially regarding their default privacy settings as part of their recommender systems.
These failures could constitute infringements of Articles 28, 34, and 35 of the DSA if proven. The Commission will conduct an in-depth investigation and continue to gather evidence. The opening of formal proceedings empowers the Commission to take further enforcement steps.
Facebook and Instagram were designated as Very Large Online Platforms (VLOPs) under the EU's DSA as they both have more than 45 million monthly active users in the EU. As VLOPs, they had to comply with a series of obligations set out in the DSA. On 30 April 2024, the Commission had already opened formal proceedings against Meta on other issues.