The European Union has formally questioned major tech companies Apple, Snapchat, and YouTube regarding their efforts to protect children online, under the Digital Services Act (DSA), the bloc’s comprehensive online safety legislation. Regulators are seeking detailed explanations on how these platforms enforce age restrictions, control harmful content, and manage recommendation algorithms that may expose minors to inappropriate material.
Platforms Under the Microscope
Snapchat faces inquiries about its methods for blocking users under the age of 13 and preventing illegal transactions involving drugs and vapes. The company stated it is “deeply committed” to safety and pledged full cooperation with the regulators.
Apple and Google have been asked to clarify how their app stores prevent children from downloading dangerous or illegal apps, including those related to gambling and nudity.
YouTube, a Google subsidiary, is being pressed over its recommendation system, with regulators questioning how harmful content still reaches minors despite parental control features. Google maintains that it “already uses strong protections” and plans to further enhance these safeguards.
EU Moves Toward Stricter Child Safety Regulations
The inquiries align with increasing political momentum within the EU for introducing a “digital majority age.” Twenty-five member countries, plus Norway and Iceland, support the concept. Denmark has announced plans to ban social media usage for children under 15, while France is considering similar restrictions.
European Commission President Ursula von der Leyen has publicly backed age-based online limits, with an expert panel set to review additional measures soon.
Though currently formal information requests, these actions could escalate into investigations and fines if the companies fail to meet EU standards. Some U.S.-based tech firms have criticized the DSA as excessive regulation.
Nonetheless, the EU remains firm: safeguarding children’s online experiences is a top priority and “non-negotiable.”