Meta told it's violating EU law by not doing enough to keep children off Facebook and Instagram

The European Commission found that Meta is in violation of its Digital Services Act on Wednesday.

In a preliminary investigation, it mentioned the platform is failing to keep under-13s off its platforms. Furthermore, experts in investors note the continued relevance.

Meta has faced heightened scrutiny over its handling of child safety on its platforms this year.

The European Commission has found that Meta breached EU law by failing to prevent under-13s from accessing its platforms, as scrutiny of the tech giant’s handling of child safety intensifies.

The commission stated Wednesday that its preliminary investigations concluded that Meta violated the EU’s Digital Services Act because the minimum age requirement of 13 for Instagram and Facebook is not adequately enforced.

When creating an account, minors can input a false birth date, with no controls in place to verify it, the Commission commented.

Additionally, the tool for reporting a minor’s account is “difficult to use” and requires up to seven clicks to access the form, the commission stated. Even when a minor’s account is reported, the commission found that there are often no adequate follow ups or measures to get them off the platform.

“The Commission considers that Instagram and Facebook must change their risk assessment methodology, To evaluate which risks arise on Instagram and Facebook in the European Union, and how they manifest,” the commission remarked in its announcement.

A Meta spokesperson told CNBC: “We disagree with these preliminary findings. We’re clear that Instagram and Facebook are intended for the public aged 13 and older and we have measures in place to detect and remove accounts from anyone under that age.

“We continue to invest in technologies to find and remove underage users and will have more to share next week about additional measures rolling out soon. Understanding age is an industry-wide challenge, which requires an industry-wide solution, and we will continue to engage constructively with the European Commission on this crucial issue.”

Meta can now review the Commission’s preliminary investigation findings and respond in writing. If the commission’s findings are confirmed by its final investigation, it can fine Meta up to 6% of its total worldwide annual turnover.

This comes after two high-profile U.S. court rulings in March: one found that aspects of its platform design contributed to addiction and mental health harms among teenagers, while the other concluded that the organization misled users about children’s safety on its platforms.

Meanwhile, a blanket social media ban for under-16s is gaining traction with governments worldwide, after Australia became the first country to implement such a ban.

The U.K., Spain, and France are among some of the countries looking at legislation to prevent teens under 16 from being on social media.

U.K. regulators also urged social media giants, including YouTube, TikTok, Snapchat, Instagram, and Facebook, to enforce stricter protections for children on its platforms in March.

The Information Commissioner’s Office noted that the platforms need to implement better age verification technologies as opposed to just “self declaration,” which is “easily circumvented.” This could include facial age estimation, digital ID, or one-time photo matching.

“With ever-growing public concern, the status quo is not working, and industry must do more to protect children. You should act now to identify and implement current viable technologies to prevent children under your minimum age from accessing your service,” ICO’s CEO Paul Arnold stated in a letter at the time. This also touches on aspects of wall street.

AI Disclosure: This article has been generated and curated using advanced AI technology. While we strive for absolute accuracy, some details may be summarized or translated by autonomous systems. Please cross-reference critical financial data with official sources.