Posted inSocial MediaNews

Meta in trouble for content exposed to children 2023

Is Meta exposing children to harmful content?

Meta in trouble for content exposed to children 2023

Meta under scrutiny

Meta, the parent company of popular social media platforms like Facebook and Instagram, is under increasing scrutiny not just for alleged content moderation failures but also for questions surrounding the accuracy of its reporting and protection of users, especially younger ones.

A recent complaint filed by 33 states accuses Meta of consistently misrepresenting the effectiveness of its moderation teams through its Community Standards Enforcement Reports. These reports, according to the complaint, boast low rates of community standards violations while excluding crucial data from user experience surveys that reveal higher instances of user encounters with harmful content.

For instance, Meta claims that only 10 or 11 out of every 10,000 content views on its platforms contain hate speech. However, internal surveys, like the Tracking Reach of Integrity Problems Survey, paint a different picture, reporting an average of 19.3% of Instagram users and 17.6% of Facebook users witnessing hate speech or discrimination on the platforms.

Child safety neglected?

The complaint suggests that Meta may be using a law of averages to downplay such incidents, presenting a smaller amount of reports divided by its vast user base. Despite this, user feedback indicates a much higher exposure to harmful content, revealing a stark contrast between the presented data and the actual user experience.

Meta in trouble for content exposed to children 2023

Moreover, the complaint alleges that the company, despite being aware of this discrepancy, has publicly presented alternative statistics to reduce scrutiny and create a false sense of safety within its apps. The potentially more alarming aspect of the complaint involves Meta reportedly receiving over 1.1 million reports of users under 13 accessing Instagram since early 2019, with only a fraction of those accounts being disabled.

Consequences to be faced

These allegations are part of a federal lawsuit filed in the U.S. District Court for the Northern District of California. If the platform is found in violation of privacy laws, it could face significant fines and increased scrutiny regarding its protection and moderation measures, especially concerning younger users. The outcome may not only impact Meta’s business but could also provide a clearer understanding of the actual rates of exposure and potential harm within its apps.

In response to the complaint, Meta asserts that it is being misrepresented, with selective quotes and cherry-picked documents used to characterize its work unfairly.

Stay updated on all of the latest news by subscribing to the ITP Live newsletter below and by clicking the push notifications.