U.S. lawsuit against Instagram

Meta faces new scrutiny over claims that teens are being exposed to harmful content

While X has been in the spotlight lately for alleged failures in content moderation, Meta is also facing questions about how its systems handle protecting users, especially young people, as well as the accuracy of its external reporting of such problems.

According to a recently disclosed complaint against the company filed on behalf of 33 states, Meta has repeatedly misrepresented the performance of its moderation teams in its community standards compliance reports, which new findings show do not reflect Meta's own internal data on violations.

As Business Insider reports:

"Meta's community standards compliance reports indicate low levels of community standards violations on its platforms, but exclude key data from user experience surveys that show much higher rates of user interaction with harmful content. For example, Meta claims that out of every 10,000 views of content on its platforms, only 10 or 11 contain hate speech. But the complaint states that Meta's internal user survey, known as the Integrity Issues Tracking Survey, found that an average of 19.3% of Instagram users and 17.6% of Facebook users reported witnessing hate speech or discrimination on the platforms."

In this sense, Meta appears to be using the law of averages to mitigate such incidents by accepting fewer reports and splitting them across its huge user base. But actual user feedback shows that such impacts are much higher, so while the broader data suggests very low numbers, the user experience is obviously different.

The complaint alleges that Meta knows this, but has publicly presented these alternative statistics to create a false sense of security in its apps and approach.

A potentially even more troubling element of the same complaint is that Meta also reportedly received more than 1.1 million reports of users under the age of 13 logging into Instagram since the beginning of 2019, yet it disabled "only a portion of these accounts."

The allegations were outlined as part of a federal lawsuit filed last month in the U.S. District Court for the Northern District of California. If Meta is found to have violated privacy laws as a result of these claims, it could face huge fines, as well as further scrutiny of its security and moderation measures, especially with regard to young user access.

Depending on the outcome, this could have a serious impact on Meta's business, and could also lead to a more accurate understanding of the actual exposure and potential harm in Meta's apps.

In response, Meta says the complaint mischaracterizes its work by "using selective quotes and carefully chosen documents."

This is another problem for Meta's team that could bring Zuck and company back into the spotlight regarding effective moderation and disclosure, and could lead to the introduction of even stricter rules around young users and data access.

This could potentially end up pushing the US towards stricter EU rules.

In Europe, the new Digital Services Act (DSA) includes a number of provisions designed to protect young users, including a ban on the collection of personal data for advertising purposes. Similar restrictions could arise from this new pressure from the US, although it remains to be seen whether the complaint will move forward and how Meta will counter it.

Last year, a Common Sense Media report showed that 38% of children aged 8 to 12 use social media on a daily basis, and that number has been steadily increasing over time. And while Meta is committed to implementing better age-identification and security measures, many kids are still accessing Instagram, in many cases simply by entering a different birth year.

Of course, parents also have a responsibility to monitor their child's time in front of a screen and make sure they're not accessing apps they shouldn't. But if the investigation does show that Meta knowingly allowed this to happen, it could lead to a number of new complications for both Meta and the social media sector more broadly.

It will be interesting to see where the complaint leads and how moderation on Instagram and Facebook will change.

Actual articles

All articles