Court Papers Highlight Meta’s Historic Hesitation in Safeguarding Children on Instagram

The case against Meta claims that recently released, unredacted court filings from New Mexico highlight the company’s “historical reluctance” to protect minors on its services.

Attorney General Raúl Torrez of New Mexico filed a lawsuit against Meta, the corporation that owns Facebook and Instagram, in December, claiming that the business neglected to shield young users from content that encourages child sexual abuse and permitted adults to request obscene images from them.

Internal staff messages and presentations from 2020 and 2021, which were recently redacted from the lawsuit, reveal that Meta was aware of problems like adult strangers being able to contact children on Instagram, the sexualization of minors on the platform, and the risks associated with its “people you may know” feature, which suggests relationships between adults and children.

However, the paragraphs demonstrate that Meta was slow to resolve the problems.

For example, in 2021 Instagram started limiting the amount of messages that adults may send to youngsters.

“This is the kind of thing that pisses Apple off to the extent of threating to remove us from the App Store,” an internal memo quoted in the lawsuit depicts Meta “scrambling in 2020 to address an Apple executive whose 12-year-old was solicited on the platform.”

The complaint claims that Meta, “knew that adults soliciting minors was a problem on the platform, and was willing to treat it as an urgent problem when it had to.”

The difficulties in reporting disappearing videos was one of the “immediate product vulnerabilities” that Meta noted in a July 2020 document titled “Child Safety — State of Play (7/20).”

Court Papers Highlight Meta's Historic Hesitation in Safeguarding Children on Instagram

Meta also stated that Instagram did not always have the same precautions as Facebook. According to the lawsuit, Meta’s justification at the time was that it didn’t want to prevent parents and elder relatives from interacting with their younger relatives on Facebook.

The author of the report stated that Meta compromised children’s safety in order to make a “big growth bet,” calling the explanation “less than compelling.” However, Instagram declared in March 2021 that it will no longer allow users over 19 to message underage users.

Meanwhile, an employee questioned, “What specifically are we doing for child grooming (something I just heard about that is happening a lot on TikTok)?” in an internal chat in July 2020.

Another employee said, “Somewhere between zero and negligible.” The lawsuit states that “child safety is an explicit non-goal this half” (presumably referring to a half-year).

Meta stated in a statement that it has spent “a decade working on these issues and hiring people who have dedicated their careers to keeping young people safe and supported online” and that it wants kids to enjoy age-appropriate, safe experiences online.

Menlo Park, California-based Meta has been upgrading its tools and safeguards for younger users in response to legislative pressure on child safety, but others argue that it hasn’t gone far enough.

The business made the announcement last week that it will begin removing offensive content—such as posts on eating disorders, suicide, and self-harm—from adolescent users’ Facebook and Instagram profiles.

The action from New Mexico comes after a lawsuit filed in October by 33 states alleging that Meta is intentionally and willfully creating features on Facebook and Instagram that cause young people to get addicted to the social media sites, so endangering adolescents and exacerbating the youth mental health problem.

“For years, Meta employees tried to sound the alarm about how decisions made by Meta executives subjected children to dangerous solicitations and sexual exploitation,” Torrez stated in a statement. “While the company continues to downplay the illegal and harmful activity children are exposed to on its platforms, Meta’s internal data and presentations show the problem is severe and pervasive.”

Source

Comments are closed.