An internal 2021 Meta Platforms presentation indicated that every day, 100,000 children receive photographs of adult genitalia or other sexually abusive information on Instagram and Facebook, where they have been the regular subjects of sexual harassment.
This conclusion is among newly unredacted information concerning the company’s child-safety standards in a lawsuit filed last month by New Mexico, which claims Meta’s platforms recommend sexual content to minors and push underage accounts to predatory adult users.
In one 2021 internal document detailed in the now-redacted documents, Meta staffers stated that one of their recommendation algorithms, known as “People You May Know,” was known to connect underage users with possible sexual predators.
According to the New Mexico lawsuit, officials were alerted to the discovery several years ago. They rejected a staff request that the business change the algorithm’s architecture, known internally as PYMK, to prevent it from recommending minors to adults.
In the report’s comments section, one Facebook worker claimed that the algorithm has previously “contributed up to 75% of all inappropriate adult-minor contact.” “How on earth have we not just turned off PYMK between adults and children?” asked another employee, according to the lawsuit. “It’s really, really upsetting.”
Meta declined to comment on the newly disclosed references to internal materials, directing the Journal to a prior statement in which it said New Mexico “mischaracterizes our work using selective quotes and cherry-picked documents.”
The corporation has referred to child predators as “determined criminals,” and has long invested in both enforcement and child safety-focused solutions for young users and parents.
New Mexico claims Meta failed to address pervasive predation on its platform or prohibit design features that promoted youngsters to adults with evil intent. The action claims that Meta wrongfully persuaded the public that its platforms were safe, rather than openly disclosing internal findings such as the 100,000 child-a-day scale of harassment.
Much of the internal conversation reported in the recently unredacted information centred on Instagram. According to papers referenced in the case, staff noted in an internal email in 2020 that the prevalence of “sex talk” to children on Instagram was 38 times higher than on Facebook Messenger in the United States, and encouraged the firm to implement stronger controls on the site.
One employee that year stated that an Apple executive had complained about his 12-year-old child being approached on Instagram. The Meta employee in charge of addressing the issue stated that “this is the kind of thing that irritates Apple to the point of threatening [sic] to remove us from the App Store,” and inquired about a date for when the firm would prohibit adults from texting kids on its platform.
According to a November 2020 presentation titled “Child Safety: State of Play,” Instagram has “minimal child safety protections” and policies surrounding “minor sexualization” are “immature.” It also cited the platform’s “minimal focus” on trafficking.
Despite recognizing the scope of the problem, New Mexico claims that Meta executives did not take measures to prevent adults from sexually soliciting children until late 2022, and they fell short of the extensive messaging limitations advised by its safety experts.
Rather than simply halting the recommendation of children’s profiles to adults, Facebook and Instagram attempted to block such recommendations to adults who had already displayed suspect behaviour towards children.
Meta’s method of limiting communication to only known questionable accounts was guaranteed to be less effective than turning off the recommendations, according to New Mexico, because both harmful adults and minors habitually lied about their age.
According to New Mexico’s complaint, Meta admitted internally in 2021 that the majority of juveniles on its platforms falsely claim to be adults, and an examination of accounts disabled for grooming children discovered that 99% of those adults failed to identify their age.
Meta formed a task force in June 2023 to investigate child-safety issues on its platforms, following a piece in The Wall Street Journal revealing that Instagram’s algorithms connected and encouraged a massive network of accounts openly dedicated to the commission and purchase of underage sex content.
Additional Journal articles published last year revealed that Meta is struggling to resolve issues on both Instagram and Facebook, where it recently adopted encryption for direct communications.
According to the Journal, the company’s safety team had long warned of the risks of enshrouding exchanges that could be used to prosecute child exploitation. Meta stated that it has spent years building safety protocols to prevent and combat abuse.
In addition to New Mexico’s lawsuit, Meta was sued in October by more than 40 other states, who said it deceived the public about the dangers its platforms pose to young people.
Meta announced this month that it would begin automatically banning hazardous content on teen Instagram and Facebook accounts, including videos and postings on self-harm, graphic violence, and eating disorders.