New Mexico’s attorney general set up phony accounts on two Meta platforms that pretended to belong to teens and preteens, resulting in some very troubling results. The findings are the centerpiece of a new lawsuit that alleges the mega tech company’s algorithms offered a vast array of sexually explicit material to youngsters as well as propositions from sex predators.
The AG’s office established several accounts on Facebook and Instagram that contained AI-generated pictures of fictional children. One made it look like a mother was sex trafficking her daughter – and the response was startling. On Facebook Messenger, the account’s chats soon became jam-packed with porn, including videos of exposed private body parts, according to the lawsuit as outlined in The Wall Street Journal.
Sex Predators — Cheaper by the Dozen
In October, a coalition of 41 state attorneys general filed suit in federal and state courts against Meta, alleging multiple features of the popular social media platforms harm youngsters. New Mexico Attorney General Raúl Torrez told The Journal that Meta has “gone beyond merely hosting child sex-abuse content to enabling it.”
Right out of the box, the New Mexico AG fired away at Mark Zuckerberg & Company, saying that the enterprise openly advertises that it is safe for kids — but “the reality is far different.” Instead, he claims Meta “knowingly exposes children to the twin dangers of sexual exploitation and mental health harm.” Then down came the hammer:
“Meta’s platforms, Facebook and Instagram, are a breeding ground for predators who target children for human trafficking, the distribution of sexual images, grooming, and solicitation. Teens and preteens can easily register for unrestricted accounts because of a lack of age verification. When they do, Meta directs harmful and inappropriate material at them. It allows unconnected adults to have unfettered access to them, which those adults use for grooming and solicitation. And Meta’s platforms do this even though Meta has the capability of both determining that these users are minors and providing warnings or other protections against material that is not only harmful to minors but poses substantial dangers of solicitation and trafficking.”
For its part, Meta put out a statement claiming to “use sophisticated technology, hire child safety experts, report content to the National Center for Missing and Exploited Children, and share information and tools with other companies and law enforcement, including state attorneys general, to help root out predators.”
However, if the fake AI-generated accounts are any indication, these measures aren’t very effective. A case in point is the fictitious account of Taya Neils. It was set up with only her name and birthdate. Based on that information alone, “Meta recommended accounts with sexually suggestive images,” and one Facebook post “recommended” to the 13-year-old a site that includes a pornographic video called “Lovely Girl,” which sells sex videos and has 119,000 followers. Other posts recommended were “Sadi*st” with “Teens” and “Se*xual fun.”
Sex predators actively sought out these users, requesting them to “send pictures” and to engage in and then post a variety of sexually explicit acts. That this inappropriate content was so quickly and efficiently recommended to 12- and 13-year-olds is unconscionable.
Predatory sexual behavior does harm to youngsters as well as those with cognitive deficits or special needs. Meta is likely to have its hands full with this 224-page New Mexico lawsuit, as the evidence of connecting young children with myriad opportunities to become easy prey for sex predators is explicit, detailed, and plentiful.