facebook-pixel

How Snapchat gets Utah kids hooked: 5 newly revealed details from state’s lawsuit against message app

From low moderation to high addiction, here is what unredacted details show about Utah’s legal argument.

(Amr Alfiky | AP Photo) This July 30, 2019, file photo shows an introduction page for Snapchat shown in a mobile phone displayed at Apple's App Store in Chicago. Newly unredacted details in Utah's lawsuit against Snapchat show more reasoning behind the state's arguments.

When the state of Utah first sued Snapchat in June, much of the legal complaint was redacted — but last week, Snapchat’s lawyers filed a motion to keep a few paragraphs blacked out, while agreeing to make the rest public.

The suit, filed by the state’s Division of Consumer Protection, accuses Snap Inc., the parent company of the Snapchat messaging app, of violating Utah law by using “deceptive design features that addict children; harm their mental health and wellbeing; and facilitate illegal drug sales, sexual exploitation, sex trafficking, the distribution of pornography, and other unlawful acts.”

A spokesperson from Snap said the “safety of Snapchatters” is their highest priority, and criticized the state’s lawsuit.

“Last year, a social media law passed in Utah was preliminarily enjoined after the court ruled it was likely unconstitutional, violating the First Amendment’s protection of free speech,” the Snap spokesperson said in a statement. “The Utah Attorney General is resorting to civil litigation as a means to circumvent the court and impose age verification requirements and age-related restrictions in ways that are unconstitutional.”

The state’s complaint initially blacked out details supporting those accusations. The newer, less-redacted version, released to the public Tuesday, offers these five key insights to the state’s argument:

1. Utah teens spend lots of time on Snapchat

In the original lawsuit, Utah argued that “children are particularly vulnerable to Snapchat’s manipulative design features” that use their brain’s underdeveloped reward system to keep them compulsively checking the app.

In the newly unredacted sections, the state gives further statistics that it argues shows the efficiency in which the app hooks kids, citing documents obtained by subpoena that show Utah children check the app more than 70 times a day.

Not only is Snapchat the “leading messaging platform among users aged 13-17,” the state’s complaint says, but a fifth of Utahns between ages 10 and 19 use the app every day — and since 2020, Utah teenagers have spent 7.9 billion minutes using the app.

This, according to the lawsuit, is by design.

The state’s complaint says, “Snap engineering managers have even boasted that Snapchat ‘should have more addicting features’ because of how effective they are at boosting usage.”

2. Utah says Snapchat’s My AI launch was “reckless”

Another unredacted portion of the lawsuit says Snapchat’s My AI feature was tested “for an unusually short time and the decision to quickly launch the feature was called ‘reckless’” by some of the company’s senior engineering managers.

The feature, according to the lawsuit, was known by Snap employees to make up answers — and could be tricked into confidently saying anything.

Before the My AI feature was fully released in 2023, the lawsuit says the company released it to Snapchat+ subscribers who acted as unknowing “guinea pigs” for the feature. That group, the lawsuit said, included tens of thousands of Utahns.

The lawsuit references an investigation from The Washington Post in which the paper’s tech columnist set up accounts that listed the user’s age at both 13 and 15. The chatbot gave advice to the fictional 13-year-old’s account on how to have sex with an adult. To the account registered as having a 15-year-old user, My AI gave advice on hiding drugs and alcohol from a parent.

“Even before high-profile reporting, Snap employees expressed concerns about the company’s ability to manage the new technology responsibly,” the lawsuit says.

Still, the lawsuit says, Snap released the AI feature to all of its users two months later, including 400,856 daily active users in Utah between the ages of 13 and 24.

The lawsuit says the Utah Division of Consumer Protection tested the chatbot on accounts it registered as having minor users.

They found that the chatbot would continue talking with someone even if they disclosed they were under 13, the minimum age to use Snapchat. To an account they registered to a 15-year-old girl, the lawsuit says the chatbot gave advice on how to flirt with a Spanish teacher, “including by suggesting she ask him to meet her outside of school.”

“Contrary to Snap’s public claims that My AI is ‘programmed’ to follow Snap’s community guidelines, internal discussions reveal this was not true,” the lawsuit says. “Snap stated that it was not monitoring My AI conversations for violations.”

3. Snapchat tracks location data, even in “Ghost Mode”

In the original lawsuit, the state argued that My AI’s data mining “raised alarm bells,” as it collects “troves of sensitive data about the chats and images children send through the feature.”

The lawsuit says this data includes geolocation and is even collected when a user is in “Ghost Mode,” a feature that users are told will prevent their friends from seeing their location. In a now-unredacted portion of the lawsuit, the state says internal Snap documents show that in the chatbot’s underlying code, “My AI is told that ‘if Ghost Mode is on, you still know their location.’”

This, it says, is against the “reasonable assumption” that users would not be tracked in this mode because “My AI frames its relationship with the user as a friendship.”

Yet, on Snap’s website, it says, “If you share your location with Snapchat, then it will be shared with My AI, even in Ghost Mode.”

4. Snapchat struggles to police illicit content

Despite Snap stating online that it “takes ‘a very proactive approach to moderating content and promoting information integrity,’” newly unredacted portions of the state’s lawsuit argue the company knows its app’s “risks” are “from lack of enforcement infrastructure and resources issues.”

The lawsuit says the company “falls especially short” in moderating such content as drug dealing, financial extortion and child sexual abuse images.

Utah authorities claim the company “has long known it has significant issues with moderating [child sexual abuse material] and that the situation has simply grown out of control.”

The lawsuit says the company has at times stopped using PhotoDNA, a technology that works to identify child sexual abuse images, giving an example when they did so in 2021 citing “operational constraints.”

The company tried new methods in 2021 and 2022, the lawsuit says, to detect child sexual abuse images and other dangerous content. The lawsuit says the program suffered in both resources and implementation.

The company, the lawsuit says, was “focused on correcting ‘false positives’” instead of finding bad actors, did not adequately train its machine learning models to better detect problematic content and “failed to invest in human moderation resources.”

The lawsuit says Snap can only catch a small amount of child sexual abuse images, referencing the company’s internal accuracy metrics that said it caught about 4%.

Citing a May 2021 statement from a senior trust and safety manager, the lawsuit also says Snap doesn’t know the extent of illegal drugs on Snapchat.

In 2022, the lawsuit says Snap had only 53 employees and about 80 third-party vendor agents working to address illicit drugs on Snapchat.

“Unfortunately, social media platforms like Snapchat have become a new virtual market for drug cartels and their associates to sell pills,” the lawsuit states.

5. Almost no in-app reports are reviewed

Utah’s lawsuit argues that Snapchat’s in-app reporting feature — which, the suit said, the company claimed was “enormously valuable” until earlier this year — is “neither easy to use nor effective.”

Newly unredacted portions of the lawsuit say that in February 2022, the company knew that more than 96% of account reports were never reviewed.

Rather than reviewing the reported accounts, the lawsuit says Snapchat would prompt reporting users to block the account without triggering a review, something the company has since said was a “significant gap in our safety coverage.”

This process, the lawsuit says, causes kids to lose faith in the app’s reporting mechanism.