The images are horrific. Children, some just 3 or 4 years old, being sexually abused and in some cases tortured.
Pictures of child sexual abuse have long been produced and shared, but it has never been like this: Technology companies reported a record 45 million online photos and videos of the abuse last year.
More than a decade ago, when the reported number was less than 1 million, the proliferation of the explicit imagery had already reached a crisis point. Tech companies, law enforcement agencies and legislators in Washington responded, committing to new measures meant to rein in the scourge. Landmark legislation passed in 2008.
Yet the explosion in detected content kept growing — exponentially.
An investigation by The New York Times found an insatiable criminal underworld that had exploited the flawed and insufficient efforts to contain it.
A paper recently published in conjunction with the National Center for Missing and Exploited Children described a system at “a breaking point,” with reports of abusive images “exceeding the capabilities of independent clearinghouses and law enforcement to take action.” It suggested that future advancements in machine learning might be the only way to catch up with criminals.
In interviews, victims across the United States described in heart-wrenching detail how their lives had been upended by the abuse. Many of the survivors and their families said their view of humanity had been inextricably changed by the crimes themselves and the online demand for images of them.
“I don’t really know how to deal with it,” said one woman who, at age 11, had been filmed being sexually assaulted by her father. “You’re just trying to feel OK and not let something like this define your whole life. But the thing with the pictures is — that’s the thing that keeps this alive.”
The Times’ reporting revealed a problem global in scope but one firmly rooted in the United States because of the central role Silicon Valley has played in facilitating the imagery’s spread and in reporting it to authorities. While the material, commonly known as child pornography, predates the digital era, smartphone cameras, social media and cloud storage have allowed images to multiply at an alarming rate.
In a particularly disturbing trend, online groups are devoting themselves to sharing images of younger children and more extreme forms of abuse. The groups use encrypted technologies and the dark web, the vast underbelly of the internet, to teach pedophiles how to carry out the crimes and how to record and share images of the abuse worldwide.
After years of uneven monitoring of the material, several major tech companies stepped up surveillance of their platforms. Executives with some companies pointed to the voluntary monitoring and the spike in reports as indications of their commitment to addressing the problem.
But police records and emails, as well as interviews with local, state and federal law enforcement officials, show that some tech companies still fall short. It can take weeks or months for them to respond to questions from authorities, if they respond at all. Sometimes they respond only to say they have no records, even for reports they initiated.
And when tech companies cooperate fully, encryption and anonymization can create digital hiding places for perpetrators. Facebook announced in March plans to encrypt Messenger, which last year was responsible for nearly 12 million of the 18.4 million worldwide reports of child sexual abuse material, according to people familiar with the reports. Reports to authorities typically contain more than one image, and last year encompassed the record 45 million photos and videos.
The law Congress passed in 2008 foresaw many of today’s problems, but the Times found that the federal government had not fulfilled major aspects of the legislation.
The Justice Department has produced just two of six required reports to compile data about internet crimes against children and set goals to eliminate them, and there has been a constant churn of short-term appointees leading the department’s efforts.
The federal government has also not lived up to the law’s funding goals. Congress has regularly allocated about half of the $60 million in yearly funding for state and local law enforcement efforts. Separately, the Department of Homeland Security this year diverted nearly $6 million from its cybercrimes units to immigration enforcement.
Further impairing the federal response are shortcomings at the National Center for Missing and Exploited Children, which reviews reports it receives and then distributes them to federal, state and local law enforcement agencies, as well as international partners.
The nonprofit center has relied in large measure on 20-year-old technology, has difficulty keeping experienced engineers on staff and, by its own reckoning, regards stopping the online distribution of photos and videos secondary to rescuing children.
Covering Their Tracks
The videos found on the computer of an Ohio man and site administrator named Jason Gmoser were described by investigators as among “the most gruesome and violent images of child pornography.” The videos were stored in a hidden computer file and had also been encrypted.
Increasingly, criminals are using advanced technologies like encryption to stay ahead of police. In this case, the Ohio man, who helped run a website on the dark web known as the Love Zone, had more than 3 million photos and videos on his computers.
The site, now shuttered, had nearly 30,000 members and required them to share images of abuse to maintain good standing, according to court documents. A private section of the forum was available only to members who shared imagery of children they abused themselves.
Multiple police investigations have broken up enormous dark web forums, including one known as Child’s Play that was reported to have had over 1 million user accounts.
Offenders can cover their tracks by connecting to virtual private networks, which mask their locations; deploying encryption techniques, which can hide their messages and make their hard drives impenetrable; and posting on the dark web, which is inaccessible to conventional browsers.
The anonymity offered by the sites emboldens members to post images of very young children being sexually abused, and in increasingly extreme and violent forms.
Exhibits in the Love Zone case include screenshots showing the forum had dedicated areas where users discussed ways to remain “safe” while posting and downloading imagery.
Testimony in Gmoser’s criminal case revealed that it would have taken authorities “trillions of years” to crack the 41-character password he had used to encrypt the site. He eventually turned it over and was sentenced to life in prison in 2016.
‘Truly Terrible Things’
The surge in criminal activity on the dark web accounted for only a fraction of the 18.4 million reports of abuse last year. That number originates almost entirely with tech companies based in the United States.
Companies have known for years that their platforms were being co-opted by predators, but many of them essentially looked the other way. And while many companies have made recent progress in identifying the material, they were slow to respond.
The recent surge by tech companies in filing reports of online abuse “wouldn’t exist if they did their job then,” said Hemanshu Nigam, a former federal prosecutor in cybercrime and child exploitation cases who now runs a cybersecurity consulting firm.
“The companies knew the house was full of roaches, and they were scared to turn the lights on,” said Hany Farid, who worked with Microsoft to develop technology in 2009 for detecting child sexual abuse material. “And then when they did turn the lights on, it was worse than they thought.”
Federal law requires companies to preserve material about their reports of abuse imagery for 90 days. But given the overwhelming number of reports, it is not uncommon for requests from authorities to reach companies too late.
Most tech companies have been quick to respond to urgent inquiries, but responses in other cases vary significantly. Police officers in Missouri, New Jersey, Texas and Wisconsin pointed to Tumblr, a blogging and social networking site with 470 million users, as one of the most problematic companies, lamenting Tumblr’s poor response to requests.
Law enforcement officials also pointed to problems with Microsoft’s Bing search engine, and Snap, the parent company of the social network Snapchat. Bing was said to regularly submit reports that lacked essential information, making investigations difficult. Snapchat is engineered to delete most of its content within a short period of time.
Facebook has long known about abusive images on its platforms, including a video of a man sexually assaulting a 6-year-old that went viral last year on Messenger. When Mark Zuckerberg, Facebook’s chief executive, announced in March that Messenger would move to encryption, he acknowledged the risk it presented for “truly terrible things like child exploitation.”
Annual funding for state and regional investigations was authorized at $60 million, but only about half of that is regularly approved. It has increased only slightly from 10 years ago when accounting for inflation. Even $60 million a year would now be vastly inadequate.
And even the most recent biennial strategy reports published by the Justice Department, in 2010 and 2016, did not include data about some of the most pressing concerns, such as trade in illicit imagery.
When the law Congress passed in 2008 was reauthorized in 2012, the coordinator role was supposed to be elevated to a senior executive position. That has not happened.
The National Center for Missing and Exploited Children has also struggled with demands to contain the spread of the imagery.
As child exploitation has grown on the internet, the center has not kept up. The technology it uses for receiving and reviewing reports of the material was created in 1998. To perform key upgrades and help modernize the system, the group has relied on donations from tech companies like Palantir and Google.
The center has said it intends to make significant improvements to its technology starting in 2020, but the problems don’t stop there. Police complain that the most urgent reports are not prioritized, or are sent to the wrong department completely. And despite its mandate by Congress, the center is not subject to public records laws and operates with little transparency.
The Times found that there was a close relationship between the center and Silicon Valley that raised questions about governance practices. The center receives both money and in-kind donations from tech companies, while employees of the same companies are sometimes members of its board. That practice, those working in the area of child protection said, could elevate the interests of tech companies above the children’s.
The close relationship with tech companies may ultimately be in jeopardy. In 2016, a federal court held that the national center, though private, qualified legally as a government entity because it performed a number of essential government functions.
If that view gains traction, Fourth Amendment challenges about searches and seizures by the government could change how the center operates. And under those circumstances, if they were to collaborate too closely with the center, companies could also be viewed as government actors subject to new legal requirements and court challenges when they police their own sites.
An Ugly Mirror
It was a sunny afternoon in July, and an unmarked police van in Salt Lake City was parked outside a pink stucco house.
At the back of the van, a man who lived in the house was seated, while officers cataloged hard drives and sifted through web histories from his computers. The man had shared sexually explicit videos online, police said.
The year was barely half over, and the team of Chief Jessica Farnsworth, an official with the Utah attorney general’s office who led a raid of the house, had already conducted about 150 such raids across Utah. The specially trained group, one of 61 nationwide, coordinates responses to internet crimes against children.
The Utah group expects to arrest nearly twice as many people this year as last year, but federal funding has not kept pace. Funding for the 61 task forces from 2010 to 2018 remained relatively flat, federal data shows, while the number of leads referred to them increased by more than 400%.
Much of the federal money goes toward training new staff members because the cases take a heavy emotional and psychological toll on investigators, resulting in constant turnover.
The volume of work has also forced the task forces to make difficult choices. Some have focused on the youngest and most vulnerable victims, while others have cut back on undercover operations.
The internet is well known as a haven for hate speech, terrorism-related content and criminal activity, all of which have raised alarms and spurred public debate and action. But the problem of child sexual abuse gets scant attention because few people want to confront the enormity and horror of the content.
Some state lawmakers, judges and members of Congress have refused to discuss the problem in detail, or have avoided attending meetings and hearings, according to interviews with law enforcement officials and victims.
Steven Grocki, who leads the child exploitation and obscenity section at the Justice Department, said the reluctance was a societal problem. “They turn away from it because it’s too ugly of a mirror,” he said.
While the imagery is often defined as “child pornography,” experts prefer terms like child sexual abuse imagery or child exploitation material to underscore the seriousness of the crimes and to avoid conflating it with adult pornography.
“Each and every image is a depiction of a crime in progress,” said Sgt. Jeff Swanson, a task force commander in Kansas. “The violence inflicted on these kids is unimaginable.”