An Explosion in Online Child Sex Abuse: What You Need to Know

Tech companies are reporting a boom in online photos and videos of children being sexually abused — a record 45 million illegal images were flagged last year alone — exposing a system at a breaking point and unable to keep up with the perpetrators, an investigation by The New York Times found.

The spiraling activity can be attributed in part to a neglectful federal government, overwhelmed law enforcement agencies and struggling tech companies. And while global in scope, the problem is firmly rooted in the United States because of the role Silicon Valley plays in both the spread and detection of the material. Here are six key takeaways.

While the problem predates the internet, smartphone cameras, social media and cloud storage have made it much worse.

Before the digital age, offenders had to rely on having photographs developed and sending them through the postal system, but new technologies have lowered the barriers to creating, sharing and amassing the material, pushing it to unprecedented levels.

After years of uneven monitoring, major tech companies have stepped up surveillance of their platforms and have found them to be riddled with the content.

Criminals are increasingly “going dark” to hide their tracks. They are using virtual private networks to mask their locations; deploying encryption techniques to obscure their messages and make their hard drives impenetrable; and posting on the so-called dark web, the vast underbelly of the internet, which is inaccessible to conventional browsers.

As the technologies lower people’s inhibitions, online groups are sharing images of younger children and more extreme forms of abuse.

“Historically, you would never have gone to a black market shop and asked, ‘I want real hard-core with 3-year-olds,’” said Yolanda Lippert, a prosecutor in Illinois who leads a team investigating online child abuse. “But now you can sit seemingly secure on your device searching for this stuff, trading for it.”

Congress passed a landmark law in 2008 that foresaw many of today’s problems, but The Times found that the federal government had not fulfilled major aspects of the legislation. Annual funding for state and regional investigations was authorized at $60 million, but only about half of that is regularly approved.

Senator Richard Blumenthal, a sponsor of the law’s reauthorization, said there was “no adequate or logical explanation and no excuse” for why more money was not allocated. Even $60 million a year, he said, would now be “vastly inadequate.”

Another cornerstone of the law, biennial strategy reports by the Justice Department, was mostly ignored. And although a senior executive-level official was to oversee the federal response at the Justice Department, that has not happened.

The Justice Department’s coordinator for child exploitation prevention, Stacie B. Harris, said she could not explain the poor record. A spokeswoman for the department, citing limited resources, said the reports would now be written every four years beginning in 2020.

With so many reports of the images coming their way, police departments across the country are besieged. Some have managed their workload by focusing efforts on imagery depicting the youngest, most vulnerable victims.

“We go home and think, ‘Good grief, the fact that we have to prioritize by age is just really disturbing,’” said Detective Paula Meares, who has investigated child sex crimes for more than 10 years at the Los Angeles Police Department.

About one of every 10 agents in Homeland Security’s investigative section is assigned to child sexual exploitation cases, officials said, a clear indication of how big the problem is.

“We could double our numbers and still be getting crushed,” said Jonathan Hendrix, a Homeland Security agent who investigates cases in Nashville.

Police records and emails, as well as interviews with nearly three dozen law enforcement officials, show that some tech companies can take weeks or months to respond to questions from the authorities, if they respond at all. Some do not retain essential information about what they find.

Law enforcement records shared with The Times show that Tumblr was one of the least cooperative companies. In one case, Tumblr alerted a person who had uploaded explicit images that the account had been referred to the authorities, a practice that a former employee told The Times was common for years. The tip allowed the man to destroy evidence, the police said.

A spokeswoman for Verizon said that Tumblr prioritized time-sensitive cases, which delayed other responses. Since Verizon acquired the company in 2017, the spokeswoman said, its practice was not to alert users of police requests for data. Verizon recently sold Tumblr to the web development company Automattic.

Facebook Messenger was responsible for nearly two-thirds of reports in 2018. And earlier this year, the company announced that it planned to begin encrypting messages, which would make them even harder to police.

Facebook has long known about abuse images on its platforms, including a video of a man sexually assaulting a 6-year-old that went viral last year on Messenger. When Mark Zuckerberg, Facebook’s chief executive, announced the move to encryption, he acknowledged the risk it presented for “truly terrible things like child exploitation.”

Source link