The people policing the internet’s most horrific content

Image copyright
Shawn Speagle

Image caption

Shawn is still trying to process what he had to watch as a content moderator

In this digital self-publishing era people can record and produce their own content, a lot of horrific stuff that clearly breaches websites’ taste and decency guidelines. A growing army of moderators has the unenviable task of sifting through it all, sometimes at considerable cost to their mental health.

WARNING: article contains upsetting content.

Shawn Speagle worked as an online content moderator for six months in 2018. He’s still scarred by the experience.

“One of my first videos that I remember looking at was two teenagers grabbing an iguana by the tail and they smashed it onto the pavement while a third person was recording it.

“And the iguana was screaming and the kids just would not stop until the iguana was just pasted on the ground.”

Shawn was employed by a company called Cognizant in Florida which had a contract with Facebook. He speaks in a slow, considered way, still trying to process what he had to go through.

Image copyright
PA Media

Image caption

Facebook uses around 30,000 sub-contracted content moderators around the world

“I’ve seen people put fireworks in a dog’s mouth and duct tape it shut. I’ve seen cannibalism videos, I’ve seen terrorism propaganda videos,” he continues.

Hearing Shawn speak, it becomes clear why moderation has often been described as the worst job in tech.

Most of us internet users probably never give these moderators a second thought, yet there are hundreds of thousands of them around the world helping companies weed out disturbing content – ranging from suicide and murder videos to conspiracy theories and hate speech.

And now some are coming out of the shadows to tell their stories.

Media playback is unsupported on your device

Media captionWATCH: Film explores social media moderation

Shawn decided to speak out, despite having signed a non-disclosure agreement (NDA) – a standard practice in the industry.

These NDAs are also meant to prevent contractors from sharing Facebook users’ personal information with the outside world, at a time of intense scrutiny over data privacy.

But Shawn believes Facebook moderation policies should be talked about openly, because staff end up watching upsetting content that is often left untouched on the platform.

As an animal lover, he was distraught that animal content “was, for the most part, never accelerated in any way shape or form”, meaning that it was never referred for removal.

For humans the rules were a little bit different, but also more convoluted.

More Technology of Business

The most common outcome was marking it as “disturbing” and leaving in on the platform. Shawn tells the BBC that, according to Facebook policy, seeing bodily innards, not in a medical setting, would result in the video being deleted.

He struggles to recollect any other examples that would result in content removal.

The stress of the job led to overeating and weight gain, Shawn says.

“I felt like I was a zombie in my seat. It really gets to you because I don’t have that bystander syndrome where I’m OK just watching this suffering and not contributing any way to deter it.”

He also could not get adequate psychological support.

The only time he tried to speak with the duty psychologist, Shawn says: “He flat out told me I don’t know how to help you guys.”

Despite all of this Shawn says he persisted for six months, not complaining, because he thought that although he worked through a subcontractor, “Facebook would get their act together”.

Facebook vice-president Arun Chandra was brought in especially to focus on the working conditions of the social media platform’s 30,000 moderators, who are largely employed by subcontractors in the US, India and the Philippines.

Image copyright
Getty Images

Image caption

Where does the duty not to show upsetting content intersect with the duty to allow free speech?

Mr Chandra says that he had already visited over 15 sites around the world and that he always spoke with moderators directly.

He denies that there was a “broad scale problem” and stresses that their subcontractors, like Cognizant and Accenture for example, were “reputable global companies”. Facebook will be introducing formal audits later this year, he says.

He also confirms that Cognizant’s contract remains in place following an investigation.

But Shawn Speagle believes lots more could be done to improve working conditions.

“The place was absolutely disgusting,” he alleges. “There was only one restroom in the entire building and there were 800 employees.

“People were smoking in the building; people were drinking in the parking lot and having sex in their cars.”

Workers were often young, inexperienced and poorly paid, he says.

The BBC approached Cognizant for comment but it has not yet responded.

Image copyright
Stella Kalinina

Image caption

Prof Sarah Roberts thinks governments may have to force online publishers to clean up their act

Facebook’s Mr Chandra says that a psychologist is now available across all of the subcontractors’ sites during all shifts and that the pay has been been increased – but only for US-based moderators.

Sarah Roberts, a professor at the University of California, Los Angeles (UCLA), has spent years investigating the world of internet moderation for her newly published book, Behind the Screen.

She believes websites and social media giants have assumed that automation, AI, and machine learning would make the need for human content moderation redundant.

“I think that Silicon Valley is putting a primacy on computation over all other things,” she says.

“So, if a workforce can be maintained cheaply and be treated as expendable, until such time that computation can fully be brought in, then so much the better.”

Prof Roberts thinks the difficulty for social media companies is that they have “built up a global user base predicated on the notion that we should all be able to do and say and share and show whatever our heart desires pretty much at all times”.

Changing that culture from within would be a “tall order”, she believes, which is why tougher legislation might be needed. But this is more likely to come from European Union countries rather than the US.

While this debate rages on, real people, like Shawn Speagle, live with the consequences of sifting through the internet’s filth.

Diagnosed with night terrors, he is on several medications, is scared of driving after watching so many car crash videos and is startled by loud noises.

“Looking at this stuff eight hours a day five days a week. It is something even veterans and ex-military can’t handle,” he concludes.

  • Follow Technology of Business editor Matthew Wall on Twitter and Facebook



Source link