Policing Web content takes toll on screeners
Published 5:00 am Monday, July 19, 2010
- Stacey Springer, vice president for support operations at Caleris, an outsourcing company that provides Web screening services, talks with Marie Wittry as she reviews images at the company's offices in Jefferson, Iowa. Last month, an industry group recommended that the federal government work with companies that provide such services to “address the psychological impact on employees of exposure to these disturbing images.”
Ricky Bess spends eight hours a day in front of a computer outside Orlando, Fla., viewing some of the worst depravities harbored on the Internet. He has seen photographs of graphic gang killings, animal abuse and twisted forms of pornography. One recent sighting was a photo of two teenage boys gleefully pointing guns at another boy, who is crying.
An Internet content reviewer, Bess sifts through photographs that people upload to a big social networking site and keeps the illicit material — and there is plenty of it — from being posted. His is an obscure job that is repeated thousands of times over, from office parks in suburban Florida to outsourcing hubs like the Philippines.
With the rise of websites built around material submitted by users, screeners have never been in greater demand. Some Internet firms have tried to get by with software that scans photos for, say, a large area of flesh tones, but nothing substitutes for a discerning human eye.
The response has been a surge in Internet screening services — but also a growing awareness that the jobs can have mental health consequences for the reviewers, some of whom are drawn to the low-paying work by the simple prospect of making money while looking at pornography.
“You have 20-year-old kids who get hired to do content review, and who get excited because they think they are going to see adult porn,” said Hemanshu Nigam, the former chief security officer at MySpace. “They have no idea that some of the despicable and illegal images they will see can haunt them for the rest of their lives.”
Like ‘combat veterans’
David Graham, president of Telecommunications On Demand, the company near Orlando where Bess works, compared the reviewers to “combat veterans, completely desensitized to all kinds of imagery.” The company’s roughly 50 workers view a combined average of 20 million photos a week.
Bess insists he still gets bothered by the offensive material, and acknowledges the need to turn to the cubicles around him for support.
“We help each other through any rough spots we have,” said Bess, 52, who previously worked in the stockrooms at Walmart and Target.
Last month, an industry group established by Congress recommended that the federal government provide financial incentives for companies to “address the psychological impact on employees of exposure to these disturbing images.”
Nigam, co-chairman of the group, the Online Safety and Technology Working Group, said global outsourcing firms that moderate content for many large Internet companies do not offer therapeutic care to their workers. The group’s recommendations have been submitted to the National Telecommunications and Information Administration, which advises the White House on digital policy.
Workers at Telecommunications On Demand, who make $8 to $12 an hour, view photos that have been stripped of information about the users who posted them. Rapidly cycling through pages of 300 images each, they are asked to flag material that is obviously pornographic or violent, illegal in a certain country or deemed inappropriate by a specific website.
4.5 million images daily
Caleris, an outsourcing company based in West Des Moines, Iowa, says it reviews about 4.5 million images a day. Stacey Springer, its vice president for support operations, says the job is not for everybody and that “people find they can do it, but it is usually a lot harder than they thought.” The company offers counseling as part of its standard benefits package for workers.
Springer says she believes that content moderators tend to get desensitized to the radical imagery, making it easier to cope. But she is called on to review the worst material, and says that she finds some of it “hard to walk away from.”
“I do sometimes take it really personally,” she said of the pictures she reviews. “I remind myself, somebody has to do it.”
Content moderation
A common strategy at websites is to have users flag questionable content, then hand off material that needs further human review to outsourcing companies that can do so at low cost.
Internet companies are reluctant to discuss the particulars of content moderation, since they would rather not draw attention to the unpleasantness that their sites can attract. But people in the outsourcing industry say tech giants like Microsoft, Yahoo and MySpace all outsource some amount of content review.
YouTube is an exception. If a user indicates a video is inappropriate, software scans the image looking for warning signs of clips that are breaking the site’s rules or the law. Flagged videos are then sent for manual review by YouTube-employed content moderators who, because of the sensitive nature of the work, are given only yearlong contracts and access to counseling services, according to Victoria Grand, a YouTube spokeswoman.
For its part, Facebook has relied on its users to flag things like pornography or harassing messages. That material is reviewed by Facebook employees in Palo Alto, Calif., and in Dublin.
Simon Axten, a Facebook spokesman, said the company had tried outsourcing the manual review of photos but had not done so widely.