Facebook Moderators Reveal Effects of Battling Violence Online

delete

Moderators are exposed to child pornography, murder, rape and suicide on a daily basis.

sinead photo

Facebook has recruited a staggering number of moderators, growing from 4,000 reviewers to more than 15,000 in the past two years.

The CCC at Barcelona’s Sant Marti district is one of dozens of moderation outsourcing firms for Facebook, where employees are attempting to reduce the volume of disturbing content on the social media site.

From beheadings to neo-Nazi slogans and child pornography, one reviewer in Barcelona said: “It is no secret, we see content that can be violent, violent in a way that you react to.”

“I would not want my daughter seeing this kind of thing,” emphasised another.

On a typical day, a moderator’s day shift starts at 7am and finishes at 3pm with the evening shift taking place from 3pm to 11pm.

Employees must adhere to Facebook’s Community Standards, a set of rules designed to remove offensive content from the site. Moderators can decide whether to delete content completely, hide it or blur an image, let it pass, or alert a senior manager.

Facebook’s use of its army of content moderators, has come under deep scrutiny over their incredible workload and their mental wellbeing.

Repercussions of harrowing content

According to Sarah Katz, a former US-based Facebook moderator with outsourcer ProUnlimited, her team was expected to review around 4,000 posts each per day, spending a maximum of one minute per post. “The most difficult content I viewed was the child pornography,” she stated.

Katz, who left the company three years ago, recalls a harrowing post she had to remove, a video of a girl of 12 and boy of nine who were not wearing pants or underwear.

In September 2018, a group of content moderators in the US sued Facebook over failing to provide enough mental health support for its moderators. The claimants said they were suffering from Post Traumatic Stress Disorder having been “bombarded” with videos and images of “child sexual abuse”, “rape”, “suicide” and “murder”.

One reviewer said: “I don’t think it is possible to do the job and not come out of it with some acute stress disorder or PTSD.”

Recommended:

Amidst mounting pressures, Facebook says its outsourcing firms like CCC have counselling services in place, including individual and group sessions and other forms of therapy on site.

Cleaning up Facebook is fast becoming vital to the site’s future as pressure from shareholders and regulators increases around the world. Facebook also plans to continue to release reports on what content is taken down each quarter.

Recently, chief executive Mark Zuckerberg said: “I think the health of the discourse is just as important as any financial reporting we do so we should do it just as frequently.”

A monumental task

According to its latest Community Standards Enforcement Report, Facebook took action against around 73 million pictures, videos, images, posts and memes that violated its policies.

Of that, Facebook said it removed 5.4m pieces of child nudity or abuse (representing 0.03% of views on Facebook) and 6.4m pieces of terror propaganda.

“We have billions and billions of things being uploaded,” says Paula Garuz Naval, a safety product policy manager at Facebook. “It is international in scale. Facebook is a reflection of the real world and those are the types of things that we see.”

Simon Cross, group product manager for community integrity at Facebook, says Facebook’s ultimate aim is to replace as much of the work of content moderators with its own machine learning technology.

“Ideally we would enforce them completely accurately, but that is not a state we are ever likely to reach. Reporting from users is a key part of how we solve these problems.”

Facebook most recently announced the hire of 500 artificial intelligence and security experts to be based out of London.

Cross says the moderators are also assisting Facebook to train its artificial intelligence technology, “tuning them over time” to more accurately remove content. Every time content is removed, and occasionally restored on appeal, it is a piece of training data.

However, for the foreseeable future, the thousands of employees at firms like CCC in Europe, Cognizant in the US, or CPL in Dublin, will continue to clean up behind the scenes to give social media users the censored Facebook experience they have come accustomed to.



Latest News

Business Funding News
Events Featured News
News Technology
Digital News