Man hired to moderate Facebook content says more needs to be done to block heartbreaking downloads


To be honest, I don’t remember the first time I watched an execution. When you have seen so many that they begin to merge into one, it is impossible to remember the first.

Then there is the cruelty to animals, the torture and the disgusting pornography that I have witnessed.

You must think that I am a kind of twisted individual with disturbing online tastes. Or that I am a police investigator with years of training on how to handle online content which is the kind of nightmares.

Former Facebook moderator Chris Gray and his wife Mildka Gray are pictured above in Dublin. Mr. Gray writes:

Former Facebook moderator Chris Gray and his wife Mildka Gray are pictured above in Dublin. Mr. Gray writes: “I am now suing Facebook and CPL Resources, to which Facebook has outsourced moderation services, for the psychological injuries that I believe I suffered as a result of my work”

The truth is neither. Instead, these memories are the deeply disturbing legacy of the nine months I spent working for Facebook as a moderator, or as I was called, “community operations analyst.” It’s a role that left me with post-traumatic stress disorder and deep cynicism that social media can really be controlled.

This week, Facebook announced that it would tighten its rules on political advertising in the run-up to the US presidential election in November, including allowing users to completely disable political advertising. This is the latest development in a Facebook image cleansing strategy. Former Lib Dem chief Sir Nick Clegg is now vice president of global affairs and communications.

But with a focus on controlling political content, it’s easy to forget the dark side of Facebook – the immeasurable amounts of deeply disturbing content downloaded every second.

As a moderator, I was used to filter the content that British and Irish users see every day. I scanned it on my screen for eight hours a day.

My job was to help keep control of this vast universe of downloaded content and make sure it didn’t violate community standards for the business.

I was based at Facebook headquarters in Dublin and later in another building.

During the only interview I had before my job, I was asked if I understood that I would sometimes see content that I could find a little disturbing. As I discovered,

During the only interview I had before my job, I was asked if I understood that I would sometimes see content that I could find a little disturbing. As I discovered, “a little disturbing” was not even close to describing what I should watch [File photo]

While I was employed by an entrepreneur, CPL Resources, when I did the work in 2017-2018, the link between what I was doing and the tech giant was clear. I am now suing Facebook and CPL Resources, to which Facebook has outsourced moderation services, for the psychological injuries I believe I have suffered as a result of my work.

I’m not alone. To date, more than 20 former moderators are suing Facebook in Dublin.

Meanwhile, Facebook recently agreed to compensate moderators in the United States following a class action.

Becoming one of the tens of thousands of analysts around the world tasked with deciding whether a Facebook post is hateful, harmful, satirical, innocent or just plain stupid was not the most obvious role for a man who has spent the years 1980 to run his own construction business in Bath, then left the UK to teach English in China.

These were interesting moments, but when my wife Mildka and I got married, I found myself wanting to return to a country where English was the mother tongue, so we moved to Ireland.

I was 50 and needed a new job. Facebook, via CPL, sought to employ 800 people in three months. It was not well paid, a base rate of less than £ 12 an hour, but it was a foot in the door.

As more moderators were employed, we were given different roles. At first, I mainly watched pornography which, on the second day, included a scene of bestiality [File photo]

As more moderators were employed, we were given different roles. At first, I mainly watched pornography which, on the second day, included a scene of bestiality [File photo]

Shortly after teenage girl Molly Russell of London committed suicide after viewing Facebook self-harm and suicide graphic material and boss Mark Zuckerberg was pressured to take more aggressive action to moderate the content.

During the only interview I had before my job, I was asked if I understood that I would sometimes see content that I could find a little disturbing.

As I discovered, “a little disturbing” was not even close to describing what I should be watching. Nothing can prepare you for the depravity that reigns on social networks.

My first day in the company’s large offices was exciting. We were 50 at the reception, from all over the world. It was hectic and noisy, but I think we all shared the same feeling that we were on a mission that was important. It quickly evaporated.

There was a week and a half of training, quite basic stuff with an instructor armed with a script and PowerPoint. By the second week, we were supposed to be ready to make judgments that could affect people’s lives forever.

I worked evening shifts, coming in at 6 p.m. and working until 2 a.m. There were about 20 others, filling a corner of a large, open, large floor. Anyone visiting would have thought it was a model workplace – as long as they were not looking at a computer screen.

As more moderators were employed, we were given different roles. At first, I mainly watched pornography which, on the second day, included a scene of bestiality. But after the first month, I mostly worked on the high priority queue – hate speech, violence, really disturbing content.

My days have always followed a similar pattern. I started by checking the Facebook rules updates, which were numerous and constantly evolving, and it was my responsibility to disseminate this information to other moderators on my team.

Then, with the day, looking at my “game plan”, as it was called, who told me what I should focus on first, and on how many “tickets”, the term for the tech industry for tasks, I had to face.

In my case, this evaluated hundreds of bits of different content that had been flagged by Facebook users as questionable.

Obviously, not all of the content is disturbing – some people use the report function with anger and, as moderators, we ended up analyzing family births, marital conflicts and abuse from teenagers.

There were even fluffy kittens and puppies to watch, as someone could have reported a Facebook user for selling pets.

I would say 10-20% of what I had to watch was disturbing, but when you watch hundreds of clips or images every day, it adds up quickly.

When you think someone you are watching is going to hurt or hurt someone else, as happened on Facebook, there is a process to follow. There are over 100 options to choose from when taking action [File photo]

When you think someone you are watching is going to hurt or hurt someone else, as happened on Facebook, there is a process to follow. There are over 100 options to choose from when taking action [File photo]

Working at an average processing time, or an AHT (because everything has an acronym), of 30 seconds per ticket, seemed relentless. Imagine seeing so much depravity that you can’t remember the detail?

Much is unsuitable for telling in a family journal.

But I remember the day when I had to watch a montage of images relating to ethnic cleansing in Myanmar. There were burnt villages, refugees carrying bags – then a picture of a baby lying on the ground, eyes closed and a human foot pushing on his chest.

I thought it was an easy decision, a photo of a baby who had experienced a violent death, so I deleted it.

Because of the foot, I thought someone had run over this baby and stopped breathing.

But, soon after, I was audited, a process that occurs regularly to ensure that moderators make decisions that follow Facebook rules.

It even received a name, your “quality score”, which at the time was to remain at 98%.

The auditor questioned my decision; there was no confirmation of the death, he said, with no visible blood or fracture. How did I know this baby was dead?

At the time, my overwhelming fear was: “I can’t afford to let my quality level go down, I need this job.”

I pointed out that the baby was not resisting – if he were alive there would surely be a flicker of movement, some resistance to the force that would drop, but his arms were flat on the ground, palms facing the bottom. To my relief, my decision was upheld.

And I am ashamed to admit that my thoughts were not about the violent death of an innocent child, but about my job security.

This is what the job has done to me. I became numb not only by the atrocities, but by the insidious drip of the horror of human behavior.

It wasn’t until two years later, when I was speaking on stage at an event, that I suddenly remembered this incident and burst into tears.

It was, I realized, the first time I really cared about this baby. It was an overwhelming thing.

Having watched films of terrorist executions, I can tell you what happens when a person is shot at point blank range in the head – and that is not what happens in a video game.

Sometimes there are tragic stories where people spoke on the phone and got out of a bus and were killed.

Did you know that there are people who collect video surveillance images, edit them all together, add party music on top and publish them online? It also ended up on my screen.

Through the weight of experience, I also learned horrible details about the damage people do to themselves.

If someone is hurting themselves, you need to look at how they did it to find out if it can be considered self-harm or attempted suicide.

It is not that one is deemed OK and the other not, but rather that each is classified differently.

If you think there is an imminent danger, such as watching a live video stream, there is a climbing chain to follow.

It wasn't long after a teenage girl, Molly Russell, above, from London committed suicide after viewing Facebook self-harm and suicide graphics and boss Mark Zuckerberg was under pressure to take more aggressive action to moderate the content

It wasn’t long after a teenage girl, Molly Russell, above, from London committed suicide after viewing Facebook self-harm and suicide graphics and boss Mark Zuckerberg was under pressure to take more aggressive action to moderate the content

It was the great fear for all of us. When you think someone you are watching is going to hurt or hurt someone else, as happened on Facebook, there is a process to follow.

There are over 100 options to choose from when taking action on the content. The priority is speed but, as a moderator, you have no idea what happens after pressing the buttons to escalate. There is a horrible feeling of helplessness.

What adds to the stress of exposure to disturbing images and videos is the pressure to make the right decision as to what to do with it.

You are in constant danger of making a wrong decision and being penalized for it – like categorizing something as violence when in fact there is a flash of nipple, since nudity is a different category.

And the rate was relentless. Press a button, make a decision, press a button, make a decision, for hours. Because I had signed a non-disclosure agreement, I was not allowed to discuss everything I saw with my wife.

But I was on a hair trigger all the time, ready to chat with anyone about anything. Frankly, I was not a good person.

Yes, the company has made efforts to take care of us, albeit in a very American way. The same week, I joined a wellness coach who started out as a “wellness team.”

We were receiving emails saying “we do yoga at 11 am” or “we do finger painting in the canteen” and I had meetings with a wellness coach. But try to take care of yourself when you have a 98% accuracy target and your boss is restless because there is a delay in queuing images of child sexual abuse.

In the end, I was there a little less than a year. The firm let me go, she said, because my quality score was not high enough.

When I got home and told my wife that she was so relieved, she said, “Oh my God, you’ve been such a miserable and awful person in the past six months. I’m so glad you left this job. “

It took me a year to realize how damaged I felt at work. I was a mess: someone who had become aggressive, had trouble sleeping and whose relationship had suffered.

I became someone I didn’t like. But it wasn’t until I decided to talk about it publicly and talk about it that the emotional floodgates opened up and I realized how affected I was.

I asked moderators from all over Europe to contact them and encourage them to speak to the nonprofit organization Foxglove, which is campaigning to end the abuse of digital technology and has supported me.

Talking about it brings flashbacks, but it is essential that I talk about it because I know that I am not alone.

So what is the answer? If content on Facebook needs to be controlled, what most people think it should be, how can it be done?

I think there should be moderators, but they shouldn’t be low-paid, overworked workers without specialized training.

They must be well-trained, well-paid professionals who understand what is going on and are fully supported.

With a net profit of £ 14 billion last year, Facebook can afford it. The role of a moderator may not be as visible as a police officer on the ground, but it is just as important.

Facebook says it introduced a new set of standards last year.

A spokesperson said, “We are committed to providing support to those who review Facebook content because we recognize that examining certain types of content can sometimes be difficult.

“Anyone who reviews Facebook content goes through an in-depth, multi-week training program on our community standards and has access to extensive psychological support for their well-being.”

“This includes 24/7 on-site assistance with qualified practitioners, on-call service and access to private health care from the first day of employment.

“We also use technical solutions to limit their exposure to graphic material as much as possible. Providing this support is really important and we are committed to doing it right. “

These days, I work as a guide, taking tourists to Ireland by coach, although the pandemic has stopped that for the moment. Even so, I still prefer to be where I am now than where I was then.

Meanwhile, I don’t have a Facebook account and I’m not on Instagram or Twitter. Not surprisingly, I don’t have a great appetite for social media these days.

LEAVE A REPLY

Please enter your comment!
Please enter your name here