Friday , July 3 2020
Home / ARTIFICIAL INTELLIGENCE / “The web giants have chosen to make the moderation process invisible”

“The web giants have chosen to make the moderation process invisible”

Moderator work is poorly regarded and poorly paid, even though it poses psychological risks. QUENTIN HUGON FOR THE WORLD

It is a profession whose contours are not yet well known, and which is nevertheless at the heart of the functioning of the major web platforms: moderators are responsible for ridding them of content posted by users (photos, videos, comments, etc.) that they prohibit. Distributed in open spaces around the world, managed by subcontractors, these thousands of little hands examine throughout their working day the worst content circulating on the Internet, and decide their fate.

Read our survey: Murders, pornography, racism … In the shoes of a Facebook moderator

American information scientist Sarah T. Roberts, a teacher at the University of California at Los Angeles (UCLA), has spent eight years working on this topic, and has met with dozens of moderators. She got a book out of it, Behind the Screen (still not translated). We interviewed her during her visit to Paris to participate in a series of conferences on “the dark side of work”, organized at the Gaîté Lyrique.

Web giants (Facebook, Youtube, Twitter, etc.) are very reluctant to talk about how they moderate content. Why ?

When these companies were launched about 15 years ago, the last thing they wanted was to have to make decisions about content. They concentrated on technologies, and presented themselves to the American public authorities as belonging to the category “Internet service providers”. This allowed them to assert that content was not their core business, and to move forward without being asked for accounts.

This positioning has allowed them to grow. So they had to keep their content decisions secret. They were selling this idea to users that there was no intervention, that you could express yourself on YouTube or Facebook without any barriers. They didn't want there to be an asterisk, they didn't want to go into details. However, from their very beginnings, content was deleted.

Article reserved for our subscribers Read also Internet: GAFA want to remain hosts of content with limited liability

In your book, you go so far as to say that these companies “erase human traces” of moderation. It’s a very strong formula…

These companies deeply believe that IT solutions are better than others, and they have this idea that there will soon be a technology strong enough to replace these workers. They seem to say: “Don't get too used to the idea that there are humans, it's just temporary”.

Additionally, if users knew there were humans involved, they would ask for an explanation. And these companies did not want to be responsible for these decisions. They therefore chose to make this process invisible and crossed their fingers so that people did not ask too many questions.

Moderation is hard work. What are the risks faced by these workers?

There are several, starting with psychological risks. The moderators I spoke to often said to me: “I am able to do this job, people left after two weeks, but I am strong”. And a few minutes later, they told me “I drank a lot” or “I don't want to go out with friends anymore, because we always talk about work, and I don't want to talk about it, and besides, I don't have the right to do so. “

When people tell me they can't stop thinking about a picture or video they saw at work, it's worrisome. One woman, who was a moderator about fifteen years ago for Myspace, told me that she didn't like meeting new people and shaking their hands: “I know what people are doing, and they are vile. I don’t trust anyone anymore “.

There are also other difficult things, which seem less obvious. You have to be very cultured to be a good moderator, many have studied literature, economics, history, sometimes in prestigious universities. But it is considered low-level work and it is fairly poorly paid.

How have the working conditions of the moderators changed?

It's hard to say, these are often call center conditions: from a purely material point of view, it's relatively comfortable. In the Philippines, people work in skyscrapers with air conditioning, which is important there.

But … How does that change anything to what they see everyday? Some companies have made psychologists available. But the employees are wary: when the psychologist is paid by your employer, it creates a complicated relationship, it can be difficult for the worker to open up sincerely.

But if these companies did not provide psychologists, we would blame them …

That is true. But is it really effective or is it just to look pretty? A moderator said to me: “The problem is that if I get up to speak to the psychologist, my manager and my colleagues all see me get up and understand that I have a problem. I don’t want them to know. “ And then, these are short-term solutions.

What should these companies do to ensure the well-being of their employees?

For starters, they could pay them more. Second, the fact that their work is not recognized is very hard for them, as are the confidentiality agreements, which makes them afraid to talk about their work, that they are not connected to others. They don't even know who the other moderators are. These are things that could easily change. And on psychology, we should study the effects of this work on the long term. There is no such study to my knowledge.

Could we do without moderators? What would the Internet be like without them?

At the 4chan forum [connu pour sa modération très faible, et la virulence de certains de ses membres anonymes] ? It would be the worst! Most people don't want that, so they have to deal with moderators. Internet users spend a lot of time on major platforms mainstream, supported by advertising, and the fact that they don't understand how these spaces are moderated is problematic because it reflects values.

The idea of ​​an Internet without moderation did not exist very much. Certainly, I gave the example of 4chan. But at one point, there was a decision, at 4chan, that the approach would be to let everything go. It’s an ideological posture there too. And it's important to note that this is what some people want.

You write that this research subject represents “the work of a lifetime”. What aspects of moderation interest you most for the future?

It's the job of a lifetime because it changes all the time: Facebook changes its rules every two weeks! And if today we have 100,000 people working as moderators, where will we be in two, five or ten years?

Article reserved for our subscribers Read also Visit one of Facebook's very secret moderation centers

I am also very interested in the issue of regulating these companies. As well as the way in which Americans have adopted these technologies for fifteen years. We are approaching the presidential election of 2020, and these platforms are very involved, we have had interesting statements from Facebook on the veracity of the information disseminated … All of this is in some way linked to the question of moderation. These companies are now considered responsible, when they never wanted to be! Having to deal with this is their worst nightmare, and they have to do it all the time.

Leave a Reply

Your email address will not be published. Required fields are marked *