“You can’t see them in the dark”, Cengiz Haksöz begins his statement with this sentence from the Threepenny Opera. Cengiz is a content moderator for Meta. I sit next to him as he talks about his working conditions. We are in no less a place than the German Bundestag. It is the first time that a content moderator has addressed the MPs of the Digital Committee.
Advertisement
There is a lot to tell because the conditions are precarious. Cengiz reports thousands of hours of violent posts that he and his colleagues have to sift through. Although Cengiz moderates content for the tech company Meta, he is not employed by Meta. Content moderation is outsourced. Cengiz works for a company most have never heard of. The content is so gruesome that it will leave psychological damage for life. Psychological support from employers is inadequate and pay is precarious. Time pressure and monitoring are the order of the day.
Situation of people in content moderation
Until now, little was known about the working conditions of content moderators. Because the companies they work for establish a culture of fear and secrecy. Anyone who speaks publicly about the situation is risking their job. Cengiz Haksöz also feels this. A few days after the hearing, he is released and is no longer allowed to enter the company building.
The problem that Cengiz reports is all too well known. It’s called exploitation. Not just here in Germany, but on a global scale. Content moderation mainly takes place in countries of the Global South. Time Magazine describes the content moderators’ offices in Kenya as Facebook’s African sweatshops. The moderators carry out their work under a lack of security measures, poor pay and disregard for their human rights. There is also talk of forced labor.
We know sweatshops and exploitation from sectors such as the textile or meat industry. We don’t have digital services on our radar in this context. AI, social networks and self-driving cars simply would not exist without the work of tens of thousands of clickworkers and content moderators. For ImageNet alone, one of the largest image databases for training AI systems, more than 49,000 people have categorized, tagged and annotated images.
(Bild:
Oliver Ajkovic
)
As a co-founder of the feminist organization Superrr Lab, Julia Kloiber works on just and inclusive digital futures. She regularly publishes her column in the print edition of MIT Technology Review.
Digital products must be included in the Supply Chain Act
Advertisement
A new law offers hope for a global improvement in conditions. The Supply Chain Law. It wants to put an end to exploitative working conditions by holding companies accountable for respecting human rights in their supply chains. They must ensure that protection against child labour, the right to fair wages and environmental protection are implemented throughout the supply chain. Simply outsourcing precarious work and evading responsibility, as large tech companies do by moderating content, is made massively more difficult by the law.
So far in theory. Unfortunately, apart from the supply chains of products such as clothing, food and hardware, nothing can be read about digital services so far. In order to change the precarious situation of tens of thousands of content moderators worldwide, digital products such as social networks, AI and Co. must also be taken into account in the supply chain law. That would be a big step for all those who have been working in secret in this area up to now.
Those in the dark are increasingly seen. It is important that in addition to visibility, actions follow.
(jl)
To home page
#exploitation #content #moderators