Content moderators from several African countries demanded compensation of 1.46 billion euros for violation of labor rights and outsourcing. So far, most of the demands made by the sector to the technology transnationals have been ignored.
Kenyan workers who clean the Internet of toxic and harmful content have formed a union to demand better working conditions. The situation of underemployment and precariousness of work, which affects workers in this sector hired by high-tech companies, is well known.
The job of the content moderator is to navigate through online content and detect information that is violent, disturbing, and controversial. Sometimes it may be extremely violent videos, other times it may be political propaganda, images that denigrate people, child pornography, etc.
The need to filter the content of social networks has created a labor sector that has been characterized by poor remuneration and inadequate working conditions. A starting point in the long struggle of demands of the new union will be the raising of the current salary in Kenya, which is barely three euros an hour.
Other demands of the Kenyan union will be directed to the design of more respectful protocols, the creation of mental health services, as well as the establishment of safety standards and means of occupational safety.
As in other sectors of production and services, the first world externalizes the socio-economic and political cost of “decontamination” jobs to underdeveloped labor markets. In this way, they evade the rigorous protocols of their countries of origin and make labor cheaper.
In 2022, Facebook’s Spanish-language content moderation staff, subcontracted in the United States itself, denounced the terrible working conditions they had, compared to their English-language counterpart.
The basic demands that have arisen from workers in the sector around the world, especially in those countries where there are fewer labor regulations for online work, have been: the need for a programming of rotation cycles for at least three months. This involves rotating through different types of content, as well as total rest from moderator work. Thirdly, there is the right to mental health assistance, and the design of realistic demands for compliance with the work program. Longer times should be determined to make moderation decisions, which on average requires regulating content every 60 or 66 seconds with an effectiveness of 85 percent.
Months earlier, content moderators from several African countries demanded compensation of 1.46 billion euros for violation of labor rights and outsourcing. So far, most of the demands made by the sector to the technology transnationals have been ignored.