Current Affairs

You are the content supervisor on Facebook. I saw the real cost of using external sources of digital employment Sonia Kgamu


MArk Zuckerberg may be done by examining facts, but it cannot escape the truth. The richest man in the world announced that Meta will replace its independent facts with the comments of society. I went to the AI ​​Action Summit in Paris this week to tell the executives and technical policy makers Why is this wrong?.

Instead of expanding the scope of programs that make social media and artificial intelligence more worthy of confidence, companies need to invest in people who liquidate social media and design the data on which AI depends. I know that I used to be one of them.

Mother of two young children, I was recruited from the death of South Africa with the promise to join the growing technology sector in Kenya for a sub -contractor on Facebook, Sama, as a content supervisor. For two years, I spent up to 10 hours a day in staring at the abuse of children, human distortion, racist attacks and the darkest parts of the Internet, so you didn’t have that.

It was not just the type of content that I had to watch and which gave me insomnia, anxiety and migraines, but also the quantity. In Sama we had something called AHT, or the time to deal with work. This was the amount of time we gave to analyze and evaluate a piece of content. We were at a time, and the company measured our success in seconds. We were constantly under pressure to achieve this properly.

You can’t stop if you see something painful. You cannot stop your mental health. You cannot stop to go to the bathroom. You just can’t stop. We were told that the customer, in our Facebook case, asked us to continue.

This was not the life that I imagined when I moved to Nairobi. Ins isolated from my family, my only real community was my colleagues in Sama and companies using other external sources. When we gathered, our conversations were always about the same thing: our work, and the way we were broken.

The more we talk, the more we realize that something is happening more than our personal stories. Each supervisor had content, explanation of the data, and the artificial intelligence factor that we met with the same stories: impossible classes, deep shock and ignoring our well -being.

It was not just a poisonous problem. It was not just a Facebook problem. This was the way the entire technology industry works – the use of external sources of the most brutal digital employment and benefiting from our pain.

These issues are currently the subject of a collective lawsuit in Kenya, which was submitted by 185 of the former supervisors of the content against SAMA and the owner of Facebook, as reported by The Guardian. When approaching the comment, Sama said that as of March 2023, they are no longer involved in moderate content and no longer use content supervisors. They added that the Kenyan court asked the two parties not to speak with the media regarding the case regarding the case regarding the case regarding the case regarding the current litigation, as they asked that the two parties do not speak with the media regarding the case.

She left two years ago. Since then, problems have worsened. I know this by help African technology workers extend. Workers are still shocked, and work is more intense. The content supervisors have to watch videos at a speed of 2x or 3X on several screens at one time. Wages and conditions are not better, as the wages of some low data workers reach $ 0.89 (70 pixels) and content. The content gets two dollars.

Things cannot continue as it is, but Zuckerberg’s approach to weakening protection is the wrong path. This work needs professionalism. We need standards for workers such as the content supervisors who realize work difficulties and respect our rights. This means training and real health and safety protocols like any other profession. This means guaranteeing a living wages and putting reasonable work classes. This means creating a framework that respects our humanity and dignity. This means that there is a union.

Meta refused to comment on the specific claims during the litigation, but she said that it requires companies to use external sources of advice and health care, and to pay the standards of local industry – and said that it provides technical solutions to limit the cancellation of the subscription to the automatic play function, where videos or photos are played in a flow Other than stopping.

We cannot wait for technology companies to fix this problem. African technology workers are organizing better wages and protection for mental health and occupational standards in Kenya and abroad. We do this because artificial intelligence is not magic. Behind every algorithm of thousands of workers hidden, signs of data and training and moderate under risky conditions. The work that operates artificial intelligence remains invisible because many prefer to focus on technological innovations instead of the supply chains they maintain.

If you believe in the safer and more moral Internet, you will stop with us: support our organizational efforts, and to push politicians to organize large technology and demand that artificial intelligence companies and social media companies respect all their workers. The change will not come from above – it will come from us. This is the truth.

  • Sonia Kgomo is an organizer of African technology workers, a project supported by Uni Global Union and the Kenya Communications Workers Union

  • Do you have an opinion on the issues raised in this article? If you want to provide a response of up to 300 words by email to be considered to be published in our messages section, please click here.

Leave a Reply

Your email address will not be published. Required fields are marked *