Meet Mercy and Anita – the African workers driving the AI revolution, for just over a dollar an hour
By James Muldoon,Mark Graham and Callum Cant.
The Observer – Artificial Intelligence. July 6, 2024.
Social media content and AI training data are processed in outsource centres in the global south, where long hours, low pay and exposure to disturbing material are the norm
From the book: Feeding the Machine – The Hidden Human Labour Powering AI, by James Muldoon, Mark Graham and Callum Cant.
AI feeds off the work of human beings.
Mercy craned forward, took a deep breath and loaded another task on her computer. One after another, disturbing images and videos appeared on her screen. As a Meta content moderator working at an outsourced office in Nairobi, Mercy was expected to action one “ticket” every 55 seconds during her 10-hour shift. This particular video was of a fatal car crash. Someone had filmed the scene and uploaded it to Facebook, where it had been flagged by a user. Mercy’s job was to determine whether it had breached any of the company’s guidelines that prohibit particularly violent or graphic content. She looked closer at the video as the person filming zoomed in on the crash. She began to recognise one of the faces on the screen just before it snapped into focus: the victim was her grandfather.
Mercy pushed her chair back and ran towards the exit, past rows of colleagues who looked on in concern. She was crying. Outside, she started calling relatives. There was disbelief – nobody else had heard the news yet. Her supervisor came out to comfort her, but also to remind her that she would need to return to her desk if she wanted to make her targets for the day. She could have a day off tomorrow in light of the incident – but given that she was already at work, he pointed out, she may as well finish her shift.
New tickets appeared on the screen: her grandfather again, the same crash over and over. Not only the same video shared by others, but new videos from different angles. Pictures of the car; pictures of the dead; descriptions of the scene. She began to recognise everything now. Her neighbourhood, around sunset, only a couple of hours ago – a familiar street she had walked along many times. Four people had died. Her shift seemed endless.
We spoke with dozens of workers just like Mercy at three data annotation and content moderation centres run by one company across Kenya and Uganda. Content moderators are the workers who trawl, manually, through social media posts to remove toxic content and flag violations of the company’s policies. Data annotators label data with relevant tags to make it legible for use by computer algorithms. Behind the scenes, these two types of “data work” make our digital lives possible. Mercy’s story was a particularly upsetting case, but by no means extraordinary. The demands of the job are intense.
Moderators witness suicides, torture and rape ‘almost every day … you normalise things that are just not normal’
“Physically you are tired, mentally you are tired, you are like a walking zombie,” said one data worker who had migrated from Nigeria for the job. Shifts are long and workers are expected to meet stringent performance targets based on their speed and accuracy. Mercy’s job also requires close attention – content moderators can’t just zone out, because they have to correctly tag videos according to strict criteria. Videos need to be examined to find the highest violation as defined by Meta’s policies. Violence and incitement, for instance, are a higher violation than simple bullying and harassment – so it isn’t enough to identify a single violation and then stop. You have to watch the whole thing, it gets worse.
“The most disturbing thing was not just the violence,” another moderator told us, “it was the sexually explicit and disturbing content.” Moderators witness suicides, torture and rape “almost every day”, commented the same moderator; “you normalise things that are just not normal.”