May 26, 2022

Meta and Sama, Africa’s leading content moderation subcontractors, are facing legal action in Kenya for allegedly unsafe and unfair working conditions if they do not meet the 12 working conditions submitted to them.

In a notice seen by gaming-updates, Nizili and Sumbi Advocates, the law firm representing Daniel Motang, a former Sama employee fired for organizing a strike in 2019, mentioned health and was accused of violating various rights, including Kenyan and international employees.

Motaung was fired, allegedly for organizing a strike and trying to unionize Sam’s workers. The law firm gave Meta and Sama 21 days (from Tuesday, March 29) to respond to the demands or file a lawsuit.

In a formal notice letter, the law firm asked Meta and Samu to abide by the country’s labor, privacy and health laws, hire qualified and experienced medical professionals, and provide proper mental health insurance and higher compensation to arbitrators.

“Facebook outsources much of that work to companies like Sama — a practice that keeps Facebook high on profits but at the cost of the health of thousands of moderators — and the security of Facebook around the world. Sama moderators report ongoing violations, including unsafe, violent acts, and threats of post-traumatic stress disorder (PTSD),” Motuang’s lawyers said.

The pending lawsuit follows a Time story detailing how Sama hired middlemen under the false pretense that they were taking jobs at a call center. The story goes that the content moderators, recruited from across the continent, only learned about the nature of their jobs after signing employment contracts and moving to the Nairobi hub.

Moderators scrutinize social media posts across all of its platforms, including Facebook, to remove those that spread hate, misinformation and violence.

One of the many requirements employees must meet is the nature of their work with outsiders. Content moderators in Africa earn one of the lowest salaries in the world, according to the article. Sama positions itself as an ethical AI company. Following this disclosure, the company recently increased the wages of its employees.

The law firm argued that Sama Motang and her partner failed to provide adequate psychosocial support and mental health measures, including “unscheduled breaks when needed, especially after viewing graphic materials.” Sama’s employee productivity was also tracked using Meta software to measure employee screen time and movement during working hours. She allows them “thirty minutes a day with a healthy lifestyle consultant.”

“Sama and Meta failed to prepare our clients for this kind of work and its consequences. The first video he remembered was of the beheading. Prior to this, they were not offered psychological support in advance,” the law firm said.

Mercy Mutemi, who is leading the trial, said: “I use Facebook, as do many Kenyans, and it is an important place to discuss the news. But that is why this issue is so important.”

“The security and integrity of our democratic process in Kenya depends on Facebook, which is well-staffed and where content moderators who work on the front lines in the fight against hate and disinformation get the support we need to protect everyone. This is not a general employment issue – the conditions for hiring Facebook moderators apply to all Kenyans.”

Leave a Reply

Your email address will not be published.