Outsourcing Firm Expresses Regret Over Facebook Moderation Work in East Africa

Outsourcing firm Sama expresses regret for undertaking Facebook content moderation work in East Africa, admitting it was a mistake due to the traumatic impact on former Kenyan employees exposed to graphic posts. Legal cases are being pursued against the company. CEO Wendy Gonzalez acknowledges the lessons learned and announces a new policy against harmful content moderation. The controversy highlights challenges in outsourcing moderation tasks and the need for ethical considerations in AI projects.

Outsourcing Firm Expresses Regret Over Facebook Moderation Work in East Africa

A firm contracted for moderating Facebook content in East Africa has admitted to regretting its decision, acknowledging that, in hindsight, it should not have undertaken the task. Sama, an outsourcing company, has faced criticism from former Kenyan-based employees who claim exposure to distressing graphic posts traumatized them. Several of these ex-employees are now pursuing legal action against the company through the Kenyan courts.

Wendy Gonzalez, CEO of Sama, stated that the company would no longer engage in moderation work involving harmful content. The moderation hub, operational since 2019, has reportedly exposed workers to graphic material, including videos of beheadings and suicides.

Former moderator Daniel Motaung revealed that the first graphic video he encountered was a "live video of someone being beheaded." Motaung is currently suing both Sama and Facebook's owner, Meta. While Meta asserts its requirement for around-the-clock support from all its partner companies, Sama claimed that certified wellness counselors were available to its employees.

Sama's CEO, Wendy Gonzalez, conveyed her regret to the BBC, emphasizing that the moderation work represented a mere 4% of the company's business and it was a contract she wouldn't accept if given the chance again. Sama officially announced its discontinuation of this work in January.

Gonzalez further admitted to "lessons learned" and revealed a new company policy against taking on content moderation work involving harmful material. Additionally, Sama would abstain from engaging in artificial intelligence (AI) projects related to weapons of mass destruction or police surveillance.

When asked about the claims of harmed employees, Gonzalez refrained from providing a direct answer due to ongoing litigation. She indicated that moderation work's potential harm is a "new area that absolutely needs study and resources."

Sama is an unconventional outsourcing firm that initially aimed to alleviate poverty by providing digital skills and income through outsourced computing tasks. However, the company found itself embroiled in controversy over exposing employees to harmful content while moderating.

Ms. Gonzalez reiterated the importance of involving African people in the digital economy and AI system development. She underlined the significance of African teams moderating African content, asserting that external moderators could not effectively understand and moderate local languages.

Gonzalez revealed that she herself had engaged in moderation work in the past. Moderators at Sama received around 90,000 Kenyan shillings ($630) per month, which was considered a respectable wage in Kenya.

The company also had a partnership with OpenAI, working on tasks related to AI development. However, Sama discontinued such work after staff in Kenya raised concerns about tasks not outlined in their contract.

Ms. Gonzalez concluded by reaffirming Sama's commitment to non-harmful AI projects that contribute positively to various sectors, like driver safety and crop disease detection. She stressed the need for global collaboration in the development of AI to avoid reinforcing biases and to ensure a diverse perspective.