The terrible content of the Meta broke him. Now he wants it to pay

1 year ago
tgadmintechgreat
187

The case is the first from a content moderator outside of the company’s home country. In May 2020, Meta (then Facebook) reached a settlement agreement. $52 million with US moderators who developed PTSD while working for a company. But previous reporting found that many of the company’s international moderators doing much the same job faced lower pay and less support while working in countries with fewer mental health and labor rights services. While moderators in the US earn about $15 an hour, moderators in countries such as India, the Philippines, and Kenya earn significantly less, as of 2019 making report from Grani.

“The whole point of sending content moderation work overseas and far away is to keep it at arm’s length and keep the cost of this business function down,” says Paul Barrett, associate director of the Center for Business and Human Rights at New York University. who is the author of 2020 report on outsourced content moderation. But content moderation is critical to keep the platforms running, keeping the kind of content that will alienate users and advertisers from the platform. “Content moderation is a core vital business function, not something secondary or secondary. But there is a great irony in the fact that this whole scheme is designed to shirk responsibility,” he says. (A short version of Barrett’s report was included as evidence in the present case in Kenya on behalf of Motaung.)

Barrett says that other outsourcers, such as those in the clothing industry, would find it unthinkable today to say they are not responsible for the conditions under which their clothes are made.

“I think tech companies, being younger and in a way more arrogant, think they can pull this trick,” he says.

Moderator Sama, speaking to WIRED on condition of anonymity for fear of retaliation, said he needs to view thousands of pieces of content every day, often needing to decide what can and cannot stay on the platform in 55 seconds or less. . Sometimes it can be “something graphic, hate speech, bullying, incitement, something sexual,” they say. “You should expect anything.

Foxglove Legal’s Kreider says the systems and processes that Sama moderators face — and which have been shown to be mentally and emotionally damaging — are all designed by Meta. (The case also alleges that Sama was involved in labor abuse through union busting activities, but does not allege that Meta was part of those efforts.)

“It’s about broader complaints that the system of work is inherently harmful, inherently toxic, and exposes people to an unacceptable level of risk,” Kreider says. “This system is functionally identical whether the person is in Mountain View, Austin, Warsaw, Barcelona, ​​Dublin or Nairobi. So, from our point of view, the point is that it is Facebook that develops a system that causes trauma and the risk of post-traumatic stress disorder for people.”

Kreider says that in many countries, especially those that rely on British common law, courts often look to decisions in other, similar countries to help formulate their own, and that the Motaung case could be a model for outside moderators in other countries. “While this does not set any formal precedent, I hope this case can serve as a guide for other jurisdictions considering how to deal with these large multinational corporations.”

Leave a Reply