Sin eating was an age old British practice carried out by those on the fringes of their communities. When someone died the sin eater would consume a ritualistic meal over the corpse and in doing so they would take on their sins. Whether they were outcasts because of this, or to start with folklorists can’t say. What is known for certain though is that they were among the poorest – who else would do it?
It’s a dirty job and someone’s got to do it. This episode takes a look at modern day Sin-eaters in the shape and form of content moderators. Early in the episode we hear this :
“One woman I spoke to, told me, after I stopped working on content moderation I wouldn’t shake people’s hand. If you’ve had the job that I’ve had you know that people are nasty.”
This is a harrowing episode in many ways, touching upon offensive content to content that is much much worse. This is the side of content moderation that we don’t hear much about, it’s a side of content moderation that isn’t widely talked about due in part to the fact that companies simply don’t want to talk about it and that in turn means it’s a side of content moderation that is greatly under appreciated.
Aleks Krotoski points out :
Their job is vital, but we treat them like second class citizens.
The episode features input from Sarah T. Roberts, Assistant Professor of Information Studies at UCLA who has been studying content moderation since 2010. Sarah runs a website on content moderation, or Commercial Content Moderation (CCM); The Illusion of Volition.
Sarah T. Roberts and Aleks Krotoski. both highlight the point that this is a human role, that technology alone can’t deal with this content and that in turn comes at a human cost to the people who perform the role of content moderation.
Earlier this year Herald.Net reported; Ex-Microsoft worker says policing toxic images led to PTSD. The article informs us that back in 2008, Henry Soto got a job at Microsoft as part of an online safety team. Henry brought to the attention of authorities worldwide the horrific content he encountered, but it appears to have came at a heavy price, as the article explains :
Soto, 37, now is living with a diagnosis of post-traumatic stress disorder. His health care providers have linked his anxiety, memory troubles and other crippling symptoms to repeated exposure to the materials he helped remove from digital space.
This not the only story of a content creator reporting that they have PTSD.
Content creators also face issues with their work colleagues in other departments, they can be isolated from the rest of the company, they are in some cases identified by a different coloured badge, they can’t really talk widely about what they are exposed to.
There’s also another issue for content moderators, there’s a very real danger that they can become desensitised to content.
This episode definitely makes the point that content moderators are under appreciated. I’ve certainly moaned about content moderation, many others do, but many of us don’t appreciate the work content moderators do in preventing us from seeing content and this work comes at a cost, maybe it’s time for a rethink on the role content moderators do, they deserve far more respect than they get.
You can read more at Digital Human’s Tumblr page : http://thedigitalhuman.tumblr.com/