Given how 2020 has been so far, if I were to ask you your favourite month of the year, you would rightfully hang up on me and then call me names, not in that order. However, since football was back last month, I would say September was when I was happiest. It’s great to look forward to football matches after a drought. I love the feeling when I get a notification on my phone that says line-ups are available. It is when my tinfoil hat is at its largest.
For me, the only way to make sense of who is playing is to look at who is not playing. Let me explain. If I look at the starting eleven, all of it makes sense to me. For example, if in a game, Messi, Coutinho, Fati and Griezmann are starting, I would not question it. But, when I look at the bench and see Dembele, I come to know of the trade-off the coach had to make to come to this preferred line-up.
Looking for what’s absent can be an excellent way to understand what’s in front of you. Because, when you begin to look for what’s missing, you begin to see the bigger picture. In fact, the best example of this often goes unnoticed every day.
Next time you pick up your phone to browse through your Instagram or Facebook feed, take a moment to make sense of what you are looking at. Those cat videos or photos of your friends made the cut, but there is a whole world of content that you have never come across. A world, which thousands of scarred, underpaid employees work to remove, often at the cost of their own mental wellbeing.
See, when platforms like Facebook and Reddit came online, they brought a massive amount of humanity’s communication along with them. Not all of it was worth putting on the internet for everyone to see. I am talking about videos of people beating animals to death, clips of people committing suicide or hurting themselves, or child pornography. And people posting stuff like these on platforms is not a problem that will go away on its own.
Outsourced moderation
In the early days of most platforms, the founding teams had to make sure sites were free of obscene content and responsibilities were passed on to community moderators. More often than not, the content was so scarring that people moderating it suffered from Post-Traumatic Stress Disorder (PTSD). However, as platforms began to grow bigger, they began to outsource content moderation to places where labour was cheaper, like India.
Most people assume content moderation is something that happens in dingy rooms lit with computer screens. However, the reality is that it happens in big corporate buildings located in Gurugram. People work day and night to scrub the proverbial floor, making sure all we can see are cat photos.
The cost of moderation is brutal and unseen by design. Facebook recently settled a court case by paying $52 million in total to moderators who developed PTSD and other mental health issues on the job. While none of the money is going to come to Indian content moderators, the best practices from the settlement at least should.
The money is not important here. Instead, it is these best practices that can save future damages to the mental health of moderators in India. These include muting videos by default, changing the colour scheme to black and white and providing regular access to therapy and counselling. This is the content moderation equivalent of prevention is better than cure and the mental health impact of doing so can be priceless.
The writer is a policy analyst working on emerging technologies.
Tech-Tonic is a monthly look-in at all the happenings around the digital world, both big and small.