Hyderabad-based content moderator reveals the demands and perils of the job

28 January,2024 06:59 AM IST |  Mumbai  |  Neerja Deodhar

Content moderators are the traffic cops of social media, stopping dangerous content in its tracks. A young woman in the profession lets us in on the perils and demands of the job, and why she persists at it

Illustration/Uday Mohite


India is home to a number of firms engaged in content moderation - the next chapter in its IT capacities after the boom of call centres and BPOs. It was in the 2000s that companies like Meta (then Facebook) began outsourcing moderation to our country and the Philippines. A moderator's key responsibility is assessing posts and videos flagged by users who cite the violation of guidelines, the tone and nature of the content, and the poster's intention. They do this by absorbing detailed policies, related to discrimination, abuse and terror.

Their judgement relies on this knowledge as well as their own instincts - it is their ability to assess confusing and nuanced social context that sets them apart from AI tech employed for moderation. Traversing through social media can feel like stepping on a landmine sometimes, even if you carefully curate your fields and tabs. If it weren't for moderators, known to be underpaid, overworked and desensitised, the Internet would be a far less welcoming place. A Hyderabad-based tech professional, aged 25, reveals what her work as a moderator has taught her about Indian netizens and anti-social behaviour online.

• • •

‘‘It may come as a surprise - considering my profession and age - but I wasn't an avid social media user before I became a content moderator. Four years on, my perception of what it means to be safe on online platforms has much to do with our work. It feels like an honest attempt to bring about positive change, whatever the size of the impact. Some days, it is weeding out thousands of vitriolic comments on a queer person's profile. On others, it is coming face-to-face with young, radicalised children armed with AK-47s - as they roam about a city, demanding its residents surrender or face bullets.

This seems to be the work of young people; I joined when I was 21, right after college. Most of my colleagues are in their early 20s, and this is their first job. There's certainly a preference for freshers, who find firms engaged in content moderation through consultancies, or, like me, through job aggregation websites. Once we joined, we were all trained for nearly two months by those with experience, to identify violations of rules commonly referred to as Community Guidelines. This is when I observed that most users who create accounts on popular platforms aren't aware of the safety mechanisms available to them. And every platform has a different understanding of what constitutes free speech and acceptability; Google, for example, has a wider definition for what is permissible.

This is also work that commands our 100 per cent focus, where we have to be completely present. After all, we have 40 seconds to determine whether a post reported or flagged by a user violates guidelines, though we can take up to a minute if we need to deliberate. Videos, of course, need to be played in entirety. On average, we go through 200-250 posts over eight-hour work days - posts which range from sarcastic memes about politicians, to horrifying visuals that circulate during times of war. The visuals may not even relate to the war or the region where the war is taking place, but they are volatile nonetheless; like a video where a woman was helplessly dragged through a street in torn clothes.

The pressure and stakes, however, stem not from the deadlines, but rather the consequences of a mistaken judgement - one that can affect countless lives, or enable something sinister to go viral. Recently, I assessed a profile reported for spreading disinformation about Israel-Palestine; when I went over the Reels posted from it, I found that the user was not only denying the reality but also provoking their followers to harm people from a certain community. I take relief in knowing that we can always lean on our teammates for their insight, especially when we have different perspectives.

The sad reality is that every year has brought increasingly harmful content our way. Upsurges in violent posts tend to coincide with elections and international crises, such as the Israel-Palestine issue. In India, two violations that raise their ugly heads throughout the year are nudity and hate speech. Some of the things my fellow countrymen post are incredulous; they bring shame to our society.

To an outsider, this can seem like an impossible profession. But I've found that it is possible to deal with the worrying nature of the content and the stress it brings by leaving it behind when you go home. It also helps to have a supportive work environment; our own company mandates 30-40 minute-long wellness sessions where we can seek out counsellors and take part in activities that help us de-stress.

Yet on some occasions, it's tough to keep our emotions separate from what's playing out on screen. I can still recall a video that left me shaken for days, one where a man sadistically tortured and skinned a baby monkey. Sometimes, it's not the material itself but rather the limitations of the job. A user may report content for legitimate reasons but those may not meet the parameters set by a platform.

During such moments, I feel both disturbed and concerned at my inability to act. The only consolation is that some platforms show flexibility and responsiveness by inviting recommendations about widening the scope of their parameters. These institutions have to be open to change, to mould themselves, so they can truly serve different types of users.''

40
No. of seconds taken by content moderators to assess flagged posts

"Exciting news! Mid-day is now on WhatsApp Channels Subscribe today by clicking the link and stay updated with the latest news!" Click here!
life and style sunday mid-day
Related Stories