In a recent interview, an online content moderator told Nearshore Americas that she spends between eight and ten hours each day sifting through social media comments and classified ads in search of offensive material. The Mexico-based freelancer, who asked not to be identified, hides profane or hateful speech that appears on the social media pages of a U.S. company. She also removes ads from a classified platform operating out of the United States. The inappropriate content ranges from people searching for a romance to those hireing a hitman.
The freelancer is one of the more than 100,000 people responsible for moderating digital content globally. The social media giants and other enterprises have outsourced much of this work – to India and the Philippines but also to Nearshore destinations such as Mexico and Colombia. Most companies undervalue these professionals, offering little opportunities for career advancement. But content moderators perform an essential function, often at the expense of their own mental wellbeing.
“We stop [disturbing] pictures or text from appearing in other people’s lives,” the freelancer said, adding that businesses did not value content moderators as much as they should. “It has gained value over the years, but not as fast as we would want.”
Across industries, companies are increasingly aware that online brand management has an impact on the bottom line. A recent study by the market research firm Fact.MR projected that the demand for content moderation solutions would thrive, with a a compound annual growth rate of 10.5% through to 2029.
“Content moderation is a kind of insurance. You can’t live without it until you get a problem and then it could have a real impact on your brand,” said Jeremie Mani, the CEO of Webhelp Canada, a business process outsourcing firm.
According to Mani, when businesses start a Facebook or Instagram account, they are implicitly associating themselves with the comments that appear on that page. If brands create a community and allow offensive content to be shared, “the rest of the community won’t understand why the brand is not intervening and… deleting those comments,” he said.
Discussions around content moderation typically center on social media. Twitter and Facebook have come under fire for failing or refusing to take down controversial material.
“When I started in this industry… I had to create a blog to talk about [content moderation] because people were not really interested in the subject and clients were not really aware of what it was,” Mani said.
Classified platforms also employ professionals to remove illegal items, detect scams and ensure items are correctly categorized. Other sectors, including the dating, gaming and media industries, use content moderators to safeguard their digital platforms.
A Human Touch
Headquartered in Paris, Webhelp recently expanded into Montreal to gain access to the North American market. The BPO giant handles content moderation for TikTok, among around 200 other clients. The company collaborates with its partners to create customized content guidelines that reflect the values and culture of the brand. Webhelp currently employs more than 2,400 moderation experts working in nine locations. To supplement the efforts of staff, the company has automated parts of the process. These solutions involve using artificial intelligence to remove material or augment human efforts. The company has also developed an in-house tool which obtained a Facebook Marketing Partner certification following an extensive audit.
But while AI has a growing role in the industry, human agents will remain essential for the foreseeable future. When it comes to detecting nuance, or making judgements on context, humans still outperform robots.
“We do offer content moderation as part of our digital and social media CX solutions portfolio,” said Rajiv Ahuja, the president of Startek, a BPO giant with offices in Argentina, Honduras, Jamaica, and Peru. Having also worked for a company that provided content moderation services more than a decade ago, Ahuja is aware of some of the business’s changing dynamics. “By embracing proactive, advanced digital approaches to content moderation, brands are now reaping tangible benefits, such as time savings, employee retention, and improved brand reputation,” Ahuja said. “It is crucial to have a balance of human empathy and AI, as content moderation requires nuanced and culturally-sensitive interventions.”
An Emotional Toll
As the former CEO of Netino, a social media and content management business that was sold to his current company in 2016, Webhelp’s Mani has specialist knowledge of the subject. He believes executives are increasingly aware of the responsibilities and risks of online content.
“When I started in this industry… I had to create a blog to talk about [content moderation] because people were not really interested in the subject and clients were not really aware of what it was,” he said.
While debates around what constitutes appropriate material have raged in recent years, the position of moderators within the tech and BPO sectors has also come under scrutiny.
Content moderators are typically paid by the hour and face limited opportunities for career growth. Turnover is high. But some providers have raised their base pay in response to growing appreciation for the role.
“Before it was seen as something of low value that you had to do,“ he said. “Now, there are some changes in the views of most clients. They are aware that it could be dramatic to have a problem because of something that is not moderated.”
Some BPO providers, including Webhelp, have opted to provide mental health support for their workforce. The company currently provides a psychologist to employees who would like to talk.
But Mani knows the measure does not automatically resolve or mitigate psychological risks.
“That’s a mainstream answer,” Mani said. “It doesn’t mean that’s enough.”
The executive said it was also important to focus on the “small solutions” that help provide a safer, more supportive environment for employees.
“One of them is to make sure that you gather the content moderators every week, at least, for an informal conversation in which you ask: ‘what do you remember seeing this week?’ and ‘What have you been shocked by?’ That’s one way you can make them feel like a team facing the same difficulties.”
Mani said practical measures, such as switching between different types of content every two hours could also prevent burnout. Above all, Mani said, it was crucial to encourage workers to reflect on the meaning of their work.
“When you do this job, you need to understand that if you see harsh content and you delete it, it’s hard for you,” Mani said. “But you might be proud because that means that thousands of people won’t see it after you.”
What does it take achieve great outcomes in Nearshore services? If you would like to share an exciting case study or news story drop me a note — Steve Woodman, Managing Editor