Content moderation has grown into such a hot button issue over the last several years that some BPOs have decided to abandon ship before the flames catch up to them. Given the size of the investments and the levels of employment that come with this segment of the market, the question comes naturally: should Nearshore territories be concerned?
The growth of online activity has made content moderation into a segment of BPO offerings that’s not only profitable, but important too. Social media companies in particular find themselves under tremendous pressure to monitor content posted in their platforms, especially when it is egregious or potentially harmful. A lot of that pressure is shared with BPO vendors who leverage international teams to keep up with the volumes of content which require review from a human eye.
That pressure has proven too much for some service providers. Over the past five years, a handful of vendors have left the content moderation game in response to hard-hitting media reports which scrutinized their labor practices in the segment.
Cognizant announced in November 2019 that it would eliminate its moderation business following two exposés on the working conditions of content moderation staff in the company’s Phoenix and Tampa offices, which serviced Facebook. The announcement led to the gradual shutdown of the segment over the following years, resulting in the cutting of about 6,000 jobs.
Facebook’s content moderation site in Tampa, FL, operated by Cognizant, is its lowest-performing site in North America. Three former Facebook moderators are breaking their NDAs and going on the record to discuss extreme working conditions https://t.co/vD4zJB6Gno pic.twitter.com/1oO7MjZUYv
— The Verge (@verge) June 19, 2019
In late 2022, Teleperformance also announced it would shut down its moderation of “egregious” content. The announcement happened after two reports which accused the French multinational of malpractice in its offices at El Paso and Colombia. In its press release, Teleperformance stated that “as a public company that has always been considerate of its shareholders perspectives, Teleperformance has decided that exiting the highly egregious segment of trust & safety is the right thing to do, at this time”.
The company didn’t provide information at the time on how many of its employees would be impacted by the shutting down of the segments. Its latest annual report points out that nearly 12,000 people took courses on trust and safety activities, which include content moderation. In 2022, the company provided around 47,000 individual counseling sessions for content moderators.
San Francisco-headquartered Sama also distanced itself from the moderation game following a PR firestorm. In early 2022, TIME Magazine reported on the working conditions suffered by Sama employees in Nairobi (Kenya), where content moderation services were provided to Facebook, among other clients. The shutdown of the segment resulted in Sama laying off 200 employees, or around 3% of its staff.
🚨BREAKING: An exposé by @Time reveals the truly traumatic toll on @Teleperformance workers moderating content for #TikTok in 🇨🇴 Colombia, who report exposure to extreme violence, suicide, child abuse and animal cruelty.
— UNI Global Union (@uniglobalunion) October 20, 2022
Contrary to what their responses might suggest, BPOs are very aware of the risks that content moderation represents for their business. Some of the biggest players in the industry warn investors about potential media firestorms and government scrutiny as parts of the pitfalls that come with the moderation territory.
Is content moderation about to become a no-go zone for service providers?
Playing With Fire
In March 2023, less than six months after announcing its abandonment of content moderation, Teleperformance informed it was back on business.
“Teleperformance is now convinced that it is in the best interest of the billions of people that are online every day that Teleperformance continues to serve the content moderation needs of its clients in full and not exit any part of the business,” the French firm stated in a press release.
Though such a sudden reversal might seem odd, Teleperformance’s decision makes sense once the numbers are taken into consideration. According to the company’s own account, its trust and safety services –which it provides to 33 clients, including top social media companies– represents about 7% of its yearly revenues.
Everest Group expects the trust and safety services market to be one of the faster-growing segments in the BPO market. The group’s projections see it reaching US$11 billion in 2025
The segment also throws relevant numbers for other BPOs. Cognizant reported a US$178 million impact in its 2020 revenue due to its exit from content moderation. Its two-year contract with Facebook was worth US$200 million, according to media reports. Sama’s African contract with Facebook was reportedly worth US$3.9 million in 2022. That business is expected to be taken by Majorel.
Everest Group expects the trust and safety services market to be one of the faster-growing segments in the BPO market. The group’s projections see it reaching US$11 billion in 2025, a number cited in Teleperformance’s latest annual report in reference to appealing “new business sectors”.
Top companies in the BPO space recognize the risks that come with providing content moderation services, but they’re willing to play the game. Vendors such as TaskUs, Accenture, Genpact, Concentrix and Majorel still offer trust and safety services for the sake of their financials. Majorel reported that the segment represented 23% of its net revenue in 2022, two percentage points above what it reported in 2021.
In other words, while some BPOs are abandoning ship, afraid to catch fire, more adventurous players are pushing ahead at full speed, betting on the eventual profitability of such a risky venture.
What’s To Be Done?
If BPO providers expect content moderation to be smooth sailing, they’ll have to ramp up their efforts (and their investments) to make moderation as effective and as mentally safe for their staff as possible.
AI tools have emerged as a promising solution to the glut of content in need of monitoring. Teleperformance claims that, on average, 97% of the egregious content it handles is monitored by AI. The remaining 3% is reviewed by human eyes “because of contextualization issues”. Even then, in its announced return to content moderation, the company promised to increase the use of AI tools to improve automated screenings of harmful content.
Content moderation isn't just about social media. With technology, it's possible to moderate at scale, reducing the massive volume of content that requires human attention. Discover the powerful synergy between humans and technology: https://t.co/kWSkTYsaM0#ContentModeration pic.twitter.com/Ei6zNcQ8vv
— Teleperformance (@Teleperformance) April 7, 2023
Though AI helps with the scalability and precision of content screening, discarding the human element is still an unlikely proposition. AI experts and industry observers agree that, even in the age of sophisticated automation tools, a human eye will still be required.
“Companies need workers to annotate data sets of images or other kinds of media that will be used to train the tools, and human beings still have to check whether the algorithms got the decisions right,” Sarah T. Roberts, Associate Professor at UCLA’s Department of Information Studies, pointed out in an interview with Harvard Business Review.
As long as human moderators are needed for content monitoring, there will be risk of malpractice. Companies who want to keep playing the game will have to be more aware of the wellbeing of their staff and tread carefully, offering better tech tools, more counseling and the chance for a break, Roberts pointed out.
BPOs are fully aware. Teleperformance promised in its return to screening egregious content that it would enhance “physical and emotional wellness programs for content moderators”. In its latest annual report, Majorel recognized that “socially responsible, professional content services which put the health and wellbeing of team members at the forefront, will be key in supporting growing demand in this area.”
Such levels of caution will be particularly necessary in the Nearshore. The rise of left-leaning governments in Latin America has brought with a “worker-friendly” rhetoric. Right after TIME’s exposé on Teleperformance, the company found itself in hot water with the Colombian government. In Mexico, labor inspections are expected to ramp up dramatically this year.
Though Nearshore markets should not fear the loss of potential jobs and investment from BPOs in the content moderation game, local authorities will have to keep their eyes open for any form of malpractice that might result in another media firestorm. If the exposés become too frequent and intense, more vendors might find it advisable to steer clear from the heat.