Skip to main content
Blog

Beyond the Bottom Line: How Wellbeing Programs Protect Against Legal and Reputational Damage

By October 1, 2024No Comments

We know Content Moderators play a vital role in maintaining the integrity and safety of platforms. However, the demanding nature of moderation, which often involves exposure to distressing and graphic material, has raised serious concerns about the mental health and wellbeing of those in these roles.

In recent years, there have been several high-profile cases where Content Moderators have taken legal action against companies for failing to provide adequate mental health support. These cases not only highlighted the mental health challenges faced by Content Moderators but also significantly impacted the brand value and reputation of the companies involved. This article explores how robust wellbeing programs can help organizations protect their brand value and mitigate risks.

The Legal Landscape: Cases of Content Moderator Mental Health Issues

PTSD

One of the most notable cases involved a group of Content Moderators who filed a lawsuit claiming that they had developed post-traumatic stress disorder (PTSD) due to their work. The plaintiffs alleged that their organization did not provide adequate mental health support or safeguards against the psychological trauma they experienced while moderating harmful content. In 2020, the organization agreed to a $52 million settlement, which included compensation for moderators and the implementation of mental health initiatives such as on-site counseling and regular psychological support.

Psychological Distress

Similarly, a lawsuit was filed by a former Content Moderator, who claimed that the company failed to provide a safe work environment. The moderator reported suffering from severe psychological distress after being exposed to graphic videos daily. This case brought to light the psychological toll Content Moderation can take and the necessity for companies to implement comprehensive wellbeing programs.

Inadequate Mental Health Support

Another lawsuit from two former Content Moderators who alleged that they developed PTSD and other psychological issues due to their job. The lawsuit claimed that the organization had failed to provide adequate mental health support and had not warned employees about the potential risks associated with the work. The case was settled confidentially, but the organization in question used this opportunity to review and enhance its wellbeing policies for Content Moderators.

Class Action Lawsuit 

A former Content Moderator filed a class-action lawsuit in 2022, alleging that the company failed to implement industry-standard protections against the trauma of viewing disturbing content. The case underscores the continuing concerns over Content Moderator mental health and the legal and reputational risks companies face.

When these lawsuits happened, many of the largest social media companies opted to outsource their content moderation, but this caused other issues for the organizations. Often the end client, the social media platform, was brought into cases as a co-defendant, which caused negative publicity, and impacted brand value.

Legal Landscape – Court Cases Involving BPOs & Social Media Platforms

We have avoided using the names of either BPOs or social media platforms, as many organizations have completely revamped their wellbeing strategy, often in conjunction with companies like Zevo Health.

Case 1- In 2021, a former BPO employee filed a lawsuit against the company, alleging that she had developed anxiety, depression, and PTSD due to the disturbing content she was required to review for a social media company. The lawsuit claimed that the BPO failed to provide adequate psychological support and did not implement sufficient safeguards to protect the mental health of its employees. Although the lawsuit was directly filed against the BPO, the social media organization involvement as the client company led to significant media scrutiny. The case highlighted concerns over how much responsibility social media platforms should bear for the wellbeing of moderators employed by third-party firms.

Case 2 – In 2018, a lawsuit was filed by a Content Moderator against another BPO, claiming that exposure to graphic and violent content had caused her to develop severe psychological issues. The legal complaint pointed out that the BPO, under the social media company’s contract, had not provided adequate training or mental health support to deal with the emotional toll of the job. While the lawsuit was primarily against the BPO, the social media company was also implicated, raising questions about the ethical obligations of social media companies towards the moderators employed by their partners.

Case 3 – In 2021, a class-action lawsuit was filed by Content Moderators who worked for a BPO, citing severe emotional distress and inadequate mental health provisions. The lawsuit alleged that the distress was directly related to the nature of their work, which involved reviewing harmful and explicit content. Although the case was brought against the BPO, the social media platform’s connection as a client amplified their reputational risks.

Impact on Brand Value & Reputation

These lawsuits have had significant implications for the companies involved. Not only have they led to substantial financial settlements, but they have also affected the companies’ public image and brand value. Negative publicity surrounding these cases has raised awareness about the harsh realities faced by Content Moderators and the lack of support provided by some organizations. This negative perception can damage a brand’s reputation, leading to decreased consumer trust and loyalty.

Loss of Trust

Trust is a critical component of brand value, especially for social media platforms that rely on user engagement. When a company is seen as neglecting the wellbeing of its employees, particularly those on the front lines of maintaining platform safety, it can lead to a loss of trust among users. Users may question the ethical standards of the company and its commitment to creating a safe and healthy environment, not only for its employees but also for its users.

Media Scrutiny and Public Backlash

Media coverage of these lawsuits often highlights the emotional and psychological distress experienced by Content Moderators, portraying the companies as negligent. This kind of negative publicity can lead to public backlash, with users and advocacy groups calling for boycotts or regulatory action. In the age of social media, such backlash can quickly go viral, further damaging the company’s brand and image.

Impact on Employee Morale and Recruitment

A company’s reputation for how it treats its employees can significantly impact on its ability to attract and retain talent. Negative press regarding the mental health challenges faced by Content Moderators can deter potential employees from joining the organization, particularly in roles related to trust and safety. Current employees may also feel demoralized if they perceive the company as not valuing their wellbeing, leading to increased turnover and reduced employee engagement.

For cases where the platform was legally entangled in a BPO case, there are additional damages that can occur:

Reputational Damage to Social Media Platforms 

While lawsuits are often filed against BPOs, the social media platforms that contract these services are inevitably drawn into the spotlight. Negative publicity surrounding these cases can damage the public image of the platforms, casting them as negligent and profit-driven entities that disregard the welfare of the people who keep their platforms safe. Reputational damage can lead to a decrease in user trust and engagement, affecting the platform’s overall brand value.

Accountability and Corporate Responsibility 

These lawsuits have also led to increased calls for social media platforms to be more accountable for the wellbeing of all moderators, whether they are direct employees or contracted through BPOs. Public opinion increasingly demands that platforms ensure that their content moderation processes are ethically managed and that they provide adequate mental health support. Failure to do so not only risks legal repercussions but also negatively impacts their brand as ethical employers and platforms committed to safety.

Financial and Legal Risks 

Lawsuits expose both the BPOs and the social media platforms to significant financial risks, including costly settlements, legal fees, and potential fines. These financial burdens, combined with reputational damage, can significantly impact stock prices and investor confidence. For instance, one settlement related to Content Moderator PTSD not only raised concerns around the legal issues raised in the case, but also raised broader concerns among investors about the platform’s long-term risk management strategies.

Consumer and Advertiser Perception 

The brand value of social media platforms is closely tied to user trust and advertiser confidence. Negative headlines around how a platform’s content is moderated and the mental health impacts on moderators can lead to consumer backlash, boycotts, and a reluctance from advertisers to associate with the platform. Brands are increasingly sensitive to ethical issues, and association with platforms that are seen as neglecting worker wellbeing can harm their image, prompting them to reconsider advertising partnerships. For example, advertising on X has reduced significantly over the last few years due to concerns from advertisers about the lack of content moderation on the platform.

How Wellbeing Helps Mitigate Risk and Protect Organizational Reputation

Given the potential impact on brand value and the risks associated with neglecting Content Moderator wellbeing, organizations now more than ever are keen to implement robust wellbeing programs. Effective wellbeing programs help comply with legal standards and demonstrate a commitment to ethical practices, thereby protecting the company’s reputation and brand value.

The challenge of protecting brand value and reputation becomes even more complex when companies outsource moderation to third party BPOs. Some of the ways an industry-leading wellbeing solution can help protect both the BPO and the end client include:

Consistent Ethical Standards Across Partners

Social media companies need to enforce stringent ethical standards and wellbeing requirements in their contracts with BPOs. By doing so, they can ensure that all moderators, regardless of their employment status, have access to the necessary mental health support and safe working conditions. This can include mandatory mental health screenings, access to in-person counseling services, and regular breaks from reviewing egregious content.

Monitoring and Compliance 

Regular audits and compliance checks should be conducted to ensure that BPOs adhere to agreed-upon wellbeing standards. By maintaining oversight, social media platforms can prevent situations where outsourced moderators are exposed to poor working conditions and subsequently take legal action.

Public Commitment to Wellbeing 

Platforms should make public commitments to the wellbeing of all individuals involved in content moderation, including those employed by third parties. Transparent policies and regular reporting on the wellbeing initiatives can help build trust with users, employees, and partners, demonstrating that the platform takes its ethical responsibilities seriously.

Proactive Mental Health Support 

Providing access to proactive mental health support, such as resilience training and regular mental health check-ins, can help Content Moderators manage the psychological impact of their work. Social media platforms should invest in creating a culture where seeking help is encouraged and supported, reducing the stigma around mental health issues.

Training and Resilience Building 

Training programs that help Content Moderators build resilience and coping skills are crucial. These programs can include mindfulness training, stress management techniques, and workshops on recognizing the signs of burnout. By equipping moderators with the tools to manage their emotional responses, companies can help mitigate the mental health risks associated with content moderation.

Conclusion

The cases against BPOs and social media platforms serve as important reminders of the risks associated with inadequate mental health support for Content Moderators. These cases highlight how neglecting the wellbeing of those on the front lines of content safety can lead to significant reputational damage, legal action, and financial loss.

By investing in comprehensive wellbeing programs specifically designed for Trust & Safety teams, organizations can protect their brand value, demonstrate corporate responsibility and create safer and healthier workplaces. A further challenge is ensuring ethical standards are maintained across all content moderation partnerships, social media platforms can protect their brand value, demonstrate corporate responsibility, and create safer and healthier work environments. In doing so, they not only mitigate risks but also uphold the integrity and trustworthiness of their platforms.

Free Webinar | This webinar will take place on September 26th, 4:00 pm GMT

Register Now