Skip to main content
Blog

Protecting the Protectors: Addressing Mental Health Needs of Content Moderators

By August 28, 2024No Comments

I recently watched a report from KTN news in Keyna called The Web Gatekeepers: The dark side of content moderation. The video interviewed various people involved in the content moderation business including Content Moderators and providers in the market.

Sama, a data annotation services company headquartered in California that employs Content Moderators around the world, is facing allegations of poor working conditions and inadequate mental health support. The moderators claim that they were subjected to harmful content without the provision of proper psychological care. They argue that the company violated labor laws by failing to offer sufficient mental health services and adequate compensation for the traumatic nature of their work.

Content moderation is a job that often goes unnoticed, but it is critical for maintaining safe and healthy online environments. In the KTN interview, Content Moderators talk about the severe psychological impacts of their work that were brought to light. These individuals are tasked with reviewing vast amounts of disturbing written content, including violence, abuse, and other harmful materials, as they train the OpenAI model.

The Invisible Burden

In the report, Content Moderators described how their exposure to graphic and distressing content is relentless, leaving little room for emotional recovery. The interview revealed that many moderators feel isolated and unsupported in their roles. Despite their critical importance, they often work under high pressure, having to complete a specific number of cases a day, with insufficient mental health resources. This situation not only affects their personal well-being but also impacts their professional performance and job satisfaction.

There is Hope

The good news is that the risks of developing vicarious trauma (VT), burnout, and other mental health issues in content moderation roles can be significantly reduced by prioritizing the wellbeing of these employees. Content Moderators should have ample support, including clear information during onboarding about recognizing mental health symptoms, especially VT. Implementing robust wellbeing programs that offer peer support, psychoeducation, and one-on-one counseling during work hours is essential. Early intervention, such as providing “well-being time” to help regulate stress responses, can prevent chronic hypervigilance and reduce the risk of VT.

A 2018 study found that engaging in brief, distracting activities like playing Tetris after encountering traumatic content can reduce intrusive memories more effectively than other methods. This suggests that allowing Content Moderators short periods for similar activities could help mitigate the development of VT.

The Need for Specialist Mental Health Care

For me, the interview highlighted the need for companies to invest in specialist mental health care for content moderation teams. Standard employee assistance programs (EAPs) are inadequate for addressing the specific challenges that Content Moderators experience every day.

If you look at what our customers are providing for their Trust & Safety teams, they ensure there is specialized mental health support, which has been developed by Zevo, including:

Risk Assessment

Zevo’s interaction with a new team of moderators always starts with an analysis of psychosocial hazards in the workplace, their level of risk (or injury potential), and implementing risk mitigation measures. Additionally, we engage customers in a baseline audit of their teams’ psychological health and resilience using a mixed-methods approach, identifying correlations with work-related stressors. Utilizing all this data, our therapists then work to develop a dedicated wellbeing program, in conjunction with the organization’s wellbeing lead, to support the moderators.

Trauma-Informed Therapy

Regular access to therapists (either onsite or digitally) who specialize in trauma can help moderators process and cope with the content they are exposed to. We ensure that all our therapists are Masters educated, with three years of practical experience working with clients in both individual and group settings.

Peer Support Groups

Creating spaces where moderators can share their experiences with colleagues who understand the unique challenges of the job enhances team cohesion and maintains compliance with NDAs or other confidentiality agreements. Our onsite therapists facilitate regular group sessions which aim to reduce the overall cognitive load of the work, as well as giving moderators a safe and fun intervention where they can safely process any difficulties and get a break from their work.

Resilience Training

Programs that teach resilience skills can empower moderators to handle their responsibilities more effectively. We provide resilience training programs with our customers which have been developed specifically for Content Moderators and their managers, using evidence-based therapeutic modalities that address unwanted or unhelpful thoughts, feelings, and behaviors. These programs help ensure that the mental health of everyone in the Trust and Safety team is prioritized and supported.

Off-Boarding Support

Providing access to a therapist after a Content Moderator resigns or is made redundant, helps ensure their mental health is supported and any issues that arise can be quickly worked through. Many of our customers provide a series of off-boarding sessions for the Content Moderators when they leave, to ensure they are supported even post-exit. The duration differs depending on the customer and the type of content the moderator has been working with – some of our larger customers offer a full year of off-boarding sessions, which can be accessed anytime during that year.

Implementing Supportive Policies

As well as providing interventions for the individual, we also work to improve the wider ecosystem surrounding Content Moderators that supports their mental health and wellbeing. We work with our customers to develop or improve working policies that further create a supportive environment. This includes:

Reasonable Work Hours – Avoiding excessive exposure by limiting the number of hours moderators spend viewing harmful content.

Regular Breaks – Ensuring moderators have sufficient breaks to decompress and reduce cumulative stress.

Decompression Time – Allowing time for moderators to transition out of their work mindset before going home.

Recruitment Consultancy – Ensuring that the right candidates and strongest profile of moderators is brought into the business.

T&S Teams Support – Working with QAs, policy, L&D, management, HR and other functions that support Content Moderators directly or indirectly.

A Call to Action

The interview with Content Moderators serves as a stark reminder of the hidden costs of maintaining safe online spaces. As digital platforms continue to grow, the need for comprehensive mental health support for Content Moderators becomes increasingly urgent.

Organizations must recognize their responsibility to protect the mental health of those who protect their platforms. Providing specialized care and fostering a supportive work environment is not just an ethical imperative but also essential for sustaining an effective and resilient workforce.

By addressing these needs, we can ensure that Content Moderators receive the recognition and support they deserve, ultimately leading to healthier, more sustainable online communities for everyone.

Free Webinar | Tailoring Psychological Support to Different Roles in Trust and Safety

Register Now