“It’s like a death by a thousand cuts, every day wearing you down a little more, bit by a little bit,” says Ella Dawson
With just one click, everything can change. Content Moderators know this better than most. Daily, they are exposed to material that ranges from spam and inoffensive material, such as misleading advertisements, to hate speech, graphic violence, extreme pornography, and child sexual abuse material (CSAM).
To give you an idea of just how much of this kind of content there is online, here are some startling numbers:
- In the fourth quarter of 2023, Facebook took action on 7 million pieces of bullying and harassment-related content, down from 8.3 million in the previous quarter.
- During the 1st quarter of 2024, approximately 99 million TikTok videos were removed from the platform, down by around 5% compared to the previous quarter.
In other words, were it not for the work of these incredible people (Content Moderators) and some AI, it would be extremely tough for most of us to go to our favorite platform for fear of seeing an image that would permanently scar us.
However, that is the job of Content Moderators. Therefore, it is incumbent on employers to ensure that every step and measure is taken to help alleviate the damage that exposure to such images can cause.
Signs of Moderation Burnout
In an ideal world, no one would ever reach a stage where they are burnt out. However, this is not always the case.
As highlighted in this article, here are some warning signs that can help you avoid any of your employees reaching this point:
Increased Stress and Anxiety
- What does this look like? Telltale signs include the inability to fall into a deep or prolonged sleep, irritability, and mood swings.
Emotional Detachment
- We’ve all heard of desensitization, where the frequency of hearing or watching horrific situations makes us numb, almost as if we have become accustomed to them. Look for employees who appear emotionally detached from their work and show reduced empathy and compassion.
Cynicism and Negativity
- “It is what it is” can be a coping mechanism until it isn’t and has deepened into apathy. Granted, it’s hard not to give into feelings of hopelessness when it comes to what is posted online, making this a tricky one to identify as something more than a healthy level of cynicism and something more concerning.
Decreased Productivity
- Naturally, when we are burnt out, we are less productive, so when someone highly efficient begins to show signs of slowing down, take note.
Physical Symptoms
- Our bodies can be a clear indicator that something is wrong. Prolonged stress can manifest as headaches, fatigue, and digestive issues.
How Organizations Can Reduce the Stress Levels of Content Moderators
Fortunately, companies can take plenty of actions to prevent burnout.
1. Schedule regular breaks
Constant exposure to content, especially harmful or sensitive material, can take a mental toll. Taking regular breaks allows our brains to rest and recharge. These breaks should be used to engage in activities that relax or distract from work—whether it’s taking a short walk, meditating, or simply stepping outside for fresh air. Breaks can improve focus and reduce the emotional impact of stressful content.
2. Provide opportunities for peer support
Knowing that others within the team “have your back” is crucial. Train all of your managers so that they understand Content Moderator’s unique challenges and can improve operational conditions for better work-life quality. This includes equipping supervisors with skills to optimize check-ins, discuss sensitive topics, enhance communication, and build relationships.
3. Offer therapy as part of your employee wellbeing program
Sometimes, professional help is not just beneficial; it’s necessary. Given the unpredictable nature of news events, I recommend personalized support from trained psychologists for anyone having an intense reaction. This ensures that your Content Moderators have the necessary support and regular check-ins with mental health professionals, which can be reassuring and reduce anxiety.
4. Mindfulness and Relaxation Techniques
Deep breaths.
While this may seem obvious and basic, we often overlook the value of mindful breathing. Meditation, deep breathing exercises, and yoga can and do help reduce stress and improve overall wellbeing.
5. Identity opportunities for risk reduction
Are there any changes that you can make to limit the inherent risks? Start by assessing workload strain, targets, and devices. For instance, we have technology like affective interface design, which can lessen the emotional impact as it alters how content is presented, for instance, content can be presented using blurred or greyscale images to reduce its emotional impact. Utilize AI and machine learning tools to pre-filter harmful content and reduce the volume of disturbing material moderators need to see. Additionally, reviewing targets ensures they are realistic and manageable for Content Moderators. If you can take even small actions limiting exposure, the overall cumulative effect can be pretty powerful.
6. Offer training and development
As mentioned in From Strength to Strength: Building a Resilient and Adaptable Organizational Culture, we should never assume people are naturally resilient. Resilience is a skill that can be taught and needs to be continuously nurtured to help people adapt to whatever situation is thrown at them. Develop resilience in your Content Moderators through education and regular training programs.
7. Promote work/life balance
Given the rise of remote working and the ability to access work 24/7, sometimes it takes a huge effort to switch off and draw that line between work and home. Inculcate and actively build a work environment that ends at a specific time and discourages anyone from engaging with stressful content outside of work hours.
8. Stay informed on changing legislation
As outlined in Moderating Harm, Maintaining Health – Protecting the Wellbeing of Content Moderators, legislative initiatives like the Online Safety Act and Digital Services Act are proof that, as a society, we are trying to take action to regulate online content and promote user safety. We still have a long way to go, so organizations must take responsibility and stay informed of any new legislation or changes to current and actively advocate for its Content Moderators and, ultimately, for everyone.
Conclusion
Given the nature of some content posted online, we should all be very grateful that some people are responsible for protecting everyone else and removing harmful content. The onus for looking for these people and protecting them falls on their employers, who must make it their priority to take and implement every action and measure possible to help reduce their stress levels from relentless exposure to disturbing and traumatic content.