About this Whitepaper
Moderating news media is as challenging as it has ever been in a VUCA (volatile, uncertain, complex and ambiguous) world. As users or consumers, we are faced daily by coverage of harrowing events unfolding globally, from war to natural disaster and even assaults against marginalized communities. While we have the option to mute accounts or avoid news channels, Content Moderators (CM) do not have the same luxury.
This work conducted by moderators is emotionally laborious. Additionally, moderators are expected to accurately identify false or misleading information, remain neutral despite any personal beliefs or values, maintain compliance with ever-evolving platform policies and the regulatory landscape. This white paper explores these areas for concern, drawing on research conducted amongst news production staff and other adjacent occupations.
Understanding the potential harms to moderators focused on news media content is the first step to addressing and mitigating them. With greater awareness, the Trust and Safety community can identify the most appropriate interventions to safeguard moderators. In the final section of this white paper, we share general strategies to safeguard moderators from harm based on our years of experience working with moderators, considering a holistic and strategic approach that involves various stakeholders within the entire organization.