2024 Zevo Group. All Rights Reserved. | Privacy Policy | Terms and Conditions | Sitemap
Join Dr. Michelle Teo and Dr. Pamela Lennon as they discuss the evolving digital landscape and the imperative need to protect online users and Content Moderators. They highlight Content Moderators’ role in filtering harmful material and stress the need for psychological support.
The conversation addresses the root causes of online threats and upcoming legislation to regulate platforms and safeguard users. They emphasize collective responsibility in fostering a safe and respectful online environment, and their discussion serves as a call to action for concerted efforts to uphold digital safety and wellbeing.
Takeaways:
- Explore the root causes of the problem: The impact that distressing content has on both users and Content Moderators.
- Examine new legislation that is coming down the line to regulate these platforms along with guidance on how to protect employees from psychological harm.
Unknown speaker
Welcome back to Zevo Talks. I’m Dr. Michelle Thio, the Clinical Director at Zevo Health and your host. Today I am joined by Dr. Pamela Lennon, who is a Senior Psychological Wellbeing Consultant with Zevo.
Unknown speaker
She holds a doctorate in health psychology and her work includes running well-being assessments to assess risk and resilience, and also conducting research around content moderation and intervention design.
Unknown speaker
So welcome, Pam. Thanks, Michelle. Thanks for having me today. Great. So today I suppose we’re exploring the constantly changing and evolving digital landscape and why we need to protect users online, but also why we need to protect the people who protect the users who are the content moderators.
Unknown speaker
So I guess we’re talking about things like the root causes of problems online, some of the legislation and regulation around online platforms, and then the impacts of the changing digital landscape on users and content moderators alike.
Unknown speaker
So maybe just first of all, maybe if you want to share with the audience, like why do we have this problem at all? Like why do we need to protect people, to protect users, as well as those who moderate content?
Unknown speaker
Yeah. Well, first of all, any business that has a website where people can comment or they can purchase things and leave reviews to dating sites, you know, so many businesses are at risk from receiving disturbing or hateful content or dangerous content as well that can expose the public.
Unknown speaker
And also there’s an appeal, you know, when there are certain bad actors out there that can use social media as a platform to voice their hate speech or to put on, you know, graphic content. And, you know, there’s where the role of the content moderator comes in that they want to try.
Unknown speaker
screen that out as quickly as possible. So we think, when we think about the root cause of this, it’s really the tool in itself, the internet, that attracts these bad actors that might have, you know, have psychopathy or narcissism or sadism, or even spitefulness that, you know, social media can be a powerful tool of mass impact and traumatization.
Unknown speaker
So really there, we’re looking to protect, you know, users of these platforms, as well as, you know, figuring out how to protect content moderators who are constantly screening this material as well.
Unknown speaker
So, you know, more as time goes on, there’s more and more content being uploaded year on year, that in fact, it’s been found it’s getting worse, you know, the types of illegal or harmful content that is going up.
Unknown speaker
So now more than ever, we kind of need to think about, you know, protecting the public. And I think like, yeah, sorry, just like even in that vein, you know, like we talk a lot about bad actors, but then I think conversely, there’s also people who are trying to do the right thing, you know, by sharing certain propaganda, do you want to maybe speak to that a little bit?
Unknown speaker
Yeah, I was just gonna move on to that side of things, because it’s not necessarily people who are intending to do harm, like in the case of the Israel Hamas war, individuals may be sharing content that, you know, to voice their concerns.
Unknown speaker
However, you know, to increase awareness of what’s really going on, however, there’s also a lot of propaganda, you know, in terms of content, and also the impact on the public to witness such, you know, concerns concerning traumatic material that, you know, how much power do individuals have?
Unknown speaker
Individuals have different responses to that, like extreme anxiety or shame or guilt that they’re not able to do something about that as well. However, the individuals posting this might see it more as a call to action and really have, you know, a good intended purpose of that.
Unknown speaker
You also see a lot online on treads where there’s almost a bravado, you know, going on, you know, where people are boasting like I’ve seen this and this graphic content online, and it doesn’t affect me anymore.
Unknown speaker
And it’s almost kind of boasting about the desensitization, but that’s so concerning, you know, you know, we shouldn’t be so desensitized to such, you know, harmful graphic content. And on the other hand, it’s almost debilitating other people and paralyzing them with with anxiety and trauma.
Unknown speaker
We’re deeply affected by this and can’t really feel anything in any way related to it. And also, I think we know too much about what’s going on everywhere in the world at the moment. You know, it’s a recent, I suppose, huge.
Unknown speaker
and concerned because before we just knew what was in our community or in our town and what was going on here. Now we know too much about you know all that’s going wrong in the world which you know really exacerbates anxiety in individuals as well.
Unknown speaker
And even I’ve noticed recently the influence of advertising on children such as you know children seeing videos and now they’re all concerned about skincare that you know someone 40 plus should be more concerned with and how the impact of this advertising is on culture of young people.
Unknown speaker
Now I know every you know generation has its challenges in terms of influences and negative influences but it’s an interesting factor as well you know how do we protect children. from one theory being exposed to one platforms.
Unknown speaker
And also looking at recently the XD prioritization of trust and safety and the impact that’s had on business as well for them. So there’s a lot going on, there’s a lot of challenges of how to protect users and then those who are protecting the user’s content moderators.
Unknown speaker
Yeah, so like even in that, when we think about the kind of root causes of why even trust and safety as an industry exists, there’s the side of the bad actors and then there’s the side of people who are just wanting to bring awareness and disseminate information and share knowledge and maybe like have a call to action.
Unknown speaker
And I think you bring up a really important point around that sense of desensitization because that’s I suppose one of like the concerns that we have around content moderators whose jobs 40 hours a week are to view graphic materials.
Unknown speaker
When they’re two, three years into the job, do they become completely desensitized to it and it doesn’t impact them anymore or they feel that it doesn’t impact them anymore but perhaps there’s something going on in their body that they’re not aware of and how do we kind of tackle those kind of challenges.
Unknown speaker
So there’s so much to be considered when we’re thinking about the root cause. So then I suppose if we’re thinking about the content moderators, like how are they being affected if they’re kind of at the forefront of the problem, they’re the ones kind of constantly viewing the objectionable content being uploaded and trying to protect the public, the users.
Unknown speaker
Yeah and really, content moderators are the unsung heroes that are there to protect users even though they might be very moderately paid and lacking support within the companies that they’re working in.
Unknown speaker
However, day-to-day they’re exposed to graphic violence, hate, terrorism, misinformation. these animal cruelty or suicide self-injury, there’s so much that they may have to deal with in a queue, and those queues, you know, concerning those topics might vary from day to day as well.
Unknown speaker
So, you know, their role has become increasingly more important as more and more types of content is being created and uploaded, and there’s pressure as well for them to comply with platform policy, which decides, you know, how to keep the bad guys out in this sense, you know, what is agreeable content or not.
Unknown speaker
And they’re essentially confronted with the darker side of humanity every day, which affects, you know, their world beliefs, and also, you know, this continuous exposure to… you know, difficult content can contribute to symptoms of PTSD such as, you know, flashbacks or extreme anxiety, you know, having nightmares, heat disturbances as well.
Unknown speaker
And also does increase risk of, you know, anxiety and depression. You know, thinking about the, you know, having surprise content that they weren’t expecting an acute can have kind of a start response in the anxiety and increased stress as well.
Unknown speaker
And there’s also the risk of vicarious traumatization where they’re exposed to the trauma of other people in the content they’re reviewing. And that has that impact and a unique set of, you know, psychological impacts such as, you know, shame and guilt of maybe not being affected by what you’re viewing as well, or, you know, taking on that, you know, secondary trauma as well.
Unknown speaker
And that’s just from a perspective of viewing the content every day, you know, surrounding that they’re also working in very, very stressful environments. So, you know, they may have, you know, high work demands, they have a lot of tickets to get through during the day, maybe a quota of say 500 in the day, and they’re quite strict on reaching those targets.
Unknown speaker
And there’s quite intensive quality assurance standards as well that are assessed externally. So rigorous performance expectations. And in some places, they’re not allowed to take significant breaks to recover from the content to, you know, calm the nervous system from being stressed as well.
Unknown speaker
So it’s kind of a storm of different conditions that really, it makes that job difficult to work in for a long period of time, which contributes to high turnover as well. So, you know, from my experience of working with these teams, what I do see is a lot of psychological hard work.
Unknown speaker
and a lot of hardworking and persistence and, you know, commitment to really perform well and anxiety when they don’t. However, you see the impact of high anxiety and cognitive overload from trying to use different tooling to have tight deadlines.
Unknown speaker
And then that feeling of, you know, guilt on themselves for not being able to strive in this challenging work environment as well. So it kind of you kind of tend to blame yourselves for that. Whereas, you know, there’s a lot of external factors that impact their well-being.
Unknown speaker
And a lot of ways organizations or these companies can help reduce the damage or, you know, risk a VT by controlling stressors and psychosocial stressors in the workplace as well. But you often see good coping skills, I think, in these teams that they have strong team bonds, you know, they’re…
Unknown speaker
They can’t really discuss their experiences in work with their family or friends. So they can with their peers in a sense that can relate to their concerns as well. And they often have good senses of humor about it as well.
Unknown speaker
It’s a way of kind of getting through it. And even dissociation as a way of surviving, particularly quite difficult content like CSAM, you know, of not really viewing the content as people are humans.
Unknown speaker
It’s more compartmentalizing it into more tasks that they need to get done. And then using toolings such as gray scaling and blurring to kind of help reduce the psychological impact. But still AI and tooling are not there at a level that can really protect the human frontline as well.
Unknown speaker
I mean, I think you bring up a lot of really good points in terms of like the various factors that can impact a content moderator. It’s not just the exposure to content, which could potentially lead to like PTSD like symptoms or vicarious traumatization down the line, but it’s everything external to that.
Unknown speaker
It is your productivity targets. It is how many like different workflows are you working within a shift? It’s, you know, how long have you spent reviewing content that’s related to CSAM and then suddenly moving into like a queue that is terrorist and violent extremism content.
Unknown speaker
So, you know, those are two very highly distressing queues, you know, back to back. It’s things like if I have a conflict with one of my peers in work and I need help to figure out what to do with the ticket, who am I gonna ask if I have a conflict with my peers?
Unknown speaker
You know, so there’s a lot of, yeah, so many factors. And I think there’s like an interesting kind of trade off in that sense of meaning and purpose for content moderators because they are… seen as unsung heroes or like guardians of the internet, keeping the protectors that the users say from harm.
Unknown speaker
But then there’s this sort of converse or like contradictory almost feeling in that because you want them to have a sense of meaning and purpose in what they do. But then that can also drive up the pressure to make sure that they’re working quickly and that they’re getting everything off the internet that is really harmful to users.
Unknown speaker
Yeah, especially in the sense of post-traumatic growth and post-traumatic resilience. How content moderators can really navigate through these daily pressures and stressors is really to look at the great work that they’re doing and to understand.
Unknown speaker
the impact of this work on them and to self-care themselves, but how can they be exposed to this content yet still work at a worthy cause and somehow understand this impact in relation to what they’re doing, to navigate and boost their resilience in the face of constant demands as well.
Unknown speaker
But there are other factors involved in that, especially if there are a lot of workplace factors that are really putting too much pressure on them. It’s not going to be conducive to processing this traumatic content as well as supporting their resilience as well over time.
Unknown speaker
Yeah, absolutely. And there’s a lot of recent discussions around things like regulation and online platforms, and they’re very focused on the users, which is fair enough. There’s tons of users every day on these online platforms.
Unknown speaker
Our approach is holistic, providing services to the organization, the wider trust and safety team, and individual content moderators. From one-to-one digital and in-person therapy to crisis management and group interventions, we ensure every step of moderator’s career journey is supported.
Unknown speaker
But Zevo’s impact stretches beyond individual care. We provide comprehensive organizational solutions aligning with regulatory demands for compliance assurance, enhancing operational efficiency for performance optimization, and proactively supporting brand integrity.
Unknown speaker
We want to ensure that content moderators across diverse industries, from social media platforms to streaming services to gaming, are flourishing. Discover our solutions today. Dive into our world of proactive wellbeing solutions.
Unknown speaker
Visit ZevoHealth.com, or click the link in this podcast description. and join us in leading a new era of content moderator well-being. Do you feel like there’s anything being done on a global level to help support content moderators?
Unknown speaker
Yeah, there’s actually a lot going on at the moment in the UK. You have the online safety act that is trying to address illegal and harmful content online and placing a duty of care on the online platforms to ensure that they’re removing illegal content as quickly as possible and protecting children and everything is age appropriate and also trying to target false information as well.
Unknown speaker
And then in the EU there’s the digital services act which is also calling for greater transparency. So the UK and then the EU are are focusing on having that influence over platforms and how they moderate content.
Unknown speaker
However, in the US, there’s currently a lot going on in the US government about how much control they want to have on the actual content that’s moderated and really take control over what platforms are moderating.
Unknown speaker
And then there’s debates about freedom of speech. And is that the correct way to tackle it? Because otherwise, you have the government screening what goes up onto these platforms as well. So it’s an interesting time at the moment.
Unknown speaker
And a really positive thing to say that there’s work being done to really go to extra money to protect children as well and to protect vulnerable populations. However, from reviewing this and new regulatory bodies, their focus is quite strongly on the users and their advocacy for platforms to really become more strict in the algorithms they use and the systems and processes that they use.
Unknown speaker
However, I do question how that’s going to impact content moderators that are there already under so much pressure to work in stressful environments and to quickly take down if a terror stacked occurs and it’s been shared all over the internet to take that down so quickly.
Unknown speaker
So will that pressure be placed upon that individual? Or when really it should be at a leadership executive level where they need to really think about how they need to change their approach, their policy, how to carry out risk assessments for who might be exposed by the content that they share as well.
Unknown speaker
So it’s really calling for an overhaul, but it really highlights the importance to really focus and protect on content moderators. And at least, again, on the global scale, there was the first global standard for managing psychosocial hazards, the ISO 45003, which is published in 2021, which provides guidelines for companies to how they can assess psychological risk to employees and how to employ preventative and proactive strategies,
Unknown speaker
as we’ve seen since COVID, you know, that. you know, stress and psychological stressors in the workplace really need high priority and psychological health and safety. Because we look so much at, you know, physical risks of harm in a work environment when there can be so much going on on a psychological level.
Unknown speaker
So particularly for content moderators where they are at high levels of psychological impact, you know, this can really be used to help protect them and to, you know, maximize efficiencies of working models and, you know, looking at, okay, if we’re changing their shift rotation all the time, how is that affecting them?
Unknown speaker
Is there a better way I can schedule which cues they’re on? Can we find out if they may have preferences for cues? How can we try to accommodate for that? And, you know, highlighting the importance of taking regular breaks or if I see content that’s, you know, shocking that I can leave my desk without feeling that I don’t need to be present.
Unknown speaker
And often you see the companies are like, sure, take a break if you see anything, but then they’re restricted and taking breaks. So there’s a conflict there within what they’re allowed to do. Yeah, I think like the concern that you raised there around a lot of the regulation when it comes to online platforms now, like with the Online Safety Act and with the DSA, that’s probably, you know, the thing that’s been missing in these conversations.
Unknown speaker
It’s great for users, you know, we definitely want to put additional guardrails in place to make sure that users aren’t exposed to illegal and harmful content. But at the end of the day, it’s the content moderators then that have to take all of that down and have to gatekeep all of that content.
Unknown speaker
So it’s that sort of missing piece of the puzzle where we go, OK, but there’s still people sitting in front of a screen who have to view this content. What can we do to protect them? And like you’re saying, is it, you know, the framework of the ISO 40?
Unknown speaker
5003 that can help us work with companies to develop even regulation around how well-being services are provided to content moderators to safeguard them or does it even go into things like when we look at some of the hazards, it could be really poor tooling systems within a company.
Unknown speaker
Is it that you need to have better tooling systems? Is it that your your engineers need to build a product that is safer from the outset? I was reading a recent article about the algorithms and the recommender systems.
Unknown speaker
They’re built in a certain way to drive engagement and then they recommend the same content again and again once you’ve engaged with it. And if you’ve engaged with something harmful, the likelihood that you’re going to get recommended something harmful again is much higher.
Unknown speaker
So does the recommender system need to be changed? You know, that’s it. And in that way, it can protect the users as well as protecting the content moderators from having to take down the same content or the same type of content time and time again.
Unknown speaker
Yeah, yeah. I think it’s just there’s a lot that needs to be considered, which is still maybe missing from that conversation. Yeah. And I think it’s all new, but it’s good to see regulators engaging with these stakeholders as well.
Unknown speaker
And, yeah, you know, asking for a lot of input from academics and researchers on is this the right approach as well, which is it’s good to see. And I think it’ll evolve over the next couple of years as well.
Unknown speaker
Yeah. Yeah. So is there anything that you would recommend to companies now that they could do to better support content moderators that are protecting users online? Yeah, as I previously mentioned, really looking at the psychosocial risk, particularly because they fall under the umbrella of traumatic experience from the work they do, but also they’re exposed to other psychosocial risks from high job demand,
Unknown speaker
low control. And maybe it’s quite a hierarchical way to work as well. Yeah. And a lot of micromanagement. Yet their role is so important, as we said. And, you know, the focus really is how do you maintain that wellbeing and productivity, you know, without, you know, impacting the structure of the work really should facilitate that.
Unknown speaker
So that productivity is more sustainable, is what I mean, you know, over time, rather than trying to, you know, put so many, like, is it better to have a minimum daily quota or a maximum daily quota and look at the way that might have content moderators work better like such as better tooling and to really find out what might be you know not so conducive to their work day as well and to engage with them and to try and support them as much as possible.
Unknown speaker
And also to have those you know psychological well-being supports in place which should really be mandatory you know having you know one-to-one support should you experience or be affected by content having your peer support groups you know to discuss challenges with peers and you know share coping mechanisms and ways in which you can switch off at the end of the day.
Unknown speaker
But ultimately I think you know there’s so much more research needed in this area you know we see the legislation coming in on a kind of a global scale. There’s also you know, the focus on how to really develop post-traumatic growth and resilience in content moderators so that they can, you know, stay working at that for a longer period of time and not be as affected and be able to manage psychological impacts.
Unknown speaker
So, you know, you’re looking at exploring what kind of modalities might, for example, help alleviate anxiety from seeing surprise content or how you can, you know, support a content moderator who might have, you know, passed traumatic history and is being triggered by certain types of content or is looking at content from their own home country and how do you support them in that case.
Unknown speaker
So to get really granular on how to really support them as well. And of course, again, on the organizational side to try and progress that structure, that workplace that is supportive for both management and, you know, team leads and content moderators and those who write policy as well.
Unknown speaker
Yeah. And I think sometimes the conversation around like wellbeing and productivity is that they’re mutually exclusive, which isn’t really the case. You know, there’s so much research that says, you know, when people get to take their breaks during the day, when people, you know, are given time off, when people have more control over the work that they do and there’s, you know, job demands that meet the resourcing capabilities,
Unknown speaker
they actually become much more productive. And so, you know, if your main concern is ensuring that productivity is sustainable, wellbeing is necessary to make that happen. And wellbeing needs to be sustainable as well, you know, they both hand in hand.
Unknown speaker
Yeah. Yeah. So I think, you know, there’s a lot, even in our own customers that we see, there’s a lot that they are doing, you know, we’re a wellbeing service provider for them, we’re making recommendations to them, and they’re taking those on board, which is But I think at a wider level, you know, across trust and safety as a whole industry, there’s probably more that could be done.
Unknown speaker
And I think a lot of it is, like you’re saying, getting into the granular detail by doing more research about how content moderators are impacted, you know, the new kind of interventions or like therapies that could be helpful and supportive to them in facilitating post-traumatic growth, you know, helping people to really understand at that systemic level what the hazards are and how to mitigate those risks.
Unknown speaker
Yeah. And you’re really looking at that individual level, the content moderator, the environment they work in, the global stage as well, and what’s going on there. So there’s a lot to consider. Yeah.
Unknown speaker
Well, I think we’ve covered a lot in this conversation today, so I really appreciate you taking the time, Pam. Is there any kind of last takeaway that you would give to people? I think what we’re seeing now is a call for more transparency between different platforms and researchers and different bodies and regulatory bodies to all talk together and work together.
Unknown speaker
I think until now, even well-being companies ourselves work on an individual level, but really we need to kind of put our heads together and try to tackle these challenges. Yeah. Yeah. Collaboration, I think, in the industry is very much needed.
Unknown speaker
Yeah. Well, thank you very much to everyone who is listening. Thank you for giving us your time and we look forward to having you back to listen to the next episode of Zemo Talks.