Inclusive Moderation: The Critical Role of Diversity and Inclusion

Thank you for listening to this month’s episode of Zevo Talks. For this episode, we were joined by Thomaz Lopes, Corporate Wellness Lead at Zevo Health and our host Dr. Michelle Teo, Clinical Director at Zevo Health. In this episode, we dive into what diversity and inclusion means in content moderation. Diversity and inclusion in content moderation are essential for unbiased decision-making, incorporating a range of perspectives to prevent unintentional biases. Prioritizing inclusivity fosters a respectful online environment that addresses the diverse needs of users, promoting equity and understanding.

 

Takeaways:

  • Representation in moderation teams 
  • Unconscious bias and cultural sensitivity 
  • Understanding of cultural norms 

You can also listen here:

Spotify 

Soundcloud

Youtube 

 

Speaker 1

Welcome everyone to Zevo Talks, where we delve into the crucial topic of inclusive content moderation and its impact on trust and safety teams. I am your host, Dr. Michelle Tio, Clinical Director of Zevo Health and a Chartered Counseling Psychologist.

 

Speaker 1

And today, I am joined by Tomas Lopez, who will shed light on key aspects such as representation in moderation teams, unconscious bias, cultural sensitivity, user engagement, and understanding of cultural norms.

 

Speaker 1

Tomas is a corporate lead building specialist with Zevo, as well as a behavioral psychologist. So you are very welcome, Tomas.

 

Speaker 2

Hi, Michelle. Thank you. Thank you very much for the invite as well. I’m glad to be here.

 

Speaker 1

Lovely. So let’s start with the foundation. Why, in your opinion, is representation and moderation teams so crucial when it comes to content moderation?

 

Speaker 2

So, representation is key because a diverse team brings a variety of perspectives to the table. When the moderation team reflects the diversity of the user base, it helps in understanding different cultural contexts and ensures a more nuanced approach to content evaluation.

 

Speaker 1

Okay, so, you know, when we’re thinking about users who maybe use like social media platforms or games that are live and online, or even like discussion forums, essentially what you’re saying is that if the user base is represent, if the user base is representative of certain groups, certain cultures, certain backgrounds, then moderation teams need to be the same.

 

Speaker 2

Exactly. So it’s important to take that into consideration. So the cultural background, where the person is located, and the context that they are inserted is very important when we are talking about inclusive content moderation.

 

Speaker 2

So it’s important for the moderation teams to be aware of that as well.

 

Speaker 1

Hmm, yeah. And I suppose in the context of this kind of discussion, unconscious bias is a term that’s often discussed. So could you explain how unconscious bias might manifest in content moderation and what kind of potential impact that has on the moderation processes?

 

Speaker 2

Yes, so unconscious bias can seep into decision making processes, then leading to disproportionate actions against certain content. For example, a lack of awareness about culture nuances may result in the unintentional suppression of content that is actually published, but misunderstood.

 

Speaker 2

I can give you an example to make that more clear. For example, countries that speak specific languages or the same languages, like Portuguese. So we speak Portuguese in Portugal, some countries in Africa, and there is also Brazilian Portuguese.

 

Speaker 2

And then there are cultural aspects of that, that needs to be taken into consideration. So a word that in Portugal might seem harmless. In Brazilian Portuguese, the same word might mean something completely different, that it can be something offensive.

 

Speaker 2

So it is important to be aware of that for the content moderation teams to know if that word in specific areas of the world is seen as offensive or not. So that is something that we need to take into consideration.

 

Speaker 1

Yeah, and I think that’s a really good example of, you know, just language nuance, despite the fact that is, at its core, the same language, there can be differences by countries, when people speak that language, whether it is like a different dialect, or slang terms that are not used in one country versus the other.

 

Speaker 1

And so, you know, unconscious bias is that, you know, we’re able to understand that nuance a little bit better. And that moderation teams need to also be able to understand those nuances in the same way.

 

Speaker 2

Exactly and then some platforms are using a shadow banning which is pretty much like when the content is being the person is not being blocked from that specific platform but that comment might be not seen by everyone based on where they are you know so it’s sometimes like when shadow banning happens the user won’t even be notified that this is happening right this is mostly for the platform to protect a specific cultural backgrounds or social backgrounds

 

Speaker 1

And yeah, we see that, you know, with a lot of influencers who talk about shadow banning is that they feel they’re not being represented by the platform or that they’re they’re finding it hard to be represented because of that shadow banning and whether or not that’s actually happening there’s, you know, potential unconscious bias that’s being played out by some of those processes.

 

Speaker 1

And, you know, that it’s as a human moderation is, you know, fundamentally a human centric kind of piece of work, it’s hard to get away from those unconscious biases and so you know I think that leads nicely into our next question, which is how cultural sensitivity is really paramount when it comes to moderation.

 

Speaker 1

So how do you feel that content moderation teams develop and maintain cultural sensitivity to avoid unintentional biases. Very good.

 

Speaker 2

So, training plays a crucial role then, so content moderation teams should undergo regular training sessions to enhance cultural awareness. In addition, fostering an inclusive environment where team members feel comfortable discussing and learning about diverse culture, which is like various essential for them and for the content moderation itself.

 

Speaker 2

So, for example, what is important to take into consideration like that the companies can do is, for example, workshops from experts that will be able to support those discussions, to make those discussions efficient and valuable, and then some guidance peer group support as well for the content federations to be able to talk about the content and to talk about their own cultural awareness or how to raise their cultural awareness.

 

Speaker 2

So, it’s important to have that organizations have those experts to be able to deliver those workshops and build guided discussions.

 

Speaker 1

Yeah, and it reminds me of when I was supporting moderators, you know, sometimes there are certain platforms who maybe have agnostic review processes where the individual who’s moderating the content doesn’t actually speak the language of the content that they’re moderating.

 

Speaker 1

So things like cultural sensitivity and making sure that there is awareness of cultural differences is really important, particularly in those kind of areas where you’re not just relying on people’s market knowledge or their language knowledge, where they maybe are using translators or other tools to help them do the work that they’re doing.

 

Speaker 2

Exactly, so it’s it’s it’s exactly about that. I do recall myself as well when supporting the teams. This this is very important and crucial part on. Yeah, so if you come from a specific background to talk and talk in a specific language.

 

Speaker 2

But it’s still like is important to. Have those discussions raised awareness for. Something that you might not know, and then you will be able to do the collaboration in in a way that is effective.

 

Speaker 1

Our approach is holistic, providing services to the organization, the wider trust and safety team, and individual content moderators. From one-to-one digital and in-person therapy to crisis management and group interventions, we ensure every step of the moderator’s career journey is supported.

 

Speaker 1

But Zevo’s impact stretches beyond individual care. We provide comprehensive organizational solutions aligning with regulatory demands for compliance assurance, enhancing operational efficiency for performance optimization, and proactively supporting brand integrity.

 

Speaker 1

We want to ensure that content moderators across diverse industries from social media platforms to streaming services to gaming are flourishing. Discover our solutions today. Dive into our world of proactive wellbeing solutions.

 

Speaker 1

Visit zevohealth.com or click the link in this podcast description and join us in leading a new era of content moderator wellbeing. And so I suppose engagement is often cited as a challenge in content moderation.

 

Speaker 1

So how can an inclusive approach enhance user engagement while ensuring a safe online space?

 

Speaker 2

So that would be by involving the community in the moderation process. So platforms can gain valuable insights and foster a sense of ownership among users. So an inclusive approach invites users to be part of the solution as well.

 

Speaker 2

So creating a collaborative environments that encourages positive engagement. So as an example, I think I can give you is like some discussion websites rely on community moderator, besides their own content moderator, moderators staff.

 

Speaker 2

So also, user that engage in reporting anything that is harmful. So basically cross functional approach in the platforms where more than one department is supporting with tailoring policies, reporting tools that needs enhancements, or or maybe AI data settings.

 

Speaker 2

So there is a variety of areas that’s not just the content moderators themselves will be in those discussions, but then users on our other departments that will be able to support in creating creating specific tools or policies as I said, to promote that.

 

Speaker 2

So it’s not focusing only on the staff, but it’s pretty much opening the discussion to a variety of the population and like a wider group. Thank you.

 

Speaker 1

Yeah. And, and it’s sort of the crux of what we’ve been talking about for the last few minutes. You know, it’s, it’s really about having that diversity of voices. So it’s not even just the diversity in terms of like culture and background.

 

Speaker 1

And, you know, it could be religious affiliation and political affiliation or whatever else. It’s also just having a diverse set of voices in terms of like roles. So am I a user? Am I a content moderator?

 

Speaker 1

Am I someone who builds AI tools? Am I a policy developer? Whoever it is, as many voices that we can get at the table, the better. So understanding cultural norms, I suppose, is also critical when we’re talking about inclusive moderation.

 

Speaker 1

So how can content moderation teams keep themselves updated on evolving cultural norms in order to adopt their, oh, sorry, in order to adapt their strategies accordingly?

 

Speaker 2

So, regularly engaging with the user community, so leveraging feedback mechanisms and staying connected with cultural experts are effective structures then. So, continuous education within the moderation teams is also key to ensuring a dynamic and responsive approach to evolving cultural norms.

 

Speaker 2

So, for example, content moderators should be aware of changes in the world, like words that before were considered acceptable, let’s say, even I wouldn’t go too far, like five years ago some words would be accepted as like not offensive, not harmful for when you’re calling a person.

 

Speaker 2

But nowadays those same words that before were okay, they can be considered and seen as offensive by specific communities. So, we need to be constantly aware of the changes that are happening in the communities and in the world.

 

Speaker 1

And yeah, I think you make a really salient point there is that it can be language, sometimes it’s behaviours, you know, culture shifts all of the time, what’s acceptable versus what’s not acceptable shifts all the time, what’s offensive versus what’s not offensive shifts all the time and so we have to be able to kind of keep up with the evolving landscape.

 

Speaker 2

Exactly. Which is extremely important, like the world changes and the content moderation needs to change as well.

 

Speaker 1

Yeah, exactly. So in the context of content moderation, how do you think that technology can be leveraged to enhance diversity and inclusion efforts?

 

Speaker 2

So technology can aid in automating certain aspects of content moderation. But it’s essential then to ensure that algorithms are trained on diverse data sets to avoid perpetuating biases. Additionally, ongoing human oversight is crucial to handle the ones situations that algorithms might struggle with.

 

Speaker 2

So like we are talking a lot about AI at the moment, which is efficient until certain points in content moderation. So yeah, it is important to have the human side of it where as like you asked me in previous questions about like the things that are trained to see the differences and what’s happening in the world and what like difference between same languages.

 

Speaker 2

So sometimes the algorithms and AI can struggle with that specifically then there where the content moderation and team is so important be that layer of protection to the users.

 

Speaker 1

So Tom asks, what steps can platforms take today to make their content moderation more inclusive?

 

Speaker 2

Good question. So platforms should prioritize diversity in hiring. So from the hiring process, they would take that into consideration and make that an important aspect of the hiring process. So invest in ongoing training programs, then involve the community in moderation decisions, and embrace a culture of opennesses and learning.

 

Speaker 2

By doing so, they can build trust, create safer online spaces, and contribute to a more inclusive digital environment.

 

Speaker 1

Right, so my last question for you then is how do cultural dynamics impact well-being and trust and safety services in the workplace, especially if we’re considering interventions, moderation practices, even team leadership across diverse cultural backgrounds?

 

Speaker 2

So cultural dynamics play a crucial role in shaping the effectiveness of well-being and trust and safety services in the workplace. So inclusive moderation requires cultural sensitivity that goes beyond avoiding judgment or bias.

 

Speaker 2

It involves tailoring interventions to suit specific cultural groups. For instance, some cultures such as certain cultural and social groups from specific countries may perceive seeking support as taboo or indicative of mental health issues.

 

Speaker 2

And then similarly, team leadership from different cultural backgrounds can lead to miscommunications and tensions if not managed effectively. Then I would say training in understanding and conscious biases becomes essential to navigate a diverse environment in compassing various factors like flex orientation, race, and cultural beliefs.

 

Speaker 2

So by fostering cultural awareness and sensitivity in all aspects involved, leadership to the teams, then the organizations can create a safer and more supportive workplace for their youth, ensuring that well-being and trust and safety services are truly inclusive and active.

 

Speaker 1

Hmm, I think that’s a great point for us to kind of finish up on. So I really thank you to Maas for joining me today, sharing your insights on the really critical role of diversity and inclusion in content moderation.

 

Speaker 1

So to all our listeners, thank you very much, and we welcome you to join us next time as we continue exploring present topics in the ever-evolving landscape of online platforms.

 

Speaker 2

and you think very much yourself.