The Story Behind the Screen: Life as a Content Moderator

We were thrilled to welcome the incredible Alexandra Koptyaeva, Trust and Safety Lead at Heyday. Starting her career as a Content Moderator, Alexandra has ascended to a leadership role, drawing on her rich experience to offer unique insights into mental health support and content moderation.

Key Takeaways:

  • Navigating Cultural Norms in Mental Health Support: Explore how varying cultural backgrounds influence approaches to mental health.
  • The Human Side of Content Moderation: Understand the personal impacts on those who keep our digital spaces safe.
  • Understanding Our Own Capabilities and Adapting to Others: Learn how self-awareness can enhance our interactions and support mechanisms.
  • The Link Between Positive Mental Health and Productivity: Discover how mental wellbeing contributes to organizational success.
  • Balancing Internet Safety and Moderator Protection: Discuss strategies to protect both the integrity of the Internet and the well-being of Content Moderators.

Spotify 

Soundcloud

Youtube 

 

Speaker 1

So welcome back to Ziva Talks. I am Dr. Michelle Teo, clinical director at Zivo Health and your host. And today I’m joined by Alexandra, trust and safety lead at Heyday. Our episode today delves into the story behind the screen as Alexandra previously worked as a content moderator, and I’m so excited to speak to her about her experiences.

 

Speaker 1

So you’re very welcome, Alexandra.

 

Speaker 2

Thank you and thanks for having me here. Excited to have this conversation today.

 

Speaker 1

Great. So maybe just to start off, could you tell our listeners how long you’ve been working in the trust and safety industry?

 

Speaker 2

Yeah, of course. So technically, I started my career back in 2018, when I was relocated to Athens, Greece, to work as a content moderator in Russian and English. And then a year later, I transitioned into the startup environment.

 

Speaker 2

So since then, I’ve been focusing on trust and safety bots for social media and e-learning startups.

 

Speaker 1

Okay. And so when you were sort of working in this moderator job, I know that you’ve worked in various different kind of client companies was the job what you expected, you know, based on the job spec that you saw, maybe some of the interviews or even the onboarding training that you went through.

 

Speaker 2

Um, so I was in both like, yes and no. Um, so when I started back in 2018, first, um, the title, the content moderator did not exist. I think we were both like social media reviewers or something like that.

 

Speaker 2

And during the first interview that I had, um, it was mentioned that I might come across some kind of harmful content. Um, and then I was hired to actually. Um, that’s why it was, I guess, like very new and also very surprising for me to even hear these questions when it came down to the work itself, um, I would say like, it was a very long learning journey at least with the outsourcing company.

 

Speaker 2

Simply because I was hired to do one thing and then for sure, like during the year, um, several times it had changed. So from e-commerce, I went to politics. No, no, I’m just something else. I’m just something else.

 

Speaker 2

So there are many things that I, I was like, I wasn’t mentally prepared to see. Um, um, like, for example, when moderating e-commerce content, you, I was like, definitely don’t expect to probably come across prostitution.

 

Speaker 2

When you are trained on spotting fraud on like fake bread.

 

Speaker 1

Okay, so it’s that sort of like unexpected nature of how, you know, it’s not just one type of abuse that you might be viewing you know there there’s always the possibility that there’s going to be other kind of online harms that that you face even if you’re supposed to be focusing on fraud or if you’re supposed to be focusing on another kind of workflow or abuse area.

 

Speaker 2

Yeah, exactly. And I feel like in general, the nature of trust and safety, and even the role of content moderator is so diverse. So every single day for me, it’s like, I’ve been durable. I never know what I might see during the day if I moderate content.

 

Speaker 2

So mentally, I know that I mean, now I know that I should be prepared to see everything. My spec, like my tasks might be forms.

 

Speaker 1

Okay. And were, were you given any kind of training on like, you know, preparing for potentially graphic imagery or like, was it ever stated to you even that it could possibly happen?

 

Speaker 2

Um, so during the interview, I guess back in 2018, um, it was briefly mentioned, but then it has never been specified even throughout the year of my work as to what did the company mean by the graphic content.

 

Speaker 2

Um, and for me, because it was all new, um, I know. Um, and then what happened was that at least the outsourcing company I tried my career with, they did have to be there. Like they did have say colleges on the floor that we would focus on, but then I was working in the Russian department and what happened there six months later was that I’ve got promoted.

 

Speaker 2

So I helped to supervise occupied, um, content moderators. I think like it’s probably like part of the Russian mentality that it’s really not common to talk to a psychologist. Like the environment itself was set in a way that you’re rather expected to talk to your friends or family and share everything with them instead of talking to the third person that technically you don’t really.

 

Speaker 1

you

 

Speaker 2

And that’s why in my department, what did happen in the end of the day was that literally no one would go to see a psychologist. And if someone would try to, unfortunately, I remember being stigmatized or bullied, or constantly asked by the entire team, although it was supposed to be a private issue, like a private, they were constantly asked by the team as to why do you need to see a psychologist.

 

Speaker 2

I was like, it was very uncomfortable to contact anyone in this type of environment.

 

Speaker 1

Yeah, and I think it brings up a really, you know, sort of important point when we’re talking about content moderation and content moderators in general, you know, it’s a very diverse group and there’s all kinds of cultural backgrounds, you know, like everyone has their own idea of what well-being is or isn’t and whether or not there is stigma in the culture around mental health, like even myself as Asian,

 

Speaker 1

you know, there’s a big stigma around mental health, we don’t talk about mental health, so it would be much more challenging for us to approach a psychologist or any kind of mental health professional if we were experiencing issues, which sounds exactly like what you’re talking about with your your Russian colleagues.

 

Speaker 2

Yeah, exactly. And I think like back then, so I think I was 22, 23. So coming from Russia originally, I honestly had no idea about mental health itself. Because it has never been the topic of the discussion, like in my family, in my circle of friends, in my school, at the university, it has never been brought up.

 

Speaker 2

So having the knowledge that I do have a possibility to talk to a psychologist, versus knowing that if I do, I might be bullied by my team. For me, definitely as a team lead, back then it was a no go.

 

Speaker 2

And I think like only gradually over the years, I want to not only learn on my own, as to what is mental health, how to protect it, how to be prepared, but also understand that there is, like there should be nothing bad about talking to a professional trained to help me.

 

Speaker 2

Yeah, so yeah, I think like that definitely should be taken into consideration.

 

Speaker 1

Mm, absolutely. And I do understand that you have experienced some mental health difficulties in relation to the job. Do you want to maybe just share a little bit about that?

 

Speaker 2

Yeah, of course. So I think like, not only while working in the outsourcing company, but also later at a startup environment where not only the amount of work is much higher, but also the pace is much faster.

 

Speaker 1

Yeah.

 

Speaker 2

I’ve got to see definitely like quite a lot of disturbing content, especially working for social media startups that focus on binaries. And what did happen was that I definitely had insomnia for quite some time in both environments.

 

Speaker 2

I remember breaking down at least three times when working at a startup environment, because of these, the combat was what I was seeing on the screen, right? And how hard it was to take it in. And definitely, I guess like not necessarily being depressed by not questioning my environment and questioning my relationship with other people.

 

Speaker 2

To the point that I think like when, especially when dealing with, let’s say, propositioning or scammers in a way, it’s very hard to trust people after that. Knowing that this is what happens online, this is how people get scammed, and then trying not to bring it into your real life and into your relationships with people.

 

Speaker 2

So until now, I can see how in a way it keeps on affecting me one way or another.

 

Speaker 1

Yeah, I mean, it’s it’s not uncommon that I’ve heard content moderators talk about that that you know, it’s that sort of shift in your worldview or that sort of change in your worldview that not everything is sunshine and rainbows, you know, there’s a there’s a lot of stuff that content moderators would view in the in the course of their work that indicates that there are bad actors out there and that there are people looking to manipulate you or harm you in some way and And I can really take a toll on exactly like what you’re saying how much you trust other people in the real world,

 

Speaker 1

people that you’re interacting with on a day to day basis that aren’t in the online world and just how that might have a longer term impact rather than just in the moment because I think a lot of the conversation around well being and supporting content moderators is okay I view a piece of content.

 

Speaker 1

I have an immediate reaction to it. How do I respond in the moment but you know there there are potential long term impacts like this kind of change in the worldview that haven’t necessarily been all that commonly addressed or haven’t really been talked about too much.

 

Speaker 1

At least in the course of maybe like the last six years I have definitely seen that change but probably when you were initially starting out as a content moderator in 2018 it probably wouldn’t have been talked about very much.

 

Speaker 2

Um, yeah, a hundred percent. I mean, until now I can definitely see how, for example, working, uh, with child sexual exploitation has transitioned into my relationship with my sister and her daughter, right?

 

Speaker 2

So I’m, I understand and I acknowledge that I began to be, I guess, like overprotective when it comes to what type of content my sister is posting or sharing on her social media. Um, and I understand as well that it is not necessarily the healthiest way as to how I want to build my relationship with her.

 

Speaker 2

But apart from that, I think like the main difference between now and back then is that, um, now we see all the new features being introduced by the quantum moderation vendors, um, or my companies when it comes to protecting mental health of quantum moderators and to let’s say like seeing the content being blurred or black and white instead of seeing all that, uh, as it is.

 

Speaker 2

Um, back then it definitely wasn’t the case. Um, so now I guess like it makes it slightly easier to work. Um, although still definitely very challenging.

 

Speaker 1

Yeah, absolutely. You know, and I think regardless of what tools are in place, you know, there’s still an element of having to review content. And even if it’s not like very, very egregious imagery, it could be hate speech that you’re reviewing or like mis and disinformation, which could also change how you perceive the world around you, right?

 

Speaker 2

Yeah, no, a hundred percent. And that’s why I think like, I guess like throughout all these years, I’ve got to learn as to how to put the borders between my work life and my personal life. So that I won’t be affected as I was back then.

 

Speaker 2

Um, and I also got to train the quantum moderators that I’m now managing and explain to them how to do that. Because otherwise I see that no matter how many years of experience someone might have in quantum moderation, um, it still affects that.

 

Speaker 2

Yeah. Because definitely like mental health is not black and white. Um, you can’t be, I guess, like living in a bubble all the time. Um, one way or another, it might affect you.

 

Speaker 1

Yeah, absolutely. And, you know, even just from the perspective as as a psychologist, there’s there’s tons of times, you know, in our lives when things are going really great and we’re financially stable and we’ve got great relationships, but then one event or one experience like a bereavement or like a breakup can impact how we’re feeling.

 

Speaker 1

And then all of a sudden, maybe it’s in that moment where content begins to trigger you, even if it hasn’t previously.

 

Speaker 2

Yeah, no, definitely. I think like all the external and internal factors play a very huge role. But also, I guess, like, when it comes down to the work itself, what for me, I think, like, is very important is to try not to put a lot of pressure on the quantum moderators.

 

Speaker 2

Yeah, when it comes to the amount of content that they have to moderate per day. Because there are also daily KPIs involved. Yeah. And working under this pressure, knowing that, let’s say, like, I have, or like, I used to have back then between two to five seconds to review the image, and confirm if there is, if there are any violations on the screen or not.

 

Speaker 2

Yeah, it’s definitely a lot of pressure sometimes coming from the company. So for me, if the companies could also pay attention as to how they put the mental health of their quantum moderators, how they on a daily basis, would be also important because it’s not just the KPIs, it comes from management itself.

 

Speaker 2

And it, it’s good to know for some quantum moderators that yes, they can reach out to the pathologists. But at the same time, if they still have to deal with all the pressure coming from the company management, yeah, it doesn’t really make sense.

 

Speaker 2

Like it’s not really helpful.

 

Speaker 1

Yeah, yeah, look, you hit on a really like key point there, I think is that like we even see it in our own data, it’s not necessarily the content itself that is harmful to people. A lot of the time it’s just those other work related factors.

 

Speaker 1

So how do I engage with my team members? Are they helping me when I don’t know what kind of decision I need to be making on a ticket? Do I have a good relationship with my manager? Do I have KPIs like you’re talking about that feel unachievable and that aren’t really feasible based on the resources that we have available to us.

 

Speaker 1

There’s so many things outside of the actual content itself that could impact moderators, mental health, and I think a lot of the time. It’s that it’s that trade off between companies needing to make sure that users are kept safe by reviewing content and taking it down as quickly as possible and as accurately as possible.

 

Speaker 1

But then at the same time, trying to take care of their content moderators well being. And there’s a sort of disconnect almost between the two.

 

Speaker 2

Yeah, so honestly, like what I’ve got to understand and what I’ve got to experience as well is definitely like this kind of sometimes unrealistic expectations when it comes to numbers versus the experience of the company management.

 

Speaker 2

And for me, what I realized is important as a manager is to know my numbers as well, right? So from time to time, I would definitely dedicate a few hours per week to moderate the content to measure my speed as well and try to adjust the numbers for my team based on that.

 

Speaker 2

At the same time, I understand that maybe compared to someone else in my team, I might have more experience. So my speed might be much faster than someone else’s. So I’m also trying to keep a balance.

 

Speaker 2

But for sure, if someone will moderate, let’s say like one image per, I don’t know, one hour, that’s another type of conversation.

 

Speaker 1

Yeah, absolutely. But I think it’s a great approach that you have as a team lead to make sure that you’re also still taking action and reviewing content at times, just making sure that you’re setting realistic expectations for your direct reports and making sure that you’re considering all kinds of different factors, not just whether or not they know the policies and whether they know whether something is a violation and how quickly they can do it,

 

Speaker 1

but the years of experience that they have and ensuring that you’re taking it all in as like a holistic view rather than just looking at one thing in particular.

 

Speaker 2

Yeah. Yeah. And I feel like, for some companies, or like for me, what was important as a content moderator, and what I later noticed as a project manager is that when the content moderators see the backlog and the amount of money that is pending, for some, they do feel a lot of pressure just seeing the numbers.

 

Speaker 2

Because sometimes when we’re talking about thousands of profiles pending the review, for some keeping in mind that okay, there might be child sexual exploitation involved, there might be some scamming, someone might be suffering at the moment, it definitely puts a lot of pressure as well.

 

Speaker 2

Yeah. So for me, it was always important to hear the explanation as to how fast the bug should be tackled. Because if we’re expected to deal with 10,000 of pending profiles per day, that’s one conversation.

 

Speaker 2

If we know that there is nothing super urgent waiting in the queue, so it’s okay to moderate it throughout the week. That’s another type of conversation. So it removes the extra pressure from that as well.

 

Speaker 1

Our approach is holistic, providing services to the organization, the wider trust and safety team, and individual content moderators. From one-to-one digital and in-person therapy to crisis management and group interventions, we ensure every step of the moderator’s career journey is supported.

 

Speaker 1

But Zevo’s impact stretches beyond individual care. We provide comprehensive organizational solutions, aligning with regulatory demands for compliance assurance, enhancing operational efficiency for performance optimization, and proactively supporting brand integrity.

 

Speaker 1

We want to ensure that content moderators across diverse industries, from social media platforms to streaming services to gaming, are flourishing. Discover our solutions today. Dive into our world of proactive wellbeing solutions.

 

Speaker 1

Visit zevohealth.com or click the link in this podcast description and join us in leading a new era of content moderator wellbeing. Yeah, and I think you bring up a lot of great points there that we hear it a lot from content moderators, all these different kind of factors that impact the way that they work and impact how productive they can be and how well they can switch off from work after finishing their shift for the day.

 

Speaker 1

If I know I’m coming back into work with a backlog of a thousand tickets, I’m going to be stressing about it all night before I come back into work the following morning. So how do we find a way to make sure that we know what’s the biggest priority versus what can be deprioritized based on the potential harm there’s gonna be to the user?

 

Speaker 2

Yeah, 100%. And I think also doing some kind of rotation internally, within the contribution team is important. So that no one will feel like they’re working on one specific type of content all the time.

 

Speaker 2

Because, for example, when it comes down to very basic things, like, for example, taking a vacation, right? I got to hear the feedback from some team members that, being on a vacation, they still felt very stressed and under pressure, knowing that while they are missing for, let’s say like three, four days, there is no one there moderating the content that they are trying to help.

 

Speaker 2

So that’s very important as well, I think, to keep in mind.

 

Speaker 1

Yeah, absolutely. Absolutely. And I know you mentioned that when you were working as a content moderator, you did have access to psychologists who were on site for you. Were there any other kind of well being supports offered to you other than maybe like your typical kind of one to one counseling sessions?

 

Speaker 2

Yeah, so here I think it’s very important to mention the difference between the outsourcing companies that didn’t for this health versus startups. Okay, from my experience, which I think is also very sad is since basically 2020 I’ve been working as a contractor, or as a freelancer, meaning that by default, I do not have the same rights or the same, I guess like features as the employees, like I’m not treated as a company for it.

 

Speaker 2

Unfortunately, in most of the contracts that I’ve been signing until now with my clients, it does mention that the company is not in charge when it comes to my mental health, my well being, and they remove all the responsibility from themselves, meaning that as a contractor, I literally have zero access to the well being that is provided to the company employees.

 

Speaker 2

So I think from all the clients that I’ve got until now, which is anything like up to 10. It was only one company that did try to do something for the contractors. Okay, team of, I guess like more than 100 outsourced moderators, but it was still for me was still very questionable because we got access to had space for employees.

 

Speaker 2

And then I think like two months later it was removed for financial issues that the company had, or in a way they figured out that it’s extra money that they weren’t willing to spend. And again, like for me, I guess like it opens the door to another type of conversation as to how do we actually protect the mental health of contractors, right?

 

Speaker 2

And why don’t the companies want to do anything for the outsourced stuff, like for the freelancers? Because yeah, until then, I would say like my mental health and well being has been my own responsibility for 99% of cases.

 

Speaker 1

It’s quite challenging, I think, to, to kind of help companies understand that return on investment that if you actually put the money and the time into appropriately resourcing well being for your content moderators, you are going to see better retention, you’re going to see improved productivity, you’re going to see improved accuracy, like less sickness absence, but it’s hard to demonstrate that I think because also well being as a concept is quite difficult to measure.

 

Speaker 1

Even from, from our perspective, so it’s, I think that that conversation that you’re talking about is is trying to figure out like who actually has the responsibility here. Does it need to be, you know, even pushed up to like lawmakers and policymakers, you know that can work with regulatory bodies to put certain standards in place, regardless of whether or not you’re hiring outsourcers and freelancers or whether or not you’re hiring people as full time employees.

 

Speaker 2

No, very true. I mean, I feel like by default, unfortunately, right now, the contractors or the freelancers, in a way, became like a hidden labor. Because by default, there are no benefits that the contractors have, at least to the same extent that the company employees, so they’re treated in a different way.

 

Speaker 2

Meaning that in most cases, there’s also literally zero protection for them in place from being ladled. Meaning that any single day, I can open my laptop and see a message that from today on, my contract has been terminated.

 

Speaker 2

And that’s it. And I’ll be just removed from all the company channels. I won’t get any benefits from suddenly being unemployed. So that’s also another issue.

 

Speaker 1

Uh, it’s, I mean, it’s a tough industry to work in, isn’t it? True. Let me train. Well, listen, I, I know, so now you’re a trust and safety lead, um, in Heyday. Um, have you seen how the landscape of wellbeing has changed maybe over the last couple of years when it comes to content moderators?

 

Speaker 2

Um, yes. So I think like it has been improving for sure. Um, starting from, I guess like quantum moderation vendors introducing different features to protect the mental wellbeing of quantum moderators, plus the companies themselves, um, finally acknowledging that yes, there might be a long-term impact on their metals and trying to do something about that, uh, versus trying to silence, um, their quantum moderators who are trying to speak up about what has been happening.

 

Speaker 1

Yeah, and have you seen any change in terms of like the actual well-being supports that are available to people?

 

Speaker 2

Um, yes. So I think like after COVID due to the shift in the nature of work, right? So from the office to remote work, um, we definitely started seeing more online solutions being introduced in the mental health space and more, I guess, like applications available for those working remotely.

 

Speaker 2

Um, so for me, it’s quite a lot of positive changes that has been happening.

 

Speaker 1

Yeah. Do you think there’s anything still missing?

 

Speaker 2

Um, definitely a lot. Um, yeah, I mean, like by default, like everything that we had discussed when it comes to contractors and them having any access, uh, for me, that’s something that the companies who do hire freelancers to focus on.

 

Speaker 2

Um, and then long-term there will definitely be so much more, like for example, having the team meetings on maybe like a yearly basis, so that team members working remotely can meet each other, can connect, talk to each other, can socialize, right?

 

Speaker 2

So I think it’s also my help to remove the extra pressure, um, like one of my experience working remotely and literally not having time nor the opportunity to get to know their team members outside of the one hour, 30 minutes team meetings, um, that are happening.

 

Speaker 1

Yeah, I think that’s a that’s a really great point because we do have a couple of customers who are like predominantly remote or fully remote and and it’s something that we hear a lot is that you know moderators are meant to be working in a sort of team working environment right you rely on your colleagues to help you and you don’t know what decision to make or if you can’t find a policy or if you need a little bit of extra support but when you’re remote it’s much harder to access that support from your colleagues or from your manager versus when you’re sitting next to them in the office you can just turn to the person next to you and say hey could you help me with this.

 

Speaker 2

Thank you. Thanks. Thank you.

 

Speaker 1

Yeah. So I wanted to talk a little bit about the company that you founded, so Clearmind Keyboard. Could you tell us what the sort of aim of that company is?

 

Speaker 2

Yeah, sure. So I was like, I recently became a co-founder and a CEO. And what we do or like what we’ve built that clear mind keyboard is we’re the end user as an iOS app and a virtual keyboard that is powered by the AI.

 

Speaker 2

And as you type using the keyboard, which looks exactly like any other keyboard that you currently have on your phone. The AI behind it will be able to analyze the context of your communication, like of your words.

 

Speaker 2

And based on that, towards the end of the day, you will see your personal talks with stress factors. Meaning that as you communicate with your friends, your family members, like use your social media or email, whatever you do with your keyboard at the moment.

 

Speaker 2

We can tell you if you might be anxious or depressed or stressed out or burned out. All those kind of things that one user will be able to see through our app. Based on that, we will also measure the happiness level, the happiness levels and the awareness levels.

 

Speaker 2

Meaning that for sure, like the more depressed someone is, the less happy they are. And then long term down the road, what we want to do is to also integrate the personalized improvement plans. Meaning that as you learn what is that’s like stressing you out on the daily or the weekly basis.

 

Speaker 2

We could also help someone to build the action plan. So just knowing that you might be depressed or anxious is great, but then what can someone do about that? Yeah, so we’re working on that. And we have a very similar app and the feature for the B2B, so for the companies.

 

Speaker 2

And I think in a way, it’s also coming from my experience in trust and safety and quantum moderation. So our AI can be also integrated either in the internal communication channels for the companies so that they can also, I guess, focus on their retention rate and make sure that the employees who are burned out get help on time.

 

Speaker 2

So it doesn’t lead to someone quitting the job because they can’t take it anymore. And of course, we also have the corporate version of the app for the employees.

 

Speaker 1

I mean, it’s a, it’s an amazing app, you know, it’s, it’s innovative, it’s using AI, it’s, you know, tracking people in real time, it’s, it’s giving you awareness at that user level and it’s, it’s, you know, helping people to kind of just have that knowledge of where they’re at on any given day, which means that they can then exactly, you know, take actions to kind of rectify any challenges that they’re having at the moment.

 

Speaker 1

So it’s, it’s an incredible initiative and I’m excited to see where it goes kind of going forward.

 

Speaker 2

Thank you. Yeah. And I feel like in a way it comes from my experience of coming from the culture where mental health is stigmatized, right? So when, because when you think about it, like how often do you actually discuss with your friends what model or what type of keyboard do you use on your phone?

 

Speaker 2

Like, hopefully it’s not your daily topic of discussions, right? So from the, for someone coming from the same type of environment as I am, it might be a great way to use a keyboard without necessarily their friends or family, family knowing what they might be doing.

 

Speaker 1

Yeah, absolutely. It’s breaking down some barriers there. Yeah, amazing. Well listen, before we finish up, just one question for you. Is there a nugget of information or something that you might say to your younger self when you were working as a moderator.

 

Speaker 1

What would that be. you

 

Speaker 2

Yeah, I think definitely I wish I knew how to protect my mental health much better, compared to what I do now. And I think in a way I wish I had a better working environment and a connection with my supervisor and my team members, so that I will be comfortable sharing what I saw on the screen to in a way decompose and help myself to process what I was seeing.

 

Speaker 2

Yeah, I think it’s

 

Speaker 1

It’s that sense of openness and just not kind of hiding when we’re feeling a little bit vulnerable or when we need support that we can just say to someone, it’s not a good day for me. I need a little bit of extra help or I need a little bit of extra time or like a break and that someone can turn to you and say, okay, I can give you that.

 

Speaker 1

That’s not a problem. 100%. Well, listen, thank you so much, Alexandra. I really appreciate you taking the time, sharing your experiences with us and giving us a little bit of insight into clear mind.

 

Speaker 1

I think it’s really amazing what you’re doing. So I just wanted to say thank you and to all of our listeners. I hope that you got something out of this conversation and keep an eye out on our socials for the next episode.

 

Speaker 1

So thank you, Alexandra. Thank you. Have a good day.