New MDX study reveals impact of moderating harmful online content
22 April 2025
/7x0:1033x697/prod01/channel_3/media/middlesex-university/news-section-images/2025/Compuer-tech-lab.jpg)
More than a quarter of the content moderators surveyed demonstrated moderate to severe psychological distress
A new Middlesex University study revealed a dedicated wellbeing service for online content moderators who are dealing with extreme material on a regular basis made a real difference in improving their mental health.
Dr Ruth Spence and Dr Jeffrey DeMarco, from MDX’s Centre for Abuse and Trauma Studies (CATS) surveyed 160 workers from an international company, which provides moderation for the entertainment industry, whose job is to remove and flag harmful material such as child sexual abuse images, hate speech and extreme violence.
The research found that just over a quarter of the content moderators demonstrated moderate to severe psychological distress from viewing toxic online material, while a third reported experiencing low wellbeing.
More than three-quarters (76.9%) of the content moderators reported being exposed to hate speech, and over a third reported exposure to humiliation (35.6%) and child sexual abuse material (34.4%).
On a daily basis more than a quarter (28.7%) of the content moderators said they were exposed to content they found distressing, while more than a fifth (21.3%) reported seeing it every week.
The study highlighted the importance of wellbeing support for content moderators and the need for further research because many such employees are young and the long term psychological effects of the work are unknown.
Problem-focused coping styles such as for example a structured meeting immediately after distressing incidents, where staff can discuss feelings at length, were shown to reduce psychological stress and improve wellbeing.
Academics found that the majority of content moderators felt the existence of a dedicated wellbeing service made them feel ‘heard and valued’, which can increase job satisfaction and motivation, while almost half felt it improved their mental health.
Dr DeMarco, a Senior Lecturer in Psychology who co-authored the study, said: “This study demonstrated the importance of wellbeing programmes for individuals which are dynamic, flexible and tailored to meets the individual needs of the moderators, instead of the normal wellbeing services which amount to essentially a tick box exercise and have a one size fits all approach.
“This paper demonstrates that actually investing in wellbeing services and treating it as an essential pillar of the job, right from the point of joining all the way through to promotion or moving, actually mitigates some of potential psychological damages these content moderators could face and that’s a really powerful finding.”
Dr DeMarco said that content moderators were the internet’s ‘firefighters’ and play a crucial role in protecting the public from harmful content online.
“These moderators are important not just for identifying child sexual abuse material, hate speech and extreme violence or pornography, but also because they assist in developing and informing technological tools that prevent toxic materials online. All the work done by moderators’ feeds into algorithms and large language models, which helps big companies get better at automating this process and hopefully means that less people will be exposed to this vitriolic content in future.”
Jeffrey De Marco
Moderators are often the first point of criminal investigations as they flag content to the US-based NCMEC (National Centre for Missing and Exploited Children) which shares the information with local, national or international law enforcement agencies who identify and bring perpetrators to justice.
The paper - Content Moderator Mental Health and Associations with Coping Styles: Replication and Extension of Previous Studies – has been peer reviewed and published in the journal Behavioural Sciences.
Find out more about studying Psychology at Middlesex University.
Photo by Teja Chari on Unsplash