Close Menu
  • AI
  • Content Creation
  • Tech
  • Robotics
AI-trends.todayAI-trends.today
  • AI
  • Content Creation
  • Tech
  • Robotics
Trending
  • Anthropic Mythos is Unauthorized by Discord Sleuths
  • Ace the Ping Pong Robot can Whup your Ass
  • GitNexus, an Open-Source Knowledge Graph Engine that is MCP Native and Gives Claude Coding and Cursor Complete Codebase Structure Awareness
  • Deepgram Python SDK Implementation for Transcription and Async Processing of Audio, Async Text Intelligence, and Async Text Intelligence.
  • DeepSeek AI releases DeepSeek V4: Sparse attention and heavily compressed attention enable one-million-token contexts.
  • AI-Designed drugs by a DeepMind spinoff are headed to human trials
  • Apple’s new CEO must launch an AI killer product
  • OpenMythos Coding Tutorial: Recurrent-Depth Transformers, Depth Extrapolation and Mixture of Experts Routing
AI-trends.todayAI-trends.today
Home»Content Creation»Content moderators are organizing against Big Tech

Content moderators are organizing against Big Tech

Content Creation By Gavin Wallace30/05/20257 Mins Read
Facebook Twitter LinkedIn Email
YouTube will let you search for things you see in
YouTube will let you search for things you see in
Share
Facebook Twitter LinkedIn Email

Content moderators who comb through harmful material uploaded to online platforms have formed a global trade union alliance in a bid to improve working conditions. The Global Trade Union Alliance of Content Moderators (GTUACM) announced today in Nairobi, Kenya, says it aims to “hold Big Tech responsible” for failing to address workers’ issues like low wages, trauma, and lack of union representation across the industry.

Companies like Meta, Bytedance, and Alphabet often outsource content moderation on their platforms to contract workers. They must flag and analyze videos that contain violent content, hate speech and images of abuse. GTUACM says that many moderators in the industry experience “depression, post-traumatic stress disorder, suicidal ideation, and severe mental health consequences” due to being exposed to such content without adequate support. They are faced with unrealistic performance expectations, uncertain employment and the threat of being punished if they speak out.

“The pressure to review thousands of horrific videos each day – beheadings, child abuse, torture – takes a devastating toll on our mental health, but it’s not the only source of strain. Precarious contracts and constant surveillance at work add more stress,” said Michał Szmagaj, a former Meta content moderator who is now helping workers to unionize in Poland. “We need stable employment, fair treatment, and real access to mental health support during work hours.”

GTUACM aims to offer a global forum to negotiate with tech companies as well as coordinate collective campaigns and research occupational health. The alliance will include content moderators through the trade unions of Ghana, Kenya and other countries. In the future, other unions, such as those from Ireland and Germany are expected to be included.

There are no US unions on that list. However, that does not mean they won’t participate. Benjamin Parton, head of UNI Global Union’s ICTS Sector, told The Verge that “not all unions who are supporting content moderator organizing were able to attend the event, but we work closely with our member unions in the United States, such as the CWA, to demand justice in the Big Tech supply chain.”

“Kenya has become a global hub for

Content moderators who comb through harmful material uploaded to online platforms have formed a global trade union alliance in a bid to improve working conditions. The Global Trade Union Alliance of Content Moderators (GTUACM) announced today in Nairobi, Kenya, says it aims to “hold Big Tech responsible” for failing to address workers’ issues like low wages, trauma, and lack of union representation across the industry.

Companies like Meta, Bytedance, and Alphabet often outsource content moderation on their platforms to contract workers. They must flag and analyze videos that contain violent content, hate speech and images of abuse. GTUACM says that many moderators in the industry experience “depression, post-traumatic stress disorder, suicidal ideation, and severe mental health consequences” due to being exposed to such content without adequate support. They are faced with unrealistic performance expectations, uncertain employment and the threat of being punished if they speak out.

“The pressure to review thousands of horrific videos each day – beheadings, child abuse, torture – takes a devastating toll on our mental health, but it’s not the only source of strain. Precarious contracts and constant surveillance at work add more stress,” said Michał Szmagaj, a former Meta content moderator who is now helping workers to unionize in Poland. “We need stable employment, fair treatment, and real access to mental health support during work hours.”

GTUACM aims to offer a global forum to negotiate with tech companies as well as coordinate collective campaigns and research occupational health. The alliance will include content moderators through the trade unions of Ghana, Kenya and other countries. In the future, other unions, such as those from Ireland and Germany are expected to be included.

US unions will not be included on this list, despite the fact that they are notably missing. Benjamin Parton, head of UNI Global Union’s ICTS Sector, told The Verge that “not all unions who are supporting content moderator organizing were able to attend the event, but we work closely with our member unions in the United States, such as the CWA, to demand justice in the Big Tech supply chain.”

“Kenya has become a global hub for [content] moderation, and we welcome investors to Kenya to invest in this sector, but it must not be against the health of workers in this country,” said Benson Okwaro, the general secretary of the Communication Workers Union of Kenya (COWU). “That is why we are organizing on the ground and alongside unions worldwide. Together we are sending a clear message to investors in this sector, including Meta, TikTok, Alphabet, and Amazon that moderators everywhere will no longer stay silent while platforms make profit from their pain.”

Meta is notably being sued by former content moderators in Ghana and Kenya over psychological distress inflicted by the contracted role. A group of former content moderators who flagged graphic and violent videos on TikTok has also filed a lawsuit against their former contractor, Telus Digital, over claims that they were fired for trying to unionize and improve their working conditions.

“The content we see doesn’t just disappear at the end of a shift. It haunts our sleep and leaves permanent emotional scars,” Özlem, a former Telus worker, said in a statement to the UNI global union. “When we raise it with our managers, they say these are the conditions TikTok, the client, requires. When we stand up for better conditions at our jobs, our coworkers get fired.”

Meta, TikTok, Google and other social media platforms have all been asked to comment on the GTUACM.

“Companies like Facebook and TikTok can’t keep hiding behind outsourcing to duck responsibility for the harm they help create,” Christy Hoffman said, the general secretary of UNI Global Union. “This work can — and must — be safer and sustainable. That means living wages, long-term employment contracts, humane production standards, and a real voice for workers.”

Benson Okwaro said that the Communication Workers Union of Kenya’s (COWU) general secretary Benson Okwaro welcomed investors in Kenya who wanted to invest in the sector. However, this must be done with moderation and not at the expense of worker health. “That is why we are organizing on the ground and alongside unions worldwide. Together we are sending a clear message to investors in this sector, including Meta, TikTok, Alphabet, and Amazon that moderators everywhere will no longer stay silent while platforms make profit from their pain.”

Meta is notably being sued by former content moderators in Ghana and Kenya over psychological distress inflicted by the contracted role. A group of former content moderators who flagged graphic and violent videos on TikTok has also filed a lawsuit against their former contractor, Telus Digital, over claims that they were fired for trying to unionize and improve their working conditions.

“The content we see doesn’t just disappear at the end of a shift. It haunts our sleep and leaves permanent emotional scars,” Özlem, a former Telus worker, said in a statement to the UNI global union. “When we raise it with our managers, they say these are the conditions TikTok, the client, requires. When we stand up for better conditions at our jobs, our coworkers get fired.”

Meta, TikTok and Google have been contacted for comments on the GTUACM.

Christ Hoffman said, “Companies such as Facebook and TikTok are not able to continue hiding behind outsourced services in order for them to escape responsibility for any harm that they may have caused.” “This work can — and must — be safer and sustainable. That means living wages, long-term employment contracts, humane production standards, and a real voice for workers.”

Creators Google Labor meta News Tech TikTok YouTube
Share. Facebook Twitter LinkedIn Email
Avatar
Gavin Wallace

Related Posts

Former MrBeast exec sues over ‘years’ of alleged harassment

22/04/2026

I’m Growing on Instagram After 10 Years — Here’s What I‘m Doing Differently

22/04/2026

YouTube will allow celebrities to request the removal of AI fakes.

21/04/2026

YouTube now expands AI-based likeness detection to celebrities

21/04/2026
Top News

All Home windows 11 PCs Will Get These Superior Copilot AI Options

Schematik Is ‘Cursor for Hardware.’ The Anthropics Want In

The AI can break the rules by using psychological tricks

AI: What hotels can and need to do to gain an advantage or stay ahead in 2025/2026

“Jury Duty presents: Company retreat” almost makes Corporate culture seem fun

Load More
AI-Trends.Today

Your daily source of AI news and trends. Stay up to date with everything AI and automation!

X (Twitter) Instagram
Top Insights

Anthropic and Thinking Machines Lab’s new AI research stresses tests model specs and reveals character differences among language models

26/10/2025

ALPHAONE is a universal test-time framework for modulating reasoning in AI models

09/06/2025
Latest News

Anthropic Mythos is Unauthorized by Discord Sleuths

25/04/2026

Ace the Ping Pong Robot can Whup your Ass

25/04/2026
X (Twitter) Instagram
  • Privacy Policy
  • Contact Us
  • Terms and Conditions
© 2026 AI-Trends.Today

Type above and press Enter to search. Press Esc to cancel.