The use of artificial intelligence by consumers to promote a pro Russian disinformation campaign “content explosion” According to reports, the focus of this campaign was on exacerbating tensions surrounding global elections, Ukraine and immigration among other controversial topics. new research published last week.
It is also known by other names such as Operation Overload You can also find out more about the following: Matryoshka Researchers have also linked the disease to Storm-1679Multiple groups have aligned themselves with the Russian government, including Microsoft Then, there is the Institute for Strategic Dialogue. It spreads fake narratives through impersonating news outlets, with an apparent goal of dividing democratic nations. Although the campaign is targeting audiences worldwide, including in the USIts main focus has been Ukraine. A campaign that used AI to manipulate videos has been used in an attempt to spread pro-Russian propaganda.
This report shows how the content produced and viewed by the people running the campaign increased significantly between September 2024 to May 2025.
Researchers found 230 pieces of unique content in their report that were promoted between July 2023 to June 2024. These included pictures, videos and QR codes as well as fake websites. Over the past eight months however, Operation Overload has produced a total 587 unique content pieces, the majority being generated with AI tools.
Researchers said that the increase in online content is due to AI consumer tools available for free. It was easy to access, which helped the campaign. “content amalgamation,” AI enabled the people running an operation to generate multiple content pieces that told the same story.
“This marks a shift toward more scalable, multilingual, and increasingly sophisticated propaganda tactics,” In the report, researchers from Reset Tech – a non-profit organization based in England that tracks disinformation campaigns – and Check First – a Finnish company of software developers authored. “The campaign has substantially amped up the production of new content in the past eight months, signalling a shift toward faster, more scalable content creation methods.”
Researchers were astounded at the range of different tools and content types that this campaign used. “What came as a surprise to me was the diversity of the content, the different types of content that they started using,” Aleksandra Atanasova is the lead researcher for open-source intelligence at Reset Tech. She tells WIRED. “It’s like they have diversified their palette to catch as many like different angles of those stories. They’re layering up different types of content, one after another.”
Atanasova also said the campaign didn’t appear to use any AI custom tools to reach their goals but instead used AI-powered image and voice generators, which were accessible to anyone.
The researchers found it difficult to determine all of the tools that the campaign operators were using. However, they were able narrow the list down to just one: Flux AI.
Flux AI was created by Black Forest Labs in Germany, founded by Stability AI’s former employees. Using the SightEngine image analysis tool, the researchers found a 99 percent likelihood that a number of the fake images shared by the Overload campaign—some of which claimed to show Muslim migrants rioting and setting fires in Berlin and Paris—were created using image generation from Flux AI.

