A Pro-Russia Disinformation Campaign Is Using Free AI Tools to Fuel a ‘Content Explosion’


Pro-Russian disinformation A campaign utilizes artificial intelligent tools to feed a “content explosion” centered in aggravating existing tensions around global choices, Ukraine and immigration, among other controversial issues, according to New research published last week.

The campaign, known by many names including Operational Overload And Martrium (Other researchers also linked it to Storm-1679), works since 2023 and has been aligned with the Russian government by numerous groups, including Microsoft and the Institute for Strategic Dialogue. The campaign spreads false stories with impersonal media with the apparent goal of sowing division in democratic countries. As the campaign are targeting audiences around the world, including in the United StatesIts main purpose was Ukraine. Hundreds of AI handled videos of the campaign tried to feed pro-Russian stories.

The report sketches as, between September 2024 and May 2025, the amount of content produced by those who manage the campaign has grown dramatically and receive millions of views around the world.

In their report, the researchers identified 230 unique content promoted from the campaign between July 2023 and June 2024, including images, videos, QR codes and fake websites. Over the last eight months, however, Operation Overarad has exhausted a total of 587 unique content, and most have been created with the help of AIs, researchers said.

The researchers said the spice in content was driven by consumer AI tools available for free online. This easy access helped to feed the tactics of the “Content Campaign”, where those who operated the operation were able to produce multiple content pushing the same story thanks to AIs.

“This marks a change to a more scalable, multilingual and increasingly sophisticated propaganda tactic,” researchers from Reset Tech, a London-based non-profit that tracks disinformation campaigns, and Check First, a Finnish software company, wrote in the report. “The campaign has substantially increased the production of new content over the past eight months, signaling a change to faster, more scalable content creative methods.”

Researchers were also stunned by the variety of tools and types of content that the campaign intended. “What surprised me was the diversity of the content, the various types of content they started using,” Alexandra atanasova, lead open source smart researcher at Reset Tech, says Wired. “It is as if they diversified their pallet to capture as many as different corners of those stories. They arrange various types of content, one after the other.”

Atanasova added that the campaign did not seem to use any usual AI tools to achieve its goals, but used AI-powered vocal and image generators that are accessible to everyone.

While it was difficult to identify all the tools that campaign operations used, the researchers could reduce to one tool especially: Flux AI.

Flux AI is a text-to-image generator developed by Black Forest Labs, a German-based company founded by former stability employees. Using the Stsherngine image tool analysis, the researchers found a 99 percent likelihood that some false images shared by the overload campaign – some of which claimed to show Muslim migrants tumultuous and igniting fires in Berlin and Paris – were created using an image of a Flux Ai generation.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *