[ad_1]
IN THE 1980S the KGB had a well-worn method for pumping disinformation around the world. “We preferred to work on genuine documents,” recalled Oleg Kalugin, a former KGB general, “with some additions and changes.” That method has not changed greatly, but technology has accelerated the process.
In early March a network of websites, dubbed CopyCop, began publishing stories in English and French on a range of contentious issues. They accused Israel of war crimes, amplified divisive political debates in America over slavery reparations and immigration and spread nonsensical stories about Polish mercenaries in Ukraine.
That is not unusual for Russian propaganda. What was new was that the stories had been taken from legitimate news outlets and modified using large language models, most likely one built by OpenAI, the American firm that operates ChatGPT. An investigation published on May 9th by Recorded Future, a threat-intelligence company, found that the articles had been translated and edited to add a partisan bias.
In some cases the prompt—the instruction to the AI model—was still visible. These were not subtle. More than 90 French articles, for instance, were altered with the following instruction in English: “Please rewrite this article taking a conservative stance against the liberal policies of the Macron administration in favour of working-class French citizens.”
Another rewritten piece included evidence of its slant: “It is important to note that this article is written with the context provided by the text prompt. It highlights the cynical tone towards the US government, NATO, and US politicians. It also emphasises the perception of Republicans, Trump, DeSantis, Russia, and RFK Jr as positive figures, while Democrats, Biden, the war in Ukraine, big corporations, and big pharma are portrayed negatively.”
Recorded Future says that the network has ties to DC Weekly, an established disinformation platform run by John Mark Dougan, an American citizen who fled to Russia in 2016. CopyCop had published more than 19,000 articles across 11 websites by the end of March 2024, many of them probably produced and posted automatically.
In recent weeks, the network has “started garnering significant engagement by posting targeted, human-produced content”, it adds. One such story—a far-fetched claim that Volodymyr Zelensky, Ukraine’s president, had purchased King Charles’s house at Highgrove, in Gloucestershire—was viewed 250,000 times in 24 hours, and was later circulated by Russia’s embassy in South Africa.
These crude efforts are unlikely to persuade discerning readers. And it is easy to exaggerate the impact of foreign disinformation. But AI-enabled forgeries are still in their infancy and likely to improve considerably. Future efforts are less likely to leak their incriminating prompts.
“We are seeing every one of the nation state actors and big cyber groups playing around with AI capabilities,” noted Rob Joyce, until recently the director of cybersecurity for the National Security Agency, America’s signals intelligence service, on May 8th.
In his memoirs, Mr Kalugin boasted that the KGB published almost 5,000 articles in foreign and Soviet newspapers in 1981 alone. For the modern propagandist, those are rookie numbers.
© 2024, The Economist Newspaper Limited. All rights reserved.
From The Economist, published under licence. The original content can be found on www.economist.com