Why are Russian disinformation campaigns citing ICIJ?
ICIJ spoke to disinformation experts about how these campaigns fuse fact with fiction, and engage in narrative laundering to trick audiences.
Over the past two years of war in Ukraine, Russian disinformation campaigns have been many and varied. Among the false claims that have gone viral are that Ukrainian President Volodymyr Zelenskyy’s wife bought a $5 million car and spent over $1 million at Cartier in New York, and that he owned a casino in Cyprus.
More than once, the deliberately inaccurate stories have lifted details from past ICIJ investigations. One claimed a company that Zelenskyy’s wife once owned, which surfaced in ICIJ’s Pandora Papers investigation, created the Cypriot casino’s website. Another tried to falsely link him to the supposed purchase of two yachts through Boris and Serhiy Shefir — a pair of brothers whose ties to Zelenskyy also featured in Pandora Papers.
To understand Russian disinformation campaigns and how they work, ICIJ spoke with disinformation experts at Columbia University’s Tow Center for Digital Journalism and Clemson University’s Media Forensics Hub.
Narrative laundering and why it works
Darren Linvill, a professor at Clemson University who studies Russian disinformation campaigns and co-directs the Media Forensics Hubs, said the campaigns often show trademark signs of “narrative laundering.” Like money laundering, narrative laundering tries to pass off inaccurate information as legitimate.
By making the bad information look like it’s from an “unbiased source,” Linvill said, “it gives the message a higher probability of being believed by a more general public.”
Narrative laundering, Linvill said, has three steps: placement, layering and integration.
Placement refers to where the story first appears once it’s created — for example, videos uploaded to YouTube or social media.
Layering, the second step, is the process of obscuring the source of the fake story: paying to place it in non-Western news outlets, sharing via bot social media accounts or Russian-state-affiliated influencers, and publishing on fake news websites made to look like legitimate Western outlets — like the innocuous sounding “DC Weekly.”
Emily Bell, founding director of Columbia University’s Tow Center of Digital Journalism, refers to the latter method as “pink slime journalism.” Fake French news sites, for example, were used to popularize the false story of Zelenskyy’s wife and her $5 million car. NewsGuard Technologies, which creates software to track misinformation and rates news and information sites credibility, reports it’s tracking at least 618 fake websites disseminating Russian disinformation.
Integration is the final step, when the misinformation gets picked up by genuine voices, and integrated into mainstream discourse. Disinformation campaigns rarely get to this stage, Linvill said, but the few that do can wreak havoc, and technology like generative AI has dramatically lowered the cost of getting such campaigns off the ground.
“It does not take long because of modern technology,” Linvill said. “Social media is an incredibly efficient machine.”
In many ways, Linvill said, it’s an old Russian playbook. In the 1980s, Russia planted a letter to the editor in a KGB-created Indian newspaper in an attempt to convince the world that the U.S. created the AIDS virus. By 2006, a study found that as many as a quarter of Black Americans — a group with reason to distrust the U.S. government and the medical establishment — believed AIDS originated from a U.S. government laboratory.
Why disinformation catches fire
The most effective disinformation, Bell said, feeds into something that people already believe.
“If you get things which are not necessarily true, but they spread like wildfire, it means that there’s already a pre existing condition for people to want to believe whatever the material is,” Bell said.
The stories often have threads of truth woven throughout: Zelenskyy was indeed in New York the same weekend as his wife’s supposed Cartier shopping spree to speak at the United Nations. And legitimizing elements, like doctored video or documents and allusions to well-established news events such as the Pandora Papers revelations, can extend disinformation’s reach. Sometimes, Bell said, audiences stop caring about whether or not a story is real or fake.
“When people want to believe something, and a piece of disinformation or misinformation is dropped in front of them, they will leap on it,” Bell said. “Even when you then debunk it.”
Combatting disinformation
Skepticism and critical thinking are the easy recommendations, Linvill said. But conspiracy thinkers and groups like QAnon, he warns, think they’re being skeptical and critical thinkers too.
“Being skeptical is also what the Russians want you to do,” Linvill said. “Being skeptical is kind of how we’ve gotten to where we are, where no one trusts any media.”
Instead, he advises people to cultivate information sources they trust, and to approach the virtual world with the same caution you would approach the real world. Just because you meet someone wearing a T-shirt with a political slogan you agree with, he said, doesn’t mean you would invite them into your house and introduce them to your friends and family. Linvill said that’s exactly what you are doing when you reshare a dubious post on social media.
People deserve to know where their messages come from.
— Emily Bell, Columbia University’s Tow Center
Combining academic research with trusted news outlets and individual media savvy offers the best outcomes, Linvell said. There’s no foolproof solution. Disinformation networks change their tactics in response to increased awareness and detection. For example, amid increasing public awareness and ability to recognize AI generated images, accounts have started to use real photos for their profiles — increasing the resource burden on those networks.
The platforms and disinformation networks are locked in a kind of “arms race,” Bell said. And governmental agencies like the Federal Elections Commission, which regulates campaign finance and political speech, can also play a larger role in cracking down on the origins of political messaging.
“People deserve to know where their messages come from,” Bell said.