“#Azov militants deserve execution, but death not by firing squad but by hanging, because they’re not real soldiers,” the Russian Embassy to the United Kingdom tweeted, in English, on the same day as the deadly explosion. The tweet, echoing a long-held Russian talking point that equates Ukrainians with Nazis, ended with the hashtag #StopNaziUkraine.
In response to an outcry from users, Twitter hid the tweet behind a warning label and blocked it from being shared, but let both the tweet and the Russian Embassy U.K. account remain on the platform, citing considerations of public interest. Google-owned YouTube later removed a video that the tweet had linked to.
The tweet, and the tech platforms’ response, illustrate how Russian propaganda and anti-Ukrainian hate continues to spread on global social media platforms nearly six months into the war, even as those platforms have taken an array of measures to limit it. While tech giants including Facebook, YouTube, Twitter and TikTok have succeeded in crimping the reach of the largest Russian state media outlets, partly in response to European sanctions, new research highlights blind spots in their efforts. And Ukrainian officials are calling on them to recognize and adapt to changing Russian tactics.
Russian Embassy accounts in countries around the world have actually received more engagement on Facebook and Twitter since the war began than they did before Russia’s unprovoked Feb. 24 invasion, according to a new report from the nonpartisan research group Advance Democracy. On Facebook, those accounts have found ways to launder Russian propaganda from sanctioned state media accounts, such as copying and embedding videos originally produced by state-run Russia Today rather than linking to them.
Such loopholes make major U.S. tech platforms a vehicle for Russian propaganda, including the demonization of Ukrainians as Nazis, that might not otherwise find a place in mainstream Western media.
“Russia already very well knows the vulnerability of the rules of some media platforms,” said George Dubinskiy, Ukraine’s deputy minister of digital transformation. “We have a media war right now.”
Twitter spokeswoman Elizabeth Busby said that as of April, the company does “not amplify or recommend government accounts of states that limit access to free information and are engaged in armed interstate conflict,” including Russian Embassy accounts. Under those rules, the company doesn’t recommend Russian Embassy accounts on Home Timelines, and it also adds labels that provide additional context.
Facebook did not respond to requests for comment.
Dubinsky said Russia has been evolving its tactics since the outset of the war, seeking to harm Ukraine’s reputation and deter Western countries from supporting it. He also said the Kremlin and its proxies, including Russian embassies, are increasingly active in spreading falsehoods in Africa and the Middle East. The country’s online strategy involves the use of both state-controlled media to establish key talking points and bots, or automated social media accounts, to parrot and amplify them.
Nearly six years after Russia exploited social media to interfere in American elections, prompting a global reckoning of the tech industry’s content moderation practices, the world’s most valuable companies are still struggling to keep pace with a deluge of propaganda. Russian Embassy accounts are circumventing some of the platforms’ rules, aiding the spread of narratives favorable to Russia, the Advance Democracy research found.
“Russia denied that an invasion was planned in the days leading up to an invasion, and has continued to promote propaganda and disinformation since the invasion began,” the report said.
While Western media outlets, with notable exceptions, have tended to treat Russia’s claims skeptically, social platforms offer Russia a chance to reach global audiences with unfiltered, sometimes vicious propaganda. That has included claims that Ukrainian victims are “crisis actors” manufacturing images of false suffering; that Ukrainians are responsible for shelling their own civilians; and that the real villains in any case are Ukrainian “Nazis.”
Other influential Russian online propaganda efforts have included pushing the idea that Ukraine was developing bioweapons; blaming Ukraine for grain shortages; and arguing that Ukrainian corruption meant that arms shipments from allies would fall into the wrong hands.
Increasingly, Russia is working to spread such messages in Africa and other parts of the global South, where tech companies’ content moderation enforcement tends to be lax, said Larissa Doroshenko, a postdoctoral scholar at Northeastern University who researches disinformation.
“I think it was perhaps a step in the right direction when, with the start of the war, all the Western social media platforms took a much firmer stance than they had before” on Russian state media and propaganda, she said. “What they did not realize is that the spread of this information was much more stealthy and much more inventive than just having an official account and posting from that account.”
According to the Advanced Democracy research, Russian Embassy accounts’ tweets are being liked or retweeted about 279 times on average, up 240 percent since before the Russian invasion. They found a similar jump on Facebook, where the average number of reactions, comments or shares on an embassy account’s post rose 108 percent since the invasion.
In one April Facebook post that garnered more than 700 likes, the Russian Embassy in Indonesia shared a timeline that purported to prove Russia wasn’t responsible for an apparent massacre of civilians in the Kyiv suburb of Bucha, where officials identified 458 dead bodies after weeks of Russian occupation. Nearly all are known to be civilians, and the details of each case are now under investigation.
The Advance Democracy research also shows that Russia has taken steps to evade the defenses that Western social media companies have erected against propaganda and state-controlled media.
Since 2020, Meta — the corporate parent of Facebook, Instagram and WhatsApp — has appended labels to media outlets that are under government editorial control, such as RT. But when Russian Embassy accounts directly embedded videos produced by RT in posts, the company did not apply warnings.
In one April 26 post, the Russian Embassy in Australia shared on its Facebook page a video that claimed that there were “no signs of alleged mass graves,” near Mariupol, a southern port city. Satellite images provided to The Washington Post by the company Maxar Technologies showed rows of graves in a Russian-occupied village about 12 miles west of the city, and Ukrainian officials say they are evidence of war crimes against civilians.
In other instances, the embassy accounts directed users to follow them on social media platforms with less stringent content moderation rules. In a March post, the Facebook page for the Russian Embassy in Malta announced it had opened a channel on Telegram, a messaging app that has become a critical communication channel for Russian and Ukrainians alike. A similar post later that month on the Facebook page of the Russian Embassy in Indonesia called on users to follow another new Telegram channel.
The report found at least 26 channels affiliated with Russian embassies operating on Telegram, more than 80 percent of which were created since the Feb. 24 invasion. They collectively have more than 50,000 subscribers. Telegram did not respond to a request for comment on Tuesday.
The evolution of Russian propaganda techniques online was inevitable, and global tech companies can’t be expected to perfectly moderate every objectionable account or post worldwide, said Katie Harbath, CEO of the civic tech consulting firm Anchor Change and a former Facebook public policy director. That’s not to say they can’t do better, she added.
“It’s one thing to say, ‘This is our rule. This is our intent of what we want to do,’ ” Harbath said, referring to platforms’ efforts to label Russian state media and limit its reach during wartime. “It’s a whole other thing to build the algorithms and the classifiers and the knowledge and the people at scale to be able to find that content and take whatever action it is that you want to.”
But Harbath added that there are also reasons for tech companies to be wary of overreaching when it comes to restricting government or state media from disfavored countries.
“It might be something where it seems super obvious we should do this for the Russians. But when do you start doing that for India? When do you start doing that for Brazil, other places? This is one of the slippery slopes I’m worried about where the tech platforms are becoming pawns in an overall foreign policy battle that is happening around the world.”
U.S. sanctions bar major Russian state media outlets from receiving American advertising dollars, yet sanctions experts have said there has been little government action to clarify what responsibility tech companies have to remove accounts or posts associated with blacklisted entities. Facebook, YouTube and TikTok banned Russian state media in Europe soon after the invasion, under pressure from European regulators, a move that quickly diminished their online audiences, according to a Post investigation.
There are good reasons for the tech platforms not to ban Russian Embassy accounts altogether, Northeastern University’s Doroshenko added. There are Russians living in countries all around the world, and they have legitimate reasons to want to be able to correspond with their embassies, and vice versa. But she says the platforms could be much more proactive and responsive when it comes to ensuring those embassy accounts don’t become vehicles for lies and war propaganda.
Ukrainian officials are also worried about tech companies’ well-meaning content moderation efforts preventing their own war message from getting out. Ukraine’s Ministry of Digital Transformation sent a letter to Meta last month warning of a “full-scale campaign” to block the country’s opinion leaders, bloggers and activists on platforms like Facebook and Instagram.
The ministry has been reporting cases to Meta when it believes content has been taken down wrongfully, and in the majority of cases, Meta restores it after finding it was wrongly removed, according to the letter. The ministry warned that “the core problem is the quality of the moderation process” and called on Meta’s president of global affairs, Nick Clegg, to open a review of the process. Meta did not respond to a request for comment.
Dubinksy said the ministry has not yet received a formal response from Meta but that they are in regular communication with the company. He said the Ukrainian government wants Meta to move more quickly and also to reevaluate how it handles photos of Russia’s destruction in Ukraine.
“Can you imagine a situation where you were living in war, your house was ruined and you have no right to publish something about it?” he said. “Really, we need to show the truth right now.”
Dubinsky said the persistence of propaganda underscores the need for Western countries to unite against disinformation.
“In kinetic war, they have at least some rules of war,” he said. “In [the] media sphere, nothing is prohibited. … We need to unite with at least Western governments and companies and [the] media sphere and develop some clear rules.”