Russian online disinformation has increased in 2023, especially after SpaceX and Tesla billionaire Elon Musk took over Twitter late last year, a new report from the EU has said.

The authors warned the “reach and influence of Kremlin-backed accounts has grown further in the first half of 2023, driven in particular by the dismantling of Twitter's safety standards." 

Musk unleashed a wave of sackings when he took over, firing many moderators who vetted Twitter content for disinformation and harmful messages. 

He has said, however, that Twitter – since rebranded as X – is "working hard" to meet the DSA rules.

X wasn’t the only platform to draw criticism.

The independent study for the EU comes after tougher rules under the Digital Services Act (DSA) kicked in this month for the world's biggest online platforms. 

Advertisement

The report focused on risks from pro-Kremlin disinformation on six platforms – Facebook, Instagram, Twitter (rebranded X), YouTube, TikTok and Telegram – and whether the companies' actions complied with elements of the DSA. 

Except Telegram, all must currently comply with the DSA's stricter rules that demand a more aggressive approach to policing content – including disinformation and hate speech – from "very large" platforms with at least 45 million monthly active users. 

Tech companies signed a code of practice on disinformation before the DSA that would have "mitigated some of the Kremlin's malign activities", the report said. 

Exposing Russian Propaganda Playing Off Ukrainians Against Poles
Other Topics of Interest

Exposing Russian Propaganda Playing Off Ukrainians Against Poles

As Russian disinformation aims to sow distrust between Ukrainians and Poles, the chief editor of Poland’s respected news outlet Onet.pl sets the record straight.

"The evidence suggests that online platforms failed to implement these measures at a systemic level," the study wrote. 

Most major platforms signed the code last year but Twitter withdrew in June. The report also criticised the Telegram social network, but it has not signed up to the code. 

The report concluded that under the DSA's article 35, "standards of effective risk mitigation were not met in the case of Kremlin disinformation". 

The European Commission pointed out, however, that "limited data access impose some caveats on this assessment" that the tech companies efforts' were "insufficient". 

Advertisement

The EU is particularly worried about the impact of disinformation during next year's European parliamentary elections and urged tech giants to effectively enforce the DSA. There is a "high risk", the study said, that Russia would interfere in the elections. 

"The rules provided by the DSA hold great potential to reign in Kremlin disinformation campaigns and other state-sponsored attacks on the democratic integrity and fundamental rights," the authors urged. 

"But they must be applied quickly and effectively in order to help mitigate these coordinated attacks on European democracy," they added.

To suggest a correction or clarification, write to us here
You can also highlight the text and press Ctrl + Enter