Complementary Efforts of Serbian and Russian Bot Networks

Men walk past a mural depicting the logo of the Russian mercenary 'Group Wagner' and a slogan in Russian by the informal pro-Russia organization 'Narodna Partola' (lit.: People Patrol), in Belgrade, Serbia, 20 January 2023.
© EPA/ANDREJ CUKIC   |   Men walk past a mural depicting the logo of the Russian mercenary 'Group Wagner' and a slogan in Russian by the informal pro-Russia organization 'Narodna Partola' (lit.: People Patrol), in Belgrade, Serbia, 20 January 2023.

In February 2022, as the Russian invasion of Ukraine began, a different kind of offensive was underway on social media. Automated accounts flooded Twitter, Facebook, and Telegram, with a set goal to amplify Kremlin narratives and manufacture consensus as the war was unfolding. This effort represented the culmination of years of institutional investment in what disinformation experts focused on Russia call “reflexive control”. Serbia, geographically encircled by the European Union and NATO, yet politically and economically tied to the Kremlin, has developed a strikingly similar infrastructure, albeit adapted for a more down to earth purpose of entrenching the rule of a single political party. These two models: one imperial in scale, the other clientelistic; nevertheless, often amplify each other in the realm of political discourse, although there is no clear evidence they are coordinated from a single center of power.

The Russian playbook: bot farms, fake internet personas, Doppelganger sites, and fake news floods

The structure of Russia's online influence operations has been widely researched by Western intelligence agencies, academia, and the media since the start of total invasion of Ukraine, while Serbian regime bot farms have not received such attention because of their limited reach. As to Russia, the earliest identified node was the Internet Research Agency, established in Saint Petersburg in the summer of 2013 and linked to oligarch Yevgeny Prigozhin, who admitted publicly to having founded it in 2023. Its operations, detailed in the Mueller indictment of February 2018, involved hundreds of employees working in shifts to maintain fake personas across every major Western platform. The content was calibrated to exacerbate existing divisions: it ran competing Facebook groups simultaneously, staged pro-Trump and pro-Clinton rallies in the same American cities on the same day, and embedded its messaging in racial controversies with no connection to Russia. The goal was to achieve atomization – to make the information environment so saturated with conflict that citizens ceased to trust mainstream sources.

Since 2022, the Russian bot toolkit has enlarged considerably in parallel with Russian expansionist policies. The operation known as Doppelganger, attributed by the US Department of Justice and France's digital interference agency VIGINUM to three Russian companies, Social Design Agency, Structura National Technology, and ANO Dialog, represents a significant evolution in method. Rather than fake personas expressing opinions, Doppelganger deploys fake versions of entire media institutions. It managed to clone the websites of Der Spiegel, Le Parisien, Fox News, and The Washington Post and spread fake news under guise of trusted media. The US Department of Justice, which seized 32 domains in September 2024, publicized an internal Social Design Agency document describing the project's stated objective as the escalation of internal tensions in countries allied with the United States. The DOJ's FBI affidavit named this project “International Conflict Incitement” and identified it as directed by Sergei Kiriyenko, First Deputy Chief of Staff of Putin's Presidential Executive Office.

A parallel network known as Pravda, carefully dissected by French VIGINUM under the earlier designation Portal Kombat, operates on an opposite principle. It floods the internet with sheer volume of disinformation: in 2024 alone, it published over 3.6 million articles across approximately 150 pro-Kremlin websites targeting 49 countries, according to the American Sunlight Project and NewsGuard. Its primary method is gaming the algorithm itself. The content is designed to be scraped by the crawlers that train large language models, and a NewsGuard audit published in March 2025 found that ten leading Western AI chatbots repeated Pravda-sourced disinformation in 33% of cases tested. The American Sunlight Project has described this technique as “LLM grooming”. The Pravda network operates Serbian-language sites as part of this structure. An RFE/RL investigation documented how one of them published over 1,000 articles in a single day during the April 2024 Israeli-Iranian military exchange, sourcing content almost entirely from Russian Telegram channels, with a turnaround time of three to twelve minutes.

Serbia’s deploying bots to spread pro-government and anti-opposition narratives

Serbia's domestic bot ecosystem differs from Russia's in origin and purpose but is akin to it in method and effect. The country's ruling Serbian Progressive Party, known in Serbia by its acronym SNS, assembled its digital infrastructure somewhere around 2017, centered on a system at the domain www.castle.rs whose IP address BIRN, the Balkan Investigative Reporting Network, traced to the same town quarter leased by the party's Belgrade HQ through the state-owned telecommunications company Telekom Srbija. Gaining undercover access during 2019, BIRN journalists documented a managed network of real people, party members, state enterprise employees, teachers, and municipal workers, assigned to run fake accounts and given daily tasks via Viber and a dedicated application. The hacker known as Robin Xud, who assisted BIRN's investigation, estimated that more than 1,500 people were botting daily during normal working hours, billing the Serbian taxpayer for their activities. Instructions specified precisely which opponent to attack, on which platform, and with which accusation.

The scale of the operation itself has been confirmed through successive platform enforcement actions. In April 2020, Twitter removed a large number of accounts identified as engaged in inauthentic coordinated bot activity. The Stanford Internet Observatory found these accounts had generated millions of tweets, mostly in form of simple retweets, consistent with artificial amplification. In the fourth quarter of 2022, Meta dismantled thousands of Facebook and hundreds of Instagram accounts connected to SNS party members. In July 2023, a spreadsheet listing over 14,500 individuals behind SNS-linked accounts circulated publicly. The ruling party's response was a campaign with the slogan “Yes, I am a SNS bot”, a calculated gamble that normalizing the practice would defuse its reputational damage.

The mechanism through which the SNS network amplifies its message is consistent with the following pattern: a narrative is placed in one of several pro-government tabloid newspapers; bot accounts retweet and comment, creating the appearance of popular resonance; that synthetic resonance is then cited back by tabloids and TV stations as evidence of authentic public opinion. BIRN documented how tweets from accounts later removed by Twitter had been embedded as quotations in pro-government media, presented as the voice of the people. Some of those accounts were also picked up by Russian media: following Putin's visit to Belgrade in January 2019, the Russian outlet fontanka.ru cited a tweet from a subsequently deleted SNS bot account as evidence of warm popular reception, an illustration of cross-connection between Serbian bot output and Russian media amplification that shows that these two distinct bot networks can cooperate naturally with one another.

The systematic degradation of the information environment as a political strategy

The structural differences between the two models are, nonetheless, noticeable. Russia's operations are designed for geopolitical projection. They target dozens of countries simultaneously and serve the ambitions of a state that imagines itself a great power in a state of ascendance. Serbia's model is domestic in its primary function, existing to produce an aura of infallibility around a single political actor and suppress space available to his opponents. Where Russian operations are run by companies with Kremlin contracts, Serbia's network is embedded in the party machinery itself, mirroring its municipal hierarchy and operated by ordinary citizens compensated for compliance. Russia employs professional operators; Serbia co-opts state employees.

What unites the two models in wider sense is a shared epistemological framework – the systematic degradation of the information environment as a political strategy. Russia does it to weaken Western democracies and project influence globally; Serbia does it to keep a single party structure in power indefinitely. The methods also overlap because their end goals are identical. They both aim to render citizens uncertain of what is true, distrustful of independent journalism, and exhausted by the effort of distinguishing authentic from manufactured opinions. In the end, both systems amplify each other because their narratives are complementary: Russian bots support the nationalistic claims of their SNS counterparts and SNS bots often glorify Russia as Serbia’s ally.

Read time: 5 min