After checking the hashtag distribution per country we have noted that this campaign mostly targeted audiences from the United States, U.K. and India. This does not come as a surprise as English speaking countries, especially the United States, are usually the main targets for this type of propaganda where the actor attempts to over inflate its moral high ground on specific topics through digital influence.
Lastly, we monitored the hashtag #HERO popularity from the end of December 2020 up to 3 January 2021 and came up with an expectable increase in distribution.
End of December 2020
Beginning of January 2021:
As mentioned before, it is a radically different approach from the one employed by Russia as, rather than having the creation of havoc through totally false information as its main aim, Iran uses propaganda-type actions in order to not only influence international actors but also keep its domestic affairs in check. The dual aim of its online campaigns stems from the belief that the state can only survive through informational dominance both on the domestic level, by ferociously censoring opposing views and controlling the Internet, and on the international stage, using informational war tactics as a continuation of its public diplomacy. Although these tactics have increasingly been used after intensifying US pressure and international sanctions, information warfare as a covert alternative to military actions can be seen as a priority of the Iranian state since the Islamic Revolution of 1978-1979. This priority is based on the permanent fight of this state to be recognised as a religious centre, an anti-imperialist and anti-colonialist warrior and a victim of the US, all at the same time. Taking all of these into account, it is clear that the motive behind most of Iran’s online actions is to promote pro-Iranian talking points by hijacking the political conversation. Even though these types of campaigns do not employ disinformation as the main tactic, one can not ignore the danger that this type of propaganda poses against factual discourse through its use of distorted truth and exaggerated rhetoric.
On the other hand, it has to be pointed out that Russian-like campaigns are not totally disregarded by Iran. As an example, creating division within the US (especially) and polarising the society even more than it already is can also be observed as outcomes of an action mentioned in the Atlantic Council report and the usual occurrence of Russian rather than Iranian strategy. In that case, Iranian affiliated entities (as discovered by Facebook in 2019) established a page named “BLMNews.com” (connected with the Black Lives Matter movement) with the objective of ‘feeding’ manipulated information to progressive activists and promote Iranian interests. However, the secondary effect was, most probably, deepening the right-left divide and adding fuel to the fire.
Another important aspect to be taken into account is the fact that Iran seems to learn information warfare tactics from the ‘elders’ like Russia and China. The creation and development of trolls and bots armies is a key step into conducting efficient propaganda campaigns. In the case of Iran however, there seems to be a lack of hierarchy. This is a crucial element of an effective cyber propaganda army as vertical leadership leads to a more efficient repartition of tasks and objectives that have to be achieved. For instance, Russia’s Internet Research Agency – also known as ‘Glavset’ – is financed (and maybe ruled) by the Russian oligarch Yevgeny Viktorovich Prigozhin, also known as ‘Putin’s chef’. Consequently, its workers are divided into one category of trolls focused on the production of memories (content that has an impact on the user) and another one which concentrates on writing comments on posts of other users. Particularly, according to Dawson and Innes (2019), these two branches of the Russian IRA have to maintain six Facebook accounts and ten Twitter ones, post at least three times a day about the most recent news and discuss the evolution of the social media groups where the malicious accounts spread the Kremlin’s narratives. This relevant feature seems to not be available for Iran’s cyber armies.
Nevertheless, just like Russia, Iran seems to have developed its cyber armies based on swarm intelligence. According to Gerardo Beni and Jing Wang, swarm intelligence represents a collective behaviour of decentralized, self-organized systems, natural or artificial. Speaking of state-sponsored disinformation campaigns, trolls represent the human actors whose job is to give a personal touch to Tehran’s narratives. Around them, hundreds or even thousands of bot accounts revolve and disseminate those narratives in order to ensure that they reach even in the most informationally isolated corners of public opinion. Basically, they behave as technological actants (Bennett & Segerberg, 2013), namely networks of non-human agents.
In addition, if Russia also aims to create a Russian dedicated Internet – RUNET – where the ‘malicious digital information’ spread by the West does not infect the Russian population anymore, Iran is far from wishing to achieve this objective. Rather Teheran strictly wants to project its geopolitical interests through mechanisms of personalized communication, the one used here being the hashtag. As Bennett & Segerberg (2013) argue mechanisms of personalized communication like hashtags and mentions provide the user the capability to exactly target the audience.
Iran is not to be ignored when it comes to digital influence. Although it is only at the beginning of the road on digital propaganda, and its effects are not felt at a level similar to that of Russia, Iran has a capacity for rapid adaptability and being forced, it will increasingly focus on this tool of influence, domestic and global. From our analysis, we noticed that Iran has a special specific when it comes to disinformation campaigns, but it also borrows strategies from Russia, operating in a hybrid propaganda model.
Surrounded by enemies, with China and Russia as the only strategic allies, Tehran will increasingly use digital weapons against the West to advance its interests of social destabilization, division and mistrust. What is certain is that we are moving towards a digital age where the threats are getting bigger and bigger, and the information we will interact with will be more and more manipulated. There is a need for a state approach against misinformation, with a program developed to educate civil society, digital platforms to help limit and detect this phenomenon early, and to introduce critical thinking into the school curriculum. The truth is becoming increasingly difficult to identify on the Internet, and increasing the ability to distinguish false news is a matter of national security.
Authors: Intel4Patriam (Felix Staicu, Alexandra Ivan, Razvan Ceuca) Factide (Daniel Leu)