The Social Bot Phenomenon – The Road to Bot-mocracy?

Photo by NeONBRAND on Unsplash

#WarAgainstDemocrats. This hashtag was widely spread in the social networks during the US Presidential election. Posted about 1,700 times, it seemed like a determined and outraged group of Republican voters was trying to denigrate Hillary Clinton in order for Donald Trump to win the Election. Later however, it has been identified that in fact the storm of messages using this hashtag was created by Russian social bots. This raises major concerns about the stability and safeness of democracy and political systems. How many social bots are out there spreading political motivated messages? Would Trump have won without bot-support? How far does the influence of foreign as well as national interest groups on democratic will formation go? Frankly, most questions cannot be answered with certainty. But what we do know for sure is that social bots are among us – and that they will stay. However, with most of us having heard of the term “social bot”, not everyone is familiar with what they are and what they are used for. Hence, the growing impact of bot activity on the fundament of our society, this blogpost aims to give a better understanding of the social bot phenomenon while also referring to their hazardous implications on democracy.


What’s a bot?

The word bot is an abbreviation of the word “robot” which itself originated from a Czech term translating to “slave” or “forced labor”.[1] The addition “social” suggests its bi-disciplinary background: technology and social science. Social bots may be defined as “a computer algorithm that automatically produces content and interacts with humans on social media, trying to emulate and possibly alter their behavior.”[2] Thus, the three main elements that constitute a social bot are: a) an algorithm, implemented by software, b) the imitation of being a real human, and c) the aim to influence opinion.[3]

However, social bots must be distinguished from so called chat or assistant bots. Those are often used by companies in customer service as help desk tools, automatic telephone answering systems, or may be used as tools to aid in education, business and e-commerce.[4]


Way of functioning

The function of social bots is based on a “stimulus-reaction” mechanism, according to which they detect certain phrases, hashtags or keywords and react to them with pre-defined actions.[5] Widely spread are social bots with a limited functional spectrum whose main task is to repeat certain messages. More sophisticated bots are programmed to automatically filter social media sites and search engines like google for certain hashtags and keywords which they then retweet or reply to, and they link the websites that include the keywords in tweets or automatic replies to other users.  Also, they can follow back and find accounts that are somehow connected to the specific key words. These actions can be executed by social bots individually or collectively as a “botnet”. Moreover, the more complex the bots are programmed, the more deceptive features they have, such as personal information in their profiles, profile pictures, and long Friends or Follower lists.[6] Hence, the identification of social media accounts as social bots poses a crucial problem. Even though bot-detection programs such as “Botometer” exist, it is difficult to detect them and their operators in an error-free manner. In addition, the expected rapid technological development – especially in the field of Artificial Intelligence – will pave the way to new more sophisticated functions and appearances of social bots, which simultaneously further complicates their identification.


It’s a bot’s world

The social bots we know today operate in the form of simple social media accounts, predominantly on Twitter, where approximately 9 to 15 % of accounts are estimated to be social bots, but also other social media sites like Facebook.[7] Social bots are indeed very present on social media. It is estimated that about 40 % of the overall online traffic in 2018 was generated by social bots instead on human users. This might also be contributed to by the fact that it is relatively easy to build a social bot without profound IT knowledge or coding skills. Several tech blogs post tutorials on how to create them and the necessary software tools are free to download. Thus, the bot phenomenon has created a new market. So called BaaS (Bot-As-A-Service) companies like RoboLike or Legacy Pandorabots API offer their customers already developed bots for Twitter Facebook and Instagram for prices starting from 7,77 US dollars per month which can be customized according to personal wishes.

Based on this background, it is crucial to look into the implications of social bots on politics and democracy. Social bots are used in the strategy called “Digital Astroturfing”. Digital astroturfing is defined as “a form of manufactured, deceptive and strategic top-down activity on the internet initiated by political actors that mimics bottom-up activity by autonomous individuals.[8] In other words, social bots are used to create masses of content in the form of posts, retweets, likes and follows, that express consent or disapproval for a certain person or topic while pretending to be regular human voters. In this way, they are generating trending topics which is ultimately distorting the public discourse and may even lead to the creation of doubts about a certain political viewpoint and the alteration of opinions.


The road to Bot-mocracy?

The use of social bots as a propaganda instrument emerged in the course the 2016 US Presidential Election but is certainly not limited to it. Research reveals a worldwide deployment of social bots as political propaganda instruments. For instance, during the 2016 US election, 19 % of the tweets of the overall political conversation were posted by bots.[9] In Great Britain bots ‘hijacked’ the Brexit debate by adding 77,000 signatures to an online petition calling for a second referendum.[10] In France, bots were involved in the #MacronLeaks campaign[11] and also the debate on the UN Migration pact was supposedly infiltrated by bots.[12]

From a social science perspective, the use of social bots may pose an actual threat on our democratic structures. According to the „Spiral of Silence“ theory (Noelle Neumann), the human fear of isolation plays an important factor: Individuals observe the majority opinion of their environment and if their own opinion differs from the opinion they perceive as being shared by the majority, individuals are less likely to openly share a diverging opinion and they even tend to join the majority opinion.[13] A German study examined a communication network and confirmed that this phenomenon can also be applied in online scenarios. They found that just 2 – 4 % of bot participation within a communication network can be sufficient to tip over the opinion climate and that the opinion disseminated by social bots was also perceived as the overriding opinion.[14] In so far, the findings constitute evidence for a social bots as a possible mechanism for manipulating the opinion formation online.

As well as the other countries, Germany is not steeled against bot invasion. Even though all political parties in Germany had pledged themselves to refrain from any bot use as propaganda instruments, studies revealed bot activity among all political parties during the 2017 federal election.[15] Furthermore, just like in the US election[16] also in Germany activity of foreign – mostly Russian – bots was tracked.[17]



This makes us reflect about our understanding of a functioning democracy in the times of AI and the importance of social networks in elections and propaganda.

We can see that social bots may have a dangerous effect on the formation of political will and thinking about a scenario in which political parties or even third state actors use these bots on a large scale to win voters and gain political power undermines our democracy and the concept of “the will of the people”. Even though recently a labelling obligation for automated accounts as social bots has been introduced implementation seems rather complicated.[18] In the face of the continuing existence of the social bots, it is even more important to be aware of their use and their capabilities.


[1] Woolley, S. Howard, P. (2016). Political Communication, Computational Propaganda, and Autonomous Agents. International Journal of Communication Vol. 10, pp. 4882-4890.

[2] Ferrara, E., Varol, O., Menczer, F., Flammini, A. (2016). The Rise of the Social Bots. Communications of the ACM, Vol. 59 (7), pp. 96 – 104.

[3] Kind, S., Jetzke, T., Weide, S., Ehrenberg-Silies, S., Bovenschulte, M. (2017). Social Bots. TA-Vorstudie. Horizon-Scanning Nr. 3.

[4] Abu Shawar, B., Atwell, E. (2007). Chatbots: Are they Really Useful? LDV-Forum, Vol. 22, pp. 29-49.

[5] Kind, S., Jetzke, T., Weide, S., Ehrenberg-Silies, S., Bovenschulte, M. (2017). Social Bots. TA-Vorstudie. Horizon-Scanning Nr. 3.

[6] Ibid.

[7] Howard, P., Woolley, S., Calo, R. (2018). Algorithms, bots, and political communication in the US 2016 election: The challenge of automated political communication for election law and administration. Journal of Information Technology & Politics, Vol. 15 (2), pp. 81-93.

[8] Kovic, M., Rauchfleisch, A., Sele, M., Caspar, C. (2018). Digital astroturfing in politics: Definition, typology, and countermeasures. Studies in Communication Science, Vol. 18 (1) (2018), pp. 69-85.

[9] Bessi, A., Ferrara, E. (2016). Social bots distort the 2016 U.S. Presidential election online discussion. First Monday, Vol. 21 (11).

[10] BBC News (2016). EU referendum hijacked by bots. 27.06.2016.

[11] Ferrara, E. (2017). Disinformation and Social Bot Operations in the Run up to the 2017 French Presidential Election. First Monday, Vol. 22 (8).

[12] Tagesschau (2018). Mit Social Bots gegen den Migrationspakt, 11.12.2018.

[13] Ross, B., Pilz L., Cabrera, B., Brachten, F., Neubaum, G., Stieglitz, S. (2019). Are social bots a real threat? An agent-based model of the spiral of silence to analyse the impact of manipulative actors in social networks. European Journal of Intermation Systems, 28:4, pp. 394 – 412.

[14] Ibid.

[15] Neudert, L.-M., Kollanyi, B., Howard, P. (2017). Junk News and Bots during the German Parliamentary Election: What are German Voters Sharing over Twitter? Project in Computational Propaganda Data Memo 2017.7.; Keller, T., Klinger, U. (2019). Social Bots in Election Campaigns: Theoretical, Empirical, and Methodical Implications. Political Communication, Vol. 36 (1), pp. 171-189.

[16] Shane, S. (2017). The Fake Americans Russia Created to Influence the Election. The New York Times. 07.09.2017.

[17] Digital Forensic Research Lab (2017). #ElectionWatch: Final Hours Fake News Hype in Germany. 24.09.2017.

[18] Schröder, M. (2018). Rahmenbedingungen der staatlichen Regulierung von Social Bots. Deutsches Verwaltungsblatt, pp. 465 – 536.

0 Kommentare

Dein Kommentar

Want to join the discussion?
Feel free to contribute!

Schreibe einen Kommentar

Deine E-Mail-Adresse wird nicht veröffentlicht. Erforderliche Felder sind mit * markiert.