Political Communication in Russia, Eastern Europe and Eurasia (PolComm in REEE) is a seminar series organized by the Chair of Political Communication. Talks are held approximately once a month. If you are interested in joining the events either as a speaker or a member of the audience, please get in touch with Anna Ryzhova.
Nowadays, disinformation and propaganda campaigns are frequently carried out on social media with the aim to mislead and to manipulate crowds. In order to be successful, each campaign must spread to and "infect" a large number of users. This often mandates large coordinated efforts, as well as the use of bot armies, for the campaign to obtain significant outreach, to exert influence, and to have an impact.
In this talk, we will provide a gentle overview of the state-of-the-art in the computational detection of propaganda campaigns, social bots and – more generally – coordinated inauthentic behavior. We will show the advantage of focusing on coordination and group behaviors, rather than focusing on the behavior and characteristics of individual social network accounts, as done in previous years. We will also briefly survey the most promising techniques for detecting propaganda and social bots and for studying online coordination, highlighting and discussing the tell-tale signs of deception and manipulation.
What is the US government involved in to conspire against its citizens and other good-willing people in the world? What happened on 9/11? Why the US is interested in spreading LGBT propaganda in Russia? How does the world look like according to the famous conspiracy theorist Jesse Ventura?
This paper is dedicated to RT’s most overtly conspiratorial output: the shows ‘The Truthseeker’ and ‘The World According to Jesse Ventura’. These shows explicitly designed to seek out facts that established institutions and power structures have allegedly sought to cover up. The two programmes under investigation date from the network’s inception, and its present-day programming respectively. My analysis reveals an evolution over time in the representational strategies used to convey conspiracy theories on RT. I provide the framework to understand how conspiracy theories operate over time since 2010, when RT launched its broadcasting in the US, and I explore how these theories are being applied to seek the support of various subnational communities inside the US.
Totalitarian regimes are commonly described as exceptional states ruled by extreme heroes/villains and sustained by outrageous actions. While this may be partially true, it is hard to believe that all relevant social systems, actors and practices in the Soviet Union, Nazi Germany or Fascist Italy were extraordinary. Drawing upon conversation analysis and ethno-methodology, the talk argues that such seemingly unusual “totalitarian” activities as the followers’ incessant applauding to the leaders can be rooted in ordinary communicative routines performed by general public on a daily basis.
Search engines increasingly reshape how people consume news around the world and, in the context of elections, how people inform themselves about their choice of candidate. Despite search engines’ political salience, their role as mediators of foreign influence has been neglected in the literature before. In this article, we examine how much influence Russian ruling elites had on Belarusians searching for information on the 2020 Belarusian presidential election and its candidates on Google. Using data from over 50,000 search results collected from google.by over four months (two months before and after the election), we find that Google featured a substantial number of Russian state-aligned websites in its search results and these results differed significantly in their content from non-Russia-aligned websites. We argue that Russian ruling elites via the Kremlin-aligned media successfully used Google’s algorithms to deliver its messages to their target audiences in Belarus, potentially intervening into the election.
Technological corporations and the digital affordances they provide play an increasingly important role in the production and dissemination of news. Their influence ranges from legacy media’s growing reliance on algorithmic recommender systems that perform gatekeeping functions to large-scale investments in the development of innovative digital journalistic tools (e.g., by Google). But what does this relationship look like in Russia? What is the relevance of the political context in which such technological shifts take place, and what consequences may the influence of technological companies have for the online news landscape in partially free media systems?
The paper investigates the relationship between Russian media outlets and ‘Russian Big Tech’ (Yandex) and analyses how the AI-driven personalised content distribution platform Yandex.Zen shapes news production and work routines in newsrooms. The paper argues that all major news aggregators and news recommendation services operate (largely) based on commercial incentives but because of their functionality increasingly perform political functions by influencing the public sphere. This raises the question, how Yandex uses its political power, and how this compares to how e.g. Google and Facebook (seek to) shape media practices elsewhere.
Our analysis builds upon approx. 35 interviews with journalists, editors and IT professionals employed by a broad variety of Russian media outlets. The interviews are complemented by a review of available procedural documents on Yandex.Zen’s media prioritisation program called ‘Nirvana’ (publications by its members are recommended to users more often but have to abide by the platforms’ rules) and computational testing of interviewees’ claims about how they believe the platform’s algorithm works.
In today's high-choice media environments, search engines play an integral role in informing individuals about societal developments. Using complex algorithms, they filter and rank textual and visual content to counter the information overload and supply their users with the most relevant information. However, search engine outputs can also be subjected to biases that can amplify discrimination and facilitate distribution of misinformation. Such biases are particularly worrisome at the time of epistemic uncertainty, when the authoritative sources of truth are challenged and alternative interpretations thrive.
In the lecture, we look at how different search engines perform their information gatekeeping functions in the context of COVID-19 pandemic. Using a virtual agent-based approach, we conduct a large-scale algorithmic auditing of the largest Western (e.g., Google and Bing) and non-Western (e.g., Yandex and Baidu) search engines to analyse how they present information about COVID-19 via text and image search results. We then introduce different techniques for investigating auditing results, varying from mathematical modelling to framing analysis, and discuss how search outputs can influence public perceptions of the pandemic.
At times of war and violent conflict, when media coverage abounds with strategically crafted governmental narratives and contradictory news, media audiences face complex epistemological challenges. Aware of propagandistic efforts and media bias (Szostek, 2018), active audiences are challenged to navigate informational sources and carefully invest their trust. This is especially true in diverse information environment of hybrid political conflict, where competing narratives often justify their cause by creating rigid and exclusive collective identities (van Dijk, 2006).
This study uses the context of the Russian-Ukrainian conflict, marked by pervasive governmental efforts at advancing its ideological agenda (Khaldarova, 2019), to investigate the different sense-making strategies that audiences employ to deal with propaganda narratives. The study uses an innovative combination of serial focus groups and individual in-depth interviews with media audiences in Eastern Ukraine to elicit their interpretations of the ongoing conflict. It performs a discourse analysis of the focus group and interview transcripts, augmented by micro-level conversation analysis to track the changes that meaning undergoes from individual interpretation to group discussion and back to individual opinion. In particular, it aims to reconstruct how citizens cope with contradictory media accounts and obtain confident knowledge in a contaminated information environment.
Dr Holger Mölder is an Associate Professor in International Relations at the Tallinn University of Technology, who is focusing on various international security issues in his academic activities. He has PhD degree in Political Sciences from the University of Tartu and MA degree in International Security and Civil-Military Relations from the US Naval Postgraduate School. Previously, he worked nearly 20 years for the Estonian Ministry of Defense and the Estonian Military Academy. His main research interests and recent academic publications cover cooperative security issues, political cultures, influence and information operations, conspiracy theories and psychological warfare.
In her presentation, Elizaveta will talk about her new research project that analyses and explains Russia’s efforts to sway public opinion in the US and Germany on social media. Using a case-study based research design her work will study RT’s (formerly, Russia Today) audio-visual content disseminated on Facebook and YouTube in the context of the upcoming 2020 US Presidential Election and 2021 German Federal Election.