Visuals of horrific violence and atrocity are one of the most powerful factors behind increased mass-based mobilisation. In today's political environment, entrenched cognitive biases, political polarisation and filter bubbles aided with the virality of social media, has made spreading lies easier than ever. Synthetic media, like deepfakes, will blur the boundary between fake and real, and lower public trust even in visual media. This will negatively affect mobilisation, since the veracity of any media could be contested. Hyper-realistic synthetic media like deepfakes will negatively affect mobilisation in three distinct ways – lowering trust in visual media, harming the credibility of news organisations and enabling autocrats to delegitimise their opponents.
Visual media and mass-based mobilisation
Visual media has the power of persuasion. People believe what they see, more than what they read 1. Images and videos increase the feeling of salience and importance 2. Social protests too 3 are driven by images and videos of brutality and violence. Civic movements in the last decade have essentially been born out of the horror and rage that people shared 4 when they were exposed to visuals of state brutality on social media 5.
The role of visual media is not limited to arousing emotion alone. Imagery has motivating potential in political activism beyond the emotions 6. They help potential participants in determining the efficacy 7 of mobilisation. Chances of participation in a protest increase when viewers see people like themselves as part of the said protest in large numbers.
The role of social media in the dissemination and reach of visuals cannot be underestimated. Social media ubiquity and connectivity ensures that visual media is shared quickly by a large number of people. The more emotionally arousing the imagery, the faster it spreads. In times of instability and political action, potential participants use this imagery as a way to verify their leanings towards participation.
In the context of mass-based mobilisation, the reliance on visual media is based on trust. The viewer does not just draw their emotional response from visual media but also trusts the information from the visuals about the size of the protest. In short, visuals help mobilisation by invoking a feeling of trust and confidence about the individual's participation in collective action.
What is a deepfake?
Deepfakes are a kind of synthetic media. A deepfake is a photo, video or audio that is manipulated using artificial intelligence (AI), in order to show a person saying or doing something that they have never said or done. However, deepfakes are different in that they are created by artificial intelligence programs and produce visual media (images and videos) that are virtually indistinguishable from other synthetic media.
Deepfakes gained attention in late 2017 when reports surfaced 8 of machine learning technology being used to create fake pornographic videos. Since then, popular media attention has made deepfakes better known. Creative artists have been using the technology to create videos 9 that are funny and impressive. At the same time, researchers 10 and media houses 11 have also created deepfakes to educate the public about the potential risks that a video, which looks real but is not, could pose.
So far, deepfakes have not been used for political purposes. However, this does not mean it cannot happen in the future.
A large component of the threat posed by deepfakes comes from their commoditisation and scale. Technology used to create high fidelity, almost real videos and images is available on open platforms like GitHub 12, and mobile applications like FaceSwap and Zao 13, which are available for anyone with a smartphone to download. There is also little to no cost in producing such videos, but a high cost in detection in terms of resources and time.

A still from a deepfake video of President Trump (Source: BBC)
Mobilisation and deepfakes
Social movements face some key challenges. Having a critical mass of people who are informed, involved and actually participate is key for any social movement. One of the biggest challenges is of collective action and the classic game theory problem of individual rationale of not contributing but reaping benefits from social change.
Visuals and imagery help address these challenges by mobilising potential participants. In repressive regimes where traditional mass media like television and newspapers might not be allowed to disseminate such visuals, this role is played by social media.
Social media also speeds up social mobilisation. Even within various modes of digital communication, mobilisation speed of social media is at least twice as fast as e-mail 14. This means that even if one or more core nodes of communications are taken away, visual information can still reach the fringe and outer nodes of the network in reasonable time, in turn increasing mobilisation.
Social movements face another challenge - one from authoritarian leaders. In the last three decades, the most common cause of dictator departure in authoritarian regimes has been popular revolt 15 16. It is then obvious that authoritarian leaders perceive citizen mobilisation and popular revolt as the biggest threat to their survival. They would redirect resources to curb civic movements.
In the context of these threats and impact of visual media, deepfakes can have far reaching, negative impact on mass-based mobilisation, in terms of trust in visual media, legitimacy of social movements and credibility of news.
1) Trust
Within the existing environment of disinformation and fake news, deepfakes have the unique power of lowering trust, even in visual media like images and videos. Combine the persuasive power of hyper-realistic media and distribution power of social media 17, and we have an explosive recipe for spreading lies and distrust.
This lack of trust will have a negative effect on mass-based mobilisation. When people can’t even believe what they see, they are less likely to believe the causes and realities of a social movement.
Imagine a video of a protestor being violently attacked and killed by the police or the army in a certain state. Now imagine a deepfake video with the face of the protestor pasted on someone else, claiming that the attack never happened and they are safe. With both these videos available online, which one would people believe? If there was a march organised to protest the killing of the protestor, would people even participate? Such a scenario is not unlikely, and would make it harder for moderate activists and apathetic moderates to mobilise 18.
2) Credibility
Deepfakes will have a chilling effect on the speed with which news organisations report on real-time disturbing events.
Much of the reporting on domestic and international events relies on visuals available to news organisations via local correspondents or via social media. Imagine a scenario in which one of the videos of police firing at unarmed protestors is actually a deepfake. If the media house decided to run the video that turns out to be fake, it would deeply hurt the credibility of the said news organisation.
On the other hand, if the organisation fact-checked the visuals, as most news organisations today do, they could be late in reporting news that is time-critical. Deepfakes are sophisticated and of high fidelity. It would take even a well-funded news organisation a greater amount of resources and time to fact check such visuals.
3) Legitimacy
Autocratic leaders could easily use deepfake technology to delegitimise and discredit their possible opponents and challengers 19. Authoritarian regimes have access to a wide array of technology for censorship and surveillance. Such regimes, along with hybrid regimes have co-opted social media to spread mis- and disinformation. These tools are used to spread misinformation on detractors, challengers and possible opponents.
When autocrats see popular civil unrest as a major threat, they would very well use deepfakes to delegitimise such movements. It is not hard to imagine autocratic leaders deploying a deepfake video or image that shows how the protestors have a history of terrorism, or how the leader of a popular civic movement is actually paid by a foreign government.
Even in democratic countries with genuine grassroots movements that mobilise large number of people, deepfakes could be used by detractors and vested interests to delegitimise either the movement or its leaders.
Conclusion
Deepfakes should not be seen as a new threat. In the broader context of disinformation today, deepfakes are just another addition to an array of tools that lower trust and sow deep distrust in organisations and institutions. However, the sophistication, credibility and scale of deepfakes is the scariest aspect. When people cannot even believe what they see with their own eyes, they are unlikely to believe anything or anyone. This problem will be worse in countries and societies that are living under state repression because even events that would otherwise have the potential to enable collective action, could be easily questioned. How would people then ever believe if something is worth fighting for?
[1] D. Kim, M. G. Frank and S. T. Kim, "Emotional display behavior in different forms of Computer Mediated Communication," Computers in Human Behavior, vol. 30, pp. 222-229, 2014.
[2] S. J. O’Neill, M. Boykoff, S. Niemeyer and S. A. Day, "On the use of imagery for climate change engagement," Global Environmental Change, vol. 23, no. 2, pp. 413-421, 2013.
[3] A. Hermida and V. Hernández-Santaolalla, "Twitter and video activism as tools for counter-surveillance: the case of social protests in Spain," Information, Communication & Society, vol. 21, no. 3, pp. 416-433, 2018.
[4] H. N. Philip and M. M. Hussain, "The Upheavals in Egypt and Tunisia: The Role of Digital Media," Journal of Democracy, vol. 22, no. 3, pp. 35-48, 2011.
[5] J. D. Sutter, "The faces of Egypt’s “Revolution 2.0.”," 2011. [Online]. Available: http://www.cnn.com/2011/TECH/innovation/02/21/egypt.internet.revolution/index.html. [Accessed 01 12 2019].
[6] T. Kharroub and O. Bas, "Social media and protests: An examination of twitter images of the 2011 egyptian revolution," New Media & Society, vol. 18, no. 9, pp. 1973-1992, 2016.
[7] M. van Zomeren, R. Spears, A. H. Fischer and C. W. Leach, "Put your money where your mouth is! explaining collective action tendencies through group-based anger and group efficacy," Journal of Personality and Social Psychology, vol. 87, no. 5, pp. 649-664, 2004.
[8] S. Cole, "AI-Assisted Fake Porn Is Here and We’re All Fucked," December 2017. [Online]. Available: https://www.vice.com/en_us/article/gydydm/gal-gadot-fake-ai-porn. [Accessed 16 November 2019].
[9] Ctrl Shift Face, "Bill Hader channels Tom Cruise [DeepFake]," 2019. [Online]. Available: https://www.youtube.com/watch?v=VWrhRBb-1Ig.
[10] B. Warner, "Deepfake Video of Mark Zuckerberg Goes Viral on Eve of House A.I. Hearing," 12 June 2019. [Online]. Available: https://fortune.com/2019/06/12/deepfake-mark-zuckerberg/.
[11] BuzzFeedVideo, "You Won’t Believe What Obama Says In This Video!," 17 April 2018. [Online]. Available: https://www.youtube.com/watch?v=cQ54GDm1eL0.
[12] Deepfakes, "Deepfakes software for all," 2019. [Online]. Available: https://github.com/deepfakes/faceswap.
[13] G. Shao and E. Cheng, "The Chinese face-swapping app that went viral is taking the danger of ‘deepfake’ to the masses," 4 September 2019. [Online]. Available: https://www.cnbc.com/2019/09/04/chinese-face-swapping-app-zao-takes-dangers-of-deepfake-to-the-masses.html.
[14] J. Wang, S. Madnick, X. Li, J. Alstott and C. Velu, "Effect of media usage selection on social mobilization speed: Facebook vs e-mail," PLoS ONE, vol. 10, no. 9, 2015.
[15] B. Geddes, J. Wright and E. Frantz, "Autocratic Breakdown and Regime Transitions: A New Data Set," 2014.
[16] S. Feldstein, "How artificial intelligence is reshaping repression," Journal of Democracy, vol. 30, no. 1, pp. 40-52, 1 1 2019.
[17] R. Chesney and D. Citron, "Deep Fakes: A Looming Crisis for National Security, Democracy and Privacy," 2018. [Online]. Available: https://www.lawfareblog.com/deep-fakes-looming-crisis-national-security-democracy-and-privacy.
[18] S. Lohmann, "The Dynamics of Informational Cascades: The Monday Demonstrations in Leipzig, East Germany, 1989–91," World Politics, vol. 47, no. 1, pp. 42-101, 1994.
[19] A. Breuer, T. Landman and D. Farquhar, "Social media and protest mobilization: evidence from the Tunisian revolution," Democratization, vol. 22, no. 4, pp. 764-792, 2015.