Disinformation operations in cyberspace

In the age of communications, where social networks have taken on a particularly relevant role in Western societies, together with the phenomenon of cyberattacks, an even more intangible security vulnerability has been verified: the existence of disinformation operations in the cyberspace in order to modify public opinion and existing currents of thought in society.

However, despite its new propagation vector in the cyber domain (social networks), the phenomenon of misinformation is not a new tool in conflicts. Already in the XNUMXth century BC, the Chinese General Sun Tzu pointed out that “the art of war is deception”. (Sun Tzu). And closer to our time, between 1933 and 1945, we could appreciate the full power of disinformation as a weapon of war by studying the figure of Joseph Goebbles, Minister for Public Enlightenment and Propaganda of the German Third Reich:

It is essential to demoralize the enemy nation, prepare it to capitulate, morally constrain it to passivity, even before planning any military action… We will not hesitate to foment revolutions on enemy land

(Rauschining, H., 1940).

The technological revolution has done nothing more than globalize and implement in a magnified way (both in magnitude, frequency and effectiveness) an already existing phenomenon. 

However, what in the past was a tool used only by States, today due to its low cost and benefits it has become the preferred tool to erode and weaken the internal cohesion of a State and is used not only by foreign state actors but also by subnational groups.

According to data from a study carried out by the CCN-CERT, about “90% of the Spanish population between 16 and 65 years old can potentially be the victim of a disinformation attack” (CCN-CERT, 2019). If we add to these percentages the idea of ​​the philosopher Jürgen Habermas on the importance of rational deliberation to achieve a stable democracy (Macnamara, 2016), we would have the necessary breeding ground to highlight the importance of not letting the consequences that operations may have go unnoticed. of disinformation about the public opinion of a State.

In Spain, where 92% of the Spanish population between 16 and 65 years of age get information daily through the Internet and 85 percent do so through social networks, according to data from the National Observatory of Telecommunications and the Information Society of the year 2017, we are not exempt from the risks that emanate from a disinformation operation (ONTSI, 2018).

In this scenario it is obvious to think that, disinformation is a phenomenon in vogue when it comes to security issuesHowever, it is necessary to specify what exactly we mean by disinformation, and more specifically, by disinformation operations.

We can define disinformation as the deliberate dissemination of false, manipulated or biased information for hostile purposes (De Pedro, 2019), and by disinformation operations, lucrative interference –whether economically speaking or in terms of political influence– of a state agency or non-state that has sufficient technical capabilities to spread misleading information with the intention of damaging the credibility or image of a target.

In this regard, special consideration is given to the emergence of fake-news, that is, fake news, which are still stories with the appearance of news written with the aim of achieving an emotional reaction and far from the idea of ​​transmitting information (Gómez de Ágreda, 2019).

Our fake news they do not necessarily have to be lies, but tendentious stories capable of getting the content spread, that is, a real news story with an exaggerated headline on purpose to get its spread misleadingly would also be classified as such.

Technically speaking, the phenomenon of disinformation is divided into two phases: first, the creation of the content takes place, this being the most important of the two, since it is essential to previously have a reliable and apparently credible web portal capable of produce the content –such as the Russian portals RT (Russia Today) or Sputnik–; and secondly, its dissemination and amplification would take place, through publication on social networks (Twitter, Instagram, Facebook, etc.), spreading through private messaging services (Whatsapp, Telegram, Kik, etc.), and achieving , ultimately, to be visible through the main Internet search engines (Polyakova and Fried, 2019)[1].

According to a study conducted by the Massachusetts Institute of Technology, human psychology is especially predisposed to favor the spread of fake news. Through the study of more than 126.000 rumors published on Twitter, the authors were able to verify that this type of news spread up to 6 times faster than those considered true (especially those related to political issues) and concluded that false news is shared on social networks 70% more than the true ones (Vosoughi et al, 2018).

This phenomenon is mainly due to three aspects. On the one hand, the novelty and surprise of this type of content, which attract the reader’s attention thanks to the use of big headlines; on the other hand, to the emotional implication that this type of news tries to obtain from the receiver[2]; and lastly, due to the fact that the average user confers greater credibility to the messages he receives if the issuing entity has a certain prestige or trust is attributed to him.

Especially since, in the interest of the national media to stay at the forefront of information in real time, this disinformation is rarely contrasted and to a certain extent they accept the “reality” shown by disinformers.

Or, put another way, in addition to the fact that the average user interacts more by intuition and by the coherence that he intuits in the message in correlation to his own beliefs than by having researched the subject, there are also no great contrast processes in the information emitted. by the middle ones.

Once the message is validated by the receiver’s feelings and is considered true, he will spread it again among his contacts, who in turn, since these senders are trustworthy for them, will continue to accept the validity of the news and thereby contribute to the fraudulent syndication cycle of received content (Gómez De Ágreda, 2019)[3].

In addition, given that individuals behave differently when they are participants in the story produced and actively participate in its elaboration and dissemination, disinformation operations that seek to influence the population rarely try to change what people think about them. a theme, but rather, the individual is intended to confirm their own beliefs by constituting themselves as part of the story.

A close example of the potential that the interference of disinformation can have can be found in the dissemination of numerous fake-news and directly false information among the Spanish population by various groups and states with the aim of eroding Spanish social stability, and therefore European, through the promotion of the Catalan independence movement.

The Catalan question came into play for the Russian disinformation media as a result of the annexation of Crimea by Russia in 2014, since the Kremlin sought to legitimize the annexation of the Ukrainian peninsula, and Catalonia became the perfect excuse to remind the European Union that the phenomenon of European independence movements could be its Achilles heel[4]

However, this is not an isolated event in recent European history. The war in Ukraine has allowed the European Union to fully confirm that, indeed, RT and Sputnik have been –and are– used as “instruments of disinformation” by Russia to gather social support for military aggression and as a means to undermine the social peace of the member states of the Union.

In the words of the High Representative for Foreign Policy, Josep Borrell: «The systematic manipulation of information by the Kremlin is applied as one more instrument in the assault on Ukraine. It poses a direct threat to the public order and security of the Union», (since RT and Sputnik are) «under permanent direct or indirect control of the Russian authorities»[5].

Public acceptance of this reality has been manifested through the adoption, for the first time in the history of the European Union, of specific measures to censor the propaganda content broadcast by RT and Sputnik (ABC, 2021).

As we advanced at the beginning of the Focus, information manipulation is not something really new, however, after the rise of social networks and the Internet, it has become one of the great challenges for Western democracies, since one of the main pillars one of these is free access to information and the open and plural nature of their societies – the absence of borders on the Internet and in social networks makes external hostile interference carried out by state and non-state actors easily accessible to public opinion, which transforms it into a potential strategic vulnerability (De Pedro, 2019).

Internet and social networks have brought about a revolution in the way public opinion is informed: free, convenient and easy access to information are the reasons why these new access routes have begun to impose themselves on the media traditional communication platforms, however, all that glitters is not gold, and likewise, these platforms have been linked to new risks and threats to society: among others, they have given a considerable boost to disinformation operations.

In general, public opinion in democratic states is especially permeable and vulnerable to this type of interference, given that, on the one hand, disinformation attacks the biases and contradictions that exist within each individual and also stresses the general malaise of society. around a series of key cleavages, and on the other, given the relevance and force that public opinion is capable of exerting on the existing power systems in a democratic state.

The response that we can give to these phenomena of disinformation that we suffer as free societies, starts –in addition to state censorship measures– from our own awareness as free citizens, since only the individual is capable of self-imposed good informative habits such as comparing and contrasting information, not disseminating content about which we are not sure of its veracity, taking time for emotional reflection, or studying a topic in depth before issuing an opinion on it.

Our present is evolving at a dizzying pace, soon – when practices like the deepfakes or deepfakes[6] spread – we should distrust even what our senses perceive. However, we must begin as soon as possible in our informative education. This Focus has been conceived with the aim of making us aware of our shortcomings in this regard as a society, of the risks that this entails and of the need to educate ourselves as individuals to create a society less vulnerable to fake newss and disinformation campaigns. The war of information and misinformation is waged within each individual.


  • Alandete, D., (2019), “World War on the Internet: how disinformation aggravated the crisis of Catalan independence”, collected in “#Disinformation Power and manipulation in the digital age”, coord. Manuel R. Torres, Institute of Security and Culture, Editorial Comares.
  • National Cryptologic Center, (February, 2019), “Disinformation in cyberspace”, CCN-CERT BP/13, CCN-CERT.
  • De Pedro, N., (2019), “Russian disinformation against the European Union”, collected in “#Disinformation Power and manipulation in the digital age”, coord. Manuel R. Torres, Institute of Security and Culture, Editorial Comares.
  • Gomez de Agreda. A., (2019), “Orwell’s World”, Editorial Ariel.
  • Macnamara, J., (2016), “Organizational Listening”, New York, Peter Lang. P.10.
  • Rauschining, H, (1940), “Hitler told me”, Editorial Atlas.
  • Sun Tzu, (sVac), “The Art of War”, Editorial Fontana.
  • Torres Soriano, M., (2019), “#Disinformation Power and manipulation in the digital age”, Institute of Security and Culture, Comares editorial.
  • National Telecommunications and IS Observatory, (2018), “Sociodemographic profile of Internet users”. INE 2017 data analysis.
  • Vosoughi, S.; Roy, D. and Aral, S., (2018), “The spread of true and false news online”, Science, vol. 359, no. 6380, pp. 1146-1151.


[1] In addition, it is necessary to reflect the use of bots, that is, accounts managed remotely by an algorithm and that do not belong to a natural person, as a means of quantitative content distribution, in order to highlight their importance.

[2] The news that express surprise or anger are those that achieve greater virality in social networks (Alandete, 2019).

[3]  This phenomenon is what is known as an “echo chamber”, according to which the user is offered information of a similar nature to that which he himself produces. The main virtual platforms are designed following this pattern with the aim of filtering information regarding user preferences and motivations (Torres Soriano, 2019).

[4] The media coverage by RT and Sputnik media, aided by Wikileaks founder Julian Assange, is the perfect example of a disinformation campaign.

[5] In this way, the Member States of the Union finally internalize the statements of the creator of RT himself, who in the past had already come to defend that the media he presided over were designed to wage media and digital warfare (Alandete, 2019).

[6] This term refers to the technology capable of making computer-manipulated videos in which people’s faces are shown that do not match their bodies, or their statements. An example of this could be observed in the following link https://www.youtube.com/watch?v=8-Jcg4I4vB0

Be the first to comment

Leave a Reply

Your email address will not be published.