Being a part of the information society, we are using many digital tools in our everyday lives. Digital tools process data by using algorithms to solve problems. At the same time, these algorithms collect information about the users. Algorithms can be defined as invisible technological processes made to direct the user’s ways of experiencing and consuming content (Ptaszek, 2019). Algorithms are used in for example search engines, mobile applications, online news sites and social media. The information collected by algorithms is used by technology companies to examine user behaviour. This data is then used to make recommendations, target advertising, personalise content and develop new services (Gran et al., 2021). If the service is free to use, it is said that the user is the product being sold to advertisers for profits (Ptaszek, 2019). Algorithms are the reason we see the online advertising on products and services we were just searching about, and we may get the feeling that algorithms know too much about us because our online behaviour is being tracked. 

Mansell et al. (2019) note that algorithms build information bubbles around groups of people. These information bubbles make it difficult to see important and objective fact-based information and complicate conscious decision-making. Algorithms affect search results and can complicate finding accurate information. Algorithms may also be used to serve unethical goals and spreading disinformation. All users do not have the skills required to check the accuracy of information and its origins. Educating users about algorithms is essential. 

Algorithmic awareness can be defined as a mental state that makes users take note of something being wrong when they use an algorithm-based service. This feeling gives people the drive to find explanations about the use of algorithms. (Ptaszek, 2019) Becoming aware of the algorithm can result in the user manipulating it via specific actions. Interacting with the algorithm by predicting its results will give users the opportunity to improve information channels and make it easier to supervise what the algorithm sees. (Ptaszek, 2019) It is important to educate users about the advantages and disadvantages of algorithms as well as users’ own possibilities to teach the algorithms and use algorithms to their own advantage. The algorithmic usage needs to be made more visible to the user and the algorithms need to be more transparent on what information is gathered and how to disable data collection. 

Ptaszek (2019) states that the goal for algorithmic awareness is to gain better user autonomy, teach the critical use of digital media and to understand the effect and shaping different social and technological systems have in the information society. Becoming aware of the effects of algorithms is essential, because algorithms can be used to create tension and public debate and to promote violence and hatred towards specific groups. 

Education about algorithms for all age groups is important, because it helps users become aware of the effects algorithms have in manipulating and controlling users. Users need more information about how algorithms work. Algorithmic awareness concerns a wide range of people, and it is significant that algorithmic awareness is taught to everyone in digital societies. Algorithmic awareness can help people find significant information and strengthen their trust in democracy and the user’s rights in the information society.


Gran, A-B., Booth, P. & Bucher, T. (2021). To be or not to be algorithm aware: a question of a new digital divide? Information, Communication & Society, 24(12), 1779-1796.

Mansell, R., Tambini, D. & Livingstone, S. (2019). Tackling the Information Crisis: A Policy Framework for Media System Resilience. LSE Truth, Trust & Technology Commission, The London School of Economics and Political Science. 

Ptaszek, G. (2019). From Algorithmic Surveillance to Algorithmic Awareness: Media 
Education as a Challenge. Academy of Fine Arts in Warsaw, Warsaw, 2019. Polish National Commission for UNESCO, Warsaw.