Updates

How Algorithms Fuel Polarization and Conflict

22/05/2025, 17:34

In today’s world, where billions of people use social media, algorithms created and used by big tech giants have become a vital tool. Moreover, this tool is essential both from a business point of view and from the point of view of its impact on society.
Algorithms decide what product, service, or content you see. Of course, your preferences, likes, comments and views also play a role in this. Each user is an object of influence for algorithms but also a subject of study.
Social media giants are interested in our greater engagement to get us to comment, like, share and consume more content. The more time we spend on social networks and the more actions we perform, the more engagement grows, which in turn is monetized by social media platforms.
But there is another side to all this, because algorithms decide what content to show us based on our behavior, and over time a bubble forms around us. We find ourselves in an invisible algorithmic bubble.
In practice, this means that if you are interested in cars, the algorithms will show you a lot of content about cars, and food lovers will get a lot of gastronomic content. At the same time, this approach applies to political content; conservatives will see a lot of conservative content, liberals will see a lot of liberal content, and followers of a particular religion will see content based solely on their interests. In the long term, this approach contributes to the process of polarization and radicalization of both certain individuals and society.
A person lives in his own ideological bubble created by algorithms, and this person has almost no contact with alternative opinions or assessments. Over time, a person begins to believe that his beliefs are the only true beliefs, and it seems to him that others also think so (because the algorithms feed him with the corresponding content 24/7). When this person encounters the real world, where there are people who do not agree with his views or even hold opposing views, this, in most cases, causes misunderstanding, misperception, and, as a result, radicalization and aggression. As billions of people use social media and, therefore, consume content, they become susceptible to the influence of algorithms. If algorithms continue to operate as they do now, in the long run, this will undoubtedly contribute even more to the polarization of societies. This is especially noticeable in the US and a number of other developed countries, where the percentage of high-speed Internet penetration and the number of smartphone users are very high. Since 2008, in parallel with Public Relations, I have also been working on Information Warfare issues, and according to my observations, malicious actors have been using this specificity of algorithms from day one to distribute their content. They understand that controversial, hostile content will cause many positive and negative reactions, but against this background, algorithms will show this content to many people. This is also one of the factors influencing polarization.
There is a risk that if major social media platforms do not reconsider their approach to algorithmic functioning, it will be too late to change or adjust anything after a certain time.

Media Literacy and Information Warfare Webinar with Students of V. I. Vernadsky Taurida National University

OK Magazine Highlights Importance of Cybersecurity and AI Training with Insights from Ali Hajizade

Ali Hajizade Discusses Bridging Education Gaps in Business Insider Africa Feature