Episode 2: Algorithmic Trenches – Echo Chambers and Isolation Cirjakovic Milos, 24/02/202609/03/2026 Share on X (Twitter) Share on WhatsApp Share on LinkedIn Share on Reddit Share on Telegram Share on Facebook Share on Pinterest The internet was theoretically supposed to broaden our horizons, but in practice, it has forced us to dig into our own beliefs. In the previous article, we dealt with the technical side of the story. I explained how algorithms select information and create “bubbles” where we hear only our own echo. If you want to understand the mechanism behind it, I recommend you read it [HERE] before continuing. Today, we won’t be looking at algorithms as lines of code, but as tools that directly influence our behavior and society. We are exploring what happens when this digital isolation stops being just an information filter and becomes a psychological trench. Why social media keeps us in a constant state of high alert, how we are losing the capacity for empathy toward those who think differently, and how the fight for “digital justice” is actually dividing us into closed, intolerant tribes. We are entering the second episode: Algorithmic Trenches – the place where logic ends and polarization begins. Architecture of Rage: Why the Algorithm Loves Your 140/90 Blood Pressure Have you ever noticed that content which relaxes or informs you rarely holds your attention for more than a few seconds, while a post that sets you off can keep you typing a response for tens of minutes? This is no coincidence, it is by design. Platforms don’t sell the truth, they sell your attention, and rage is the most profitable fuel in the world. Biology in the Service of Profit Over time, algorithms have learned a simple lesson from human biology, our brains are hardwired to prioritize threats. When you see something that makes you happy, you “like” it and move on. But when you see something that deeply irritates you, your limbic system, that ancient, prehistoric part of the brain kicks into fight-or-flight mode. In that moment, your blood pressure spikes, adrenaline surges, and you feel a biological urge to react. To the algorithm, that spike in your pressure is pure profit. The more “fired up” you are, the longer you will stay on the app, typing replies and refreshing the page to see who responded. The algorithm doesn’t want you to be informed, it wants you to be engaged and nothing engages a human being as effectively as a sense of injustice or being under threat. “Engagement” as a Code for Conflict In digital marketing, you often hear the word “engagement.” It sounds positive, but in practice, the highest level of interaction doesn’t come from pet photos, it comes from content that triggers moral outrage. The algorithm does not distinguish between your “happy” interaction and your “angry” one. To it, your five-paragraph comment explaining to someone why they are wrong is exactly the same as a compliment to a friend. In fact, it prefers the former, because it will likely trigger ten more reactions just like it from other people. This creates a spiral where the system intentionally pushes the most extreme views into your sightline, as they guarantee you will “get hooked.” The Machine for Radicalizing the Everyday The problem is that this architecture forces you to live in a state of constant digital irritation. Your feeds are not a realistic picture of the world, but a collection of the most extreme moments designed to keep you alert and ready for a fight. When every time you go online is accompanied by a micro-dose of stress, your tolerance for differing opinions drops. You no longer seek understanding, but rather confirmation that you are right and a target on which to vent that pent-up pressure. In this way, social networks are not just a mirror of your society, they are an amplifier for your worst impulses, turning every misunderstanding into an existential conflict. Dehumanization at a Distance: How “They” Become Faceless Avatars When was the last time you got into a serious, heated argument with a stranger while waiting in line at the grocery store or the post office? Probably never. But on the internet, you do it every single day. The reason is simple, digital spaces strip away your ability to see the human being behind the screen. The Loss of the Human Face In the real world, while talking to someone, you see their eyes, hear the tone of their voice, and feel their emotion. Your biology doesn’t allow you to simply lash out at someone standing right in front of you. On social media, that braking mechanism does not exist. There, you aren’t clashing with a person who has a family, worries, and a name, but with an avatar. To your brain, this is not a living being, but a pixelated representation of an idea you dislike. The moment you reduce a person to an icon and a username, the threshold for aggression drops drastically. A Caricature Instead of a Neighbor The echo chamber does one very dangerous thing. It never shows you the best arguments of the “other side.” Instead, the algorithm serves you their stupidest, most extreme, and most irritating representatives. Your feed is filled with examples of “those others” saying terrible things or acting like fools. In this way, you create an image that everyone who thinks differently from you is actually evil, insane, or a paid bot. You no longer see them as neighbors with whom you could share a coffee despite your differences, but as a statistical error that needs to be corrected or, even worse, removed from the digital space. Digital Isolation as a License to Hate When you are isolated in your own bubble of like-minded people, every voice of reason coming from the “outside” feels like an attack. Since you never meet these people outside the context of an ideological war, it becomes easy to hate them. Online hate is “clean” because it lacks the social consequences that exist in real life. You can wish the worst upon someone, turn off your phone, and go back to your lunch. But the problem is that this feeling doesn’t stay inside the phone, it slowly eats away at your capacity for empathy in the real world, training you to view everything through the lens of “us” versus “them.” Intellectual Tunnel Vision: The Death of a Shared Reality In the past, the biggest problem was that we held different opinions about the same “thing.” Today, the problem runs much deeper. We no longer agree on what the “thing” even is. Welcome to the era where a shared reality no longer exists, and every one of us lives in our own informational tunnel. The Loss of the “Public Square” Imagine standing in a city square. You can argue over whether a monument is beautiful or ugly, but you all see the same monument. You all see that it’s raining, you all see that it’s noon. This was the foundation of human society, a shared base of facts. The digital world has demolished that square. Algorithms have given each of you a personalized window into the world. While you see evidence for one theory on your screen, your acquaintances see a completely opposite “truth” on theirs. The result? You are no longer looking at the same sky. Each of you has your own set of “irrefutable evidence” and feels absolutely superior to the others. Facts to Order Inside your informational tunnel, the algorithm will never serve you a piece of data that ruins your mood or shatters your worldview. It serves you “facts” that confirm what you already believe. This is called confirmation bias, and platforms have turned it into an industry. When you encounter someone who thinks differently, you are no longer arguing about logic. You are colliding with two completely different universes of information. To you, their arguments seem like pure lies, to them, yours look like brainwashing. This is intellectual tunnel vision, you see only what is directly in front of you, within the narrow circle the algorithm has chosen to illuminate. Why Is This Dangerous? When there is no shared base of facts, dialogue becomes impossible. You cannot solve a problem if you cannot even agree that the problem exists. This division makes us vulnerable because society no longer relies on verifiable data, but on whose echo chamber is louder and more aggressive. The death of a shared reality means we have become strangers to one another, even if we live in the same building. We are no longer divided by national borders, but by the boundaries of our feeds. Emotional Tribal Blackmail: The Fear of Exile Do you think algorithms are your only jailers? Unfortunately, the strictest guards in your echo chamber are actually the people who think exactly like you. This is the paradox of the digital age, the more we isolate ourselves from the “enemy,” the more we become hostages of our own tribe. The Cost of a “Wrong” Opinion Have you ever felt the impulse to write: “Wait, maybe the other side has a point here” or “It’s not all so black and white,” only to pause and delete the comment? That moment of hesitation is the fear of exile. Inside your digital “bubble,” loyalty to the group is measured by radicalism. If you dare to show understanding for the opposing side, your tribe does not see you as an objective person, but as a traitor. The punishment is swift and severe, ranging from public ridicule and cancel culture to total ostracization. For a human being, evolutionarily hardwired to survive within a group, this digital banishment hurts almost as much as physical pain. Algorithmic Loyalty Lock-in Algorithms adore this dynamic. They constantly serve you evidence of how “the others” are evil, thereby raising the stakes. The worse the image of the enemy becomes, the greater the pressure on you to be 100% aligned with your tribe. This creates an effect of emotional blackmail. You no longer support certain views because you’ve thought deeply about them, but because they are the “entry ticket” to your social group. You fear that if you concede even a shred of truth to the other side, you will lose the support, the likes, and the sense of belonging that your “bubble” provides. A Cage of the Like-Minded This mechanism turns us into ideological soldiers who dare not lower their shields. Digital isolation doesn’t just separate us from those we disagree with, it locks us in a cage with those we agree with, where every shade of gray is forbidden. When the fear of judgment from one’s own group becomes stronger than the desire for truth, we cease to be free thinkers. We become merely echoes within a chamber, terrified of the silence that would follow if we ever said something “wrong.” Sabotaging the Algorithm: How to Become “Digitally Untameable” It’s easy to say “delete social media,” but in today’s world, for many, that’s impossible. Fortunately, there is another way. You don’t have to leave the digital world, you can simply become poor material for processing. The goal is to “pollute” your data to the point where the algorithm can no longer predict with certainty what will keep you on the app. Recognize the “Rage Bait” in Real-Time The first step in sabotage is awareness. The next time you feel that sudden spike in blood pressure while scrolling, pause for a second. Ask yourself: “Am I truly interested in this, or did the algorithm just serve me rage because it knows I’ll click?” The moment you recognize the platform is intentionally “winding you up” to keep you longer, the magic vanishes. The ultimate victory over the algorithm is seeing something that infuriates you, recognizing the system’s intent, and simply scrolling past. No comment, no like, no reaction. Deliberately “Poisoning” Your Digital Footprint The algorithm knows you because you feed it consistent data. To confuse it, you must become inconsistent. Seek out content that contradicts your beliefs, not to argue, but to send the system mixed signals. Follow people whose views you deeply disagree with. Click on articles about topics that usually don’t interest you. Once your profile is no longer “clean” (one ideology, one hobby, one type of news), the algorithm no longer knows which pigeonhole to put you in, and your echo chamber begins to crack. Intellectual Hygiene: Follow People, Not Portals Portals and pages thrive on sensationalism, but individuals. Even those you disagree with, often have more nuanced views. Instead of following aggressive media outlets that serve you the “outrage of the day,” find moderate voices from the “other side.” Listen to their arguments without the need to reply immediately. By doing this, you aren’t necessarily changing your mind, you are training your brain to see the complexity of the world again, instead of the black-and-white filter the networks impose on you. Conclusion: Regaining Control Your attention is your most valuable asset. Algorithms will always try to steal it by exploiting your most primal instincts. But the moment you begin to consciously navigate your digital path, you stop being the product and become the user once again. Sabotaging the algorithm is not an act of rage, it is an act of freedom. In the Next Episode: We will explore why our brains biologically prefer an exciting lie over a boring truth, and how misinformation is “packaged” to bypass your critical filters without you even noticing. You will discover why you feel an irresistible urge to share content that shocks or frightens you, and how to develop a healthy skepticism toward headlines designed as emotional landmines. EPISODE 3 Psychological Warfare and Manipulation on the Internet Psychology
Psychological Warfare and Manipulation on the Internet Episode 1: The front line – Your attention 16/02/202625/02/2026 Share on X (Twitter) Share on WhatsApp Share on LinkedIn Share on Reddit Share on Telegram Share on Facebook Share on Pinterest Have you ever hopped onto Instagram just to check one message, only to snap out of it 45 minutes later watching a video about traditional knife-making in Japan?… Read More
Psychology High Tech, Low Life: Are we living in a cyberpunk dystopia and how can we wake up from it? 26/12/202526/12/2025 Share on X (Twitter) Share on WhatsApp Share on LinkedIn Share on Reddit Share on Telegram Share on Facebook Share on Pinterest From digital fascination to human degradation. Can we use technology for real progress, rather than escape? Technology has advanced, humanity has regressed We live in an era where… Read More
Psychological Warfare and Manipulation on the Internet Episode 3: Information Ammo – The Anatomy of a Fake News Story 09/03/202609/03/2026 Share on X (Twitter) Share on WhatsApp Share on LinkedIn Share on Reddit Share on Telegram Share on Facebook Share on Pinterest We’ve all fallen for a headline that completely caught us off guard at least once. It has surely happened to you, you see something so shocking on Facebook… Read More