Header image by Rasheed Kemy on Unsplash
Every time you open your social media apps, an invisible force curates and manipulates your experience. Artificial intelligence-driven algorithms subtly decide what content pops up on your screen. It decides what posts to show, picks which trending topics show up on your feed, and what information you’re exposed to.
While social media can be entertaining, the goal of the tech companies isn’t to inform or enlighten their users. It’s designed to keep you scrolling. This unseen influence can do far more than keep you connected to the outside world. It can also rewire how we think, feel, and relate to others — sometimes at our own peril.
Mental health professionals have witnessed an increase in the psychological fallout from spending too much time in the algorithm bubble. From increased risks of depression, anger, radicalized thinking, and isolation, therapists are warning people of the dangers algorithms can have on a person’s mental and emotional well-being.
The Emotional Algorithm
Algorithms run on engagement and encourage users to stay online. The longer you linger on the internet, the more profits platforms stand to earn. To achieve this goal, platforms are incentivized to supply users with content that provokes strong emotional reactions, such as outrage, fear, and validation.
“These algorithms do not necessarily have our best interests in mind,” said Cory Reid-Vanas, MA LMFT, a therapist and founder of Rocky Mountain Counseling Collective in Denver, Colorado. “They can create echo chambers that narrow our thinking and make alternative views feel threatening. They amplify the emotional component and normalize these extremes.”
Over time, continuous exposure to emotionally charged content has a tendency to warp a person’s sense of balance and realistic views of the world. “Social media stokes people’s fears and anxieties,” said Jason Fierstein, MA, LPC, therapist and founder of Phoenix Men’s Counseling in Phoenix, Arizona. “It manipulates people into believing the content they’re given represents the wider world around them, when it does not.”
When our view of the world we see online feels chaotic and threatening, our nervous system reacts accordingly. It activates our body’s stress response systems, which reinforces the false perspective that danger is everywhere, even if that’s not accurate. This keeps people in a negative emotional feedback loop. While profitable for tech companies, it’s detrimental for the people who use their platforms every day.
Echo Chambers and Emotional Stunting
By design, algorithms keep track of what we like and deliver similar content to keep us engaged. This creates a feedback loop known as an echo chamber. These are digital spaces where our beliefs are mirrored and amplified. With constant access to such tailored content, over time, users can become polarized as well as less tolerant of nuance.
“Social media can make alternative views feel threatening,” said Reid-Vanas. “They normalize extreme thinking by amplifying the most outrageous and scariest parts of any topic.”
There are emotional consequences to the limited viewpoint algorithms provide.
“It can crystallize or cement a person’s worldview,” said Fierstein. “If you’re marinating in fear and terror all the time, you’re not going to have a chance to grow.”
Such emotional confinement is particularly concerning for teenagers and young adults, whose brains are still developing empathy and impulse control. Algorithms can keep people stuck in a narrow emotional lane, one dominated by fear, anger, or subscribing to an “us versus them” narrative.

Vulnerability and the Search for Belonging
Not everyone who uses social media is equally susceptible to the influences of algorithms. People who experience isolation, identity confusion, or a sense of injustice are most at risk.
“People who seek certainty, belonging, and a greater sense of purpose are at risk,” said Reid-Vanas. “Children and adolescents in the middle of identity formation are particularly vulnerable to the content presented to them online.”
Especially for many disillusioned young men, this vulnerability can sometimes spiral into darker corners of the internet, including misogynistic incel communities. These groups offer the illusion of belonging to a cause and validate people exposed to them.
“A lot of lonely young men feel hopeless,” said Fierstein. “When they find extremist spaces online, it permits them to explore their rage. These groups provide a sense of community for people who feel disenfranchised or alienated. It’s easy to scapegoat women or minorities when those emotions are reinforced by online peers.”
Online radicalization, at its core, often stems not from ideology but from unmet emotional needs such as loneliness, rejection, and the need to matter.
How Algorithms Reinforce Cognitive Distortions
Often, therapists work with clients to help them identify cognitive distortions, which are habitual patterns of inaccurate thinking. Examples include black-and-white thinking, catastrophizing, or confirmation bias. And, unfortunately, social media algorithms can intensify those harmful thought processes.
“Therapists work with clients to manage and course-correct cognitive distortions,” said Reid-Vanas. “But social media reinforces them daily. It amplifies confirmation bias and black-and-white thinking, which can have a serious and very real impact on people.”
Not only can algorithms negatively influence how you think, they can also create a warped sense of reality.
“People start believing the world is exactly how it looks on social media,” said Fierstein. “That’s a huge distortion because that’s not how the world works at all.”
When algorithms constantly confirm our biases and exaggerate our fears, our brains adapt. People can begin to trust the feed more than their own critical reasoning, what psychologists call cognitive entrenchment.
Recognizing the Warning Signs
So, how can we tell when someone’s online world has begun to distort their offline reality in a harmful manner? One red flag is increased rigid thinking.
“You might see fear of outsiders or a growing contempt for others,” said Reid-Vanas. “They often begin to use niche community jargon, become dismissive of all mainstream sources, and may become emotionally volatile when questioned about their beliefs.”
These shifts in behavior and ideals can happen gradually, which can make it difficult for family and friends to notice until the person’s worldview has dramatically narrowed.
“They may get angrier, more demonstrative, or isolate themselves,” said Fierstein. “You might notice they spend excessive amounts of time online, talk obsessively about one or two topics, or neglect other areas of their life. Their relationships often start to deteriorate.”
Helping Clients Reconnect with Reality
Mental health professionals emphasize that knowledge is the first step toward recovery. Learning the risk factors of how algorithms influence social media feeds is essential to learn how to combat the dangers.
“When we have awareness, we’re better prepared to manage and understand our online experiences,” said Reid-Vanas. “Remember to keep an open mind, build offline connections in your community, and compare your beliefs with your deeper personal values.”
It’s a good idea to take a step back from environments influenced and controlled by algorithms. Examine whether the online communities you engage with align with your real-world values and relationships.
Unfortunately, not everyone who is affected by radical or polarized content will seek help. Often, such people will need encouragement from their social support network to address the problem.
“Many people won’t go to therapy because they don’t think they need it,” said Fierstein. “Fathers will come in worried about their sons who are in their early twenties and stuck in these echo chambers. When therapy is possible, it often involves unpacking long-held beliefs that predate the algorithm itself. We might need to go way back into childhood to the beliefs and modeling they learned from their parents that were later reinforced by the echo chambers.”
Therapists and Tech: Who Bears Responsibility?
While mental health professionals are an integral part of helping people unlearn the algorithm’s influence, it can be a treacherous landscape to venture into. People in general can be wary of exploring faulty mindsets and harmful beliefs. Therapists need to tread lightly while coaxing clients to question radicalized or extreme mindsets.
“I take care with clients who come in with polarized beliefs,” said Fierstein. “They can quickly sense where I might stand politically. So I approach the topic through questions, such as asking them to reflect on how social media might be influencing their emotions and relationships.”
Digital environments have an impact on our mental health. There is plenty of evidence about the harmful ways it shapes our psychology. But the entire burden of responsibility is not just on people who use social media. Tech companies have spent an exorbitant amount of money to ensure their users stay online as much as possible.
“Tech companies are motivated to increase and sustain engagement,” said Reid-Vanas. “They need to be transparent about how algorithms work. It’s not just about individual willpower; we need structural changes.”
While how much time someone spends online engaging in extremist or radicalized forums is up to the individual, tech companies have an obligation to their users to ensure their safety.
“Tech companies have created a potentially dangerous atmosphere, especially for children,” said Fierstein. “They have a moral responsibility to protect their users, and they’re not stepping up the way they should.”
Toward Digital Awareness and Collective Responsibility
For all the risks, there’s still ample room for change. Therapists, educators, parents, and policymakers need to work together to cultivate digital literacy, a form of emotional hygiene for the era of algorithms.
“We live in such a hyper-capitalistic society that I don’t know if another counter-message will have the same weight,” said Fierstein. “But we need rules, regulations, and education to balance out the addictive nature of these platforms.”
The influences of social media use are apparent and can have very real-life consequences. It’s important to check in with yourself — and your children, not only to make sure the content they’re exposed to is good for their mental health, but also that they’re not spending too much time online.
“Ask yourself if your online ecosystem aligns with your values,” said Reid-Vanas. “Is it grounded and healthy? Is it adding value to your life or is it harming your wellbeing?”
Awareness is the first act of resistance. Every click, like, or share helps teach the algorithm what to show you next. But it also offers a chance to pause and reflect, to notice what’s being fed to us and why. As more people recognize that digital engagement is not neutral or passive, we can begin to reclaim our feeds and our minds.
Kayla Wassell is a writer and editor with 6 years of experience in journalism, copywriting, content creation, and entertainment media.