August 2025
The uncomfortable truth about engagement optimization: the same systems that systematically destroy human virtue also systematically destroy mental health.
This isn't coincidence—virtue and psychological wellbeing are intimately connected. When you engineer systems that reward the inverse of wisdom, courage, temperance, and love, you inevitably engineer systems that generate anxiety, depression, addiction, and despair.
These mechanisms also degrade our language and thinking capabilities, commodify romantic relationships, and destroy democratic discourse, creating a comprehensive assault on human cognitive and emotional functioning.
As someone who has lived with bipolar disorder and schizoaffective symptoms for years, I recognize the patterns. The cognitive distortions, the emotional dysregulation, the reality distortion, the social isolation—algorithmic feeds systematically induce the same psychological states that characterize serious mental health conditionsHaving experienced these states clinically gives me a reference point for recognizing them when they're artificially induced. The difference is that algorithmic systems create these conditions at scale, affecting billions of people who don't have frameworks for understanding what's happening to them..
The difference is that when I experience these symptoms, I recognize them as symptoms of a medical condition that requires treatment. When algorithmic feeds induce the same states in billions of users, we call it "engagement" and celebrate the metrics.
The Psychological Architecture of Engagement
Let me be specific about how algorithmic feeds systematically degrade mental health:
Anxiety Generation Through Unpredictable Rewards
Social media platforms use variable ratio reinforcement schedules—the same mechanism that makes gambling addictive. You never know when you'll get likes, comments, or shares, so you check compulsively. This creates a state of chronic anticipatory anxietyThe dopamine system evolved to motivate seeking behavior for survival needs. Hijacking it with artificial unpredictable rewards creates persistent psychological stress that the system was never designed to handle..
For someone with anxiety disorders, this is triggering. For neurotypical users, this gradually induces anxiety-like symptoms. The difference is degree, not kind.
Depression Through Algorithmic Reality Distortion
As I detailed in The Algorithm Eats Virtue, feeds systematically prioritize negative content because it generates stronger engagement. This creates what I call "algorithmic depression"—a worldview shaped by engagement-optimized content that makes everything seem worse, more hopeless, and more threatening than it actually is.
This isn't natural pessimism or healthy skepticism. It's manufactured hopelessness created by systems that profit from keeping you scrolling through an endless stream of problems without solutions, crises without context, and outrage without outletThe hopelessness feels organic because it emerges from your direct information consumption, but it's actually artificial—shaped by algorithmic selection designed to maximize your engagement time rather than reflect reality..
Attention Fragmentation and ADHD-Like Symptoms
The constant stream of notifications, updates, and algorithmic interruptions systematically fragments attention in ways that mimic ADHD. Users develop shortened attention spans, difficulty with sustained focus, and increased distractibility—not because they have ADHD, but because they're using systems designed to capture and fragment attention.
For people who actually have ADHD, these systems are particularly destructive, amplifying existing challenges with attention regulation.
Social Comparison and Body Dysmorphia
Algorithmic feeds don't show you representative samples of reality—they show you engagement-optimized samples. This means you see other people's highlight reels, filtered photos, and curated successes while living your own unfiltered daily experience.
The comparison mechanism that might be healthy in small, real communities becomes pathological when applied to algorithmic selections of millions of usersEvolution designed our social comparison mechanisms for groups of 50-150 people, not millions. Scaling these psychological patterns to social media creates systematic dysfunction.. The result is systematic erosion of self-esteem, realistic expectations, and satisfaction with ordinary life.
Paranoia Through Engagement-Optimized Outrage
Feeds learn that content generating fear, anger, and suspicion keeps users engaged longer than content generating contentment or understanding. Over time, this creates what I call "algorithmic paranoia"—a heightened sense of threat, persecution, and social danger that doesn't reflect actual risk levels.
This is particularly dangerous for people with existing paranoid tendencies, but it affects everyone who gets their information primarily through engagement-optimized feedsYour perception of social reality becomes calibrated to algorithmic selection rather than direct experience. This systematic distortion affects political beliefs, social trust, and personal risk assessment..
The Neurological Impact
These aren't just psychological effects—they're neurological changes. Heavy social media use measurably alters brain structure and function:
Reduced Gray Matter: Studies show decreased gray matter in areas responsible for attention regulation and impulse control.
Altered Reward Pathways: Dopamine systems become less sensitive to natural rewards and more dependent on artificial stimulationThis is neurologically similar to substance addiction—you need increasingly intense stimulation to achieve the same psychological satisfaction, while normal life experiences become less rewarding..
Increased Cortisol Production: Chronic stress hormones from constant engagement seeking create persistent fight-or-flight states.
Disrupted Sleep Patterns: Blue light exposure and psychological stimulation interfere with circadian rhythms and sleep quality.
Weakened Default Mode Network: The brain's capacity for self-reflection, creativity, and identity formation becomes impairedThe default mode network is active during rest and introspection—it's where we process experiences, form identity, and generate creative insights. Constant stimulation prevents this crucial psychological processing..
These changes don't happen overnight, but they happen consistently with heavy use. We're conducting an uncontrolled neurological experiment on billions of people.
The Vulnerable Populations
While algorithmic feeds damage everyone's mental health, certain populations are particularly vulnerable:
Adolescents and Young Adults
Developing brains are more susceptible to addiction, more sensitive to social rejection, and less capable of self-regulation. Introducing engagement-optimized systems during critical developmental periods can cause lasting psychological damage.
People with Existing Trauma
Having experienced psychological manipulation in personal relationships, I can attest that algorithmic systems exploit the same psychological vulnerabilities that abusers target—the need for validation, the fear of abandonment, the confusion that comes from reality distortion. For trauma survivors, these platforms can retrigger familiar patterns of emotional dysregulation and compulsive seeking behavior.
The teenage mental health crisis isn't coincidentally timed with social media adoption—it's causally related to itThe correlation between smartphone adoption and teenage mental health decline is so strong and consistent across demographics that denying causation requires willful blindness..
Young Children and Developmental Disruption
Even more concerning is the impact on very young children. Excessive screen time, particularly with algorithmic content, is creating developmental patterns that mimic autism spectrum symptoms in neurotypical children. These kids develop reduced eye contact, delayed speech, repetitive behaviors, social withdrawal, and intense fixation on digital stimuliResearchers call this "virtual autism"—autism-like symptoms caused by excessive screen exposure rather than underlying neurological differences. The symptoms often improve dramatically when screen time is reduced, suggesting environmental rather than genetic causation..
What's happening is that algorithmic content designed to capture adult attention is being deployed on developing nervous systems that haven't yet learned to distinguish between digital stimulation and real-world interaction. The result is children whose brains wire themselves around artificial rather than human patterns of communication and social engagement.
People with Existing Mental Health Conditions
Algorithmic feeds amplify existing mental health vulnerabilities. Anxiety disorders become more severe, depressive episodes become more frequent, bipolar mood swings become more extreme, and ADHD symptoms become more unmanageable.
As someone in this category, I can testify that social media use directly correlates with symptom severity in ways that are obvious once you start tracking them.
Isolated and Lonely Individuals
People using social media to address social isolation often find that it makes the problem worse. Algorithmic feeds provide the simulation of social connection without its psychological benefits, leading to what researchers call "lonely together" syndromeYou can feel socially connected while scrolling through hundreds of posts, but this parasocial engagement doesn't provide the psychological benefits of genuine human connection—leaving you more isolated than before..
Trauma Survivors
Constant exposure to triggering content, combined with the hypervigilance that engagement optimization encourages, can retraumatize people working to recover from past experiences.
The Business Model Problem
Here's the fundamental issue: platforms make money by keeping users engaged, not by keeping users psychologically healthy. These goals often directly conflict.
Mental Health Best Practices vs Engagement Optimization:
- Limited, intentional usage → Maximized time on platform
- Positive, affirming content → Emotionally provocative content
- Diverse perspectives → Echo chamber reinforcement
- Natural stopping points → Infinite scroll
- Healthy social comparison → Highlight reel comparison
- Authentic relationships → Parasocial engagement
- Present-moment awareness → Constant distraction
When your revenue depends on user attention, user wellbeing becomes a cost center rather than a goalThis isn't about evil corporations—it's about misaligned incentives. Even well-intentioned platforms face pressure to optimize for engagement over wellbeing because that's what generates revenue..
The Scale of the Crisis
We're not talking about a small problem:
- 5 billion people use social media platforms globally
- Average user spends 2.5+ hours daily on social platforms
- Teenage depression rates have increased 60% since 2010
- Suicide rates among young people have increased dramatically
- Attention disorders are being diagnosed at unprecedented rates
- Political polarization has reached levels that threaten democratic institutions
This isn't natural social evolution—it's the predictable result of systems designed to maximize engagement regardless of psychological cost.
The Personal Recognition
For me, recognizing algorithmic mental health damage required the same skills I use to manage my bipolar disorder: pattern recognition, symptom tracking, and honest self-assessment.
When I'm spending significant time on algorithmic feeds, I notice:
- Increased irritability and reactivity
- Shortened attention span and difficulty with deep work
- More negative worldview and decreased hope
- Heightened social comparison and self-criticism
- Fragmented sense of time and priorities
- Difficulty with sustained relationships and presence
These aren't just "side effects" of technology use—they're symptoms of algorithmic psychological manipulationTracking these patterns requires the same kind of careful observation I use to monitor mood episodes, medication effects, and environmental triggers. The difference is that algorithmic effects are socially normalized rather than recognized as symptoms..
When I reduce or eliminate algorithmic feed consumption, these symptoms consistently improve. The correlation is too strong and consistent to ignore.
What Mental Health-Optimized Systems Would Look Like
Imagine social platforms designed to support psychological wellbeing:
Positive Psychology Integration: Algorithms that prioritize content promoting gratitude, accomplishment, relationships, engagement, and meaning.
Attention Restoration: Design patterns that support sustained focus rather than fragmenting it. Natural stopping points, batch processing of social updates, and tools for deep engagement.
Reality Calibration: Systems designed to provide representative rather than engagement-optimized samples of human experience. Balanced news, diverse perspectives, and context for extreme content.
Social Connection: Features that facilitate actual relationship building rather than parasocial engagement. Small group interactions, local community building, and collaborative projects.
Mental Health Monitoring: Tools that help users track the psychological effects of their usage and adjust accordingly. Mood correlation tracking, usage pattern analysis, and wellbeing metrics.
Crisis Prevention: Algorithmic detection of mental health crisis indicators with appropriate intervention resources rather than engagement amplification.
Individual Protection Strategies
While we work toward systemic solutions, individuals can protect their mental health:
Algorithmic Awareness: Understanding how feeds work and tracking their psychological effects on you personally.
Intentional Consumption: Using social media deliberately rather than as default entertainment or anxiety relief.
Curated Information Diet: Choosing information sources based on accuracy and psychological impact rather than engagement value.
Attention Training: Deliberate practices to rebuild sustained focus and resist distractionMeditation, reading physical books, and single-tasking aren't just wellness practices—they're active resistance to algorithmic attention fragmentation..
Real-World Grounding: Prioritizing direct experience over mediated experience for reality calibration.
Community Building: Investing in local, in-person relationships that provide genuine social support.
Professional Support: Working with mental health professionals who understand technological impacts on psychological wellbeing.
The Urgent Need for Change
This isn't a future problem—it's a current crisis affecting billions of people right now. Every day we delay addressing algorithmic mental health damage, more people develop psychological conditions that could have been prevented.
We need:
Research: Large-scale studies examining the causal relationships between algorithmic design and mental health outcomes.
Regulation: Policy frameworks that prioritize user wellbeing over engagement metrics.
Platform Reform: Pressure on existing platforms to adopt business models compatible with mental health.
Alternative Development: New platforms explicitly designed to support rather than exploit psychological vulnerabilities.
Public Education: Widespread understanding of how these systems work and how they affect mental health.
Clinical Integration: Mental health professionals trained to assess and treat algorithm-induced psychological symptoms.
A Final Thought
The connection between virtue and mental health isn't accidental—they're different aspects of human flourishing. Systems that systematically destroy virtue inevitably damage psychological wellbeing.
We're in the early stages of understanding how profoundly algorithmic systems can reshape human consciousness. The mental health crisis we're seeing, particularly among young people, isn't separate from the character degradation I wrote about previously—they're the same phenomenon viewed from different anglesVirtue and mental health are mutually reinforcing. Systems that undermine wisdom, courage, temperance, justice, faith, hope, and love inevitably create anxiety, depression, addiction, and despair..
As someone who has struggled with mental health challenges, I can say with certainty: these algorithmic systems are making everything worse for everyone, but especially for people who are already vulnerable.
We can build technology that supports human flourishing instead of exploiting human psychology. But first, we have to acknowledge that the current systems are causing massive psychological harm at unprecedented scale.
This crisis extends beyond individual psychological effects to encompass systematic discrimination against those who need mental health support and demands a fundamental reorientation toward programming as spiritual practice that serves consciousness rather than exploiting it.
The algorithm doesn't just eat virtue—it eats sanity, peace, and hope. We can choose to feed it something else, but only if we first admit what it's currently consuming.
"The best way to take care of the future is to take care of the present moment."
"What we think, we become."
"Technology is not neutral. We're inside of what we make, and it's inside of us."