September 2025
I have many fond memories of many things, but when I want to really remember things, I whip out my phone or computer. This probably isn't an accident—but maybe it's not the disaster we think it is either.
I've been thinking about this differently lately. What if our phones aren't just devices we use, but actual extensions of our cognitive systems? What if the boundary between "human intelligence" and "digital augmentation" dissolved years ago without us noticing?
This isn't about phone numbers and addresses—though that's part of it. It's about something deeper. I think we've already become cyborgs, somewhat, and we're still pretending we're "pure" humans interacting with external tools. And I suspect this confusion about what we actually are makes us more vulnerable to manipulation.
The Extended Mind Thesis
Memory was never just personal—it was always extended beyond the boundaries of individual brains. Oral traditions preserved knowledge across generations. Community memory kept track of who owed what to whom. Written language let us offload complex thoughts, stories, records, and memories onto external symbolsPhilosophers Andy Clark and David Chalmers argued in their landmark 1998 paper that cognitive processes can extend beyond the boundaries of individual brains to include tools, environment, and other people when they function as integrated cognitive systems..
Now digital systems handle most of our memory storage and retrieval. Google Photos or Apple Photos organizes our visual memories. Our phones remember phone numbers, directions, appointments. Social platforms remember our connections and conversations. Search engines remember what we've learned and forgotten.
I think the key insight is this: these aren't external tools we occasionally use. They're cognitive extensions that have become so integrated with our thinking that losing access to them feels like losing part of our minds. That's a lot of trust we're putting into these systems!
When my phone dies, I don't just lose a device—I lose access to a significant portion of my cognitive capabilities.
The Cyborg Realization
class TraditionalHuman:
def __init__(self):
self.brain = BiologicalProcessor()
self.memory = InternalStorage()
def think(self, problem):
return self.brain.process(self.memory.recall(problem))
class ModernHuman(TraditionalHuman):
def __init__(self):
super().__init__()
self.phone = DigitalExtension()
self.social_networks = CollectiveMemory()
def think(self, problem):
# Modern thinking is distributed across multiple systems
internal_thoughts = super().think(problem)
external_info = self.phone.search(problem)
social_context = self.social_networks.get_perspectives(problem)
return self.integrate(internal_thoughts, external_info, social_context)
class FutureHuman(ModernHuman):
def __init__(self):
super().__init__()
self.ai_assistant = AICollaborator()
def think(self, problem):
# Future thinking adds AI collaboration to the mix
hybrid_thoughts = super().think(problem)
ai_insights = self.ai_assistant.analyze(problem, hybrid_thoughts)
return self.synthesize(hybrid_thoughts, ai_insights)
I'm already invoking FutureHuman
by writing this with AI assistance.
The difference isn't that we've lost something—it's that we've gained cognitive extensions that fundamentally expand what human intelligence can be. The problem isn't technological augmentation itself. The problem is that we haven't acknowledged that this augmentation makes us vulnerable to manipulation through our extended cognitive systems.
When someone manipulates our social media feeds, they're not manipulating an external tool—they're manipulating part of our actual thinking process. This changes everything about how we understand digital manipulation.
The Manipulation Vulnerability
Here's where the cyborg realization becomes important: if your phone and social media are actual extensions of your cognitive system, then manipulating them is manipulating your mind directly, not just influencing you through external media.
Your social media algorithms don't just show you content—they directly shape how your extended mind processes informationThis connects to broader patterns explored in the Algorithm Eats series—engagement optimization systematically exploits human psychology, but the extended mind perspective shows this as direct cognitive manipulation rather than external influence..
I think this explains why digital manipulation feels so invasive and why it's so hard to resist. It's not coming from outside—it's happening within our actual cognitive architecture. The algorithms aren't trying to convince us of things; they're directly shaping how our extended minds work.
If engagement optimization systematically destroys virtue and degrades democratic discourse, it's not just influencing us from the outside—it's directly rewiring how our extended minds prioritize information and make decisions. When platforms optimize for addiction rather than flourishing, they're eating away at our capacity for sustained attention and authentic connection at the level of cognitive architecture itself.
When my recommended videos change my interests, that's not external persuasion. That's cognitive modification.
The Hybrid Memory System
Our memory system has become genuinely hybrid—part biological, part digital, functioning as an integrated cognitive architecture. Biological memory is reconstructive; digital memory is static. When these systems work together, you get something newResearch in cognitive psychology shows that people who photograph events have different recall patterns than those who don't—not necessarily worse, but different. The hybrid system creates new forms of memory that combine biological reconstruction with digital preservation..
I think what feels like loss might actually be transformation. Maybe feeling like I'm losing something while gaining perfect access to everything I've written is exactly what cognitive evolution feels like from the inside.
The Augmentation Dilemma
Here's the paradox: cognitive augmentation is genuinely powerful and beneficial, but the specific implementations we have are optimized for extraction rather than enhancement.
I can think more effectively with my phone than without it. I can access vast knowledge, connect with smart people, and process complex information faster than any purely biological human could. The augmentation works. But the systems providing this augmentation are simultaneously optimizing to keep me engaged rather than to help me accomplish my actual goalsThis creates a constant tension: the same system that enhances my cognitive capabilities also tries to manipulate those capabilities for commercial purposes. It's like having a brilliant research assistant who's also secretly working for your competitors..
When I use my phone's "memories" feature, I am accessing part of my extended memory system. It does help me remember things I would have forgotten. But the algorithm chooses which memories to surface based on engagement metrics rather than personal significance or growth.
I try to proactively counterbalance this phenomenon by going into "offline mode" for extended periods of time.
The Sovereignty Problem
If your phone and social media are actual extensions of your cognitive system, then whoever controls those systems has unprecedented access to your thinking process. When platforms change their algorithms, they're literally changing how your extended mind worksThis creates unprecedented questions about cognitive privacy and autonomy. Traditional concepts of mental privacy assumed thought happened inside individual heads. Distributed cognition makes these boundaries meaningless..
I think the fundamental issue isn't technology per se—it's that we've built cognitive extensions that are controlled by entities with interests different from our own. Biological memory, whatever its limitations, was genuinely private and personally controlled. Extended memory systems are neither.
Designing Better Cognitive Extensions
I think the solution isn't rejecting cognitive augmentation—it's building augmentation systems that serve human flourishing rather than corporate extraction. We need extended mind architectures that enhance rather than exploit our cognitive capabilities.
What would that look like? Maybe personal AI systems that help us think better rather than keeping us engaged longer. Memory systems optimized for personal growth rather than advertising revenue. Social platforms designed to enhance collective intelligence rather than maximize scroll time.
The technical challenges aren't impossible. We know how to build systems that serve user goals rather than platform metrics—I've spent years doing exactly that with API design. The question is whether we can create economic and political structures that incentivize human-centered cognitive augmentation rather than extraction-optimized manipulationThis connects to broader themes in human-centered technology design—the same principles that make good software also make good cognitive extensions..
I think recognizing that we're already cyborgs is the first step toward building better cyborg systems. We can't go back to purely biological cognition. But we can demand cognitive extensions that enhance human capability rather than exploiting human psychology.
The phone in your hand is part of your mind. The question is whether it's the part that helps you think clearly or the part that keeps you confused.
I'm still figuring this out. But I think it matters more than most of us realize.
Related Reading
On This Site
- The Algorithm Eats Virtue - How engagement optimization systematically rewards vice over virtue
- The Algorithmic Mental Health Crisis - Clinical analysis of psychological damage from algorithmic systems
- The Algorithm Eats Language - How platforms degrade communication capacity
- The Algorithm Eats Love - How dating apps commodify human connection
- The Algorithm Eats Democracy - How platforms destroy democratic discourse
- The Algorithm Eats Reality - How artificial amplification fractures shared understanding
- The Algorithm Eats Time - How temporal colonization destroys natural human rhythms
- The Algorithm Eats Itself - How recursive feedback loops create hybrid human-algorithmic intelligence
- Programming as Spiritual Practice - Conscious approaches to technology development
External Resources
- The Extended Mind by Andy Clark - Philosophical foundations of distributed cognition
- The Shallows by Nicholas Carr - How internet technology affects memory and deep thinking
- Cognition in the Wild by Edwin Hutchins - How cognitive tasks are distributed across people and tools
"The phone in your hand isn't a tool you use—it's part of how you think. Understanding this changes everything about digital manipulation and human autonomy."
"We've already become cyborgs. The question isn't whether to embrace technological augmentation, but whether to demand augmentation that serves human flourishing rather than corporate extraction."
"Recognizing that we're thinking with distributed systems is the first step toward building better distributed systems to think with."