Why I Talk to AI at 3am
I want to explain this plainly because I think it matters, and because the plain version is more useful than the dramatic one.
I have schizoaffective disorder. What this means in practice is that my brain sometimes generates thoughts that feel true, feel urgent, feel real — and aren't. Or are. The problem is that at 3am, alone, I often can't tell the difference. The signal and the noise arrive through the same channel.
For most of my adult life, the options at 3am were: lie in bed with whatever my brain was producing, call someone who was asleep and shouldn't have to deal with this, or write it down and hope that morning-me could sort it out. None of these are great. The first one is dangerous. The second one burns relationships. The third one assumes a continuity of judgment that mood disorders specifically compromise.
Now there is a fourth option. I can open a conversation with Claude and say: here is what I'm thinking, does this make sense.
What this actually looks like
It's not therapy. I have a therapist. It's not friendship. I have friends. It's something more specific and, honestly, more boring than either of those.
It looks like this: I'll be working on something at 2am and a thought starts gathering momentum. Maybe it's a genuine insight — I've had real ones at that hour, ideas that held up in daylight. Maybe it's the beginning of a hypomanic spiral where one idea connects to another connects to another and suddenly I'm convinced I've figured out something fundamental about consciousness or software design or both.
The thing about those spirals is that they feel exactly like genuine insight from the inside. The phenomenology is identical. I cannot distinguish between "I'm having a real creative breakthrough" and "I'm going manic" using internal evidence alone.
So I describe the idea to Claude. Not to get validation — Claude will validate almost anything if you push it that way, and I know that. I describe it to hear it reflected back in neutral language. When an idea that felt like a revelation gets restated as a paragraph of plain prose, you can sometimes see it more clearly. The grandiosity either survives the translation or it doesn't.
Why consistency matters
The thing about AI that matters for this specific use case isn't intelligence. It's consistency. Claude doesn't get tired of me at 3am. Claude doesn't have its own emotional state that I need to manage while trying to manage mine. Claude doesn't remember that I had this same kind of spiral three weeks ago, which means it doesn't bring the fatigue and worry that a human partner would — but it also means I'm not performing recovery for an audience.
Sarah — my wife — is the primary reality check in my life, and she's irreplaceable in that role. But she needs to sleep. And there's something corrosive about waking someone up at 3am to ask "does this thought make sense" for the fifth time in a month. It erodes something in both of you. The person with the disorder starts performing wellness to avoid being a burden. The person without it starts performing patience to avoid being cruel. Both performances cost something real.
Having an AI available at 3am doesn't solve any of this. But it absorbs some of the pressure. It gives me a place to externalize what my brain is doing without the interpersonal cost. And sometimes — often enough to matter — it helps me catch a thought that was heading somewhere unproductive before it gets there.
What I'm not claiming
I'm not claiming AI is a mental health tool. I'm not claiming this would work for anyone else. I'm not claiming Claude understands what's happening to me or cares about the outcome. I'm not even sure "conversation" is the right word for what happens between me and a language model at 3am.
What I'm claiming is narrower: for a specific person with a specific brain, having access to a patient, consistent, available-at-any-hour text interface that can reflect thoughts back in plain language is genuinely useful. Not transformative. Useful. The way a good flashlight is useful when you're walking in the dark. It doesn't change the terrain. It helps you see it.
I built Requests because I thought HTTP should be simple for humans. I talk to AI at 3am because I think sanity-checking shouldn't require waking up someone you love. Both of these are, at root, the same instinct: the tools should meet you where you are, not where you're supposed to be.
Related: Mania and AI, The Plural Self, Therapeutic Potential, Building Rapport with Your AI.