For many people living with anxiety, low mood, or emotional exhaustion, the hardest moments can be the quiet ones. Those late nights when the noise in your head won’t switch off. The morning dread before your feet touch the floor. The heavy evenings when you don’t feel able to reach out to anyone.
It’s in moments like these that AI platforms like ChatGPT can feel like a lifeline. There’s no wait time, no explaining yourself, no fear of “burdening” someone. You can type freely, get an instant response, and sometimes that’s enough to soften the intensity - even if just for a while.
But what actually is AI?
AI (Artificial Intelligence) is technology designed to learn from patterns, process information, and generate responses in a way that feels conversational. ChatGPT is one of the most widely used AI “chatbots.” You type in a question, thought, or problem, and it responds instantly, drawing on a vast database of information and examples it’s been trained on.
It’s not “thinking” like a human. It’s finding patterns in words and predicting what’s most likely to come next in a conversation, based on everything it’s learned. That’s why it can feel fluent and human - but also why it can make mistakes or sound overly confident about something that isn’t accurate.
How this shows up in mental health conversations
When you use ChatGPT to talk about your thoughts or feelings, it can help you:
- Put words to things you’ve been struggling to express
- Organise and untangle thoughts when everything feels jumbled
- Suggest grounding techniques or practical ideas
- Encourage reflection in a private, low-pressure way
Those things can be genuinely helpful - especially if you’re in a moment where you need to feel less alone, or you’re not ready to talk to another person yet.
But it’s important to understand that AI is always biased toward the person talking to it. In other words, it takes your perspective at face value. If you tell it something is true, it will often respond as though it is - even if it’s not the full picture.
This can be comforting in the short term, but it’s not the same as working with a therapist who can gently challenge your assumptions, offer different perspectives, and help you explore what’s underneath. AI can’t sense your tone, notice hesitation, or recognise when something you’ve said might need deeper attention.
Why bias matters in emotional support
Imagine you tell AI, “I’m a terrible friend.”
It might try to comfort you by agreeing in a soft way: “It’s understandable to feel that way sometimes,” then offer tips for “being a better friend.”
What it can’t do is ask the questions a therapist might:
- “What’s made you feel this way?”
- “What evidence do you have for and against that belief?”
- “Could something else be going on here?”
It will lean toward the story you’re telling, which means it can unintentionally reinforce your own self-criticism or unhelpful thinking patterns.
If you’re using AI for mental health support, here are some gentle guidelines:
Do:
- Use it as a supplement, not a replacement, for human connection
- Treat any advice or suggestions as a starting point
- Follow up with a friend, family member, or therapist if something feels important
- Keep personal details private
Don’t:
- Rely on it in a crisis - always contact a helpline or emergency service
- Assume everything it says is correct or right for your situation
- Let it take the place of building real-world support
AI can be part of your mental health toolkit - especially for quick reflection, journalling prompts, or finding ideas when you feel stuck. But nothing replaces the safety, attunement, and steady presence of another human who can walk alongside you as you find your way back to feeling more grounded.
If you’d like to explore how therapy could support you, I offer a free 15-minute call so we can talk about what you’re looking for and how I may be able to help.
Click the link below to book your free call now
https://calendly.com/tessa-gates/initial-15-min-chat