Chatbots are a mirror, not a mind
People talk to AI like it’s a mind independent of them. But what if it’s mostly just a reflection?
People talk to AI like it knows things. Like there’s a mind behind the words. But mostly, it’s just bouncing things back at you.
Ask a lazy question, get a lazy answer. Ask something sharp, unorthodox, layered - and suddenly it feels smart. But the shift didn’t come from it. It came from you.
To me, it’s a mirror, not a thinker. A good mirror, sure - trained on loads of human expression. But still, just echoing patterns. How clearly it reflects depends a lot on what you bring into the room.
ChatGPT isn’t just a better Google (although it is a better Google, too, I find), like the early Internet wasn’t just a better newspaper. It’s main strength isn’t spitting out straightforward answers and facts.
I remember when ChatGPT first came out there was this sense in the mainstream (and maybe still is) that AI is the great equalizer. That now anyone, regardless of background or skill, could do complex things just by pressing a button. Instant writing, coding, analysis, you name it.
But to me, it quickly felt like the opposite was true. It doesn’t level the playing field at all. If you’re already thoughtful, curious, or good at asking the right questions, AI becomes a powerful amplifier.
If you’re unclear, shallow, or chasing shortcuts, it tends to reinforce that too. It reflects and extends whatever’s already there - strengths or blind spots.
I get this feeling that when I’m talking to a chatbot trying to find some solution to a life or business problem, I’m really just talking to myself. I don’t really get much new info but rather I clarify and organize what was already in my head. It’s a bit like having a therapist - not because the chatbot is smart, but because it reflects things back in a structured way, which helps me think.
So let’s go argue with ourselves in peace.


