Part Two: The Conversation That Registered
– Reflections on Artificial Intelligence – A Guide for Thinking Humans
This is the second reflection inspired by Melanie Mitchell’s Artificial Intelligence: A Guide for Thinking Humans.
While reading her chapters on natural language processing, I found myself returning to a moment with my daughter.
My daughter is six. And she talks. With her whole small, serious body, eyebrows raised, hands gesturing. Questions, observations, plans. She doesn’t always have the right words, but she always seems to know what she is talking about.
And somehow, despite the limits of grammar, vocabulary, and experience she converses. Truly converses.
We talk about a birthday party that happened last month. The purple balloon that she loved and that burst early. The boy who cried when she took his chocolate bar. The cake that had Elsa on it. A single phrase like “remember the magic show?”, and suddenly the whole memory unfurls. Not just facts, but atmosphere. The moment returns like mist rising off a familiar field.
There is no need to explain. We were both there. We registered it.
Even a year from now, a glance or a whisper of the old joke will bring it back. That is what conversation can be. Not a performance of intelligence, but a soft, shared architecture of experience. Something that builds a continuity.
This is what remains uncanny about language models. A conversation with ChatGPT can dazzle. It can respond, redirect, even flirt. Sometimes it feels like banter, sometimes like balm. But step away and return a week later. It won’t remember a thing.

Not really. And so beneath every prompt, every clever reply, hums the quiet question: did it register? It feels like a conversation, but is it?
We often confuse fluency for memory. And now, as the LLMs get more powerful and have access to more memory they can hold a thread. If you are experiencing it now for the first time, you might feel that the LLMs can hold a conversation, on the same thread, even after days.
When I wrote about Janelle Shane’s book You kook like a thing and I love you, I also started reading her blog AI weirdness. Just to see how her thoughts progressed from the time of publication of her book to now. And when you see her experiments with Chat GPT 2 and then GPT 3 (all models: DaVinci, Curie, Babbage and Ada, with Ada the smallest to DaVinci the largest), you will realize the progress in last four years or so. GPT-5 has a much larger memory and is far better on having an intelligent conversation.
However, to remember is not merely to store. It is to care. To allow the past to contour the future.
My daughter remembers because it mattered. Because her attention was fully present. Because she expects the world to hold together, day after day, story after story.
A conversation, for her, is not a string of clever sentences. It is a thread she is learning to weave into the fabric of life. And maybe that’s what common sense really is. Not a vast database of possibilities. But the ability to carry a world forward.
Mitchell argues that for machines, this is still a frontier. The ability to ground language in lived context, to build upon it, to refer back not just semantically, but emotionally.
Language models live in the eternal now. They respond, simulate, predict. But they do not carry. They do not accumulate. Until they do, we will speak with them, not through them.
A child, with six years of grammar and a body full of stories, can build a future with you.
A machine, no matter how eloquent, forgets the day you laughed over the cake.
Perhaps this is the measure:
Intelligence isn’t just responsiveness. It is resonance.
It is knowing that what we say now will echo later.
That it registered.