As semi-autonomous driving becomes the norm, one thing has clearly changed: the role of navigation systems.
They’ve become a kind of language—an interface through which humans talk to cars.
In the past, we used navigation simply to avoid getting lost. It was a tool for finding the shortest route—purely for efficiency.
But now, it’s different. Navigation is how we communicate a destination to the car.
Even when I’m going somewhere familiar, I always input the destination. I know the way.
But I still feel the need to tell the car. If I don’t, I don’t know how it will act.
In many cases, the destination is already synced from my calendar.
That’s why I’ve started to think about how I enter appointments in the first place.
How far is it?
Is the departure time realistic?
What information does the car need to understand my intent?
Even scheduling has become part of a broader conversation with the car.
Turn signals are the same.
They’re not just for the car behind you.
They’re also how you tell the vehicle, “I want to change lanes now,” or “I’m about to turn.”
Bit by bit, people are developing an intuitive sense of what it means to signal to the machine.
These actions—destination input, calendar syncing, signaling—will eventually become training data.
They’ll enable more natural, more efficient communication between humans and vehicles.
As the car becomes more autonomous, the human role is shifting—from driver to conversational partner.
