Categories
Asides

Why Tesla Won’t Support CarPlay

Many dismiss Tesla’s long refusal to support Apple CarPlay as mere stubbornness on the part of Elon Musk. But behind this decision lies an awareness that the revenue structure of the auto industry itself is shifting. Tesla’s answer to the question of what comes after building and delivering a good product was, in part, to reject CarPlay.

Sony’s image sensors are inside every iPhone shipped worldwide. That fact alone is proof of Sony’s engineering excellence and quality. Yet what users ultimately touch is the iPhone as a product, iOS as software, iCloud as a cloud service, and the experience Apple designed around all of it. No matter how exceptional a component supplier may be, only the company that controls the final layer of experience can build a lasting relationship with the customer. This pattern has repeated across every industry.

What is CarPlay, exactly? From the user’s perspective, it is simply a convenient way to bring the familiar iPhone experience into the car. For anyone who has suffered through an outdated car navigation interface, it feels like a rescue. But from an automaker’s standpoint, adopting CarPlay means handing control of the in-car experience to a smartphone maker. In the short term, it helps with customer satisfaction. A familiar interface is a selling point, and CarPlay compatibility alone can tip a purchase decision. Over the long term, however, every point of contact with the customer becomes Apple’s. Music plays through Apple Music, navigation runs on Apple Maps, notifications and calls come through the iPhone. The car becomes a rolling shell, and the experience is absorbed into Apple’s layer.

Tesla rejected this structure from the start. It built its own infotainment system and kept navigation, music, and every interface under its own roof. Even listening to Apple Music requires a Tesla Premium Connectivity subscription. The company chose to sacrifice user convenience rather than surrender any part of the experience layer.

What Tesla was looking at, beyond this decision, was a model of ongoing software revenue. Its FSD (Full Self-Driving) subscription shifted to a monthly-only plan in February 2026, with over a million users now enrolled. Rather than selling a car and moving on, Tesla collects driving data, trains its AI, improves its software, and sells those improvements back as a subscription. More drivers mean more data. More data means better autonomous driving. Better driving means a more valuable subscription. To sustain this cycle, Tesla needed to own both the in-car experience and the data it generates.

This dynamic extends well beyond a single feature called CarPlay. In an era when cars are becoming rolling computers, the question of whether the company that builds the hardware or the one that designs the software gets to own the customer relationship is fundamental to the structure of the industry. It mirrors what has happened to Japanese manufacturers, who built excellent products and shipped them around the world, only to find the experience layer captured by someone else.

In the future that lies beyond CarPlay’s continued evolution, automakers risk becoming Apple’s equivalent of sensor manufacturers. Their engineering and hardware quality may be recognized, but the interface customers touch every day will be one Apple designed. Tesla’s refusal was, at its core, a refusal to become a parts maker.

Whether Tesla’s bet was the right one remains unclear. Holding onto the experience layer comes at the cost of user frustration. Calls for CarPlay support never went away. Whether sheer will can hold out against market demand is a separate question entirely.

In the end, there is no clean answer. You can perfect the components and let someone else own the experience. Or you can design the entire experience yourself and absorb the friction with your customers. Both paths have costs. The one thing that seems certain is that whoever controls the experience layer gets to define the revenue structure and the shape of the customer relationship that follows. Whether or not to support CarPlay is not a technology question. It is an answer to what you believe you are selling.

Categories
Asides

Branding for Non-Human Audiences in the AIoT Era

Around 2024, Tesla began phasing out its T logo. Part of this may have been to emphasize the text logo for brand recognition, but recently it seems even that text is disappearing. It feels like the company is moving toward the next stage of its brand design.

Ultimately, the text will vanish, and the shape alone will be enough for people to recognize it. In consumer products, this is the highest-level approach—an ultimate form of branding that only a few winners can achieve.

I’m reminded of a story from the Macintosh era, when Steve Jobs reportedly instructed Apple to reduce the use of the apple logo everywhere. As a result, today anyone can recognize a MacBook or iPhone from its silhouette alone. The form itself has become the brand, to the point where imitators copy it.

A brand, at its core, is a mark—originally a literal brand burn—meant to differentiate. It’s about being efficiently recognized by people, conveying “this is it” without conscious thought. One effective way is to tap into instincts humans have developed through coexistence with nature, subtly hacking the brain’s recognition process. Even Apple and Tesla, which have built inorganic brand images, have incorporated such subconscious triggers into product design and interface development, shaping the value they hold today.

But will this still be effective going forward?

The number of humans is tiny compared to the number of AI and IoT devices. For now, because humans are the ones paying, the market focuses on maximizing value for them. That will remain true to some extent. But perhaps there is a kind of branding that will become more important than human recognition.

Seen in this light, Apple, Tesla, and other Big Tech companies already seem to hold tickets to the next stage. By adopting new communication standards like UWB chips, or shaping products to optimize for optical recognition, they are working to be more efficiently recognized by non-human entities. Even something like Google’s SEO meta tags or Amazon’s shipping boxes fits into this picture.

In the past, unique identification and authentication through internet protocols were either impossible, expensive, or bound to centralized authority. But advances in semiconductors, sensor technology, and cryptography—along with better energy efficiency—are changing that. The physical infrastructure for mesh networks is also in place, and branding is on the verge of entering its next phase.

The essence of branding is differentiation and the creation of added value. The aim is to efficiently imprint recognition in the human brain, often by leveraging universal contexts and metaphors, or by overwriting existing ones through repeated exposure. I’m not a marketing expert, but that’s how I currently understand it.

And if that’s correct, the question becomes: must the target still be humans?
Will humans continue to be the primary decision-makers?
Does it even make sense to compete for differentiation in such a small market?

At this moment, branding to humans still has meaning. But moving beyond that, as Apple products adopt a uniform design and Tesla moves toward minimalistic, abstract forms, branding may evolve toward maximizing value by being efficiently recognized within limited computational resources. Uniformity could make device recognition more efficient and reduce cognitive load for humans as well.

We should design future branding without being bound by the assumption that humans will always be the ones making the decisions.

Categories
Asides

Navigation Systems Are for Talking to Cars

As semi-autonomous driving becomes the norm, one thing has clearly changed: the role of navigation systems.
They’ve become a kind of language—an interface through which humans talk to cars.

In the past, we used navigation simply to avoid getting lost. It was a tool for finding the shortest route—purely for efficiency.
But now, it’s different. Navigation is how we communicate a destination to the car.

Even when I’m going somewhere familiar, I always input the destination. I know the way.
But I still feel the need to tell the car. If I don’t, I don’t know how it will act.

In many cases, the destination is already synced from my calendar.
That’s why I’ve started to think about how I enter appointments in the first place.
How far is it?
Is the departure time realistic?
What information does the car need to understand my intent?
Even scheduling has become part of a broader conversation with the car.

Turn signals are the same.
They’re not just for the car behind you.
They’re also how you tell the vehicle, “I want to change lanes now,” or “I’m about to turn.”
Bit by bit, people are developing an intuitive sense of what it means to signal to the machine.

These actions—destination input, calendar syncing, signaling—will eventually become training data.
They’ll enable more natural, more efficient communication between humans and vehicles.
As the car becomes more autonomous, the human role is shifting—from driver to conversational partner.

Categories
Asides

Tesla Optimus will become infrastructure

The age of AI has already begun.

With ChatGPT, we can now generate text, images, voice, even video. It’s not “coming soon” — it’s already here.

But changing the physical world takes one more step: integration with IoT.
AI can process data, but it can’t touch the real world. That’s where robots come in — they allow AI to physically interact with reality. Optimus is a symbol of that.

Tesla Optimus is a device meant to carry us into the age of automation, without rewriting our entire society.
From AI’s point of view, it’s the interface to the real world.
No need to reinvent roads, elevators, or doors. Optimus — and other robots being built by Big Tech — are designed to move through the world as it is. They’re general-purpose labor bodies, built to help AI function inside existing human infrastructure.

What we’re seeing now is, I think, a robot plan to AIoT the world.
Everything will be connected, automated, decision-capable, and able to act.
And the reason robots need to be humanoid is finally becoming clear: they’re designed to fit into our world, not the other way around.

Automation will move faster than we expect.
Car companies might end up as manufacturers of “just empty boxes” — simple transport units. These boxes don’t need intelligence. In fact, automation works better when things follow spec, stay predictable, and don’t think too much.

In Japan’s case, I wouldn’t be surprised if the government eventually distributes robots like Tesla Optimus.
You give up your driver’s license, and in return, get a subsidy for a household robot. That kind of world might not be a joke — it might be real, and sooner than we think.

But the tech and quality needed to make those robots — that’s where Japan comes in.

Humanoid robots are hard to build. They can’t afford to break down. Batteries, motors, sensors, thermal systems, materials — all of it needs to be precise and reliable.
That’s exactly what Japan has spent decades getting good at.

Manufacturing and quality control — those might be Japan’s last strongholds.
And they’re exactly what the world is looking for right now.