Categories
Asides

Branding for Non-Human Audiences in the AIoT Era

Around 2024, Tesla began phasing out its T logo. Part of this may have been to emphasize the text logo for brand recognition, but recently it seems even that text is disappearing. It feels like the company is moving toward the next stage of its brand design.

Ultimately, the text will vanish, and the shape alone will be enough for people to recognize it. In consumer products, this is the highest-level approach—an ultimate form of branding that only a few winners can achieve.

I’m reminded of a story from the Macintosh era, when Steve Jobs reportedly instructed Apple to reduce the use of the apple logo everywhere. As a result, today anyone can recognize a MacBook or iPhone from its silhouette alone. The form itself has become the brand, to the point where imitators copy it.

A brand, at its core, is a mark—originally a literal brand burn—meant to differentiate. It’s about being efficiently recognized by people, conveying “this is it” without conscious thought. One effective way is to tap into instincts humans have developed through coexistence with nature, subtly hacking the brain’s recognition process. Even Apple and Tesla, which have built inorganic brand images, have incorporated such subconscious triggers into product design and interface development, shaping the value they hold today.

But will this still be effective going forward?

The number of humans is tiny compared to the number of AI and IoT devices. For now, because humans are the ones paying, the market focuses on maximizing value for them. That will remain true to some extent. But perhaps there is a kind of branding that will become more important than human recognition.

Seen in this light, Apple, Tesla, and other Big Tech companies already seem to hold tickets to the next stage. By adopting new communication standards like UWB chips, or shaping products to optimize for optical recognition, they are working to be more efficiently recognized by non-human entities. Even something like Google’s SEO meta tags or Amazon’s shipping boxes fits into this picture.

In the past, unique identification and authentication through internet protocols were either impossible, expensive, or bound to centralized authority. But advances in semiconductors, sensor technology, and cryptography—along with better energy efficiency—are changing that. The physical infrastructure for mesh networks is also in place, and branding is on the verge of entering its next phase.

The essence of branding is differentiation and the creation of added value. The aim is to efficiently imprint recognition in the human brain, often by leveraging universal contexts and metaphors, or by overwriting existing ones through repeated exposure. I’m not a marketing expert, but that’s how I currently understand it.

And if that’s correct, the question becomes: must the target still be humans?
Will humans continue to be the primary decision-makers?
Does it even make sense to compete for differentiation in such a small market?

At this moment, branding to humans still has meaning. But moving beyond that, as Apple products adopt a uniform design and Tesla moves toward minimalistic, abstract forms, branding may evolve toward maximizing value by being efficiently recognized within limited computational resources. Uniformity could make device recognition more efficient and reduce cognitive load for humans as well.

We should design future branding without being bound by the assumption that humans will always be the ones making the decisions.

Categories
Asides

The AI That Refused the Cloud

Why didn’t Apple build a cloud-based AI?

Why didn’t they jump on the generative AI boom?
Why haven’t they released their own large language model?
Why did they bring us not “AI,” but “Apple Intelligence”?

The answer, I think, isn’t so much about strategy as it is about limitation.
It’s not that Apple chose not to use the cloud. They couldn’t.

Of course, there’s iCloud—and Apple owns infrastructure on a scale most companies could only dream of.
But unlike Google or Meta, Apple never built a business around collecting behavioral logs and text data through search, ads, or social media.
They never spent decades assembling a massive cloud platform and the dataset to match.

And with a user base of Apple’s scale, building and maintaining a unified cloud—compliant with each country’s laws and privacy standards—isn’t just difficult. It’s structurally impossible.

So Apple arrived at a different conclusion: if the cloud was out of reach, they would design an AI that completes everything locally.

An AI that lives inside your iPhone

Apple engineered the iPhone to run machine learning natively.
Its Apple Silicon chips use a custom architecture, with Neural Engines that process image recognition, speech interpretation, and even emotion detection—all on the device.

This started as a privacy measure.
Photos, voice data, steps, biometrics, location—all processed without ever leaving your phone.

At the same time, it addressed battery constraints.
Apple had long invested in larger screens to increase battery capacity, adopted OLED, and brought UMA (Unified Memory Architecture) to MacBooks.
All of this was about sustaining AI performance without draining power or relying on constant connectivity.

It was an enormous challenge.
Apple designed its own chips, its own OS, its middleware, its frameworks, and fused it all with on-device machine learning.
They bet on ARM and fine-tuned the balance of power and performance to a degree most companies wouldn’t even attempt.

Vision Pro’s sensors are learning emotion

Vision Pro includes sensors for cameras, LiDAR, infrared, eye tracking, facial muscles, and spatial microphones—designed to read what’s inside us, not just outside.

These sensors don’t just “see” or “hear.”
They track where you’re looking, measure your pupils, detect shifts in breathing, and register subtle changes in muscle tension.
From that, it may infer interest, attraction, anxiety, hesitation.

And that data? It stays local.
It’s not uploaded. It’s for your personal AI alone.

Vitals + Journal = Memory-based AI

Vision Pro records eye movement and facial expressions.
Apple Watch logs heart rate, body temperature, and sleep.
iPhone tracks text input and captured images.

And now, Apple is integrating all of this into the Journal app—day by day.
It’s a counter to platforms like X or Meta, and a response to the toxicity and addiction cycles of open social networks.

What you did, where you went, how you felt.
All of this is turned into language.
A “memory-based AI” begins to take shape.
And all of it stays on-device.

Not gathered into a centralized cloud, but grown inside you.
Your own AI.

Refusing the cloud gave AI a personality

Google’s AI is the same for everyone—for now.
ChatGPT, Claude, Gemini—all designed as public intelligences.

Apple’s AI is different.
It wants to grow into a mind that exists only inside you.

Apple’s approach may have started not with cloud rejection, but cloud resignation.
But from that constraint, something entirely new emerged.

An AI with memory.
An AI with personality.
An AI that has only ever known you.

That’s not something the cloud can produce.
An AI that refuses the cloud becomes something with a self.

Categories
Asides

When AI Exists at the Edge

It’s becoming standard for people to carry around their own personalized AI.

This trend has been building for a while, but since the arrival of Apple Intelligence, it’s something we now feel on a daily basis. Compared to cloud-based AI, there are still challenges. But that’s not the point.

For example, if the vast amount of personal data stored on iOS devices can be leveraged properly, it could unlock entirely new use cases—and boost accuracy through completely different mechanisms.

Eventually, AI will run fully offline, even in places without an internet connection. That in itself marks a major social shift.

Smartphones will no longer be just personal computers. They’ll become AI servers, memory devices, and decision-making systems. Once local AI can carry out self-contained dialogue and processing without relying on connectivity, privacy will be redefined—and sovereign computing will become real.

It’s like an extended brain. One that learns about you, runs inside you, and optimizes itself for your needs. You’ll decide what to retain, what to allow it to learn from, and what to keep private. All those decisions will happen entirely within you.
It’s a world fundamentally different from today’s cloud-first AI paradigm.

There was a time when servers lived inside companies.
Then came the age when servers moved to the cloud.
And now, we’re entering an era where servers are distributed inside individuals.

This shift isn’t going to stop.

Categories
Asides

Notes.app Markdown flavored

I made it possible to write in Markdown in the macOS Notes.app—well, just the bare minimum, but it’s working the way I wanted.

It’s based on my existing Hammerspoon setup, which I’ve been using mainly for Emacs-style keybindings across the system. I modified it so that when I write Markdown-style text, it instantly turns into rich text inside Notes. You type > quote or # Heading, and it becomes actual formatting on the fly.

This is useful if you’re the kind of person who prefers writing in Markdown (like in Slack or Asana) but still wants the end result to be nicely formatted and readable.

I’ve published the code on GitHub.

Categories
Asides

What's in my bag

How I live in the decentralized world

As a permanent Airbnb resident or as a lifetime traveler, I need to carry much stuff while I’m not at my office in Tokyo. However, I don’t want to work with ordinary tools. The best tools which blown my mind away are the only things I need.

Categories
Asides

iStick – a USB stick with a Lightning connector

I don’t agree with their concerns about cloud storages though.

Categories
Asides

What can't beacons do?

Categories
Asides

Apple To Acquire Beats For $3.2 Billion

Finally. Over $3 BILLION!

Categories
Asides

How NFC devices change ads around us?


Here is an obvious thing. Foursquare and its clones including Japan made clones have to be happy with the Facebook’s decision about their location service.

Anyway, they must have ideal strategies to beat their enemies. There’re a lot of clones here in Japan now. A big difference between Foursquare and Japan made clones is mobile clients. Most of Japan made clones support typical Japanese mobile phones unlike iOS/Android devices. It means they’re based on Japanese mobile phone culture. This difference is making some innovative movement right now. They get some ideas from typical Japanese mobile phones.

Categories
Asides

iPhone Doc For A Cool Office

iRetrofone

This is what I really want. The modern model looks nice but I prefer the retro model.

Desk Phone Dock

Exit mobile version