Beyond the screen: A Third Device, and a New State of Experience?
Reading the Vision of Jony Ive & Sam Altman from a Product and Experience Perspective Since the news broke that Apple’s renowned designer Jony Ive and OpenAI founder Sam Altman are collaborating on a “third fundamental device,” there’s been a wave of excitement — tempered with skepticism — in the tech world. There is no physical product yet, but there are serious investments, quietly assembled multidisciplinary teams, and an ambitious vision: a screenless, silent, and context-aware interaction layer.
This vision deserves deeper attention — not only as a potential turning point in technology but also as a signal of a new paradigm in product and experience design.
Especially with this video, the question that inspired this article becomes clear:
If this is the first major shift since the iPhone, what exactly is it built on — and what kind of interaction model does it propose?
A Brief History of “Fundamental” Devices: What Did They Change?
So far, we’ve welcomed two core devices that didn’t just bring new technologies into our lives — they rewired our behaviors:
Computer: The digital version of making. Writing, calculating, storing.
Smartphone: Staying connected while on the move. Capturing with a camera. A world in our pocket.
Now, Ive and Altman aim to add another layer to these two monumental shifts. But this time, there’s no interface. No screen. No app drawer. The proposed device might not even wait for our commands — it could proactively engage with us and our environment in a way that differs from anything we’re used to.
As Altman suggests, this new device is not built to do things — but to make sense of the environment. That shifts how we think about both “product” and “user.”
A New Layer or a New Behavior?
If this device is to be revolutionary, it won’t be because of its physical form — but because of the interaction model it introduces. The device itself may be secondary; what matters is the behavioral shift it enables.
So we must ask:
“Was experience always tied to screens, taps, and swipes? Or can silent, seamless interactions also be the essence of experience?”
The third device reframes experience — not as something triggered by the user — but as something that simply happens around them. A UX layer that responds before it’s summoned. That interacts with context rather than demanding attention.
Call it “ambient computing,” “predictive UX,” or “context-aware systems” — the key change is this: the product isn’t visible, but felt. It blends into the background and responds instinctively.
Jony & Sam: Redefining Form
Jony Ive’s past design legacy redefined how we relate to physical objects. The minimalist glass-and-steel elegance of the iPhone was a deliberate shift to direct our focus from the tech to the experience.
Now, that minimalism is pushed even further: the form itself disappears.
This new device isn’t made to be seen — but to be sensed. A system that understands needs without being prompted, responds according to context, and behaves with near-empathy. That’s where OpenAI’s vision comes in: technology that is no longer passive, but intuitive.
The product won’t demand action from users — it will respond to presence. And design won’t be about visuals or buttons — but about time, attention, and emotion.
Redefining Experience
Perhaps the most radical idea here: the experience may be invisible — but deeply affective, and partly autonomous.
When UX was screen-bound, we could measure it, test it, optimize it. But in this new ambient, contextual world, experience is:
not an app flow,
not a user task,
not a visual UI,
but perhaps just a state.
A system where the product adapts to the user — not the other way around. Quiet, yet present. Without a clear form, but shaping behavior and attention subtly.
So being “invisible” doesn’t mean vanishing. It means becoming part of the background while remaining influential. Like light: we don’t see it, we see because of it.
A New PX-UX Era?
We don’t yet know what this third device will look like. It may arrive in a few years — or slip quietly into our lives.
But what we already know is this:
This idea makes us rethink user behavior — before even meeting the product.
If this “third fundamental device” becomes real, we must view it not just as technological innovation, but as an evolution of the product concept itself.
And perhaps this shift brings us full circle — back to the 1990s when experience design was defined by human-system interactions, not Figma outputs.
Screens have always been a part of experience — but never the whole of it. Yet we’ve treated them as such for a while now.
Maybe this new device reminds us:
UX is not a file.
It’s the art of placing design inside life — not just in the interface.
Beyond the screen: A Third Device, and a New State of Experience? was originally published in product.blog on Medium, where people are continuing the conversation by highlighting and responding to this story.





