Standing the Noise (We are not calm)
Why AI will not fade into the background, and does not want to be calm.
Framing Calm Tech
Design has always meant sharing. As infrastructure: gatherings, projects, stages where ideas collide. Today that space is INNOVIT in San Francisco. For more than a decade it was the Frontiers of Interaction Conference—born in Italy, exported abroad, and always ahead of its time.
In the 2010s—at Frontiers—Amber Case made calm technologies visible: the best tech isn’t what shines, but what disappears. Not gadgets demanding attention, but infrastructures that simply work. Back then, her talk sounded alien.
My first Substack was about Altman and Ive, and their bro-mantic AI Pin—filmed a few steps from my office in San Francisco. Soon after, Amber Case demolished it as poverty of imagination:
A digital amulet that asks to be seen and touched, divinatory before justified. Her critique is sharp: devices that promise magic, deliver chores.
I see her point.
Yet it’s not the full diagnosis. Here is what is missing.
The Hard Infrastructure: Social & cultural Acceptability
The technologies that win are not the best, but the most adopted.
And the gatekeeper to adoption is social acceptability: that invisible layer that decides whether a technology actually enters daily life. It’s what turns a behavior from “strange” into “normal”: the guy talking to himself on the street, once seen as unhinged, then recognized as a Bluetooth headset user. It’s social acceptability that decides whether we tolerate the loudspeaker user in public, the friend who, mid-conversation over a beer, suddenly flashes their smart glasses and snaps a photo. And when no physical device is recording, a silent AI will: running in the background, on every video call, whether you consent or not. Ready to hand you a task list at the end—whether you wanted one or not.
Glasses and wearable devices that struggle for social and cultural approval before adoption.
Right image: Stewart Brand, The Clock of the Long Now
If the next generation of AI stayed in the shadows, we would have wasted the opportunity.
Technology adoption always passes through the same stages: first cultural friction, then normalization, finally habit—and sometimes even invisibility.
Social acceptability is not etiquette, it’s the hardest infrastructure of innovation.
And it doesn’t just shape culture—it shapes markets, funding, and who scales or disappears.
We mentioned Amber Case. If Amber gave us the calm lens, David Orban gave us the paradox: the fracture between what AI promises to the individual and what it produces at the societal level. An asymmetry impossible to ignore: each of us embraces it as a tool of efficiency and personal power, while collectively the perception of risk, of loss of control, keeps growing. This is the gray zone where adoption slows down or even stalls:
we are happy to chat with an assistant that makes our day easier, yet uneasy about the same technology reshaping work, institutions, and culture. That’s the real threshold to cross: social acceptability isn’t measured in individual use, but in the ability to hold as a collective pact.
Picasso, in a 1964 interview with The Paris Review, said: “Computers are useless. They can only give you answers.” Half a century later, the line resonates louder: creativity is not in the output, it’s in the question. If an AI learns to ask, it becomes truly useful. But asking exposes it. It can’t be calm anymore.
Demerzel isn’t just a servant (Cit. Foundation, Asimov)
In the latest season of Foundation on Apple TV, Demerzel isn’t only the eternal servant of the genetic dynasty — she’s worshipped as a god. Robots are never neutral.
Cinematic robots like R2D2 or TARS make this crystal clear. We celebrate them as companions, guides, even heroes, yet at the same time they mirror our deepest anxieties about autonomy and control. They’re never invisible.
Which is why the real choice isn’t between calm and noise. It’s between stupid proactivity—endless pushes, anxious gadgets, “AI” disguised as knick-knacks—and designed proactivity, capable of stepping in at the right time, in the right tone.
This is not a technology issue: it’s philosophy applied to interaction design.
So how do we tame this inevitable invasiveness?
We don’t know how it will end. But we know how it begins: with narrative.
Not the one that comes later, polished in a glossy video, but the story that precedes the product. Storytelling as the first test of meaning: if it doesn’t hold as a story, it won’t hold as technology. Without narrative, proactivity is just noise. With narrative, it becomes a cultural experiment in acceptability. Sometimes intolerable. Sometimes even desirable.
Of course, there’s the risk that the story becomes more real than the product itself. We’ve seen it: entire markets driven by pitch decks, products existing only as teaser videos at the Francis Ford Coppola Café. Not industrial roadmaps anymore—just bro-mantic series, released in episodes.
We’ve seen it before: narrative doesn’t just follow technology, it drives it.
Conclusions
With AI, we don’t have a real manual to follow.
The only call to action: keep your mind open.
Whatever comes next is not “linear evolution.” It is alien—outside any system we know. It is contact with the unknown.
And let’s drop the illusion it will ever reduce to calm.
As Kevin Kelly wrote in What Technology Wants, technology wants what life wants: to grow, to mutate, to expand. Not to calm down.
We can’t domesticate an alien intelligence without transforming our own.
The real test won’t be whether AI calms down. It’s whether we can stand the noise — presence, constant and inescapable.
Not just actionable, but proactive.
Already here.Soon… More human than human.
PS. Last night I drove back to Palo Alto from a designers’ dinner in San Francisco. Friends from Meta, Google, Samsung, LoFi, Zoox. My Model Y was on Full Self-Driving. The moment I left downtown, I saw almost nothing but Waymo cars. For a few miles, the streets belonged entirely to autonomy. In San Francisco, the AI world capital, the noise is already deafening.




