I see people everywhere anxious about whether AI will disrupt their jobs, their industries, their lives. I’ve always approached this with calm. Not indifference—calm.

The future rarely sends advance notice, but it is always arriving. This isn’t news. It’s the human condition.

A few years ago, I attended a keynote by Michio Kaku where he framed—perfectly, for me—the relationship between humanity and technological change. What follows is my version. I can’t claim novelty, and I’m not a domain expert in sociology or economics. I’m an infrastructure builder observing the same pattern from the inside.

And there’s one key thing I want to add to the usual “adoption curve” conversation:

Technology doesn’t win because it’s true. It wins when constraints fall. Cost, friction, distribution, and capacity decide what becomes real.

With that in mind, here are the three phases.

Phase 1 — “It won’t work”

Humanity evolved by doubting miraculous promises.

How many caravels departed Europe seeking the New World and never returned? We don’t descend from the adventurers who chased horizons into oblivion. We descend from those who stayed on land. Who watched the ships disappear. Who procreated. Who survived.

When I hear that someone was “visionary” because he mortgaged his house and went all-in on Bitcoin a decade ago, I don’t see a strategist. I see someone who took asymmetric risks—and got rewarded by a tail event. Yes, courage mattered. But luck was the dominant variable. Survivorship bias makes gamblers look like prophets.

Evolution is slow. Our skepticism of the new isn’t ignorance. It’s inherited wisdom.

“It won’t work” isn’t resistance. It’s a survival heuristic.

Phase 2 — “It might be cool, but not for me”

This is the polite phase. The phase where people stop arguing and start rationalizing distance.

I caught this one firsthand: the artifact was a BlackBerry.

I remember colleagues asking, “Are you walking around with a portable TV?” and “You get work emails 24/7? That’s insane.” In those days, convincing was still required—that being reachable could be a form of reliability, not a form of slavery.

Availability monitoring back then meant email alerts and hope. No Slack. No Telegram. No WhatsApp. No push-notification ecosystems. Just email—and praying you’d see it in time.

In 2010, I walked into the Apple Store on Stockton Street in San Francisco. A TV crew was making a huge fuss about what they claimed was the oldest person in the world using an iPad. It was charming. But what struck me was this: an elderly person adopting cutting-edge consumer technology was newsworthy. Remarkable enough to film.

Today, the first thing you do when setting up a phone is attach an email account. The thing that once felt radical—24/7 connectivity—became so normalized it’s now a setup requirement.

Phase 2 is the transition from rejection to selective adoption. “Not for me” becomes “maybe for some people,” then quietly becomes “for everyone.”

Phase 3 — “This is so cool—I knew it. In fact, I helped it.”

Phase 3 is where ego gets rewritten by participation.

Remember coffee? Children despise it. Then one day you wake up and realize you can’t function without it. There wasn’t a conversion moment. It just happened.

This is the phase where FOMO erodes identity. The skepticism of Phase 1 and the detachment of Phase 2 dissolve into a sudden need to participate. Not because the person has new evidence—but because the tribe has moved.

We are wired for tribal survival. On the savanna, if someone spotted a predator and bolted, the ones most likely to survive were the ones who started running without asking questions. No time for due diligence when the lion is already moving.

When everyone is into something, you need a stubborn personality to resist the flow.

And once the flow wins, history gets rewritten: “Nobody admits they were a skeptic.” “I always knew this would be big.”

The hidden variable: constraints collapsing

Now the URE part.

Adoption curves look like psychology, but they’re often physics.

A technology can exist for decades and still not matter—not because people are blind, but because the enabling constraints haven’t fallen.

  • cost is too high
  • compute is too scarce
  • tooling is too immature
  • distribution is too narrow
  • integration friction is too painful
  • reliability is too low

When those constraints collapse, Phase 3 arrives like a wave and people confuse it for “inevitability.”

Penicillin wasn’t magic. Scale was.

Penicillin is a clean story about this.

The molecule existed. The idea existed. The promise existed.

But getting a discovery into society isn’t about the eureka moment. It’s about industrialization: supply chains, production, distribution, quality control, logistics, and—often—some forcing function that turns “nice” into “necessary.”

War did that. It created urgency. It made scale non-optional.

The point isn’t “war is good.” The point is that necessity creates industry, and industry creates adoption.

Discovery without scale is a footnote. Scale turns a footnote into a civilization upgrade.

AI is the penicillin story wearing a GPU

After I built my lab last year, I ran computational tests comparing CPU versus GPU processing. I wanted to see what was real under the hood: governors, KV cache behavior, tensor operations, throughput, latency tradeoffs. It was thrilling. I’m infrastructure and security by origin—I don’t pretend to be a data scientist. But I’m fluent in the difference between “a prototype exists” and “a system works at scale.”

I have a close friend who worked on neural network adoption for audio codecs decades ago. We were discussing heuristic computing—neural networks, graphs, the weird behavior you get when statistical systems collide with real-world constraints—and we hit a déjà vu.

The math existed. The theory existed. The approaches existed.

What didn’t exist was affordable, industrial-scale compute and the surrounding ecosystem—the tooling, distribution, and integration path that makes something usable by normal organizations.

So yes, in one sense: AI is not new. The conditions are.

Is AI the “GPU version of penicillin”?

Fleming had the molecule. It took industrial fermentation to make it matter. Neural networks had the theory. It took industrial parallel compute to make them matter.

The breakthrough isn’t always the idea. Sometimes the breakthrough is the capacity to execute the idea.

“Will AI overtake my job?”

I’ve seen this movie before.

If you watch Hidden Figures (2016), you’ll see that when NASA called for a “Computer,” it wasn’t hardware—it was a badge. A person. A math outlier calculating trajectories by hand. That profession faded.

Did those people become obsolete? No. The talent adapted. The work moved. The constraints changed. New tooling arrived, and the role transformed.

So what forced AI into Phase 3?

In my view: COVID accelerated the forcing function. Overnight, digital business became mandatory. That pressure distorted labor markets, budgets, and expectations. It created an “efficiency war” inside organizations: survival through output.

Now we’re in the normalization cycle: executives openly say the quiet part out loud—some work will require fewer humans when machines become acceptable collaborators.

That’s Phase 3 behavior: not “AI is interesting,” but “AI is now part of baseline operations.”

And once it becomes baseline, the cycle repeats: skeptic → selective adopter → evangelist.

URE takeaways: how to operate in each phase

If you’re a builder, leader, or operator, the question isn’t “is AI hype?” The question is: what constraints are collapsing, and where does that change the economics of execution?

Phase 1: Measure constraints, don’t debate vibes

  • Track unit cost curves (per token, per workflow, per outcome)
  • Track friction (integration, approvals, security, data access)
  • Track reliability (failure modes, hallucination impact, recovery paths)
  • Decide what is not allowed to fail (SLO thinking)

Phase 2: Adopt narrowly where drag is measurable

  • Use AI where it reduces operational toil (triage, search, summarization, diff review)
  • Instrument it (latency, accuracy, operator time saved, error rate)
  • Treat it like a system: guardrails, rollback, auditability

Phase 3: Assume commoditization—and compete on execution

  • The model becomes a component, not a moat
  • Moats shift to: distribution, proprietary data flows, integration depth, and operating model
  • Your advantage becomes: how fast you can safely ship change

This analysis is part of URE’s ongoing research into infrastructure economics—and the human dynamics that determine which technologies become civilization-level defaults.


Stefano Schotten