In 1969, Amana (the company that brought the first affordable microwave into American kitchens) called it "the greatest cooking discovery since fire." In 2018, Sundar Pichai (CEO of Google) called AI "more profound than fire or electricity." He has been saying it ever since.
Two technologies, fifty years apart, both compared to fire by the people bringing them to the market.
A friend of mine who runs a financial advisory business suggested recently that AI might be the new microwave. He was not making a prediction, but describing a pattern that has played out before — a technology that promises everything, delivers something, and eventually finds its level. It is a reasonable take.
The pattern holds remarkably well.
88% of organisations worldwide report using AI in at least one business function, but only 6% qualify as high performers capturing real value. In PwC's most recent global CEO survey, 56% of chief executives report no increased revenue or decreased costs from AI, despite near-universal commitment to it. Near-universal adoption, modest utility for most. A small number getting serious results while the majority reheats leftovers.
The microwave did not replace the oven. But it ended up in over 95% of kitchens anyway.
Not only did the microwave achieve near-universal adoption within a generation, it did something nobody anticipated. It did not just slot into existing cooking habits, it created entirely new ones. Before the microwave, reheating leftovers was not something most people did routinely. It could be done on a stovetop or in an oven, but neither was quick or convenient enough to make it a daily habit. The microwave made that possible, and, in doing so, changed meal planning, reduced food waste, and transformed the frozen meal industry from a niche convenience into a global market (now worth close to USD 90 billion). Food companies restructured their product lines, meal patterns shifted, and the way families ate changed forever.
It was not transformative for everyone equally, however. A 1976 Whirlpool cookbook marketed itself to "a busy mother, swinging single, or harried husband." For people whose lives were shaped by time pressure, the microwave was a genuine shift in what was possible on a Tuesday night. For a small minority of others, it barely registered. I, for one, do not own a microwave and never have.
If the microwave is the right analogy for AI, the implication is reassuring. The hype will pass, AI will find its niche and most businesses will adopt it in ways that make some tasks simpler and easier, giving time back for more valuable or enjoyable things. The 6% getting real value are the early enthusiasts, the equivalent of the people who learned to make a passable risotto in a microwave. Good for them. The rest of us will reheat leftovers, and that will be that.
No matter how good an analogy, they all break somewhere.
Unlike the microwave, AI is already creating economic distance between those who use it well and those who do not. PwC's 2025 Global AI Jobs Barometer analysed close to a billion job ads across six continents. Industries most exposed to AI are seeing revenue-per-employee growth three times higher than the least exposed. Workers with AI skills command a 56% wage premium over peers in the same roles, and that premium doubled in a single year.
Nobody who mastered microwave risotto pulled away from the people reheating leftovers — there was no microwave wage premium and no compounding productivity gap between households that used their microwave skilfully and those that did not. The microwave was only ever a convenience technology. The gap between a sophisticated user and a basic user was a lifestyle difference, not an economic one.
The AI data says something different. The 6% of organisations capturing real value from AI are not just early enthusiasts. They are compounding an advantage that the rest cannot see from where they are standing.
There is a second problem with the analogy, and it is one that every AI user will recognise. When AI is applied to the wrong task, it does not just fail to help, it makes things worse. A 2023 Harvard Business School study confirmed this: 758 professionals were given realistic tasks, some within AI's capability and some outside it. For suited tasks, performance improved significantly. For unsuited tasks, professionals using AI performed 19 percentage points worse than those working without it. The AI produced plausible-looking output that the professionals believed, and their work suffered for it. It is easy to draw the conclusion that AI does not work. More often than not, the AI was not the problem. It was not knowing where the boundary is.
A microwave, on the other hand, does not make your cooking worse. If you microwaved something that should have been roasted, you got a bad meal and learned not to do it again. The feedback was immediate and obvious. AI fails differently. It fails in ways that look like success, and recognising the difference is itself a skill that needs to be learned the hard way.
So is AI the new microwave? The adoption curve fits, the overpromise fits, and many of us are just using it to reheat leftovers. But the microwave was only ever a convenience technology, where basic use was fine and mastery was optional. AI is a technology where skill determines whether it helps or hurts, and the gap between those two outcomes is widening. The question for any leader is not whether AI matters. It is whether you have enough direct experience with it to get and stay on the right side of that gap.
Sources
PwC, 2025 Global AI Jobs Barometer (close to 1 billion job ads, six continents)
McKinsey, The State of AI (November 2025)
PwC, 29th Annual Global CEO Survey (2026, 4,454 CEOs across 95 countries)
Dell'Acqua, Mollick et al., Navigating the Jagged Technological Frontier, Harvard Business School (2023, 758 professionals)
US Bureau of Labor Statistics, microwave oven household penetration (1997)
Whirlpool, Micro Menus Cookbook (1976)
Amana Corporation, Radarange advertisements (1969)