What Can Kill AI?
With the recent outage of AWS that took down large portions of the internet, it’s an interesting time to ask a question few are thinking deeply about: what could actually kill AI?
We’re living in an age where artificial intelligence is transforming industries at a rapid pace. It’s being used by millions of information workers daily. But the true test of AI’s global scale comes when everyday consumers start using it constantly — when it becomes as common as Google Search or Netflix streaming. That’s when AI transitions from being a productivity booster to a societal utility.
However, there’s a fundamental challenge standing in the way: cost.
The Cost of Compute
AI, as powerful as it is, is extraordinarily expensive to run. Every prompt, every image generated, every voice synthesized — all of it requires enormous compute capacity. And compute means data centers, GPUs, and energy. This isn’t a “software” cost; it’s a hardware tax.
Let me give a personal example.
Back in March 2006, I started my first company — a search engine that used machine learning, natural language processing, and entity resolution (what we’d call AI today). Coincidentally, that was also the exact month AWS (Amazon Web Services) launched.
Even though our startup focused on a narrow vertical of the web, we faced an enormous infrastructure problem. Crawling and indexing millions of pages, downloading profile images from LinkedIn or Wikipedia, and processing all that data took massive computing power.
We had a choice:
Build our own co-location server farm (expensive, slow, and operationally painful), or
Use the brand-new AWS cloud to scale flexibly.
We chose AWS — and by May 2007, we became one of their largest startup customers.
Meeting Jeff Bezos
Around that time, I had the opportunity to meet Jeff Bezos in a private meeting behind the Moscone Center in San Francisco, before a product demo. For about 15 minutes, we discussed how we were using AWS — S3, EC2, and the challenges of scaling an AI-driven product.
Bezos was deeply curious about one specific thing: “At what point does AWS become too expensive for you to keep scaling?”
That moment stuck with me. Bezos wasn’t just thinking about startups or hobbyists — he was already thinking about global scale. He wanted AWS to be the backbone of the world’s computing infrastructure, affordable and powerful enough for the biggest enterprises.
And to his credit, AWS became exactly that. It’s now used by millions of organizations worldwide. But that question he asked back then — “At what point does it become too expensive?” — has come full circle with AI.
So What Can Kill AI?
So now the question becomes — what can kill AI?
And I think the answer is clear: the compute cost.
Even when we were doing our little slice of AI back in 2006 to 2010, our compute costs were enormous — sometimes approaching a million dollars a month. We needed that much processing power from AWS to do things like crawl the web, download data and images, analyze them, and then run machine learning and natural language processing on top of it.
Our algorithm — an early form of AI — was working hard to make sense of complex data. The results were impressive; we could surface insights that other search engines couldn’t. But it came at a staggering financial cost.
Fast forward to today, and yes, compute costs have come down significantly — which is why we’re seeing AI flourish in ways that weren’t possible before. But here’s an insight from someone who’s been in this game a long time: it’s not the software that got better.
The algorithms, the math, and the concepts have existed since the 1950s and 1960s. What changed was hardware. Engineers could always write the code, but they lacked the compute engine powerful enough to execute it at scale.
That’s why, as soon as ChatGPT exploded in popularity, every other major player was able to release something similar within months. The software foundation was already there — it was the availability of affordable compute that unlocked it.
And that’s also why NVIDIA — once seen as a gaming chip company — is now a multi-trillion-dollar enterprise. Their GPUs made it possible to process vast quantities of data at speeds AI models demand. Compute costs, once tied to server racks and bandwidth, have fallen dramatically — but they’re still significant.
Even today, AI runs on an infrastructure bill that would shock most people. It’s cheaper than it was two decades ago, but it’s still a massive, ongoing expense.
The Economics of AI’s Future
One of the most sobering data points floating around is this:
For every $1 of revenue generated by an LLM, there may be $5 of downstream compute and energy costs tied to it.
Even OpenAI, in its own disclosures, has suggested it generates roughly $6 billion in annual revenue, but spends close to $20 billion to sustain that growth. That kind of ratio would be catastrophic for any normal business.
So when people ask me what the biggest danger to AI’s future is — the thing that could prevent it from becoming a daily utility for everyone — it’s this: can we get the compute cost down to a point where it’s not burning cash just to exist?
If not, AI risks becoming the exclusive domain of large corporations with cash cows to subsidize the losses — companies like Microsoft, Google, Meta, and X. In that world, innovation becomes centralized. Smaller startups can’t afford to train or deploy models at meaningful scale.
The alternative vision — and the one I hope for — is that costs keep dropping, new efficiencies emerge, and startups once again become the innovation engine of the industry. That’s how you get a thriving, open ecosystem of AI tools and applications that improve life and business for everyone.
The Power Problem
There’s another layer to this issue: energy.
Running large-scale AI models consumes staggering amounts of electricity. The race for compute has quietly become a race for power.
That’s why you’re now hearing major tech companies talk about investing in nuclear energy, private power plants, and new-generation grids. AI is becoming so compute-hungry that even the world’s most sophisticated data centers are starting to hit capacity limits. The conversation isn’t just about GPUs anymore — it’s about gigawatts.
That’s both the challenge and the opportunity.
The Path Forward: Three Things We Must Do to Drive Costs Down
So, what needs to happen to make AI sustainable — both economically and environmentally? I believe there are three core levers that can drive down compute costs and determine AI’s long-term future.
1. Cheaper, Cleaner Power
The single biggest variable in AI’s future is power.
If we can develop more power capacity, especially through nuclear energy, we could create an almost infinite source of low-cost, clean energy. That would immediately lower the cost basis for every AI system in the world.
But this comes with a moral dimension: if the power isn’t clean — if it’s coal or carbon-intensive — then the net impact could be negative. We’ll face the same criticism that crypto mining did: consuming massive energy without clear societal benefit. The future of AI depends not just on power, but on how cleanly we generate it.
2. Faster, More Efficient Chips
The second driver is hardware acceleration. As chips get faster, smaller, and more energy-efficient, the cost per computation drops dramatically. Companies like NVIDIA, AMD, and now dozens of custom silicon startups are racing to make chips purpose-built for specific types of AI tasks. The faster these evolve, the lower the marginal cost of intelligence becomes.
3. Cheaper Servers, Hardware, and Bandwidth
The third piece of the equation is infrastructure efficiency — the cost of servers, memory, and bandwidth. Historically, hardware costs decline over time with scale, and that trend should continue. Every generation of servers and networking hardware becomes cheaper, faster, and more optimized for AI workloads. This cumulative effect compounds across the ecosystem.
At the end of the day, power is the true X-factor.
Even if chips get faster and servers cheaper, without affordable, abundant, clean energy, AI will always face an economic ceiling.
The real test of AI’s future — and perhaps the evolution of society itself — will be whether we can achieve near-infinite, sustainable compute. Because in the end, the story of AI isn’t just about algorithms and models. It’s about the power that makes them possible.


