Jevons Paradox: Why building AI is so expensive – and why it will eventually get cheaper |
When OpenAI’s CFO Sarah Friar used the word “backstop” while talking about AI funding, the internet panicked. People thought she meant a government bailout like the 2008 financial crisis. In reality, she was just referring to long-term support for big infrastructure projects.But the reaction showed one thing: most people have no idea what it actually costs to build artificial intelligence at today’s scale — or why those costs sound so huge.
What makes AI so expensive?
Building a large AI model isn’t like creating an app or a website. It’s more like building a national power grid or railway system — massive, complicated, and expensive at the start. Three things make up most of the bill: compute (chips), energy (power), and data (training material).
1. ComputeTo train a large AI model, companies need tens of thousands of high-end chips called GPUs. The most popular is Nvidia’s H100, which costs around $25,000–$35,000 each. A model like GPT-5 may use 20,000–50,000 of them. That means just the chips alone cost between $1 billion and $2 billion. Add networking, cooling, and servers, and the total crosses $2 billion before training even begins.2. EnergyEach GPU uses about 700 watts of power. Multiply that by tens of thousands, and you get 40–50 megawatts — the same as a mid-sized Indian town. One month of training can cost $5–10 million in electricity. And that doesn’t include cooling costs to stop the chips from overheating.3. DataTraining data sounds free, but it’s not. You can’t just use random internet content. AI companies need massive, clean, and legally approved datasets — text, images, audio, and video. Creating and storing such data can cost hundreds of millions of dollars. GPT-4’s dataset alone reportedly cost over $100 million. For multimodal AI (which handles video and sound), the cost will be several times more.Add in hundreds of highly paid engineers and scientists, and it’s easy to see how a single training run can cost $2.3–$4 billion.That sounds shocking, but history shows it’s completely normal.Every new technology starts off insanely expensive The high cost of AI today isn’t unusual. Every major technology began as a financial nightmare — and then became cheap and essential. Railways: Britain’s first intercity rail line in 1830 cost £17 million (in today’s money). Investors called it madness. Within decades, costs dropped by 70%, and rail became the backbone of modern transport.Electricity: Edison’s first power station in 1882 served only 59 customers at a cost of $9–10 million. Electricity was 40 times more expensive per unit than it is today.Telecom: The first transatlantic cable cost £1 million and failed after three weeks. But as technology improved, message costs fell from hundreds of dollars to just cents.Internet: In the late 1990s, telecoms spent $100 billion laying fibre. Many went bankrupt. Yet within a decade, costs dropped 99%, and fast internet became cheap and universal.Semiconductors: Early chip factories cost $100 million. Today’s advanced ones cost $25 billion, but each produces chips worth hundreds of billions over time.Space: The Apollo moon mission cost $25 billion in the 1960s ($257 billion today). That research helped bring down rocket launch costs by more than 90%.The pattern is always the same: Version One is absurdly expensive. Then efficiency, competition, and scale make it cheap.
Why AI costs will soon fall
So, how will AI follow the same path? Every expensive part of today’s system — hardware, power, data — already has a roadmap for cost reduction.1. More competition in chipsRight now, Nvidia dominates the GPU market, so prices are high. But competitors like AMD, Google (TPU), Amazon (Trainium), and Intel (Gaudi) are changing that.As more players enter, margins will shrink and costs will drop 20–30%. Specialised chips (ASICs) and “sparse” models that use fewer active components at a time can further cut compute costs by up to 90%.2. Cheaper energy and smarter coolingEnergy makes up 15–20% of AI costs. But data centres can move to cheaper power locations — hydro in Quebec, geothermal in Iceland, or wind in Texas. That alone cuts costs by half or more.New cooling systems like liquid or immersion cooling also make servers more efficient, cutting power use by another 20–30%.3. Better memory and bandwidthHigh-bandwidth memory (HBM) is expensive because only a few suppliers dominate the market. New designs and competitors will lower those costs by 30–50% in the next few years.4. Smarter use of resourcesAI labs currently waste up to 40% of their computing power due to poor scheduling. Better job management and “model distillation” — teaching smaller models to behave like large ones — can halve training costs.Future models will also use “continual learning,” updating existing systems instead of retraining from scratch.5. Cheaper inferenceRunning AI models (known as inference) is expensive because it scales with use. But using smaller, optimised models, caching common answers, and switching to specialised chips can reduce inference costs tenfold.6. Smarter data useAI companies are learning that smaller, cleaner datasets perform better than massive, messy ones. Synthetic data — generated by AI itself — can cut costs by up to 90%.Together, these changes could bring down AI’s total cost threefold within two years, tenfold within five, and up to fiftyfold over a decade.
Why it won’t happen overnight

So why not make AI cheaper immediately? Because some parts of the system move at physical speed, not digital speed.Building chip factories takes years. Constructing data centres or new power plants can take a decade. Even when alternatives to Nvidia exist, switching software ecosystems takes time. You can’t speed up steel, silicon, or regulation. But the direction is clear. Every major constraint — from chip supply to energy to data — is being solved. Slowly, but steadily.
The paradox of cheaper AI
Even when AI gets cheaper, companies may still spend just as much — or more. This is called the Jevons paradox: when something becomes cheaper, people use more of it. As costs fall, AI will spread everywhere — from hospitals to classrooms to entertainment. The total spending will stay high, but the cost per use will drop dramatically. That’s why OpenAI’s CFO talked about “backstopping” — not to ask for bailouts, but to plan long-term funding for what’s effectively a new industrial revolution.
The big picture
AI today is expensive, loud, and energy-hungry. But that’s how every new technology begins. Over time, competition and efficiency will bring costs down just like they did for railways, electricity, and the internet. The truth is simple: the most expensive AI we’ll ever build is the one we’re building right now. Each new generation will be faster, cheaper, and more normal — until one day, we’ll wonder why it ever seemed expensive at all.