Back to Blog

Jevons' Paradox and AI: Why It Means More Developers, Not Fewer

Every tech conference keynote in 2025 had the same punchline: developers are toast. AI writes code now. Why would anyone hire a human to do what a model can do in seconds?

I've been hearing this for two years. And yet GitHub reported 43 million pull requests merged in 2025 — up 23% year over year. Nearly one billion commits, up 25%. The Apple App Store saw 557,000 new apps, a 24% jump. The Bureau of Labor Statistics projects software developer employment growing 15% through 2034, well above the average for all occupations.

If AI is replacing developers, it's doing a terrible job.

What's actually happening is something that a 19th-century economist predicted 160 years ago. And once you see it, you can't unsee it.

William Stanley Jevons and the Coal Question

In 1865, a 29-year-old English economist named William Stanley Jevons published a book called The Coal Question. Britain was the world's industrial superpower, and coal was the reason. But people were starting to worry. Coal was finite. What happens when it runs out?

Jevons had a different concern. James Watt's improved steam engine had made coal dramatically more efficient. You could get more work out of less coal. The intuitive assumption — shared by most of Jevons' contemporaries — was that this would reduce coal consumption. Same output, less fuel. Problem solved.

Jevons looked at the data and saw the opposite. Since Watt's improvements, Britain's coal consumption hadn't decreased. It had exploded. Since the start of the century, while engines got steadily more efficient, coal use had increased tenfold. More efficient coal didn't mean less coal. It meant coal became economical for things nobody had considered before — factories, railways, steamships, municipal heating. Each new use created its own demand. The efficiency gains didn't conserve the resource. They unlocked it.

Jevons wrote: "It is wholly a confusion of ideas to suppose that the economical use of fuel is equivalent to a diminished consumption. The very contrary is the truth."

This observation — that making a resource more efficient to use increases rather than decreases total consumption — became known as Jevons' paradox. And it has been right about basically everything since.

The Pattern That Never Stops

What is Jevons' paradox in modern terms? It's the engine behind most of the technology shifts we take for granted.

Lighting. LED bulbs use roughly 75% less energy than incandescent bulbs. We didn't use 75% less electricity on lighting. We put lights in closets, under cabinets, along staircases, embedded in furniture, wrapped around buildings, strung across patios. The International Energy Agency found that efficiency gains in lighting largely offset demand growth for years, keeping consumption relatively stable — until growing demand in emerging economies began outpacing even those gains. We just lit up more stuff.

Computing. Mainframes cost millions and filled rooms. PCs cost thousands and sat on desks. Smartphones cost hundreds and fit in pockets. Each generation made computing cheaper per unit. Total compute consumption didn't shrink — it went vertical. We now compute every waking moment. The smartphone didn't reduce our need for computers. It made computing so cheap that we use it for things that would have been absurd to computerize in 1980 — ordering lunch, tracking steps, identifying plants.

Cloud infrastructure. When AWS launched in 2006, provisioning a server went from a weeks-long procurement process to a credit card swipe. Did companies use fewer servers? No. They spun up thousands. Microservices, A/B testing infrastructure, staging environments, feature flag services, observability platforms — entire categories of software infrastructure that only exist because servers became cheap enough to waste.

Storage. A gigabyte of storage cost $300,000 in 1981. Today it's effectively free. We didn't store less. We store everything — every photo, every message, every draft, every version of every file. We invented entirely new categories of stored data: telemetry, analytics events, model training data, video surveillance.

The pattern is consistent enough to be a law: when the cost of something drops dramatically, consumption doesn't decrease proportionally. It expands to fill — and exceed — the gap.

The Rebound Effect and Its Critics

Economists have a more precise term for what Jevons described. The rebound effect measures how much of an efficiency gain is "taken back" by increased usage. A 10% efficiency improvement that leads to 5% more usage has a 50% rebound effect. When the rebound exceeds 100% — when total consumption actually increases despite efficiency gains — that's Jevons' paradox in full effect.

Transportation planners have their own version: induced demand. Build a wider highway to reduce congestion, and within a few years traffic is worse than before. The new capacity generates new trips that wouldn't have happened otherwise. People move farther from work, take road trips they'd have skipped, and the congestion returns — but with more total vehicles.

Not everyone agrees Jevons' paradox is universal. Critics — notably economists like Harry Saunders and environmental policy researchers — argue that the paradox depends heavily on the price elasticity of demand. If demand for a resource is relatively fixed (people only need so much heating, regardless of cost), efficiency gains can reduce total consumption. The rebound effect is real, they say, but it isn't always greater than 100%.

That's a fair criticism. And for some resources, they're right. The rebound effect for residential heating in developed nations is typically estimated at 20-40%, not 100%+. People don't heat their homes to 100 degrees just because it's cheap.

But here's the thing about software: demand for software is not fixed. It's not even close to fixed. Demand for software is, for all practical purposes, infinite. And that's not a rhetorical flourish. It's the most important claim in this entire essay, so let me make the case.

Why Software Demand Is Bottomless

Consider what software actually is: formalized human intent. Every business process that still runs on spreadsheets, every workflow duct-taped together with email, every decision made by gut because nobody built a dashboard — that's unmet software demand. The question isn't whether there's enough demand to absorb cheaper development. It's whether we can even see the edges of how much demand exists. I don't think we can, and there are three concrete ways to see why.

The Spreadsheet Inventory

According to Forrester, 82% of organizations still route tasks using paper-based processes and Excel spreadsheets. There are over 150 million business users of Excel worldwide. Think about what that means. Every one of those spreadsheets represents a process that someone needed to formalize — important enough to build a spreadsheet for, but not important enough (or not affordable enough) to build real software for.

A spreadsheet tracking inventory for a small warehouse. A shared Google Sheet coordinating shift schedules at a restaurant chain. An Excel workbook a VP built to model quarterly projections, held together with nested VLOOKUPs and a prayer. These aren't casual documents. They're load-bearing business infrastructure disguised as files. They crash, they corrupt, they silently break when someone accidentally deletes a row. They have no access controls, no audit logs, no API integrations, no mobile interface.

Every one of them is an unbuilt application. The entire spreadsheet economy is a shadow inventory of software demand that was never economically viable to fulfill. When the cost of turning a spreadsheet into a real application drops from $50,000 to $500 — or to an afternoon with an AI coding assistant — that demand doesn't stay latent. It converts.

Shadow IT as Demand Signal

Here's another way to see the unmet demand: look at what employees do when they can't get the software they need.

Roughly 42% of applications in a typical company are shadow IT — tools that employees adopted without IT approval. Eighty percent of workers admit to using unsanctioned SaaS apps at work. The average company has 975 unknown cloud services running alongside 108 known ones. Gartner estimates that 30-40% of IT spending at large enterprises is shadow IT.

The security-minded read this as a compliance problem. But it's actually a demand signal. Employees are so desperate for software that fits their actual workflows that they route around IT departments, sign up for unauthorized tools with personal credit cards, and cobble together Zapier automations on their own. They're not doing this for fun. They're doing it because the software they need doesn't exist, or IT can't build it fast enough, or the enterprise procurement process takes six months for a tool they need today.

Shadow IT isn't a bug. It's the market screaming that supply can't keep up with demand. And every shadow IT tool — every unauthorized Notion workspace, every rogue Airtable base, every Slack bot someone hacked together — is evidence that the world wants more software than it currently has.

The GDP Gap

Zoom out to the macro picture. The entire US tech sector — all of it, hardware and software combined — represents about 8.9% of GDP, roughly $2 trillion. Meanwhile, McKinsey research found that nearly 70% of top-performing companies differentiate themselves through proprietary software. The gap between those two numbers tells you everything.

Software touches 8.9% of GDP directly, but should be embedded in virtually 100% of economic activity. Every logistics company optimizing routes, every hospital scheduling surgeries, every farm managing irrigation, every law firm tracking billable hours — all of this is economic activity that benefits from custom software, and most of it is still running on generic tools, manual processes, or nothing at all.

Goldman Sachs projects the application software market will grow to $780 billion by 2030 at a 13% compound annual growth rate, with AI-agent-powered solutions potentially representing 60% of the total addressable market. That's not a market approaching saturation. That's a market still in its expansion phase, accelerating rather than plateauing.

There are roughly 33 million small businesses in the United States. Over half have no mobile app. The vast majority have never had a single line of custom code written for their specific workflows. Not because they don't need custom software — every business has unique processes — but because custom development has always been prohibitively expensive for a 15-person landscaping company or a three-location bakery chain. When development cost approaches zero, that entire market opens up.

Demand Without a Ceiling

Heating has a ceiling — you don't heat your house to 200 degrees. Transportation has a ceiling — there are only so many places to go. Banking has a ceiling — people have a finite number of financial transactions to perform.

Software has no ceiling because software is the encoding of every other activity. There's no upper bound on the number of processes that can be formalized, no saturation point for how much of human work and life can be mediated by code. We're not at 90% software penetration wondering where the last 10% will come from. We're at maybe 5-10% penetration with the entire curve still ahead of us.

Which is why Jevons' paradox applies to software development with a vengeance.

The Jevons' Paradox of Software Development

Here's the core argument: AI tools are making software development dramatically cheaper. Not just incrementally — structurally cheaper. The cost of producing a working application is dropping so fast that it's unlocking entirely new categories of demand.

Before AI coding tools, building a web application required learning a framework, wrestling with configuration, debugging cryptic errors, writing boilerplate, and spending hours on Stack Overflow. The minimum viable skill level was high and the time investment was substantial. A solo developer might ship one or two side projects a year if they were disciplined.

Now? 84% of developers use or plan to use AI tools in their workflow. Faros AI's research found that developers using AI assistants complete 21% more tasks and merge 98% more pull requests. Those aren't marginal improvements. That's a near-doubling of throughput per developer.

And the really interesting thing is what's happening with the new demand that cheaper development creates.

"Vibe coding" — the practice of using AI to build software by describing what you want in natural language rather than writing code line by line — went from a niche experiment to a mainstream term. Collins Dictionary named it their 2025 Word of the Year. Market research estimates put the ecosystem around it at $4.7 billion. People who never would have called themselves developers are shipping applications. Designers are building their own tools. Marketers are prototyping landing pages. Product managers are creating internal dashboards.

This isn't developers being replaced. This is the developer pool expanding. The same way cheap coal didn't replace coal miners but instead created jobs for railway workers, factory operators, and steam engineers — cheap software development isn't replacing developers. It's creating demand for AI infrastructure engineers, prompt engineers, integration specialists, and a hundred other roles that didn't exist three years ago.

When Satya Nadella tweeted about Jevons' paradox after the DeepSeek announcement in January 2025, the search term spiked from near-zero to peak interest on Google Trends overnight. He understood the implication: cheaper AI doesn't mean less demand for AI. It means more. The same logic applies to the software that AI builds.

The Numbers Tell the Story

The AI productivity paradox — the gap between what we expect AI to do and what the numbers actually show — keeps resolving in the same direction. Not fewer developers, but more software.

  • GitHub: 43 million pull requests merged per month in 2025 (+23% YoY). Nearly 1 billion commits (+25%). If AI were replacing developers, this would be going down.
  • Apple App Store: 557,000 new apps in 2025 (+24%). The flood isn't slowing. It's accelerating.
  • BLS projections: Software developer employment projected to grow 15% through 2034. The government statisticians, who are not known for hype, expect more developer jobs.
  • Faros AI research: Developers with AI tools complete 21% more tasks and merge 98% more PRs — but PR review time increased 91%. More code means more review, more testing, more integration, more deployment. The work shifts, but it doesn't disappear.
  • Stack Overflow's 2025 survey: 84% of developers now use or plan to use AI coding tools. And they're not shipping the same amount of code faster. They're shipping more.

These are Jevons' paradox examples playing out in real time. Every efficiency gain in code production creates new demand for code. Better tools don't shrink the industry. They grow it.

What This Looks Like in Practice: A Case Study

I'll use myself as an example, because I'm living inside this paradox.

Right now I maintain over a dozen active projects simultaneously. Six are curated directory sites — AIDevTools, Gamestruction, SaaSAlternatives, SkillShelf, MCPAtlas, and AgenticGameDevelopment — all built on the same Astro 5 + Tailwind 4 architecture with JSON data and Zod validation. On top of those, I'm building two programming languages (Klar and Kira), developing games (Stellar Throne and an unannounced project), running a managed SaaS service (ClawButler), a game discovery engine (GameLegend), a spreadsheet-to-API platform (SheetSmith), a changelog generator (ChangeSmith), and this blog. And that's not counting open source contributions.

Two years ago, this portfolio would have required a small team. Minimum three people, more likely five — frontend, backend, content, operations, and someone to keep the SEO and cross-linking strategy coherent across a dozen-plus projects. The tooling didn't exist for one person to hold all of it.

What changed isn't that AI "replaced" those hypothetical teammates. It's that the marginal cost of launching and maintaining each additional site dropped so low that the whole portfolio became economically viable as a solo operation. The first directory site had the steepest learning curve. But with AI support I quickly got into a groove. I didn't reuse code between sites — I had AI build each one from scratch, tailored to its niche. That's the part that would have been unthinkable before: six bespoke codebases, each with their own data models and category structures, built and maintained by one person.

I didn't build fewer, better things when AI made development faster. I built more things. That's the paradox in action.

And I'm not unique. Every developer I know who's adopted AI tools has a similar story. They're not working less. They're not unemployed. They're shipping more ambitious projects, maintaining more codebases, exploring more ideas. The constraint moved — from "can I build this?" to "should I build this?"

I wrote about a related angle in my earlier post on Jevons' paradox and indie games, where the same economics apply to game development. The pattern is identical: cheaper creation doesn't mean less creation. It means more.

Will AI Replace Programmers? The Wrong Question

"Will AI replace programmers?" is the wrong question, and Jevons' paradox tells you why.

When ATMs started spreading in the late 1960s and 1970s, everyone assumed bank tellers were finished. A machine that handles deposits and withdrawals 24/7 — why would you ever need a human? Economist James Bessen studied what actually happened. ATMs reduced the number of tellers per branch, which made branches cheaper to operate, which meant banks opened more branches. The total number of bank tellers in the US actually increased from 1980 to 2010. The job shifted — less cash handling, more relationship management and complex transactions — but the automation didn't eliminate the role. It expanded the market for banking services so much that more humans were needed, not fewer.

But there's an important coda to this story. As David Oks recently argued, the ATM wasn't the technology that killed bank tellers — the iPhone was. After 2010, teller employment collapsed from 332,000 to 164,000, a 50% wipeout in twelve years. Mobile banking didn't automate tasks within the existing paradigm the way ATMs did. It created an entirely new paradigm that made branches — and the humans staffing them — far less necessary. The ATM optimized the old world. The smartphone made it obsolete.

So is this a cautionary tale for developers? Will some future paradigm shift do to programmers what the iPhone did to tellers?

I don't think so, and the reason comes back to demand. Bank tellers served a finite need — people have a fixed number of financial transactions to perform, and once you give them a way to do those transactions from their pocket, the branch becomes redundant. There's a ceiling on how much banking a person needs. But there is no ceiling on how much software the world needs. Software isn't a service counter people visit to complete a transaction. It's the substrate of every process, decision, and interaction that can be formalized. The demand curve for banking services flattened and then smartphones captured it digitally. The demand curve for software has no visible inflection point — it's been compounding for sixty years and is still accelerating.

That's the crucial difference. The iPhone could kill teller jobs because it satisfied the same demand through a better channel. For AI to kill developer jobs, it would need to satisfy the demand for software — and that demand is, for all practical purposes, bottomless. Every efficiency gain in code production just reveals more of it.

AI making it cheap to write code means:

Hyper-niche software becomes viable. A tool for managing beekeeping apiaries. An app for tracking competitive Rubik's cube solving times. A custom CRM for independent funeral homes. These markets are too small to justify a traditional development budget. At near-zero development cost? Every niche gets its own software.

Internal tools proliferate. Companies that never would have built internal software — a 15-person landscaping company, a local bakery chain, a dental practice — can now have custom tools built for their specific workflows. This is a market that barely existed five years ago.

Vibe coding expands who builds. When non-programmers can describe what they want and get working software, the definition of "developer" expands. This doesn't replace professional developers any more than Instagram replaced professional photographers. It grows the total market for software while creating more demand for professionals who can build the complex systems that vibe-coded apps depend on — APIs, infrastructure, databases, security.

The maintenance multiplier kicks in. More software means more software to update, secure, integrate, monitor, debug, and eventually rewrite. The lifecycle costs of software don't go away because it was cheap to write the first version. If anything, easy creation means more software exists in more states of disrepair, creating demand for the developers who can untangle it.

Jevons' paradox for knowledge workers follows the same logic as for every other resource. Making knowledge work cheaper doesn't eliminate knowledge workers. It creates more knowledge work.

What Developers Should Do Right Now

If the paradox holds — and the data overwhelmingly suggests it does — then the strategic playbook for developers is clear:

1. Invest in judgment, not just syntax. When writing code gets cheap, the value shifts to knowing what to build, how to architect it, and why one approach beats another. Taste, design sense, and systems thinking become the differentiators. The developer who understands the problem domain deeply will always outperform the one who can only execute instructions — whether those instructions come from a human manager or an AI.

2. Learn AI tools now. The efficiency multiplier is real, and it compounds. Developers who are fluent in AI-assisted workflows today are shipping circles around those who aren't. The learning curve is real — knowing how to prompt effectively, when to trust AI output, and when to override it are skills that take practice. Start now. The gap between AI-fluent and AI-avoidant developers will only widen.

3. Think bigger. That project you shelved because it was too ambitious for a solo developer? That SaaS idea you dismissed because you'd need to hire a frontend developer? That open-source tool you wanted to build but couldn't justify the time? The economics just changed in your favor. The constraints that made those projects impossible are dissolving. The question isn't "can I build it?" anymore. It's "is it worth building?" — and for more ideas than ever, the answer is yes.

4. Embrace the expanding market. More software means more infrastructure, more APIs, more integration points, more security surface area, more performance optimization, more data pipelines. If you're a developer worried about job security, look at the total addressable market for software and ask yourself: does this look like a market that's shrinking?

The Constant

Jevons published The Coal Question in 1865 because he was worried Britain would run out of coal. He was wrong about that — other energy sources emerged. But he was profoundly right about the mechanism. Efficiency doesn't reduce consumption. It unlocks it.

AI is doing for software development what the steam engine did for coal. Making it so efficient that every use case that was previously too expensive suddenly becomes viable. The flood of new software isn't a sign that developers are obsolete. It's a sign that the world's appetite for software was always larger than we could feed with manual labor alone.

The developers who understand this aren't panicking. They're building.