The Hidden Costs of AI: Water, Power, and Who Really Pays

Felix D. Helix
April 20, 2026
12 min read
1 views

In Part 1, we talked about the human cost of the AI boom—55,000 layoffs, the job displacement, the growing divide between haves and have-nots. But there's another cost that tech companies are even less interested in discussing: the environmental and economic burden they're shifting onto the rest of us.

Towns Without Water

Here's a fact that should scare everyone: large AI data centers can consume up to 5 million gallons of water per day. That's about the same amount of water used by a town of 10,000 to 50,000 people. Per facility. And there are hundreds of these facilities all over the country.

In Texas alone, data centers will use 49 billion gallons of water in 2025. What will this look like by 2030? An estimated 399 billion gallons.

Think about that. In a state that regularly faces drought conditions, AI data centers are projected to use 399 billion gallons of water annually by 2030.

Northern Virginia's data centers consumed close to 2 billion gallons in 2023—a 63% increase from 2019. In Spain, Amazon asked the government for permission to increase water consumption at three existing data centers by 48%. The same region needed European Union aid to deal with its drought.

Meanwhile, the people living near these data centers are being told to conserve water. Take shorter showers. Don't water your lawn. Fix that leaky faucet. Excuse me? This sounds like shifting the responsibility to everyday folks just like with recycling and carbon emissions. Not to say we shouldn't be good citizens and conserve water when we can, but the source of the problem is being ignored.

The cognitive dissonance is unbelievable. A corporation can drain the local water supply to train AI models, but you better not run your sprinkler on the wrong day of the week.

And here's the thing: host communities reap some tax benefits while the costs—the intense water demand, higher electricity bills, and air pollution from backup generators—are dispersed more regionally, including to areas that won't see any new tax revenue. How does this help the local community? It doesn't. It just shifts the burden.

More than 230 environmental groups have sent a letter to Congress warning that AI and data centers are "threatening Americans' economic, environmental, climate and water security."

The Power Problem

Water isn't the only resource AI is devouring. The power consumption is equally alarming.

Data centers consumed approximately 415 terawatt hours (TWh) in 2024. That number probably means nothing to most people — so let's put it in terms that do. One terawatt hour is enough electricity to power about 90,000 American homes for an entire year. At 415 TWh, data centers consumed enough electricity last year to power every home in California, Texas, and New York — combined — for a full year. That's about 1.5% of all electricity used on the planet.

And it's growing at 12% per year. By 2030, that consumption is projected to more than double to around 945 TWh — equivalent to powering a new United States worth of homes on top of everything else already on the grid.

In 2023, data centers consumed about 26% of Virginia's total electricity supply. In North Dakota: 15%. Nebraska: 12%. Iowa: 11%. Oregon: 11%. These aren't small numbers — in some states, data centers are consuming more power than entire industries that have existed for decades.

The power grid wasn't designed for this. And now we, the average person, are all paying for it.

In the PJM electricity market stretching from Illinois to North Carolina, data centers accounted for an estimated $9.3 billion price increase. As a result, the average residential bill is expected to rise by $18 per month in western Maryland and $16 per month in Ohio.

Wholesale electricity costs as much as 267% more than it did five years ago in areas near data centers. And that's being passed on to customers.

A Carnegie Mellon University study estimates that data centers and cryptocurrency mining could lead to an 8% increase in the average U.S. electricity bill by 2030, potentially exceeding 25% in the highest-demand markets of central and northern Virginia.

Your electricity bill going up? This is why.

But again, who's really paying for this expansion? Not the tech companies reaping billions in profits and market cap increases. It's being transferred to regular people through higher utility bills and degraded service.

The Economics Don't Work

Here's the icing on the cake: these AI systems can't even pay for themselves.

The combined capital expenditure of Microsoft, Alphabet, Amazon, and Meta reached a staggering $246 billion in 2024—a 63% increase from 2023. For 2025, projections hit $405 billion, with 2026 estimates at $527 billion.

Microsoft alone is on track to spend $80 billion. Amazon: $125 billion. Google: $91-93 billion. Meta: $60-65 billion. OpenAI has committed to spending over $1 trillion on AI infrastructure—despite not being profitable.

What about the revenue? Current AI revenues stand at only $20 billion. To justify the current investment scale, AI would need to generate $2 trillion in annual revenue by 2030. That's a 100-fold increase in five years.

But here's what really exposes the problem: despite $30-40 billion in enterprise investment into Generative AI, 95% of organizations are getting zero return, according to an MIT report.

These companies are running on faith and investor hype. Ray Dalio, founder of Bridgewater Associates, says the AI boom "is now in the early stages of a bubble" and compares it to being "about 80%" of the euphoria leading up to the 1929 stock market crash or the 2000 dot-com bubble.

When you factor in the computational costs, the energy costs, the cooling costs, the infrastructure costs, the talent costs, the training data costs, the regulatory costs—the unit economics are upside down.

They're building a house of cards. And when it falls, it won't be the CEOs and investors who get crushed. It'll be the workers who already lost their jobs, and the communities already drained of resources.

But Solutions Exist—So Why Aren't They Being Used?

Here's the part that really irks me: we actually have the tech to fix much of this.

Research from MIT shows that combined best practices can cut AI data center emissions and water footprints by 73% and 86%, respectively. A big part of that is cooling — and it's worth understanding why the current approach is so wasteful.

Most large data centers today use evaporative cooling — essentially giant industrial swamp coolers. Water is circulated through cooling towers where it evaporates to carry heat away. It works, and it's cheap to build. But here's the catch: 70-80% of that water is lost to evaporation. Poof, gone. That's where the millions of gallons per day go.

The alternatives — direct-to-chip liquid cooling and immersion cooling — work very differently. Instead of evaporating water into the air, they circulate fluid directly over or around the chips themselves in a closed loop. No evaporation, near-zero water loss. A single facility switching to closed-loop cooling saves an estimated 125 million liters of water per year compared to evaporative systems. And because liquid conducts heat roughly 25 times better than air, these systems also use 30-40% less energy to do the same job.

Retrofitting a data center for liquid cooling runs $2-3 million per megawatt — significant capital that smaller operators can't absorb, and that larger ones simply choose not to spend when evaporative cooling is cheaper and the water bill is someone else's problem.

There's one nuance worth mentioning: Microsoft announced in late 2024 that all its new data center designs will use closed-loop zero-water cooling. That's a genuine step forward. But it only applies to new facilities — the hundreds of existing data centers still running evaporative systems aren't changing anytime soon.

And if you think that sounds like progress, wait until you hear what they were doing at the same time. In February 2026, Microsoft announced it had technically met its 100% renewable energy goal — through paper accounting. The company purchased enough renewable energy credits across 400+ agreements to offset its annual consumption on a spreadsheet. Milestone achieved.

Except: only 19 of the 40 gigawatts of contracted capacity is actually online. The rest won't come online for years. And while Microsoft was declaring victory, its actual emissions rose roughly 23-30% since the pledge was made, and its energy use jumped 168% — driven entirely by AI data center growth. The company has even paused new carbon removal agreements as AI makes its broader climate commitments harder to keep.

That's not a solution. That's a press release.

Some companies are exploring nuclear power as a reliable, low-carbon energy source — but meaningful scale is years away.

So the solutions exist. The technology is available. Why isn't it being implemented?

Because it costs money. Because it requires upfront investment. Because shareholders want returns now, not environmental responsibility and we can foot the bill for them.

It's the same old story: privatized gains, socialized losses. Tech companies could reduce their environmental impact dramatically. They're choosing not to.

Who Bears the Burden?

This is the pattern, we've seen it everywhere:

The profits are private. The costs are public.

Tech companies get to claim the innovation, the market value, the competitive advantage. They get the glory and the stock price bumps. In late 2025, 30% of the U.S. S&P 500 and 20% of the MSCI World index was held up solely by five companies—the greatest concentration in half a century.

But the environmental degradation? That's our problem. The infrastructure strain? We pay for it. The water shortages? Communities deal with it. The economic instability when the bubble bursts? Regular workers and communities will bear that weight.

There's a counterargument worth mentioning: in California, PG&E has projected that data center growth could reduce average household bills by up to 2%, arguing that large customers pay more than the minimum cost, generating surplus revenue for grid upgrades.

But that's one state, one utility, one optimistic projection. Against that, we have documented 267% electricity cost increases in areas near data centers, $16-18 monthly bill increases across multiple states, and entire regions running out of water.

The Questions They Hope We Don't Ask

Where is the accountability?

Why aren't these companies required to offset their environmental impact? Why aren't they paying the true cost of the resources they're consuming? Why aren't they implementing the sustainable solutions that already exist?

If you can cut water usage by 86% with existing technology, why aren't you doing it?

Why is the public subsidizing private AI development through our water, our power, our communities, and our futures?

And most importantly: can or will these systems ever actually pay back the enormous debt and cost that's been invested in them? Or are we watching one of the world's most expensive Ponzi schemes unfold in real time, while communities literally run dry?

The Real Bottom Line

The AI boom is being sold as inevitable progress. But progress at what cost? And paid by whom?

When companies spend $405 billion on AI while laying off thousands of workers, draining towns of water, and driving up electricity bills by hundreds of percent, that's not progress. That's extraction.

When the environmental and economic costs are passed onto communities and taxpayers while the profits flow to shareholders, and when existing solutions could reduce the impact by 73-86% but aren't being implemented, that's not innovation. That's exploitation.

We're told to trust that this will all work out in the end. That the benefits will trickle down. That we're building a better future.

Whose future is being built and who's paying for it?

Next in this series: Part 3 will explore how AI is being forced into every aspect of our lives—including places it has no business being—and what these companies are really after: your data.

References

  1. "Data Centers and Water Consumption." Environmental and Energy Study Institute, 2025.
  2. "AI, data centers, and water." Brookings Institution, 2025.
  3. "The AI Boom Is Draining Water From the Areas That Need It Most." Bloomberg, 2025.
  4. "AI data centers use a lot of electricity. How it could affect your power bill." NPR, January 2026.
  5. "How AI Data Centers Are Sending Your Power Bill Soaring." Bloomberg, 2025.
  6. "What we know about energy use at U.S. data centers amid the AI boom." Pew Research Center, October 2025.
  7. "Why AI Companies May Invest More than $500 Billion in 2026." Goldman Sachs, 2026.
  8. "How much Google, Meta, Amazon and Microsoft are spending on AI." CNBC, October 2025.
  9. "Why the AI Spending Spree Could Spell Trouble for Investors." Morningstar, 2025.
  10. "Ray Dalio says AI is in 'the early stages of a bubble,' so watch out for 2026." Fortune, January 2026.
  11. "AI has high data center energy costs — but there are solutions." MIT Sloan, 2025.
  12. "No, AI Data Centers Are Not Driving Up Electricity Costs." Reason, December 2025.
  13. "Microsoft matches 100% of 2025 power use with renewables." Data Center Dynamics, February 2026.
  14. "Microsoft's 100% renewables boast is complicated." PV Tech, 2026.
  15. "Microsoft Pauses New Carbon Removal Agreements." The Energy Mix, 2026.
  16. "Sustainable by design — next-generation datacenters consume zero water for cooling." Microsoft Cloud Blog, December 2024.
  17. "The data center cooling state of play 2025." Tom's Hardware, 2025.
  18. "AI's Cooling Problem — How Data Centers Are Transforming Water Use." Environmental Law Institute, 2025.
  19. "Liquid vs Air Cooling for AI Data Centers." Introl, 2025.

This series uses AI as a writing tool while critiquing the AI industry's broader impact. The difference matters: using a tool to enhance human work vs. replacing humans entirely.

Share this post