A T I M B E R & I N K E S S A Y
AI, water, energy, and what it means to be a good neighbor in a warming world.
If you've ever typed a prompt into ChatGPT or watched an image bloom from a handful of words, it can feel like magic: clean, quiet, weightless.
But "digital" doesn't mean immaterial.
Under every AI interaction is a physical reality — data centers, servers, cooling systems, power plants, transmission lines, and supply chains. And as the world races to build more AI capability, the environmental conversation has sharpened into two very earthy questions. How much electricity does this take? How much water does this drink?
What follows is a walk through the most common claims, an honest look at what needs correcting, and then a survey of what OpenAI, Google, Microsoft, and the broader industry are doing to reduce the impact — and how soon improvements may realistically arrive.
Where the conversation has to start.
The Claim
"AI is harmful to the environment because of data centers."
Verdict
Broadly valid, but details matter.
MIT summarizes the core issue well: training and running large generative AI models requires substantial electricity, and significant water can be used to cool the hardware. Manufacturing and transporting specialized hardware adds indirect impact, too (MIT News).
The important nuance: not all AI is equally expensive. A tiny model running on-device is a different creature than a frontier model served at global scale from hyperscale data centers. The footprint varies dramatically by where the data center is — grid cleanliness, water availability, climate, cooling design.
Four common claims, four honest verdicts.
Claim A
"Data centers used 4% of total U.S. electricity in 2023."
Verdict
Close, but the best-cited number is 2024, not 2023.
Pew, drawing on IEA estimates, reports that U.S. data centers consumed 183 TWh in 2024, described as "more than 4%" of total U.S. electricity use (Pew Research Center).
Claim B
"It will jump to 7–12% in the next three years."
Verdict
Plausible range, but scenario-dependent.
Some projections place data centers into high single digits or low double digits later this decade, especially if AI buildout continues at current speed (Pew Research Center). The Timber & Ink rule: present these numbers as scenarios, not promises.
Claim C
"A generative AI query uses 4–5x the energy of a typical search."
Verdict
The "multiple times higher" idea is supported; exact multipliers vary.
Some commonly cited estimates compare a web search at roughly 0.3 Wh to an LLM request at roughly 2.9 Wh, implying a several-fold difference. These are averages; real usage depends on model size, response length, caching, and serving efficiency (Epoch AI).
Claim D
"Training one large AI model emits as much carbon as five American cars over their lifetimes."
Verdict
Often repeated, frequently misunderstood or overgeneralized.
This line traces back to a widely-cited 2019 UMass Amherst analysis, popularized in multiple summaries and then spread as a shorthand. But even Google has noted that "five cars" has been misapplied in later retellings, and results vary dramatically depending on grid mix, hardware, and method (Google Research).
Two claims, and the careful framing they deserve.
Claim A
"A single large data center can use up to 5 million gallons of water per day."
Verdict
Possible for large facilities, but not typical.
Water use varies greatly by cooling system, climate, and load. EESI summarizes that water use at scale is substantial and growing, and the footprint can include both direct cooling and indirect water tied to electricity generation (EESI).
Claim B
"20–50 questions with ChatGPT can use about 500 ml of water."
Verdict
Based on an estimate, but highly context-dependent.
Researchers have explored the "water footprint" of AI workloads — cooling plus electricity — and some headlines translate this into "a bottle of water" per short session. But the exact number depends heavily on location, season, cooling design, and what's counted as "water used" (arXiv).
Responsible framing. It is fair to say AI can be water-intensive and that per-session water impacts can be meaningful in certain contexts. It is not fair to treat a single half-liter number as universal law.
Directionally valid, harder to quantify cleanly.
As AI expands, demand rises for high-performance hardware and for the electrical and construction systems that support it. These have upstream impacts — mining, manufacturing, shipping — and downstream impacts in the form of e-waste.
At the community level, concerns include grid strain and household cost impacts tied to the buildout (Pew Research Center). These are not abstractions. They show up on real bills, in real towns, with real people on either side of the meter.
Where the major players say they're moving — and what actually moves the needle.
Cooling redesigns and water intensity targets
Water replenishment and reporting
Early public commitments and ongoing calls for transparency
OpenAI has begun addressing public concerns around data center strain and the need to avoid burdening communities, emphasizing what they describe as a "good neighbor" posture (The Verge).
Beneath the corporate commitments, four practical levers determine whether AI's footprint shrinks meaningfully or just shrinks rhetorically.
When this is likely to get better — and where the real bend in the curve sits.
A word, especially, to my fellow Christians.
If you're a Christian using AI, the goal isn't fear or denial — it's wisdom. The same instinct that asks where does my food come from or who made this shirt applies here. A tool isn't morally neutral just because it's invisible. The question is whether we use it the way good neighbors use power.
Will we be faithful with what we've been given?
For Further Reading
Sources cited above, in order of appearance.