The Hidden Footprint of “Invisible” Intelligence
AI, water, energy, and what it means to be a good neighbor in a warming world.
If you’ve ever typed a prompt into ChatGPT or watched an image bloom from a handful of words, it can feel like magic: clean, quiet, weightless.
But “digital” doesn’t mean immaterial.
Under every AI interaction is a physical reality—data centers, servers, cooling systems, power plants, transmission lines, and supply chains. And as the world races to build more AI capability, the environmental conversation has sharpened into two very earthy questions:
- How much electricity does this take?
- How much water does this drink?
Let’s walk through the most common claims, correct what needs correcting, and then look at what OpenAI, Google, Microsoft, and the industry are doing to reduce the impact—and how soon improvements may realistically arrive.
1) “AI is harmful to the environment because of data centers.”
Verdict: Broadly valid, but details matter.
MIT summarizes the core issue well: training and running large generative AI models requires substantial electricity, and significant water can be used to cool the hardware; manufacturing and transporting specialized hardware adds indirect impact, too. (MIT News)
The “cloud” has a body. It’s steel and circuitry, coolant lines and backup generators, and the rivers and power plants that keep it all from overheating.The important nuance: not all AI is equally expensive. A tiny model running on-device is a different creature than a frontier model served at global scale from hyperscale data centers. The footprint varies dramatically by where the data center is (grid cleanliness, water availability, climate, cooling design).
2) Energy use & carbon: checking the biggest claims
Claim A: “Data centers used 4% of total U.S. electricity in 2023.”
Verdict: Close, but the best-cited number is 2024, not 2023.
Pew (drawing on IEA estimates) reports U.S. data centers consumed 183 TWh in 2024, described as “more than 4%” of total U.S. electricity use. (Pew Research Center)
Claim B: “It will jump to 7–12% in the next three years.”
Verdict: Plausible range, but scenario-dependent.
Some projections place data centers into high single digits or low double digits later this decade, especially if AI buildout continues at current speed. (Pew Research Center)
Timber & Ink rule: present these numbers as scenarios, not promises.
Claim C: “A generative AI query uses 4–5x the energy of a typical search.”
Verdict: The “multiple times higher” idea is supported; exact multipliers vary.
Some commonly cited estimates compare a web search (~0.3 Wh) to an LLM request (~2.9 Wh), implying a several-fold difference. These are averages; real usage depends on model size, response length, caching, and serving efficiency. (Epoch AI)
Claim D: “Training one large AI model emits as much carbon as five American cars over their lifetimes.”
Verdict: Often repeated, frequently misunderstood or overgeneralized.
This line traces back to a widely-cited 2019 UMass Amherst analysis (popularized in multiple summaries) and then spread as a shorthand. But even Google has noted that “five cars” has been misapplied in later retellings, and results vary dramatically depending on grid mix, hardware, and method. (Google Research Blog)
The right way to say it is slower: impact varies—sometimes wildly—depending on the power source, the cooling design, and the efficiency of the serving stack.3) Water use: checking the “AI is thirsty” claims
Claim A: “A single large data center can use up to 5 million gallons of water per day.”
Verdict: Possible for large facilities, but not typical.
Water use varies greatly by cooling system, climate, and load. EESI summarizes that water use at scale is substantial and growing, and the footprint can include both direct cooling and indirect water tied to electricity generation. (EESI)
Claim B: “20–50 questions with ChatGPT can use ~500 ml of water.”
Verdict: Based on an estimate, but highly context-dependent.
Researchers have explored the “water footprint” of AI workloads (cooling + electricity). Some headlines translate this into “a bottle of water” per short session. But the exact number depends heavily on location, season, cooling, and what’s counted as “water used.” For a technical discussion, see the associated “less thirsty AI” research. (arXiv)
Responsible framing: It is fair to say AI can be water-intensive and that per-session water impacts can be meaningful in certain contexts. It is not fair to treat a single “half-liter” number as universal law.
4) Other impacts: e-waste, materials, and local communities
Verdict: Directionally valid, harder to quantify cleanly.
As AI expands, demand rises for high-performance hardware and for the electrical and construction systems that support it. These have upstream impacts (mining, manufacturing, shipping) and downstream impacts (e-waste).
At the community level, concerns include grid strain and household cost impacts tied to the buildout. (Pew Research Center)
5) What’s being done about it
Microsoft: cooling redesigns + water intensity targets
- Microsoft has described a next-generation data center design that consumes zero water for cooling (chip-level cooling), rolling out beginning in 2024. (Microsoft)
- Microsoft also describes a commitment to improve data center water-use intensity by 2030, and emphasizes closed-loop liquid cooling systems that recirculate. (Microsoft On the Issues)
Google: water replenishment + reporting
- Google’s published stewardship portfolio supports its goal to replenish 120% of freshwater consumed (on average) by 2030. (Google Sustainability)
- Google also publishes environmental reporting that tracks energy and sustainability themes. (Google Environmental Report)
OpenAI: early public commitments, ongoing calls for transparency
OpenAI has begun addressing public concerns around data center strain and the need to avoid burdening communities, emphasizing a “good neighbor” posture. (The Verge)
The question isn’t whether we should build. It’s whether we will build like neighbors.Industry-wide: what actually moves the needle
- Cleaner power: emissions fall as the grid and procurement get cleaner.
- Better cooling: closed-loop and waterless designs reduce freshwater pressure.
- More efficient inference: model + serving optimizations cut cost per request.
- Smarter siting: climate and water conditions matter—deeply.
6) When will this get better? (a realistic timeline)
- Now–2027: demand growth may outpace efficiency gains in many regions.
- 2027–2030: major public targets land here (water stewardship + new cooling designs at scale).
- Post-2030: if clean power + advanced cooling + efficient inference scale, per-unit impact can fall meaningfully.
7) A word to my fellow Christians: stewardship without panic
If you’re a Christian using AI, the goal isn’t fear or denial—it’s wisdom.
Five rails for walking upright
- Tell the truth. Don’t share inflated numbers because they’re persuasive. Cite sources. Admit uncertainty.
- Love your neighbor, not just convenience. Water stress and grid strain are moral categories because people live downstream.
- Use AI like a tool, not an excuse. No plagiarism. No counterfeit trust.
- Practice digital temperance. Fewer prompts. Fewer regenerations. More intention.
- Advocate for transparency. Innovation isn’t the enemy; opacity is.
AI isn’t pure evil, and it isn’t pure miracle. It’s power—and power always casts a shadow.
Will we be faithful with what we’ve been given?
Sources & Further Reading
- MIT News — Generative AI’s environmental impact
- Pew Research Center — Energy use at U.S. data centers
- EESI — Data centers and water consumption
- Google Research — Carbon footprint of ML training
- Epoch AI — How much energy does ChatGPT use?
- arXiv — “Making AI Less Thirsty” (water footprint research)
- Microsoft — Zero water for cooling data centers
- Microsoft — Community-first AI infrastructure
- Google — Water stewardship portfolio
- Google — 2025 Environmental Report
- The Verge — OpenAI data center opposition & energy bills