For 11 weeks, I tracked all of my AI use. One hundred sessions. I counted the tokens processed and applied publicly available numbers on per-token energy and water intensity from Epoch AI and operator-reported data from Microsoft and Google. Anyone can run this math.
In those 11 weeks, I built an iOS app from scratch and wrote policy briefs on extreme heat for nonprofits I work with. I produced documentary pitch decks and drafted a 15,000-word climate fiction piece about the Colorado River collapse. I used AI every single day, often for hours at a time.
Total lifecycle water footprint of all that work: about five gallons. That accounts for everything: the water used to cool the data centers, the water consumed at power plants to generate the electricity, and the water embedded in manufacturing the hardware.
When an Outside editor reached out to ask me to write this story, I was on a trip to Marble Canyon, Arizona, to train raft guide companies on what is happening with the river. I drove my diesel Sprinter van from Tucson to the site, which tallied 383 miles at 20 miles per gallon of gasoline. When I ran the numbers later, the lifecycle water footprint of my fuel was around 110 gallons. One drive to the work I do on the Colorado River used more than 20 times the water of everything I did with AI in 11 weeks. That comparison stopped me cold—and I study this for a living.



It’s not such a clear split
https://adeshmehta.substack.com/p/simplergy17-ais-thirst-for-power
So it isn’t and hasn’t been.
the point is that inference power use is not negligible at all, and it’s actually pretty close to training usage
… once usage rates explode to many times the current level