It's hard to keep up with studies of the energy impact of generative AI, so here are nine takeaways from the sources I have personally found most illuminating.
This list aims for a "just the facts" approach that sidesteps the dueling interpretations of AI champions and critics. I'm also using one set of measures rather than comparing apples to oranges, specifically:
- π‘watt-hour (running an incandescent light bulb for 1 minute)
- π₯€liter (about a quart)
- βοΈcubic centimeter (a raindrop).
This list assesses only energy and water usage, and not actual environmental impact. I'm neither a climate scientist nor an electrical engineer; these are only my rough estimates based on academic studies, industry reports, or back-of-the envelope calculations. It's unclear how some future tradeoffs will play out, eg whether improved efficiencies will cancel out increased demand. This list also excludes numerous potential AI downsides apart from environmental risks, which you can find explained in the IMPACT RISK framework.
I welcome suggestions of research that updates or contradicts these findings. You can find a log of recent updates here.
9 takeaways from recent research
- π1. Lack of transparency by AI companies means usage calculations at this point are only estimates.
- π2. Water and energy impacts are extremely localized; eg the stress on Ireland's water and grid is much higher than Norway's due to the latter's hydropower and cool climate.
- π3. Large models consume disproportionately more energy and water than smaller ones.
- ποΈ4. Training consumes more energy and water than inference (prompting).
- π5. Policy changes by new administrations can result in more or less climate impact for the same energy consumption.
- π6. Data centers are currently 2% of global energy demand (Ritchie 2024).
- πͺCrypto is responsible for somewhere around 25% of the energy used by data centers.
- π±Social media and data usage currently consume most of the rest.
- π€AI is responsible for only 2% of data center energy demand, ie 2% x 2% = .04% of global demand (Ritchie 2024).
- π»7. Prompting a local model on a laptop requires no water and uses 1-10% of the energy of prompting a model in a data center.
- π°8. Cooling a data center requires about 4 cubic centimeters of water per watt-hour regardless of task (Lawrence 2024).
- βοΈ9. Comparisons can be surprising (approximate Watt-hours and liters or ccs):
- π¦1000 Wh / 4 L: hour-long Zoom call with 10 people (devices 600 + transmission 200 + server-processing 200).
- πΊ200 Wh / .8 L: hour-long video streamed on a big TV (Kamiya 2020, Carbon Trust 2021).
- πͺ«20 Wh / 80 cc: charging a smartphone (EPA 2024).
- πΌ6 Wh / 24 cc: generating an image online (Luccioni 2024, Ippolito 2025).
- π6 Wh / 24 cc: generating a page with an online chatbot (Brown 2020).
- βοΈ2 Wh / 8 cc: generating a sentence with an online chatbot (Luccioni 2024, Ippolito 2025).
- π.3 Wh / 1 cc: one non-AI Google search (Google 2009).
- π».01 Wh / .04 cc: Generating text with a local chatbot (30W x 1s).
β οΈ Please do not quote any of these figures without this caveat: "These are guesses based on incomplete and often contradictory sources."
Sources
- Brown, Language Models are Few-Shot Learners, 2020
- Carbon Trust, Carbon impact of video streaming, 2021
- EPA, Greenhouse Gas Equivalencies Calculator - Calculations and References, 2024
- Google, Powering a Google search, 2009
- Ippolito, Transcript of o1 calculation estimating GPT-4 v. GPT-2 and Stable Diffusion 2.1 v. 3.5, 2025.
- Kamiya, The carbon footprint of streaming video, 2020
- Lawrence, United States Data Center Energy Usage Report, 2024
- Luccioni, Power Hungry Processing, 2024
- Ritchie, Whatβs the impact of artificial intelligence on energy demand?, 2024