Editor’s note: The irony isn’t lost on us. We are very aware of how much we use AI to help with research, create imagery and improve our articles. But this leads to another discussion – who is responsible for the effects of a product? Is it a user or the product owner? Obviously, the answer is shared responsibility. Our responsibility is through raising awareness for the issue, limiting heavy usage and improving prompt optimization. We also use pre-trained models and lightweight models for less intensive tasks. Please read our AI transparency statement for more information.
#ArtificialNegligence

Artificial intelligence (AI) is everywhere. It powers chatbots that help us shop online, suggest music playlists, and even assist doctors during exams. But every AI interaction comes at a price beyond your data plan or subscription fee. That price is paid in electricity, water, and environmental strain.
Everyday AI: Small Actions, Big Footprint
Imagine you ask a chatbot a question. You hit “enter,” it thinks for a second, and then spits out an answer. That convenience hides energy use. On average, one AI prompt uses 0.3 to 3 watt-hours (Wh) of electricity. That range depends on how complex the model is and which servers handle the request.

To put that in familiar terms:
- 0.3 Wh is roughly the energy needed to stream an HD video on your tablet for one minute.
- 1 Wh can boil water in an electric kettle for about eight seconds.
- 3 Wh is about running a 60-watt refrigerator for three minutes.
Most of us fire off dozens of AI prompts a day, chatting with bots, translating text, or generating images. If you send 100 prompts at 3 Wh each, you’ve used 300 Wh, the same as running your fridge for about five minutes. Send 1,000 prompts, and you’re at 3 kWh, nearly an hour on the fridge. Add up millions of prompts across millions of users, and AI becomes a heavyweight energy consumer, even though each individual prompt seems tiny.
The Hidden Water Bill of AI
An AI user might be responsible for up to 50 liters of water a day
We often think of water use in showers, dishwashing, or watering gardens. But data centers, those massive server farms in industrial parks, need lots of water too. They use water to cool down their hardware, either through cooling towers that evaporate water or liquid loops that draw heat away.

Rough estimates show:
- Every 10 to 50 AI prompts can require 0.5 liters of water when you count both direct evaporation and the water used to generate the electricity.
- That’s one standard half-liter water bottle for about 20 prompts.
In everyday terms:
- Low-flow toilet flush: ≈ 4 liters
- Eco dishwasher cycle: ≈ 10 liters
- Kitchen tap (3 min): ≈ 15 liters
- Standard 5-min shower: ≈ 40 liters
- Front-load washing machine: ≈ 50 liters
If a heavy AI user sends 1,000 prompts daily, they might be responsible for 10 to 50 liters of water use just for their AI habit, comparable to several loads in a dishwasher. Multiply that by millions, and the water footprint balloons, causing major AI Environmental Impact.
The Training Phase: A One-Time Burst, but a Massive One
Before an AI model like GPT-3 or a large image–generation system can answer questions or create pictures, it must undergo training. Training means feeding the model vast amounts of text, code, or image data and running complex math operations on powerful processors for days or weeks.

One landmark study found:
- Training GPT-3 consumed 1,287 megawatt-hours (MWh) of electricity, enough to power 120 average U.S. homes for a year.
- It also generated around 552 tons of CO₂, similar to driving 150 passenger cars for a year.
Those numbers cover just one training run. As models improve, they often get retrained with new data to fix errors or add features. Each retraining multiplies that resource cost. Factoring in data-center cooling adds another 10–40% on top of the electricity bill, depending on how efficient the facility is.
Training is a one-time burst of energy and emissions, but with models updated often, that burst repeats, causing yet more AI Environmental Impacts. As more companies and research labs train their own AI systems, the total training load becomes a monster.
The Inference Phase: Small Sips, Endless Flow
After training comes inference, the endless cycle of answering user requests. Each time you prompt an AI, that request “infers” an answer from the trained model. Although each inference uses far less energy than training, the sheer volume of queries makes inference a huge power hog.
Consider ChatGPT at its peak:
- It handled 100 million prompts per day.
- At 3 Wh per prompt, that amounts to 300 MWh every day, equivalent to powering 20,000 U.S. homes for a day.

Over just three days, inference energy can match the cost of one full model training. Because inference runs nonstop across millions of devices and users, it may soon outpace training in total energy consumption.
Data Centers Today: A Dual Thirst for Power and Water
All AI activity takes place inside data centers. In 2024, global data centers consumed about 415 terawatt-hours (TWh) of electricity. That was 1.5% of the entire world’s electricity. Meanwhile, cooling needs drove water withdrawals, fresh, potable water pulled from municipal or natural sources, into the hundreds of billions of liters.
Key figures for 2024:
- Electricity use: 415 TWh.
- Water withdrawal: 560 billion liters.
To visualize 415 TWh, imagine running 5,500 modern refrigerators (75 Wh each) continuously for a year. Or powering Japan’s annual electricity consumption and then some. The water? Enough to fill 224,000 Olympic swimming pools.
Projections for 2030 and 2035
If current trends hold, AI will gobble up even more resources over the next decade, driving up the total AI Environmental Impact.
Data centers (will) consume 4.4% of global electricity, rivalling entire national grids.

By 2030:
- Data-center electricity could hit 945 TWh, or 3% of global electricity.
- Water withdrawals might top 1.2 trillion liters per year.
By 2035, under high-growth scenarios:
- Electricity demand could rise to 1,700 TWh, 4.4% of global power. That’s like adding another Germany’s worth of demand.
- Water withdrawals could exceed 10 billion cubic meters (10 trillion liters), enough to fill 4 million Olympic pools annually.
This leap stems from two main drivers:
- Accelerated servers (GPUs/TPUs), which power AI, are growing at around 30% per year in electricity use.
- Conventional servers and cooling systems, adding another 9–15% growth yearly.
Left unchecked, AI’s share of power grids and water systems will steadily climb, straining existing infrastructure.
AI Environmental Impact: The Climate and Ecosystem Toll
All that extra electricity and water come with real-world costs:
- Greenhouse Gases
Most of the world’s electricity still comes from fossil fuels. Every extra terawatt-hour means more CO₂ in the atmosphere. Without rapid decarbonization, data centers alone could become one of the largest CO₂ sources by 2035, doubling their current emissions. - Water Stress
Many data centers sit near urban or semi-arid regions. Pulling millions of liters of water can worsen local shortages, affecting farming, drinking supplies, and natural habitats. Evaporating water in cooling towers dries out resources that communities need. - Heat Pollution
Waste heat from data centers can raise nearby temperatures. Without proper heat reuse (like feeding warmed water to district heating), that heat simply dissipates, increasing local air-conditioning loads and pushing up energy use. - Land Use and E-Waste
Building hyperscale data centers requires land. And server hardware becomes e-waste every few years, often containing rare metals that are hard to recycle.

This environmental footprint endangers biodiversity, worsens droughts, and accelerates climate change.
Social Consequences: Jobs, Inequality, and Community Strain
AI’s expansion doesn’t just reshape ecosystems; it shifts societies too:
- Job Displacement
Automation can replace roles in customer service, transportation, and even basic coding. Without new job creation and reskilling programs, displaced workers face unemployment or lower wages. - Wage Gaps and Inequality
AI-savvy professionals often command salary premiums, and prompt engineers or data-science specialists can see 50–60% higher pay. Meanwhile, workers in traditional roles see stagnant wages, widening income gaps. - Community Resistance
Some towns oppose new data centers, fearing noise, light pollution, and resource stress. In areas where water is scarce, residents may protest plans for new cooling facilities, pitting local needs against global tech demands.
Combined, these trends risk social unrest and political backlash against AI deployment.
Corporate Drivers: Speed, Cost, and Scale
Major tech companies race to build the biggest, fastest data centers. Their imperatives:
- Reduce Latency
Faster response times mean placing servers closer to users, often in regions without ample renewable power. - Cut Operating Costs
Data-center operators negotiate cheap power deals, often from coal, natural gas, or large hydropower, at the expense of renewables. - Stay Ahead in AI
Continuous model upgrades and retraining become a self-fulfilling prophecy: better AI demands more compute, which drives more AI development.

Some giants even build microgrids with on-site generators to guarantee uptime, bypassing local grids but locking in fossil-fuel use. This speed-and-cost logic sidelines sustainability, turning AI into a runaway resource train.
A Worst-Case Scenario: AI as an Ecological and Social Threat
What if no one steps in? Picture 2040:
- Data centers consume 4.4% of global electricity, rivalling entire national grids.
- Water withdrawals for cooling surpass municipal use in several drought-prone regions.
- Carbon emissions from AI push the world further from climate targets, adding hundreds of millions of tons of CO₂ annually.
- Communities protest water grabs, blocking new data-center builds.
- Workers displaced by automation face joblessness; social safety nets buckle under the strain.
- Inequality spikes as high-paid AI experts contrast with underpaid service workers.

In that world, AI is no longer a helpful tool but a catalyst for an AI environmental impact crisis, social fracture, and political conflict.
Charting a Responsible Path Forward
Mitigating AI’s dark trajectory requires immediate, multi-layered action:

- Energy Efficiency & Model Design
- Adopt Mixture-of-Experts models: Activate only the subnetwork needed for each task, slashing energy by up to 90%.
- Embrace quantization and pruning: Reduce bit-width (e.g., FP8) and remove redundant parameters, cutting compute by 40–70% with minimal accuracy loss.
- Develop domain-specific small models rather than one monolithic generalist AI.
- Data-Center Decarbonization
- Power centers with 100% renewables: Solar, wind, hydro, even modular nuclear (SMRs) for consistent baseload.
- Deploy dynamic scaling & smart load balancing: Match compute demand to renewable availability, buffering AI workloads to off-peak green hours.
- Innovate cooling:
- Immersion cooling submerges servers in dielectric fluids, cutting water by 80%.
- Evaporative-free direct-to-chip liquid or heat-reuse systems reclaim waste heat for district heating, slashing water use by 50–70%.
- Governance & Standards
- Mandate AI environmental disclosures akin to carbon reporting: energy use, PUE, WUE per kWh, water withdrawal vs consumption metrics.
- Incentivize green AI R&D: Public funding priorities for efficient algorithms, new silicon (PIM/CIM), neuromorphic and optical processors that cut energy 5–10×.
- Enshrine responsible AI in public policy:
- EU AI Act’s environmental safety provisions and US Infrastructure law’s clean energy credits.
- Create AI-Energy Scorecards for transparency, like nutritional labels for compute.

- Human-Centered AI Deployment
- Right-size AI: Use smaller models for routine tasks; escalate to large models only when needed.
- Agent OS ecosystems: Orchestrate multiple specialized models to minimize redundant compute.
- Drive AI literacy & fair labor practices: Upskill workers for AI-augmented roles; legislate worker protections to share productivity gains and prevent jobless growth.
- Public Awareness & Advocacy
- Empower consumers with energy footprints per query (e.g., pop-up: “This ChatGPT query uses 3 Wh, you can boil water for 8 seconds”).
- Mobilize data-center location transparency: Local drilling permits linked to resource assessments.
- Foster grassroots climate-AI movements: Similar to e-waste and plastic bans, push for low-water AI zones, renewable data-centers.
This is a Shared Responsibility
AI’s power to transform lives and society is real, and so are its costs in energy, water, and social impact. We stand at a crossroads. One path leads to runaway resource demand, carbon spikes, water wars, and social upheaval. The other path leads to “green AI”: efficient models, renewable-powered data centers, clear policies, and an informed public.

We have the tools and know-how. What we need now is the collective will from tech leaders, governments, communities, and each of us as users. By choosing responsible design, deployment, and consumption, we can ensure AI remains a force for good, one that serves humanity without draining the planet.
By choosing responsible design, deployment, and consumption, we can ensure AI remains a force for good, one that serves humanity without draining the planet.
Further reading:
- UNESCO Report Warns of AI’s Soaring Energy Use, Offers 90% Reduction Blueprint
A comprehensive study detailing how smaller, task-specific models and prompt compression can dramatically reduce AI’s energy footprint. - How can we create a sustainable AI future? – TechRadar
Explores innovations in datacenter cooling, energy efficiency, and resource-aware AI deployment strategies. - AI: Our Ally in the Energy Transition – Energy Live News
Highlights how AI can optimize energy distribution, support electrification, and accelerate decarbonization efforts. - AI needs to be smaller, reduce energy footprint – UNESCO study
A concise summary of the environmental risks posed by large language models and the benefits of downsizing. - Small changes to AI LLMs could cut energy use by 90 percent – UNESCO/UCL
Breaks down three key strategies, model compression, prompt trimming, and modular design, for sustainable AI development. - Artificial intelligence: How much energy does AI use? – UNRIC
Offers a lifecycle view of AI’s environmental footprint, from hardware manufacturing to inference and e-waste. - How much energy will AI really consume? – Nature
Investigates the local and global energy demands of AI data centers, with a focus on transparency and infrastructure strain. - AI’s energy dilemma: Challenges, opportunities, and a path forward – World Economic Forum
Proposes a four-part framework for responsible AI deployment, including decarbonization and ecosystem collaboration. - The increasing energy demand of artificial intelligence and its impact – ECB
An economic analysis of how AI-driven energy demand could affect commodity prices and national electricity markets. - AI Is Accelerating the Loss of Our Scarcest Natural Resource: Water – Forbes
Examines AI’s water footprint across cooling, electricity generation, and chip manufacturing. - Explained: Generative AI’s environmental impact – MIT News
A deep dive into the resource intensity of training and deploying generative models like GPT-4. - AI’s excessive water consumption threatens to drown out its environmental contributions – The Conversation
Argues that AI’s water usage may undermine its sustainability benefits, especially in drought-prone regions. - How much water does AI consume? The public deserves to know – OECD
Introduces scope-based water accounting and calls for transparency in AI’s water footprint. - AI’s carbon footprint is bigger than you think – MIT Technology Review
Quantifies the emissions from inference and training, and urges smarter deployment strategies. - Revisit the environmental impact of artificial intelligence – Springer
Estimates that top AI systems could emit over 100 Mt of CO₂ annually, with implications for carbon pricing. - Wiley partners with Anthropic on responsible AI integration – The Bookseller
Discusses efforts to embed peer-reviewed content into AI platforms while maintaining citation integrity. - GENAI Summit 2025: Forging a Network of AI Innovators – PR Newswire
Highlights global collaboration on ethical AI development and responsible innovation.