The Energy Hunger of AI: Can the World Power the Next Generation of Intelligence?
Since generative AI systems were integrated into mainstream platforms in 2023–2024—such as Bing, Gemini, Microsoft 365 Copilot, and assistants

Since generative AI systems were integrated into mainstream platforms in 2023–2024—such as Bing, Gemini, Microsoft 365 Copilot, and assistants like ChatGPT and Claude—usage has surged to trillions of queries per month. This is no longer just a technological leap; AI is now running on massive high-performance data centers, creating an entirely new category of energy consumption.
Unlike traditional web searches, an AI-generated response requires up to a thousand times more computational power. According to Google, an AI search consumes about 3 to 5 times more energy on average. While this may seem negligible per user, at the scale of billions of daily interactions, the pressure on global electricity systems is significant.
Training a model on the scale of GPT-4 requires roughly 1,000 megawatt-hours of electricity (based on Epoch AI estimates)—equivalent to the annual consumption of around 120 average U.S. households. But training is a one-time event; the real challenge is the ongoing energy demand from usage—billions of daily requests that are processed in real time, continuously consuming server power.
As of 2024, global AI server operations are estimated to consume around 10 terawatt-hours (TWh) of electricity annually. This accounts for about 0.4% of total global electricity consumption, which the IEA estimates at 26,500 TWh per year. However, this figure is climbing fast.
Instead of projecting a single number, it is more helpful to consider a range of scenarios to capture the uncertainty of such a rapidly evolving technology. Below are three outlooks, each based on specific assumptions.
Key Assumptions:
- The market doubles every 18 months (based on observed user growth)
- AI’s share of electricity demand increases proportionally with usage
- Global electricity consumption grows by 2% annually through 2027 (IEA estimate)
Scenarios for 2027:
- Low-growth scenario (slowing expansion):
AI consumes around 15 TWh, equal to approximately 0.5% of global electricity demand (projected at ~28,000 TWh). This assumes increased efficiency and a plateau in adoption. - Moderate-growth scenario (doubling every 18 months):
AI reaches 45–50 TWh annually, or about 1.6–1.8% of global demand. That’s comparable to the total electricity use of a mid-sized country. - High-growth scenario (accelerated integration):
If AI becomes embedded across search engines, mobile apps, and productivity tools, electricity use could exceed 90–100 TWh, or 3.3–3.5% of global consumption—roughly the level of Germany’s entire power use.
In this context, the tech sector faces a critical choice: drastically improve model efficiency and architecture or accelerate investment in clean energy. Otherwise, AI expansion may soon become one of the main drivers of global energy stress. AI is no longer just a digital issue—it’s becoming a question of physical infrastructure and energy policy.