AI’s Energy Consumption
Ethan Kassai
Tech Writer
The constant and rapid development of artificial intelligence technologies is profoundly impacting energy consumption patterns in the United States. With AI becoming ever more prominent in the lives of residents, the need for electricity, particularly for AI data centers, is experiencing unprecedented expansion.

Data centers, which are the core of AI activities, are facing a rapid increase in energy usage. US data centers used about 147 terawatt-hours (TWh) of electricity in 2023, or about 4% of the nation’s energy demand. Scientists have estimated that in 2030, the energy usage will be about 606 TWh, which will be around 12% of US electricity consumed. This energy boost is being driven mostly by the training and deployment of AI models at greater scales, which is generally a highly intensive-demanding process. This places tremendous strain on the US power grid, with some locations affected more than others. Regions with high concentrations of data centers, such as Northern Virginia, are especially struggling to meet the growing demand for electricity, raising concerns about higher costs to electricity consumers and grid reliability. In fact, frequent brownouts and blackouts are already occurring in states such as California, and they can readily extend to other regions. The continuous, 24/7 operations of AI data centers contribute to a constant peak in power demand, which can complicate grid management for energy authorities in the public and private sectors.

In response to growing energy demands, the US government has initiated different policy measures to help bolster energy resources for AI data centers. At the beginning of 2025, a law became effective, which aims to speed up the construction of large-scale AI infrastructure and other clean energy-based facilities with the perspective of demanding more and cleaner energy to play a role in mitigating the effect on electricity prices. In the meantime, there have been debates on the role of various sources of energy in meeting the energy demand of AI. While some advocate for resuming the utilization of coal to energize AI operations, others support transitioning to clean, renewable sources of energy to help mitigate environmental problems.
Innovation is not only taking place among energy producers and suppliers, though. Companies creating AI technologies are trying to make AI systems energy efficient. Some innovations, such as on-device AI processing, are being undertaken to reduce the amount of energy consumption by minimizing data transmissions and utilizing power-saving hardware. These innovations aim to minimize environmental impacts of AI without affecting its performance. The emergence of AI technologies presents opportunities along with challenges to the US energy landscape. Addressing the ever-increasing energy requirements requires a multipronged approach comprising infrastructure development, policy initiatives, and technology improvements to ensure a sustainable and reliable energy supply to support AI advancements.
Contact Ethan at ethan.kassai@student.shu.edu