MIT: The Math on AI’s Energy Footprint

Zeke RosenbergData, News

MIT Technology Review Breaks Down AI’s Energy Footprint

MIT Technology Review reporters James O’Donnell and Casey Crownhart have published a detailed analysis of AI’s energy demands, based on interviews with two dozen experts, independent testing of open-source models, and a review of industry and government data. Their central finding: AI’s electricity use is rising sharply, and key data about that usage remains largely undisclosed.

While attention often focuses on the energy required to train large AI models, O’Donnell and Crownhart emphasize that inference—the real-time use of AI tools—is now the dominant source of demand, accounting for an estimated 80–90% of AI compute usage.

As AI features are integrated into more everyday applications—from search to customer service to personal devices—energy use is expected to scale dramatically. According to projections they cite from Lawrence Berkeley National Laboratory, by 2028 more than half of U.S. data center electricity could be used for AI.

The Technology Review also highlights how the rapid growth of AI infrastructure is already reshaping energy planning. Utility deals with data centers could push costs onto ratepayers, and new projects often rely on fossil fuels to meet near-term demand.

🔗 Read the full article on MIT Technology Review