The most important stories, news, and ideas from the world of AI. Join us bi-weekly to learn more about a critical matter in the blossoming AI industry.
Visit handyai.substack.com for article versions of topics discussed here, and up-to-date news on AI and tech.
Hosted on Acast. See acast.com/privacy for more information.
The most important stories, news, and ideas from the world of AI. Join us bi-weekly to learn more about a critical matter in the blossoming AI industry.
Visit handyai.substack.com for article versions of topics discussed here, and up-to-date news on AI and tech.
Hosted on Acast. See acast.com/privacy for more information.

In this episode, we explore the voracious energy consumption of large language models (LLMs). These AI systems consume massive amounts of electricity during training and inference. A single training run for a model like GPT-3 uses around 1,287 MWh of electricity—equivalent to the carbon emissions from 550 round-trip flights between New York and San Francisco. Inference amplifies the problem, with ChatGPT's monthly energy usage ranging from 1 to 23 million kWh.
The energy appetite of LLMs mirrors the cryptocurrency mining crisis, consuming enormous power with questionable societal benefits. Closed-source models like GPT-4o and Gemini hide their energy usage, hindering regulation and public accountability. The unchecked expansion of LLMs threatens global efforts to reduce energy consumption and combat climate change. It's time to confront the dangerous appetite of AI.
Hosted on Acast. See acast.com/privacy for more information.