About this episode
Vijay Gadepally joins Ed and Sara to break down the real energy footprint of AI—and why most people (and companies) are getting it wrong. They discuss: How "agentic" AI systems use an order of magnitude more energy than ChatGPT.Whether efficiency gains can keep pace with exploding usage (spoiler: not yet). The one simple change that could cut AI energy use by 80%. Vijay is Senior Scientist at the MIT Lincoln Laboratory Supercomputing Center and Co-Founder of Bay Compute and Radium Cloud. He studies what's actually happening under the hood of AI systems—and has the data to back it up. If you've been wondering whether AI is derailing the clean energy transition, or whether smarter software design could keep energy use in check, this is the conversation you need to hear.🎙️ TIMESTAMPS00:00:00 - Introduction & Cold Open00:01:17 - Welcome & Guest Introduction00:02:59 - Agentic AI: The New Energy Problem00:04:10 - A Brief History of AI: From Expert Systems to LLMs00:08:43 - Agentic AI vs. LLMs vs. Reasoning Models Explained00:10:00 - The Energy Reality: One AI Node = 10-15 Homes00:14:13 - Why Energy Consumption is Unpredictable00:16:02 - The Big Flip: Training vs. Inference Energy Use00:26:22 - What Does "Efficient AI" Actually Mean?00:29:37 - Are Tech Companies Optimizing for Energy or Market Share?00:36:20 - The Low-Hanging Fruit: Cutting AI Energy Use by 80%Full notes & referencesSend us a text (if you'd like a response, please include your email) Follow us on:LinkedInBlueskyX/TwitterInstagramEnergy vs Climate relies on the support of our generous listenersDonate to keep Energy vs Climate goingProduced by Bespoke Podcasts