About this episode
In this episode of Environment Variables, host Chris Adams welcomes Scott Chamberlin, co-founder of Neuralwatt and ex-Microsoft Software Engineer, to discuss energy transparency in large language models (LLMs). They explore the challenges of measuring AI emissions, the importance of data center transparency, and projects that work to enable flexible, carbon-aware use of AI. Scott shares insights into the current state of LLM energy reporting, the complexities of benchmarking across vendors, and how collaborative efforts can help create shared metrics to guide responsible AI development.Learn more about our people:Chris Adams: LinkedIn | GitHub | WebsiteScott Chamberlin: LinkedIn | WebsiteFind out more about the GSF:The Green Software Foundation Website Sign up to the Green Software Foundation NewsletterNews:Set a carbon fee in Sustainability Manager | Microsoft [26:45]Making an Impact with Microsoft's Carbon Fee | Microsoft Report [28:40] AI Training Load Fluctuations at Gigawatt-scale – Risk of Power Grid Blackout? – SemiAnalysis [49:12]Resources:Chris’s question on LinkedIn about understanding the energy usage from personal use of Generative AI tools [01:56]Neuralwatt Demo on YouTube [02:04]Charting the path towards sustainable AI with Azure Machine Learning resource metrics | Will Alpine [24:53] NVApi - Nvidia GPU Monitoring API | smcleod.net [29:44]Azure Machine Learning monitoring data reference | Microsoft