About this episode
The future of the grid increasingly hinges on where and how data centers get built. To forecast the kind of power infrastructure we need to meet AI’s growing appetite, we first need to understand a laundry list of variables: data center size, workload type, latency, reliability — even the variety of a data center’s coolant system.
So what’s the state of play in data center development today — and how are the trends shaping grid needs?
In this episode, Shayle talks to Chris Sharp, chief technology officer of Digital Realty, a developer, owner and operator of data centers. They cover topics like:
How AI inference workloads are clustering in existing regions, driven by latency and throughput requirements
“Data gravity” and “data oceans”: how large concentrations of data attract more compute infrastructure
What’s driving longer lead times: interconnection delays, equipment bottlenecks, or both?
Large-scale builds vs. incremental additions and densification of existing infrastructure
“Braggawatts” vs. real demand: separating hype from reality
The diverging power needs of training vs. inference, and whether any workloads work with intermittent power
The evolving role of “bridge power” and why diesel and gas are still in the mix
Resources:
Latitude Media: Google’s new data center model signals a massive market shift
Latitude Media: The future of energy-first data centers takes shape
Latitude Media: Can a new coalition turn data centers into grid assets?
Latitude Media: Do microgrids make sense for data centers?
The New York Times: Wall St. Is All In on A.I. Data Centers. But Are They the Next Bubble?
Catalyst: The case for colocating data centers and generation
Credits: Hosted by Shayle Kann. Produced and edited by Daniel Woldorff. Original mus