
Inside Iceotope’s Mission to Cool AI Sustainably
As AI workloads push data centers to their physical and environmental limits, the industry is waking up to a hard truth: today’s cooling methods can’t sustain tomorrow’s compute demands. The rise of multi-megawatt racks, unprecedented water consumption, and surging energy use are forcing operators to rethink infrastructure from the ground up.
At the 2025 AI Infra Summit in Santa Clara, Jeniece Wnorowski and I sat down with Jonathan Ballon, CEO of Iceotope, for a Data Insights interview on why it’s imperative to consider sustainability in AI infrastructure buildout—and how Iceotope is redefining cooling technology both at the edge and in the datacenter.
The Efficiency Imperative
AI’s rapid growth has triggered exponential increases in power and water consumption. Traditional air cooling is struggling to keep up with the thermal profiles of modern AI accelerators, while water-intensive cooling towers and evaporative systems are under scrutiny for their environmental impact.
“When you look at the amount of resources that are being consumed right now—whether it’s power or water—it’s unsustainable,” Ballon said.
This urgency was one of the key factors that drew him to lead Iceotope. The company’s datacenter liquid cooling technologies use 96% less water than conventional systems and reduce cooling power requirements by up to 80%. Their edge AI solutions can be deployed in almost any environment, without the use of existing facility water or dry chillers.
Those kinds of efficiency gains aren’t just good for the planet—they’re essential for the economics of deploying AI workloads. As data center campuses scale into the gigawatt range and high-throughput edge applications that require low latency continue to grow, sustainable cooling is becoming both a moral and a financial imperative.
Rethinking Infrastructure Without Forklifts
For enterprises, the efficiency conversation is tied closely to infrastructure planning. Many organizations are not starting from scratch—they’re retrofitting existing facilities or deploying AI workloads incrementally.
“I think enterprises need to think carefully about how they’re going to invest in new infrastructure,” Ballon explained. “What we’re seeing is enterprises that are looking for retrofit capability that doesn’t require forklift upgrades so that they can adopt liquid cooling gradually rather than having to do...forklift changes either in their existing infrastructure or in their new builds. Looking for technology that allows you to do that is super important. Otherwise, they could be making major infrastructure changes that could be outdated in three to five years, which could be a major risk.
That message resonates with enterprise IT leaders who must balance innovation with capital discipline. Liquid cooling solutions that integrate into existing footprints without major overhauls offer a pragmatic path forward.
Lessons from Hyperscalers
It’s tempting for enterprises to look at hyperscalers or “neo-clouds” as models for AI infrastructure strategy. But Ballon cautioned against simply trying to emulate the giants.
Instead, he suggests that enterprises use hyperscaler deployments as reference points while tailoring their strategies to their own operational realities. For many organizations, that means exploring modular, flexible solutions that enable growth over time rather than attempting to leap directly to hyperscale patterns.
New Edge Deployment Models
One of the most intriguing threads of the conversation centered on non-traditional deployment locations. Ballon and the hosts discussed the idea of deploying AI infrastructure in underused commercial spaces, like vacant office buildings, as part of an edge strategy.
While a hyperscaler wouldn’t take over an office suite to build something that isn’t at their usual scale, a retailer might, he said. As enterprises look for ways to bring AI closer to where data is generated, creative use of available real estate could accelerate edge deployments, particularly if efficient, compact, and quiet cooling solutions make it viable. Iceotope’s latest product line focuses on edge solutions that don't require access to facility water or dry chillers. Their fan-free, quiet operation makes them ideal for deployment in office environments.
A Legacy-Driven Mission
For Ballon, Iceotope’s mission goes beyond engineering.
“I’m at the stage of my career where I think about my kids, I think about the environment, and what legacy I want to leave behind,” he said.
By dramatically reducing water and energy use, Iceotope aims to make the environmental footprint of AI infrastructure compatible with a sustainable future. It’s a vision that blends technical innovation with a long-term view of responsibility—one increasingly shared across the data center industry.
The TechArena Take
Cooling is no longer an operational afterthought—it’s becoming a strategic lever for sustainable AI growth. Iceotope’s approach demonstrates how innovations in liquid cooling can enable enterprises to meet surging compute demands while dramatically cutting environmental impact.
The big insight from Ballon’s conversation: enterprises don’t need to mimic hyperscalers to succeed in the AI era. By focusing on retrofit-friendly technologies and purpose-built designs, they can chart their own path—one that balances performance, cost, and environmental stewardship.
As AI infrastructure spreads beyond massive campuses into distributed edge locations, energy efficient and water-saving cooling solutions will shape the future of data center design. Iceotope is positioning itself squarely at the intersection of sustainability and innovation.
As AI workloads push data centers to their physical and environmental limits, the industry is waking up to a hard truth: today’s cooling methods can’t sustain tomorrow’s compute demands. The rise of multi-megawatt racks, unprecedented water consumption, and surging energy use are forcing operators to rethink infrastructure from the ground up.
At the 2025 AI Infra Summit in Santa Clara, Jeniece Wnorowski and I sat down with Jonathan Ballon, CEO of Iceotope, for a Data Insights interview on why it’s imperative to consider sustainability in AI infrastructure buildout—and how Iceotope is redefining cooling technology both at the edge and in the datacenter.
The Efficiency Imperative
AI’s rapid growth has triggered exponential increases in power and water consumption. Traditional air cooling is struggling to keep up with the thermal profiles of modern AI accelerators, while water-intensive cooling towers and evaporative systems are under scrutiny for their environmental impact.
“When you look at the amount of resources that are being consumed right now—whether it’s power or water—it’s unsustainable,” Ballon said.
This urgency was one of the key factors that drew him to lead Iceotope. The company’s datacenter liquid cooling technologies use 96% less water than conventional systems and reduce cooling power requirements by up to 80%. Their edge AI solutions can be deployed in almost any environment, without the use of existing facility water or dry chillers.
Those kinds of efficiency gains aren’t just good for the planet—they’re essential for the economics of deploying AI workloads. As data center campuses scale into the gigawatt range and high-throughput edge applications that require low latency continue to grow, sustainable cooling is becoming both a moral and a financial imperative.
Rethinking Infrastructure Without Forklifts
For enterprises, the efficiency conversation is tied closely to infrastructure planning. Many organizations are not starting from scratch—they’re retrofitting existing facilities or deploying AI workloads incrementally.
“I think enterprises need to think carefully about how they’re going to invest in new infrastructure,” Ballon explained. “What we’re seeing is enterprises that are looking for retrofit capability that doesn’t require forklift upgrades so that they can adopt liquid cooling gradually rather than having to do...forklift changes either in their existing infrastructure or in their new builds. Looking for technology that allows you to do that is super important. Otherwise, they could be making major infrastructure changes that could be outdated in three to five years, which could be a major risk.
That message resonates with enterprise IT leaders who must balance innovation with capital discipline. Liquid cooling solutions that integrate into existing footprints without major overhauls offer a pragmatic path forward.
Lessons from Hyperscalers
It’s tempting for enterprises to look at hyperscalers or “neo-clouds” as models for AI infrastructure strategy. But Ballon cautioned against simply trying to emulate the giants.
Instead, he suggests that enterprises use hyperscaler deployments as reference points while tailoring their strategies to their own operational realities. For many organizations, that means exploring modular, flexible solutions that enable growth over time rather than attempting to leap directly to hyperscale patterns.
New Edge Deployment Models
One of the most intriguing threads of the conversation centered on non-traditional deployment locations. Ballon and the hosts discussed the idea of deploying AI infrastructure in underused commercial spaces, like vacant office buildings, as part of an edge strategy.
While a hyperscaler wouldn’t take over an office suite to build something that isn’t at their usual scale, a retailer might, he said. As enterprises look for ways to bring AI closer to where data is generated, creative use of available real estate could accelerate edge deployments, particularly if efficient, compact, and quiet cooling solutions make it viable. Iceotope’s latest product line focuses on edge solutions that don't require access to facility water or dry chillers. Their fan-free, quiet operation makes them ideal for deployment in office environments.
A Legacy-Driven Mission
For Ballon, Iceotope’s mission goes beyond engineering.
“I’m at the stage of my career where I think about my kids, I think about the environment, and what legacy I want to leave behind,” he said.
By dramatically reducing water and energy use, Iceotope aims to make the environmental footprint of AI infrastructure compatible with a sustainable future. It’s a vision that blends technical innovation with a long-term view of responsibility—one increasingly shared across the data center industry.
The TechArena Take
Cooling is no longer an operational afterthought—it’s becoming a strategic lever for sustainable AI growth. Iceotope’s approach demonstrates how innovations in liquid cooling can enable enterprises to meet surging compute demands while dramatically cutting environmental impact.
The big insight from Ballon’s conversation: enterprises don’t need to mimic hyperscalers to succeed in the AI era. By focusing on retrofit-friendly technologies and purpose-built designs, they can chart their own path—one that balances performance, cost, and environmental stewardship.
As AI infrastructure spreads beyond massive campuses into distributed edge locations, energy efficient and water-saving cooling solutions will shape the future of data center design. Iceotope is positioning itself squarely at the intersection of sustainability and innovation.