X

Giving Storage Strategy a Reboot for Efficiency with Solidigm

October 10, 2025

Modern storage infrastructure presents a complex balancing act. As solid-state drives (SSDs) evolve to provide performance levels demanded by artificial intelligence (AI) workloads, power consumption has grown alongside speed, prompting a necessary evolution in how organizations evaluate and optimize their storage investments.

During a recent TechArena Data Insights episode, I spoke about this phenomenon with Jeniece Wnorowski, director of industry expert programs at Solidigm, and Scott Shadley, director of leadership narratives at Solidigm. Our conversation revealed the complex factors affecting storage efficiency, and key areas organizations need to consider when undertaking efforts to optimize their systems.

Redefining Storage Efficiency with Modern Metrics

To set the stage for our conversation about storage efficiency, Scott noted that in his work with customers and partners, what’s critical is “Understanding how we manage budgets. And those budgets include power budgets and all the other aspects of building an efficient data center,” he said.  

Considering how finite resources are allocated has become increasingly important as modern flash-based storage products are being deployed in architectures that demand unprecedented performance levels. These demands have led SSDs to draw more power than ever expected, given they were designed to be both fast and power efficient.

The challenge, in fact, lies not in the technology, but in the metrics used to determine the best storage solution for use case requirements. As system demands increase, new measures are necessary to make architecture and procurement decisions. “We’ve always used the same metric, dollar per gigabyte,” Scott explained. “There’s a lot of new metrics that we’re focused on today, like watts per terabyte or terabytes per input/output operations per second…so we’ve evolved the ecosystem to talk through what a modern infrastructure looks like.” These measurements provide a more accurate picture of total system efficiency and help guide decisions from being about the fastest or the biggest drive to the right storage solution for the job.

Engineering Efficiency in Every Part of SSDs

While the legacy of SSDs is already rooted in efficiency, Solidigm is actively working on solutions to even further improve storage efficiency. For example, the company has worked with standards bodies and partners to optimize idle times. “These power states that we can put drives in make sure that they make the most of the power available to them. They have fast on, fast off, and things that you just can’t do with other aspects of storage infrastructure,” he explained

The architectural innovations extend beyond power states to fundamental design choices. For example, Scott detailed how Solidigm has long focused on optimizing the design of SSDs’ controllers, which can draw significant power if designed inefficiently. For ultra-high-capacity drives like their 122TB models, they’ve worked within the architecture and firmware design to keep only necessary components active as needed, which becomes critical when hundreds of drives populate enterprise racks.

Tackling System-Level Optimization for Maximum Benefit

Beyond the drives themselves, holistic system changes are critical to optimizing efficiency. Scott emphasized that modernization efforts must address both hardware and software components to realize systems’ full potential. Our discussion revealed a particularly intriguing challenge on the software side: legacy code optimization. Many applications originally designed for spinning media include built-in wait times, which become counterproductive with SSD deployment. These unnecessary delays waste power because systems continue drawing energy while waiting for data that has already arrived.  

Taking that challenge of comprehensive improvement a step further, Scott pointed out that drives are just one component of a larger system that must be considered. “It’s not about the drive,” he said. “It’s about the rack, and what you can do with the rack to make that rack more efficient.” A partnership Ocient, which builds a rack infrastructure that reduces the physical footprint required, shows the benefits of this approach. Reducing the footprint reduces the server count and rack-level power, which then translates into true reductions in total cost of ownership.

For organizations beginning efficiency overhauls, Scott recommended focusing on three key areas: software infrastructure optimization to eliminate unnecessary wait times, right-sizing storage performance to actual requirements rather than perceived needs, and leveraging portfolio diversity to match specific use cases with appropriate storage technologies. “Don’t just buy the fastest things, and even sometimes the biggest one isn’t what you need. We’ve got the portfolio to help you make yourself the most efficient system that can also scale,” he said.

The TechArena Take

The evolution of storage efficiency reflects a broader maturation in how enterprises approach infrastructure optimization. While IT teams wrestle with rising power consumption from high-performance storage, Solidigm’s focus on comprehensive efficiency demonstrates that the solution lies in addressing a complex web of factors. The companies prepared to not only work with efficient, modern drives, but to update their purchasing decision metrics and set aside piecemeal optimization strategies for a true systems-thinking approach will see the greatest benefits as workload demands continue to accelerate.

To learn more about Solidigm’s approach to efficiency and storage, connect with Scott Shadley on LinkedIn or explore Solidigm’s efficiency solutions at solidigm.com.

Watch the podcast | Subscribe to our newsletter

Modern storage infrastructure presents a complex balancing act. As solid-state drives (SSDs) evolve to provide performance levels demanded by artificial intelligence (AI) workloads, power consumption has grown alongside speed, prompting a necessary evolution in how organizations evaluate and optimize their storage investments.

During a recent TechArena Data Insights episode, I spoke about this phenomenon with Jeniece Wnorowski, director of industry expert programs at Solidigm, and Scott Shadley, director of leadership narratives at Solidigm. Our conversation revealed the complex factors affecting storage efficiency, and key areas organizations need to consider when undertaking efforts to optimize their systems.

Redefining Storage Efficiency with Modern Metrics

To set the stage for our conversation about storage efficiency, Scott noted that in his work with customers and partners, what’s critical is “Understanding how we manage budgets. And those budgets include power budgets and all the other aspects of building an efficient data center,” he said.  

Considering how finite resources are allocated has become increasingly important as modern flash-based storage products are being deployed in architectures that demand unprecedented performance levels. These demands have led SSDs to draw more power than ever expected, given they were designed to be both fast and power efficient.

The challenge, in fact, lies not in the technology, but in the metrics used to determine the best storage solution for use case requirements. As system demands increase, new measures are necessary to make architecture and procurement decisions. “We’ve always used the same metric, dollar per gigabyte,” Scott explained. “There’s a lot of new metrics that we’re focused on today, like watts per terabyte or terabytes per input/output operations per second…so we’ve evolved the ecosystem to talk through what a modern infrastructure looks like.” These measurements provide a more accurate picture of total system efficiency and help guide decisions from being about the fastest or the biggest drive to the right storage solution for the job.

Engineering Efficiency in Every Part of SSDs

While the legacy of SSDs is already rooted in efficiency, Solidigm is actively working on solutions to even further improve storage efficiency. For example, the company has worked with standards bodies and partners to optimize idle times. “These power states that we can put drives in make sure that they make the most of the power available to them. They have fast on, fast off, and things that you just can’t do with other aspects of storage infrastructure,” he explained

The architectural innovations extend beyond power states to fundamental design choices. For example, Scott detailed how Solidigm has long focused on optimizing the design of SSDs’ controllers, which can draw significant power if designed inefficiently. For ultra-high-capacity drives like their 122TB models, they’ve worked within the architecture and firmware design to keep only necessary components active as needed, which becomes critical when hundreds of drives populate enterprise racks.

Tackling System-Level Optimization for Maximum Benefit

Beyond the drives themselves, holistic system changes are critical to optimizing efficiency. Scott emphasized that modernization efforts must address both hardware and software components to realize systems’ full potential. Our discussion revealed a particularly intriguing challenge on the software side: legacy code optimization. Many applications originally designed for spinning media include built-in wait times, which become counterproductive with SSD deployment. These unnecessary delays waste power because systems continue drawing energy while waiting for data that has already arrived.  

Taking that challenge of comprehensive improvement a step further, Scott pointed out that drives are just one component of a larger system that must be considered. “It’s not about the drive,” he said. “It’s about the rack, and what you can do with the rack to make that rack more efficient.” A partnership Ocient, which builds a rack infrastructure that reduces the physical footprint required, shows the benefits of this approach. Reducing the footprint reduces the server count and rack-level power, which then translates into true reductions in total cost of ownership.

For organizations beginning efficiency overhauls, Scott recommended focusing on three key areas: software infrastructure optimization to eliminate unnecessary wait times, right-sizing storage performance to actual requirements rather than perceived needs, and leveraging portfolio diversity to match specific use cases with appropriate storage technologies. “Don’t just buy the fastest things, and even sometimes the biggest one isn’t what you need. We’ve got the portfolio to help you make yourself the most efficient system that can also scale,” he said.

The TechArena Take

The evolution of storage efficiency reflects a broader maturation in how enterprises approach infrastructure optimization. While IT teams wrestle with rising power consumption from high-performance storage, Solidigm’s focus on comprehensive efficiency demonstrates that the solution lies in addressing a complex web of factors. The companies prepared to not only work with efficient, modern drives, but to update their purchasing decision metrics and set aside piecemeal optimization strategies for a true systems-thinking approach will see the greatest benefits as workload demands continue to accelerate.

To learn more about Solidigm’s approach to efficiency and storage, connect with Scott Shadley on LinkedIn or explore Solidigm’s efficiency solutions at solidigm.com.

Watch the podcast | Subscribe to our newsletter

Transcript

Subscribe to TechArena

Subscribe