5 Compute Efficiency Takes with WEKA President Jonathan Martin
As TechArena prepares to roll out the 2024 Compute Sustainability Report, I was privileged to sit down with WEKA President Jonathan Martin to discuss how the right data foundation is critical to make GPUs more efficient and improve the sustainability and performance of AI applications and workloads.
WEKA's leading AI-native data platform software solution, the WEKA Data Platform, was purpose-built to deliver the performance and scale required for enterprise AI training and inference workloads across distributed edge, core, and cloud environments.
Jonathan, WEKA’s President, is responsible for the company’s global go-to-market (GTM) functions and operations, which include sales, marketing, strategic partnerships, and customer success.
ALLYSON: Jonathan, thank you for being here today. It used to be that data storage was an arena not exactly known for innovation. With organizations utilizing multiple clouds and looking to do more interesting things with data, that has fundamentally changed. How do you view data platforms today?
JONATHAN: While we are still in the early days of the AI revolution, we’re already seeing how transformative this technology can be across nearly every industry. Enterprises are now adopting AI in droves – and despite being relatively new in the market, generative AI is eclipsing all other forms of AI. A recent global study conducted by S&P Global Market Intelligence in partnership with WEKA found that an astounding 88% of organizations say they are actively exploring generative AI, and 24% say they already see generative AI as an integrated capability deployed across their organization. Just 11% of respondents are not investing in generative AI at all.
This rapid shift to embrace generative AI is forcing organizations to reevaluate their technology stacks, as they struggle to reach enterprise scale. The same study found that in the average organization, 51% of AI projects are in production but not being delivered at scale. 35% of organizations cited storage and data management as the top technical inhibitor to scaling AI, outpacing compute (26%), security (23%) and networking (15%).
At WEKA, we believe that every company will need to become an AI-native company to not only survive, but thrive, in the AI era. Becoming AI-native will require that they adopt a disaggregated data pipeline-oriented architectural approach that can span edge, core and cloud environments. A typical AI pipeline has mixed IO workload requirements from training to inference and is extremely data-intensive. Legacy data infrastructure and storage solutions fall short because they weren’t designed to meet the high throughput and scalability requirements of GPUs and AI workloads.
A unified data platform software approach also gives organizations the ultimate flexibility and data portability they need at the intersection of cloud and AI. Instead of vertical storage stacks and data siloes, a data platform creates streaming horizontal data pipelines that enable organizations to get the most value from their data, no matter where it is. A unified data platform is the data foundation of the future.
ALLYSON: WEKA has made a name for itself with sustainability. Why is this important for the way you’ve designed your solutions?
JONATHAN: WEKA’s software was designed for maximum efficiency, which is inherently more sustainable. Legacy data infrastructure contains a lot of inefficiencies, which have a steep environmental cost. Further compounding the problem, AI workloads are incredibly power-hungry and enterprise data volumes are growing, so the environmental impact of AI is a big global issue, and a growing area of concern for businesses.
In fact, in the S&P Global study, 64% of respondents worldwide say their organization is “concerned” or “very concerned” about the sustainability of AI infrastructure, with 30% saying that reducing energy consumption is a driver for AI adoption in their organization.
There are a few ways organizations can start addressing AI’s energy consumption and efficiency issues. The first is GPU acceleration. On average, the GPUs needed to support AI workloads sit idle about 70% of the time, wasting energy and emitting excessive carbon while they wait for data to process. The WEKA Data Platform enables GPUs to run 20x faster and drives massive AI workload efficiencies, reducing their energy requirements and carbon output.
Second, traditional data management and storage solutions copy data multiple times to move it through the data pipeline, which is wasteful in multiple ways—it costs time, energy, carbon output, and money. The WEKA Data Platform leverages a zero-copy architecture, helping to reduce an organization’s data infrastructure footprint by 4x-7x through data copy reduction and cloud elasticity.
For WEKA customers, this means they’re not only getting orders of magnitude more performance out of their GPUs and AI model training and inference workloads, but they’re also saving 260 tons of CO2e per petabyte stored annually. When it comes to their data stack, we don’t believe organizations should have to choose between speed, scale, simplicity, or sustainability – we deliver all four benefits in a single, unified solution.
ALLYSON: What do you think is critical for the enterprise data pipeline today, and how is WEKA meeting these challenges with your solutions portfolio?
JONATHAN: The first critical element of a data pipeline is having ultimate flexibility. Most organizations are challenged by growing data volumes and data sprawl, with data in multiple locations. The second thing enterprises must have, and this is also tied to flexibility, is a solution that can grow with them into the future. The third factor is simplicity, because data challenges are becoming more complex in the AI era.
When ChatGPT emerged in late 2022, no one could have predicted just how fast AI adoption would proliferate. As we discussed before, a whopping 88% of organizations say they are now investing in Generative AI. That’s an astounding rate of adoption in just over a year.
Although WEKA couldn’t predict the dawn of the AI revolution, a decade ago, our founders could see modern high-performance computing and machine learning workloads were on the rise and likely to become the norm, requiring a wholly different approach to traditional data storage and management approaches.
Our founders deliberately designed the WEKA Data Platform as a software solution to provide organizations with the flexibility to deploy anywhere and get the same performance no matter where their data lives, whether on-premises, at the edge, in the cloud, or in hybrid or multicloud environments. It also offers seamless data portability between locations.
While it’s difficult to predict what an organization’s technology requirements will be in five or 10 years, the WEKA Data Platform is designed to scale linearly from petabytes to exabytes to support future growth and keep pace with their innovation goals. Today, WEKA has several customers running at exascale. Tomorrow? The sky's the limit.
Additionally, WEKA saw that many customers were struggling to get AI into production and needed a simplified, turnkey option that enabled them to onramp AI projects quickly. We introduced WEKApod™ at GTC this year to provide enterprises with an easy-button for AI. WEKApod is certified for NVIDIA SuperPOD deployments and combines WEKA Data Platform software with best-in-class hardware, minus the hardware lock-in. Its exceptional performance density improves GPU efficiencies, optimizes rack space utilization, and reduces idle energy consumption and carbon output to help organizations meet their sustainability goals.
ALLYSON: From a sustainability perspective, how do you see AI influencing the data center of the future?
JONATHAN: As we’ve discussed, AI’s energy and performance requirements are already shaping how organizations are evaluating their data center investments. Whether its implementing new data architectures, embracing hybrid cloud strategies, shifting infrastructure vendors, or moving AI workloads to specialty GPU clouds, organizations are already reinventing how, when and where they store and manage their data. Power is shaping up to be the new currency of AI. The data center of the future will need to balance AI’s need for highly accelerated compute with its massive energy needs. There are various ways companies can address this, ranging from leveraging public and GPU clouds that leverage renewable energy, to deploying cooling technology, adopting and embracing more efficient data infrastructure solutions that support sustainable AI practices.
ALLYSON: Where can our readers find out more about WEKA solutions in this space and engage the WEKA team to learn more?
JONATHAN: Visit our website at www.weka.io and follow our latest updates on LinkedIn and X.
Source: 451 Research, part of S&P Global Market Intelligence, Discovery Report “Global Trends in AI,” August 2024
As TechArena prepares to roll out the 2024 Compute Sustainability Report, I was privileged to sit down with WEKA President Jonathan Martin to discuss how the right data foundation is critical to make GPUs more efficient and improve the sustainability and performance of AI applications and workloads.
WEKA's leading AI-native data platform software solution, the WEKA Data Platform, was purpose-built to deliver the performance and scale required for enterprise AI training and inference workloads across distributed edge, core, and cloud environments.
Jonathan, WEKA’s President, is responsible for the company’s global go-to-market (GTM) functions and operations, which include sales, marketing, strategic partnerships, and customer success.
ALLYSON: Jonathan, thank you for being here today. It used to be that data storage was an arena not exactly known for innovation. With organizations utilizing multiple clouds and looking to do more interesting things with data, that has fundamentally changed. How do you view data platforms today?
JONATHAN: While we are still in the early days of the AI revolution, we’re already seeing how transformative this technology can be across nearly every industry. Enterprises are now adopting AI in droves – and despite being relatively new in the market, generative AI is eclipsing all other forms of AI. A recent global study conducted by S&P Global Market Intelligence in partnership with WEKA found that an astounding 88% of organizations say they are actively exploring generative AI, and 24% say they already see generative AI as an integrated capability deployed across their organization. Just 11% of respondents are not investing in generative AI at all.
This rapid shift to embrace generative AI is forcing organizations to reevaluate their technology stacks, as they struggle to reach enterprise scale. The same study found that in the average organization, 51% of AI projects are in production but not being delivered at scale. 35% of organizations cited storage and data management as the top technical inhibitor to scaling AI, outpacing compute (26%), security (23%) and networking (15%).
At WEKA, we believe that every company will need to become an AI-native company to not only survive, but thrive, in the AI era. Becoming AI-native will require that they adopt a disaggregated data pipeline-oriented architectural approach that can span edge, core and cloud environments. A typical AI pipeline has mixed IO workload requirements from training to inference and is extremely data-intensive. Legacy data infrastructure and storage solutions fall short because they weren’t designed to meet the high throughput and scalability requirements of GPUs and AI workloads.
A unified data platform software approach also gives organizations the ultimate flexibility and data portability they need at the intersection of cloud and AI. Instead of vertical storage stacks and data siloes, a data platform creates streaming horizontal data pipelines that enable organizations to get the most value from their data, no matter where it is. A unified data platform is the data foundation of the future.
ALLYSON: WEKA has made a name for itself with sustainability. Why is this important for the way you’ve designed your solutions?
JONATHAN: WEKA’s software was designed for maximum efficiency, which is inherently more sustainable. Legacy data infrastructure contains a lot of inefficiencies, which have a steep environmental cost. Further compounding the problem, AI workloads are incredibly power-hungry and enterprise data volumes are growing, so the environmental impact of AI is a big global issue, and a growing area of concern for businesses.
In fact, in the S&P Global study, 64% of respondents worldwide say their organization is “concerned” or “very concerned” about the sustainability of AI infrastructure, with 30% saying that reducing energy consumption is a driver for AI adoption in their organization.
There are a few ways organizations can start addressing AI’s energy consumption and efficiency issues. The first is GPU acceleration. On average, the GPUs needed to support AI workloads sit idle about 70% of the time, wasting energy and emitting excessive carbon while they wait for data to process. The WEKA Data Platform enables GPUs to run 20x faster and drives massive AI workload efficiencies, reducing their energy requirements and carbon output.
Second, traditional data management and storage solutions copy data multiple times to move it through the data pipeline, which is wasteful in multiple ways—it costs time, energy, carbon output, and money. The WEKA Data Platform leverages a zero-copy architecture, helping to reduce an organization’s data infrastructure footprint by 4x-7x through data copy reduction and cloud elasticity.
For WEKA customers, this means they’re not only getting orders of magnitude more performance out of their GPUs and AI model training and inference workloads, but they’re also saving 260 tons of CO2e per petabyte stored annually. When it comes to their data stack, we don’t believe organizations should have to choose between speed, scale, simplicity, or sustainability – we deliver all four benefits in a single, unified solution.
ALLYSON: What do you think is critical for the enterprise data pipeline today, and how is WEKA meeting these challenges with your solutions portfolio?
JONATHAN: The first critical element of a data pipeline is having ultimate flexibility. Most organizations are challenged by growing data volumes and data sprawl, with data in multiple locations. The second thing enterprises must have, and this is also tied to flexibility, is a solution that can grow with them into the future. The third factor is simplicity, because data challenges are becoming more complex in the AI era.
When ChatGPT emerged in late 2022, no one could have predicted just how fast AI adoption would proliferate. As we discussed before, a whopping 88% of organizations say they are now investing in Generative AI. That’s an astounding rate of adoption in just over a year.
Although WEKA couldn’t predict the dawn of the AI revolution, a decade ago, our founders could see modern high-performance computing and machine learning workloads were on the rise and likely to become the norm, requiring a wholly different approach to traditional data storage and management approaches.
Our founders deliberately designed the WEKA Data Platform as a software solution to provide organizations with the flexibility to deploy anywhere and get the same performance no matter where their data lives, whether on-premises, at the edge, in the cloud, or in hybrid or multicloud environments. It also offers seamless data portability between locations.
While it’s difficult to predict what an organization’s technology requirements will be in five or 10 years, the WEKA Data Platform is designed to scale linearly from petabytes to exabytes to support future growth and keep pace with their innovation goals. Today, WEKA has several customers running at exascale. Tomorrow? The sky's the limit.
Additionally, WEKA saw that many customers were struggling to get AI into production and needed a simplified, turnkey option that enabled them to onramp AI projects quickly. We introduced WEKApod™ at GTC this year to provide enterprises with an easy-button for AI. WEKApod is certified for NVIDIA SuperPOD deployments and combines WEKA Data Platform software with best-in-class hardware, minus the hardware lock-in. Its exceptional performance density improves GPU efficiencies, optimizes rack space utilization, and reduces idle energy consumption and carbon output to help organizations meet their sustainability goals.
ALLYSON: From a sustainability perspective, how do you see AI influencing the data center of the future?
JONATHAN: As we’ve discussed, AI’s energy and performance requirements are already shaping how organizations are evaluating their data center investments. Whether its implementing new data architectures, embracing hybrid cloud strategies, shifting infrastructure vendors, or moving AI workloads to specialty GPU clouds, organizations are already reinventing how, when and where they store and manage their data. Power is shaping up to be the new currency of AI. The data center of the future will need to balance AI’s need for highly accelerated compute with its massive energy needs. There are various ways companies can address this, ranging from leveraging public and GPU clouds that leverage renewable energy, to deploying cooling technology, adopting and embracing more efficient data infrastructure solutions that support sustainable AI practices.
ALLYSON: Where can our readers find out more about WEKA solutions in this space and engage the WEKA team to learn more?
JONATHAN: Visit our website at www.weka.io and follow our latest updates on LinkedIn and X.
Source: 451 Research, part of S&P Global Market Intelligence, Discovery Report “Global Trends in AI,” August 2024