MLPerf Inference 5.0 signals the rise of large language models, with LLAMA 2 70B surpassing ResNet-50 in submissions and driving next-gen AI performance across compute platforms.
MemryX, a provider of edge AI acceleration hardware, recently closed its latest round of funding, serving as a potential bellwether for the next growth edge in AI compute.
From VAST Data to Weka, Graid to Solidigm — storage disruptors shined bright at NVIDIA GTC 2025. Here’s how storage innovators are redefining AI infrastructure and why it matters to the future of AI.
Deloitte and VAST Data share how secure data pipelines and system-level integration are supporting the shift to scalable, agentic AI across enterprise environments.
This video explores how Nebius and VAST Data are partnering to power enterprise AI with full-stack cloud infrastructure—spanning compute, storage, and data services for training and inference at scale.
Weka’s new memory grid raises new questions about AI data architecture—exploring how shifts in interface speeds and memory tiers may reshape performance, scale, and deployment strategies.
From 122TB QLC SSDs to rack-scale liquid cooling, Solidigm and Supermicro are redefining high-density, power-efficient AI infrastructure—scaling storage to 3PB in just 2U of rack space.
At OCP Dublin, Bel Power’s Cliff Gore shares how the company is advancing high-efficiency, high-density power shelves—preparing to meet AI’s demand for megawatt-class rack-scale infrastructure.
At OCP Dublin, ZeroPoint’s Nilesh Shah explains how NeoCloud data centers are reshaping AI infrastructure needs—and why memory and storage innovation is mission-critical for LLM performance.
From full rack-scale builds to ITAD, Circle B is powering AI-ready, sustainable infrastructure across Europe—leveraging OCP designs to do more with less in a power-constrained market.
In this TechArena interview, Avayla CEO Kelley Mullick explains why AI workloads and edge deployments are driving a liquid cooling boom—and how cold plate, immersion, and nanoparticle cooling all fit in.
At OCP Dublin, Sims Lifecycle’s Sean Magann shares how memory reuse, automation, and CXL are transforming the circular economy for data centers—turning decommissioned tech into next-gen infrastructure.
TechArena host Allyson Klein talks with futurist Brian David Johnson about future and threatcasting, and how taking agency to envision our future places us in the drivers seat to shape it.
TechArena host Allyson Klein chats with WalterPicks co-founder Sam Factor about how AI helps deliver 17% superior recommendations to fantasy football lineups vs. the major recommendation sites.
TechArena host Allyson Klein talks with Lyssn co-founder Zac Imel about how his company intends to change the shape of mental health using artificial intelligence.
In this episode, Eric Kavanagh anticipates AI's evolving role in enterprise for 2025. He explores practical applications, the challenges of generative AI, future advancements in co-pilots and agents, and more.
Peter Dueben of European Centre for Medium-Range Weather Forecasts explores the role of HPC and AI in advancing weather modeling, tackling climate challenges, and scaling predictions to the kilometer level.
David Kanter discusses MLCommons' role in setting benchmarks for AI performance, fostering industry-wide collaboration, and driving advancements in machine learning capabilities.
Join Allyson Klein and Jeniece Wnorowski in this episode of Data Insights as they discuss key takeaways from the 2024 OCP Summit with Scott Shadley, focusing on AI advancements and storage innovations.
In this episode of Data Insights by Solidigm, Ravi Kuppuswamy of AMD unpacks the company’s innovations in data center computing and how they adapt to AI demands while supporting traditional workloads.
Join host Allyson Klein and co-host Jeniece Wnorowski in this episode of Data Insights as they chat with Gigabyte's Chen Lee about AI innovations and the future of server technology at OCP Summit.