Global surge in submissions reveals the pivotal role of storage in scaling AI training, with new checkpoint tests tackling failure resilience in massive accelerator clusters.
MLCommons launches industry-standard benchmarks for LLM performance on PCs, cutting through marketing hype and giving developers and enterprises the transparent metrics they need.
From Midjourney to Firefly, Part 2 of our ‘AI Zoo’ series breaks down how today’s top image models work—and how TechArena uses them to create powerful, responsible visuals.
As Chinese EV giants like BYD rise, German automakers are forging an unlikely alliance, but history shows such partnerships often crumble within months.
As AI reshapes compute, memory, and networking, chipmakers are racing to rethink design workflows, embrace agentic AI, and overcome the next wave of data, power, and talent constraints.
From Chinese hackers hiding in US power grids for 300 days to AI agents that fight back autonomously, security expert Sean Grimaldi reveals which 2025 predictions hit, and what’s coming next.
During day two of the Oregon AI conference, attendees focused on the ethical implications of AI and how small-to-medium-sized businesses (SMBs) can integrate AI into their operations.
The first day of the inaugural Oregon AI Conference showcased how quickly AI can unite small-to-medium-sized businesses, spotlighted DeepSeek’s evolution, and championed responsible innovation.
Our own Allyson Klein moderates a powerhouse panel on AI ethics, with panelists representing Loyola University Chicago, Google, MLCommons, VAST Data and Momethesis.
In this video from Chiplet Summit, Shekhar Kapoor discusses how Synopsys’ transition to a multi-die approach to chiplet development has allowed them to innovate beyond the limitations of traditional monolithic chips.
Canada fuels AI innovation with a new partnership between Hypertec Cloud and VAST Data, enhancing AI research with advanced compute capacity and efficient data pipelines.
Intel’s Lynn Comp looks past the hype to explore AI’s real business impact, questioning its future potential: will AI drive ROI, or is it merely middleware destined to be absorbed into the stack?
In this podcast, MLCommons President Peter Mattson discusses their just-released AILuminate benchmark, AI safety, and how global collaboration is driving trust and innovation in AI deployment.
In this episode, Eric Kavanagh anticipates AI's evolving role in enterprise for 2025. He explores practical applications, the challenges of generative AI, future advancements in co-pilots and agents, and more.
Peter Dueben of European Centre for Medium-Range Weather Forecasts explores the role of HPC and AI in advancing weather modeling, tackling climate challenges, and scaling predictions to the kilometer level.
David Kanter discusses MLCommons' role in setting benchmarks for AI performance, fostering industry-wide collaboration, and driving advancements in machine learning capabilities.
Join Allyson Klein and Jeniece Wnorowski in this episode of Data Insights as they discuss key takeaways from the 2024 OCP Summit with Scott Shadley, focusing on AI advancements and storage innovations.
In this episode of Data Insights by Solidigm, Ravi Kuppuswamy of AMD unpacks the company’s innovations in data center computing and how they adapt to AI demands while supporting traditional workloads.
State Street’s Anusha Nerella explores AI-driven FinTech infrastructure—scalability, governance and agentic computing. Interested in finding out more about the AI Infra Summit and seeing Anusha Nerella live? Find out more here.
In this episode of In the Arena, hear how cross-border collaboration, sustainability, and tech are shaping the future of patient care and innovation.
Tune in to our latest episode of In the Arena to discover how Verge.io’s unified infrastructure platform simplifies IT management, boosts efficiency, & prepares data centers for the AI-driven future.
Join us on Data Insights as Mark Klarzynski from PEAK:AIO explores how high-performance AI storage is driving innovation in conservation, healthcare, and edge computing for a sustainable future.
Untether AI's Bob Beachler explores the future of AI inference, from energy-efficient silicon to edge computing challenges, MLPerf benchmarks, and the evolving enterprise AI landscape.
Explore how OCP’s Composable Memory Systems group tackles AI-driven challenges in memory bandwidth, latency, and scalability to optimize performance across modern data centers.