Open collaboration just leveled up: OCP pushes shared specs from rack to data center—power, cooling, networking, and ops—so AI capacity can scale faster, with less friction and more choice.
Appointment to Open Compute Project Foundation board of directors, contribution of Foundation Chiplet System Architecture (FCSA) spec underscore Arm’s ascendency in hyperscale, AI data centers.
CelLink’s ultrathin flex harnessing ushers in a new era in compute infrastructure innovation, cutting cable volume by up to 90% and boosting density, reliability, and efficiency.
As AI workloads push storage power consumption higher, the path to true storage efficiency demands systems-level thinking including hardware, software, and better metrics for picking the right drives.
Dave Driggers, CEO of Cirrascale, breaks down what “compute efficiency” really means, from GPU utilization and TCO modeling to token-based pricing that drives predictable customer value.
From data center to edge, Arm is enabling full-stack AI efficiency, powering ecosystems with performance-per-watt optimization, tailored silicon, and software portability across environments.
Multi-year agreement makes VAST AI OS CoreWeave’s primary data foundation, aligning roadmaps for instant dataset access and performance across training and real-time inference in global AI cloud regions.
At the OCP Global Summit, Avayla CEO Kelley Mullick reveals how rapid standardization and hybrid cooling strategies are reshaping infrastructure for the AI era.
Open models move fast—but production doesn’t forgive surprises. Lynn Comp maps how to pair open-source AI with a solid CPU foundation and orchestration to scale from pilot to platform.
What modern storage really means, how on-prem arrays compare to first-party cloud services, and a clear checklist to pick the right fit for cost, control, scalability, and resilience.
As part of Flex, JetCool is scaling its microconvective cooling technology to help hyperscalers deploy next-gen systems faster, streamlining cooling deployments from server to rack in the AI era.
Ventiva discusses how hard-won laptop cooling know-how can unlock inside-the-box gains for AI servers and racks—stabilizing hotspots, preserving acoustics, and boosting performance.
Recorded at #OCPSummit25, Allyson Klein and Jeniece Wnorowski sit down with Giga Computing’s Chen Lee to unpack GIGAPOD and GPM, DLC/immersion cooling, regional assembly, and the pivot to inference.
From #OCPSummit25, this Data Insights episode unpacks how RackRenew remanufactures OCP-compliant racks, servers, networking, power, and storage—turning hyperscaler discards into ready-to-deploy capacity.
Allyson Klein and co-host Jeniece Wnorowski sit down with Arm’s Eddie Ramirez to unpack Arm Total Design’s growth, the FCSA chiplet spec contribution to OCP, a new board seat, and how storage fits AI’s surge.
Midas Immersion Cooling CEO Scott Sickmiller joins a Data Insights episode at OCP 2025 to demystify single-phase immersion, natural vs. forced convection, and what it takes to do liquid cooling at AI scale.
From hyperscale direct-to-chip to micron-level realities: Darren Burgess (Castrol) explains dielectric fluids, additive packs, particle risks, and how OCP standards keep large deployments on track.
From OCP San Jose, PEAK:AIO’s Roger Cummings explains how workload-aware file systems, richer memory tiers, and capturing intelligence at the edge reduce cost and complexity.
Join Allyson Klein and Jeniece Wnorowski as they chat with Eddie Ramirez from Arm about how chiplet innovations and compute efficiency are driving AI and transforming data center architecture.
In this episode – recorded live at the OCP Summit – host Allyson Klein catches up with Intel Corporate VP Zane Ball to discuss silicon innovation, AI evolution, and the future of enterprise adoption.
Join us as Rob Campbell from Flex discusses the challenges and innovations in data centers, focusing on power, heat, and scale, while shaping the future of AI and sustainable solutions.
Rob Campbell of Flex discusses how Flex is driving data center transformation with cutting-edge solutions like liquid cooling, AI-ready infrastructure, and vertical integration for global hyperscalers.
Tune in as Jason Maselino of Circle B discusses the role of Open Compute Project in revolutionizing data centers with energy-efficient solutions, modular designs, and AI-ready infrastructure.
MIPS CTO Durgesh Srivastava shares insights on AI, data centers, automotive edge, and how MIPS is leveraging RISC-V to drive efficient, flexible computing solutions.