A landmark multi-year deal positions AMD as a core compute partner for OpenAI’s expanding AI infrastructure—diversifying its silicon base and reshaping GPU market dynamics.
Rafay Systems is emerging as a key enabler of global AI infrastructure, helping enterprises and neoclouds operationalize large-scale deployments in the dawn of the AI era.
Daniel Wu joins TechArena and Solidigm on Data Insights to share his perspective on bridging academia and enterprise, scaling AI responsibly, and why trustworthy frameworks matter as AI adoption accelerates.
Anusha Nerella joins hosts Allyson Klein and Jeniece Wnorowski to explore responsible AI in financial services, emphasizing compliance, collaboration, and ROI-driven adoption strategies.
From circularity to U.S. assembly, Giga Computing lays out a rack-scale roadmap tuned for the next phase of AI—where inference drives scale and regional supply chains become a competitive edge.
In Part 2 of Matty Bakkeren’s 2026 predictions series, he explores how regulation, sovereignty, and public trust will push data centers to behave more like utilities than tech projects.
Arm’s OCP board seat and new FCSA spec push chiplet interoperability from idea to implementation—enabling mix-and-match silicon and smarter storage so teams can build AI without hyperscaler budgets.
Modern software-defined cars blend multiple links—CAN/LIN, MIPI, SerDes, and Ethernet/TSN—to shrink wiring and cost, manage EMI, and deliver reliable, deterministic timing from sensors to actuators.
Durgesh Srivastava unpacks a data-loop approach that powers reliable edge inference, captures anomalies, and encodes technician know-how so robots weld, inspect, and recover like seasoned operators.
Veteran technologist and TechArena Voice of Innovation Robert Bielby reflects on a career spanning hardware, product strategy, and marketing — and shares candid insights on innovation, AI, and the future of the automotive industry.
As AI inference, edge, and autonomous systems outpace legacy networks, this playbook shows how to combine fiber, RF, FSO, and satellite to tame digital asymmetry and build resilient AI connectivity.
Billions of customer interactions during peak seasons expose critical network bottlenecks, which is why critical infrastructure decisions must happen before you write a single line of code.
Cornelis CEO Lisa Spelman joins Allyson Klein to explore how focus, agility, and culture can turn resource constraints into a strategic edge in the fast-moving AI infrastructure market.
As GPU racks hit 150kW, throughput per watt has become the efficiency metric that matters, and SSDs are proving their worth over legacy infrastructure with 77% power savings and 90% less rack space.
Equinix’s Glenn Dekhayser and Solidigm’s Scott Shadley discuss how power, cooling, and cost considerations are causing enterprises to embrace co-location among their AI infrastructure strategies.
Two decades of action and bold milestones show why Schneider Electric is recognized as the world’s most sustainable company, driving impact across climate, resources, and digital innovation.