At Advancing AI, AMD unveils MI355 with 35× gen-over-gen gains and doubles down on open innovation – from ROCm 7 to Helios infrastructure – to challenge NVIDIA’s AI leadership.
The deal marks a strategic move to bolster Qualcomm’s AI and custom silicon capabilities amid challenging competition and the potential start of a wave of AI silicon acquisitions.
A new partnership combines WEKA’s AI-native storage with Nebius’ GPUaaS platform to accelerate model training, inference, and innovation with microsecond latency and extreme scalability.
As the battle for AI market share continues, AMD’s recent acquisitions signal a strategic move toward optimizing both software and hardware for inference workloads and real-world AI deployment.
The HPE-owned platform combines unified observability, smart alert correlation, and automation to tackle hybrid IT complexity while also working with existing monitoring tools.
AIStor’s stateless, gateway-free design solves legacy storage issues, enabling high-performance object-native infrastructure for exabyte-scale AI and analytics workloads.
At Advancing AI, AMD unveils MI355 with 35× gen-over-gen gains and doubles down on open innovation – from ROCm 7 to Helios infrastructure – to challenge NVIDIA’s AI leadership.
The deal marks a strategic move to bolster Qualcomm’s AI and custom silicon capabilities amid challenging competition and the potential start of a wave of AI silicon acquisitions.
A new partnership combines WEKA’s AI-native storage with Nebius’ GPUaaS platform to accelerate model training, inference, and innovation with microsecond latency and extreme scalability.
As the battle for AI market share continues, AMD’s recent acquisitions signal a strategic move toward optimizing both software and hardware for inference workloads and real-world AI deployment.
The HPE-owned platform combines unified observability, smart alert correlation, and automation to tackle hybrid IT complexity while also working with existing monitoring tools.
AIStor’s stateless, gateway-free design solves legacy storage issues, enabling high-performance object-native infrastructure for exabyte-scale AI and analytics workloads.
Live from OCP Summit, Google Cloud’s Amber Huffman shares insights on AI's future, open standards, and innovation, discussing her journey, data center advancements, and the role of collaboration at OCP.
Live from OCP Summit 2024, this Data Insights podcast explores how Ocient’s innovative platform is optimizing compute-intensive data workloads, delivering efficiency, cost savings, and sustainability.
Join Arne Stoschek, VP of AI and Autonomy at Airbus Acubed, as he discusses the role of AI in aviation, the future of autonomous flight, and innovations shaping the industry at Airbus.
During our latest Data Insights podcast, sponsored by Solidigm, Ian McClarty of PhoenixNAP shares how AI is shaping data centers, discusses the rise of Bare Metal Cloud solutions, and more.
Letizia Giuliano of Alphawave Semi discusses advancements in AI connectivity, chiplet designs, and the path toward open standards at the AI Hardware Summit with host Allyson Klein.
Sean Lie of Cerebras Systems shares insights on cutting-edge AI hardware, including their game-changing wafer-scale chips, Llama model performance, and innovations in inference and efficiency.
Arun Nandi of Unilever joins host Allyson Klein to discuss AI's role in modern data analytics, the importance of sustainable innovation, and the future of enterprise data architecture.
TechArena host Allyson Klein chats with Sema4.ai co-founder Antti Karjalainen about his vision for AI agents and how he sees these powerful tools surpassing even what current AI models deliver today.
TechArena host Allyson Klein and Solidigm’s Jeniece Wnorowski chat with Taboola Vice President of Information Technology and Cyber, Ariel Pisetzky, about how his company is reshaping the marketing landscape with AI infused customer engagement tools.
TechArena host Allyson Klein chats with EY’s Global Innovation AI Officer, Rodrigo Madanes, about what he’s seeing from clients in their advancement with AI and what this means for the industry requirements for innovation.
TechArena host Allyson Klein chats with Intel’s Lisa Spelman about how compute requirements are changing for the AI era, where we are with broad enterprise adoption of AI, and how software, tools and standards are required to help implement solutions at scale.
TechArena host Allyson Klein interviews Netflix’s Tejas Chopra about how Netflix’s recommendation engines require memory innovation across performance and efficiency in advance of his keynote at MemCon 2024 later this month.