
Insights on AI and Data Management from Intercontinental Exchange
In today’s rapidly advancing tech landscape, optimizing infrastructure to handle massive data sets has become more crucial than ever. One noteworthy story emerging from NVIDIA GTC is how Intercontinental Exchange (ICE) is tackling the growing complexity of data management, AI implementation and storage optimization across its vast network of financial exchanges, data services and mortgage technologies. We sat down with Anand Pradhan, the head of the AI Center of Excellence at ICE, and Roger Corell, senior director of leadership marketing at Solidigm, to discuss how ICE is using technology to stay ahead of the curve.
ICE, known for operating the New York Stock Exchange, processes over 700 billion transactions daily. With such massive volumes of data, building and maintaining an optimized, highly redundant infrastructure is essential. It’s not just about the network and servers — the flow of data through these systems makes storage a critical focus in ICE’s technology strategy.
Anand explained that ICE handles around 10 to 12 terabytes of data every single day with nanosecond granularity. This data, crucial for tracking financial trades, must be stored and accessed at lightning speeds. With millions of trades, real-time analysis and preventing fraud are key, which means both data retrieval and storage processes must be supercharged for efficiency.
One of the biggest challenges is the sheer volume of data and the input-output bottlenecks that arise when reading and writing to storage systems. To solve this, Anand’s team works closely with the InfraSolutions architecture team to fine-tune the storage infrastructure, ensuring that it scales easily, remains flexible and is resilient to failure. This involves rigorous testing and investment in systems that allow for fast, uninterrupted data access, while minimizing latency and maximizing performance.
But Anand’s insights extend beyond just infrastructure; he also highlighted how AI is shaping the company’s approach to data aggregation. At ICE, AI models are primarily used for processing unstructured data, such as images of real estate properties. The AI extracts valuable insights from these photos, identifying key artifacts, such as doors, kitchens or even the color of a room. With real estate photos pouring in from across the U.S., this AI-driven data processing is a massive undertaking. AI models are deployed at scale to make sense of the raw data, which is then converted into structured, usable information for the company’s real estate services.
As ICE’s AI adoption grows, so too does its need for an optimized storage solution. The storage systems of the future, Anand noted, need to accommodate millions of files — whether flat files, images or video data — and ensure they can be accessed quickly. As more and more workloads move to the AI space, fast access to large datasets and the ability to scale storage seamlessly are becoming essential. This is where storage systems that can horizontally scale, offer fast write speeds and support massive volumes of data will stand out.
Looking ahead, ICE’s evolving use of AI and machine learning is transforming its infrastructure and redefining what modern storage systems must deliver. What’s the TechArena take? With growing demands for speed, scale and real-time access, ICE’s journey offers a clear example of how AI is driving a fundamental shift across the industry. As adoption accelerates, organizations at the forefront of tech will need to rethink their approach to storage — those that do will be best positioned to gain a lasting competitive edge.
To learn more about ICE, visit www.ice.com, or find Anand and ICE on LinkedIn.
In today’s rapidly advancing tech landscape, optimizing infrastructure to handle massive data sets has become more crucial than ever. One noteworthy story emerging from NVIDIA GTC is how Intercontinental Exchange (ICE) is tackling the growing complexity of data management, AI implementation and storage optimization across its vast network of financial exchanges, data services and mortgage technologies. We sat down with Anand Pradhan, the head of the AI Center of Excellence at ICE, and Roger Corell, senior director of leadership marketing at Solidigm, to discuss how ICE is using technology to stay ahead of the curve.
ICE, known for operating the New York Stock Exchange, processes over 700 billion transactions daily. With such massive volumes of data, building and maintaining an optimized, highly redundant infrastructure is essential. It’s not just about the network and servers — the flow of data through these systems makes storage a critical focus in ICE’s technology strategy.
Anand explained that ICE handles around 10 to 12 terabytes of data every single day with nanosecond granularity. This data, crucial for tracking financial trades, must be stored and accessed at lightning speeds. With millions of trades, real-time analysis and preventing fraud are key, which means both data retrieval and storage processes must be supercharged for efficiency.
One of the biggest challenges is the sheer volume of data and the input-output bottlenecks that arise when reading and writing to storage systems. To solve this, Anand’s team works closely with the InfraSolutions architecture team to fine-tune the storage infrastructure, ensuring that it scales easily, remains flexible and is resilient to failure. This involves rigorous testing and investment in systems that allow for fast, uninterrupted data access, while minimizing latency and maximizing performance.
But Anand’s insights extend beyond just infrastructure; he also highlighted how AI is shaping the company’s approach to data aggregation. At ICE, AI models are primarily used for processing unstructured data, such as images of real estate properties. The AI extracts valuable insights from these photos, identifying key artifacts, such as doors, kitchens or even the color of a room. With real estate photos pouring in from across the U.S., this AI-driven data processing is a massive undertaking. AI models are deployed at scale to make sense of the raw data, which is then converted into structured, usable information for the company’s real estate services.
As ICE’s AI adoption grows, so too does its need for an optimized storage solution. The storage systems of the future, Anand noted, need to accommodate millions of files — whether flat files, images or video data — and ensure they can be accessed quickly. As more and more workloads move to the AI space, fast access to large datasets and the ability to scale storage seamlessly are becoming essential. This is where storage systems that can horizontally scale, offer fast write speeds and support massive volumes of data will stand out.
Looking ahead, ICE’s evolving use of AI and machine learning is transforming its infrastructure and redefining what modern storage systems must deliver. What’s the TechArena take? With growing demands for speed, scale and real-time access, ICE’s journey offers a clear example of how AI is driving a fundamental shift across the industry. As adoption accelerates, organizations at the forefront of tech will need to rethink their approach to storage — those that do will be best positioned to gain a lasting competitive edge.
To learn more about ICE, visit www.ice.com, or find Anand and ICE on LinkedIn.