The road to Artificial Intelligence

The road to Artificial Intelligence

Ivo Ivanov, CEO, DE-CIX, on why AI success depends on AI-ready infrastructure.

It’s been a whirlwind year for AI. Record investment, booming user adoption and a surge in generative AI tools such as Brok, Bard and ChatGPT all paint a picture of a technology that is poised to revolutionize every industry.

While there is plenty of hype about AI’s potential benefits – increased productivity, smart automation, data-driven decision-making, and new revenue streams – pragmatists are sounding notes of caution.

In its 2024 AI assessment, Goldman Sachs Research refers to the “very positive signs” that AI will “eventually” significantly improve GDP and productivity.

In other words, AI is coming, but the critical question for CIOs and across boardrooms is: “Are we ready to get the most out of our AI investments?”

Somewhat surprisingly, the answer lies not in the AI technology itself, but in the foundation on which it is built – data and connectivity infrastructure. Many enterprises underestimate the impact physical infrastructure has on AI success, leading to bottlenecks that hinder progress and stifle return on investment (ROI).

Navigating performance challenges in a data-heavy world

A recent MIT Technology Review survey reveals a familiar scenario: 95% of companies are already utilizing AI in some form, with half aiming for full-scale integration within the next two years. However, the road to AI adoption is fraught with data-related challenges: data quality (49%), data infrastructure/pipelines (44%) and integration tools (40%).

A 2024 EY study into AI found that only 36% of senior leaders are investing in connectivity infrastructure related to the accessibility, quality and governance of data and that “without a strong infrastructure foundation, efforts to maximize AI’s full potential will fail”.

To overcome these challenges, enterprises have looked to the skies for solutions, or, more specifically, the clouds.

According to Deloitte, out of the companies that have had the most success implementing GenAI, 80% also report higher cloud investments as a direct consequence of their AI strategy.

By contrast, the MIT figures show that 36% of organizations’ AI initiatives are being held back by incomplete cloud migrations.

The rapid pace of data generation has meant that storing all this information on-premise is no longer feasible, resulting in a mass migration to cloud-based data lakes and warehouses.

While cloud storage offers scalability and accessibility, it also creates a dependency on seamless integration with AI models, often already residing in the cloud. This creates a networking challenge: data needs to move off-site for storage, but relying on the public Internet connections for AI deployments puts performance and security at risk.

Furthermore, typical AI implementations require periods of training mixed with periods of inference.

AI training, the process of teaching AI models how to perform tasks by feeding them data, and AI re-training, which organizations must do periodically to update their models, require high bandwidth but can tolerate a little lag, also known as latency. On the other hand, during AI inference, where the model makes real-time responses and predictions, such as with customer service chatbots, enterprises need minimal latency for optimal performance.

This means that for AI success, networks need to handle both – high bandwidth for training and low latency for inference.

To the cloud and beyond: building an AI-ready network

One thing is clear: continuing to rely on the public Internet or third-party IP transit for data transfer between on-premise hardware, data lakes and cloud-based AI services is detrimental to most enterprises’ AI ambitions.

Why? These connections offer little control over data routing, inconsistent performance, and increased security risks. Businesses are at the mercy of their restricted service provider. Instead, the best way to control data flows is to control how networks interconnect with each other.

Direct, high-performance network interconnection between a company’s network and cloud platforms, facilitated by strategically located Cloud Exchanges, is vital. These exchanges provide cloud routing capabilities, ensuring a responsive and interoperable multi-cloud or hybrid cloud environment.

Interconnection goes beyond the cloud, too. Connecting with external networks through Internet Exchanges – either via peering or private network interconnects (PNI) – guarantees the most efficient data paths, resulting in secure, low-latency and resilient connectivity. This extends to AI-as-a-Service (AIaaS) networks via AI Exchanges. These allow businesses to outsource AI development and operations to third parties while maintaining performance and security.

What’s next for AI connectivity?

As AI adoption intensifies across the world, businesses are increasingly turning to high-performance interconnection providers such as Internet, Cloud and AI Exchange operators, not only for connectivity solutions but strategic network design expertise too. By addressing their data bottlenecks through cloud-ready solutions and implementing interconnection strategies, businesses can navigate the growing complexities of connectivity infrastructure, unlock ROI and build a solid foundation for AI success.

Browse our latest issue

Intelligent CIO Middle East

View Magazine Archive