AI continues to dominate business and tech conversations, and it’s set to impact data center and end-user network traffic significantly. A new kind of network infrastructure will be needed to fully deliver on everything that AI promises.
AI will massively impact network infrastructure and connectivity in general. Omdia research shows that in 2023, AI-enriched interactions generated 63 exabytes of global network traffic per month, around a third of traffic traveling over international networks. That number is forecast to grow to 64% of the worldwide total, or 1,226 exabytes per month, by 2030.
This world of superfast, AI-enabled services and solutions raises the bar on expectations. AI-enabled use cases are expected to respond in seconds or even in real-time. Use cases can include customer service chatbots that respond instantly to customer enquiries or healthcare solutions like AI-powered patient monitoring systems and emergency diagnosis tools based on real-time data that speed up treatment.
Other AI use cases range from autonomous vehicles, where AI powers enhanced obstacle detection and avoidance through cybersecurity, gaming, supply chain and logistics, and more. They all need network infrastructure, data centers and connectivity built to deliver high quality of service (QoS) and bandwidth, low latency, minimized data loss, and scalability to support growth.
AI will significantly impact how data centers operate and the resources they need to deliver on use cases. AI compute intensity is doubling every six to 10 months, and around 20% of global data center capacity is already being used for AI.
It’s a big commitment: according to GlobalData, “Training a large language model (LLM) with ChatGPT can take a very long time and multiple graphic processing units (GPUs), or years with a single GPU.” And according to Juniper, “Many, even tens of thousands of GPU servers must be connected to train large models”, at a cost of over $400,000 per server in 2023.
An AI-ready infrastructure must be able to process vast quantities of data – and a bigger infrastructure means more data center space required, which in turn also means increased costs. Telcos must now think about deploying equipment, like servers, designed specifically for AI, and for maximizing data center performance and interconnect (DCI). As Effect Photonics puts it, “Interconnects between AI nodes are increasingly becoming a bottleneck in the ability of data center architectures to scale and handle the demands of AI models sustainably.”
Sustainability is another big issue when discussing data centers and AI. According to the IEA, worldwide data center electricity use will double between early 2024 and 2026, so sustainability must be built into data center design. For now, it’s something that the world is at least aware of, with 451 Research finding 89% of enterprises saying data center sustainability is very or somewhat important to them. But moving forward, data centers should look to renewable energy sources to improve sustainability. By transitioning to clean energy sources like solar, wind, and hydroelectric power, data centers can significantly reduce carbon footprints and make sustainability gains.
Orange Wholesale foresees AI impacting connectivity across three distinct areas:
What this means in practice is that to maximize AI, networks will need higher bandwidth. 400G Ethernet is already being deployed in data centers, but networks will need to evolve and push standards towards 800G and 1.6T Ethernet, and in time, even to Terabit Ethernet, to handle the vast amounts of data AI applications generate and process.
Low latency will also be essential, using technology such as cut-through switching and Time-Sensitive Networking (TSN), to ensure data packets are delivered with minimal delay, which is crucial for time-critical AI applications. Enhanced Quality of Service (QoS) will be needed to ensure AI workloads are given the necessary priority. This will require advanced QoS mechanisms, including flow control enhancements and dynamic traffic management. It will all need to be energy efficient to reduce power consumption in AI-driven networks and deliver on sustainability targets.
A new kind of network is needed to support AI’s specific requirements – an AI network fabric. Leading data center equipment manufacturers are recommending Ethernet to provide the infrastructure needed for AI.
Why is this? Several reasons. For starters, Ethernet can be the foundation for AI fabrics thanks to its economic advantages, scalability, and widespread adoption in data centers. Ethernet also gives organizations interoperability and return on investment (ROI) potential that single-source components cannot.
Ethernet is also an already familiar network technology, which simplifies training, procurement, and support and offers greater scalability possibilities. And Ethernet has an existing track record of innovation, with speeds increasing from 1Gbps to 400Gbps in the past couple of decades.
According to IDC, the volume of data generated at the edge will grow at a compound annual growth rate (CAGR) of 34% from now to 2027, faster than data generated at the core or on endpoints. Edge data centers can play a crucial role in supporting AI, by bringing compute power closer to data generation.
Edge data centers, situated closer to end-users and devices, process data locally and reduce latency, improving the performance of AI applications like autonomous vehicles, IoT, and smart cities. Furthermore, AI processing at the edge also helps avoid transporting vast amounts of data across networks, which is both costly and risks congestion.
At Orange Wholesale, we’ve anticipated these changes and have been taking steps to adapt our networks to support AI. We offer massive bandwidth through submarine cables and a highly-meshed terrestrial network that already supports up to 400G Ethernet. We’ve ensured our network is reliable and has resilience built in: in an uncertain world, everything from natural disasters to geopolitical turbulence to malicious third-party actors can knock the network out.
Our network has also been developed to be simpler, more converged, and fully software-defined and automated, giving operators a telco cloud that enables services at the edge, where end-users are and compute power is needed. We have also focused investment on data centers and concentrated on building relationships with key industry stakeholders in the data center space, from hyperscalers and big tech companies to innovative new entrants, to give the choice and coverage we know operators need. Our data center ecosystem comprises partners who offer best QoS versus environmental impact ratios to support Orange’s carbon emission reduction goals – and those of our customers.
It all adds up to an approach that is AI-ready, with a network in place that empowers operators to maximize AI's possibilities and minimize risk.