Data Centre Magazine January 2026 | Page 70

EDGE COMPUTING

The global race to build AI-ready data centres is accelerating and network performance has become one of the most critical competitive differentiators.

For operators and hyperscalers, the demand for low latency, scalable infrastructure capable of supporting high-bandwidth AI workloads is reshaping how networks are designed and managed.
Within its data centre networking portfolio, Nokia offers its 7220 Interconnect Router( IXR) highperformance switches and advanced Artificial Intelligence for Operations( AIOps) features for its Event-Driven Automation( EDA) platform.
Together, these innovations target the growing need for ultra-efficient, low latency connectivity that can handle the unprecedented data movement required by large-scale AI training and inference environments.

96 %

reduction in downtime for network operators using Nokia’ s EDA AIOps
The AI era demands low latency infrastructure The shift towards AI-driven workloads, particularly those powering agentic and generative AI systems, has placed data centres under immense strain.
Processing large language models( LLMs) and other AI applications requires massive parallel computing power across thousands of GPUs or XPUs. But even as processing capability grows, network latency remains a persistent bottleneck.
Latency – the delay before data is transferred – directly impacts the speed of training and inference in AI models. Reducing it is therefore essential to achieving the performance, accuracy and responsiveness modern AI systems require.
Nokia’ s 7220 IXR-H6 switches were designed specifically to meet this challenge. Capable of delivering up to 102.4TBps throughput with 800 Gigabit Ethernet( GE) and 1.6 Terabit Ethernet( TE) interface speeds, they double both throughput and interface performance compared to the company’ s previous models. The result is a network fabric optimised for low latency communication across large AI clusters.
These switches are compliant with Ultra Ethernet Consortium( UEC) standards, which aim to optimise Ethernet for AI and high-performance computing environments.
The standards introduce advanced congestion control and packet management techniques to reduce latency and improve determinism – critical for synchronising workloads across massive distributed systems.
70 January 2026