Data Centre Magazine July 2025 | Page 80

How Ciena is Powering Regional AI Data Centre Growth

CTO OF THE AMERICAS KEVIN SHEEHAN EXPLAINS HOW CIENA IS SUPPORTING DC BLOX IN ADAPTING ITS DATA CENTRE INFRASTRUCTURE FOR AI CONNECTIVITY DEMANDS
As the global AI race continues to heat up, data centre connectivity requirements are being stretched. This has caused organisations to move from centralised AI processing to more distributed models, which is making regional data centres increasingly critical. To confront this shift, leading provider of networking systems and software Ciena has been building connectivity infrastructure for AI workloads since early 2024. Kevin Sheehan, Ciena’ s CTO of the Americas, explains that the company’ s initial AI focus involved“ building up connectivity between data centres where learning was taking place” to bolster connectivity between facilities to support AI training. He says:“ We had to teach all of these engines how to do AI, and they continue to learn whatever aspect they’ re being targeted for.”
Confronting infrastructure challenges The data centre industry is now witnessing a focus on distributed AI models – those that bring computation closer to end users. Looking ahead, Kevin reveals that this transition will occur in phases, suggesting:“ Distributed first into regional locations, then into metro locations and finally right out to the edge.” This shift has created significant infrastructure challenges for data centre operators: more high-speed connectivity, power efficiency and latency.
“ Whether we’ re talking about centralised AI or distributed AI, it’ s about network capacity,” Kevin says.“ There’ s huge growth in network capacity related to AI.” Power consumption presents another challenge for data centre companies. As data centres expand their capacity and add more networking equipment, electricity demand soars whilst supply remains constrained.“ There’ s just not enough electricity. It’ s really important to balance huge increases in capacity without huge amounts of energy usage,” Kevin continues.“ You want these inference engines to work in as real-time a fashion as possible.”
Building a connectivity evolution Enterprise customers are driving much of the demand for distributed AI infrastructure, with companies across a broad range of industries adopting AI that requires new types of connectivity solutions. However, these organisations increasingly need to access external graphics processing units( GPUs) – the chips essential for AI workloads.
What sets Ciena’ s approach to AI infrastructure apart is its core strength: offering the highest network performance, speed and connectivity while also maximising capacity on optical fibres. The company has maintained this leadership