Datacloud Cannes 2025
From MegaWatts to ‘BraggaWatts’, What We Heard and What We Learnt.
Our CEO, Louis Charlton, shares his experience at Datacloud Cannes 2025.
“
We’ve just returned from Datacloud Congress in Cannes, where over 4,500 leaders from across the data centre world gathered to discuss the future of our industry.
If there is one thing everyone agrees on, it’s this: the future isn’t coming gradually; it’s arriving at full speed.
For Global, this year’s event marked a turning point.
We were everywhere: on the front of the Palais des Festivals, on thought leadership panels, in strategic meetings, and in conversations that matter.
I even had the pleasure of attending the Infra Leadership Summit, where I discussed the future of the industry with CEOs and leaders from all major companies within the market.
People are noticing the work we’re doing, and that recognition is building fast. We didn’t just attend; we made an impact. And that’s only possible because of the commitment and execution of our team.
Throughout the event, the dominant message was scale.
AI is transforming everything, from how we design to how we power to where we build. By 2030, AI workloads are expected to drive 75% of global data centre demand. Power consumption is rising to match that of entire nations, with a Japan-level power draw projected within five years.
And we’re already designing around 600kW racks. The scale is no longer hypothetical; it’s operational.
But in the rush to grow, there’s also a need for realism.
A new term, Braggawatts, made the rounds, especially in the Leadership Summit I had the pleasure of attending. It was referring to flashy announcements that boast massive capacity without any real infrastructure in place. No grid connection, no permits, no fibre. The message was clear: not all capacity is created equal. Substance matters more than spin.
One of the large discussions at the Leadership Summit was that, despite the flexibility AI introduces in how and where computing happens, physical location is still central to data centre strategy. Yes, inferencing workloads, where AI models are used rather than trained, can often be handled further from end users, which opens the door to more remote or cost-efficient locations.
But that doesn’t mean proximity no longer matters. Most customers, particularly those in enterprise, financial services, cloud, and content delivery, still prioritise access to core markets. That’s because so many applications depend on low-latency performance, strong interconnectivity, and access to dense ecosystems of partners and end users.
For these use cases, location remains non-negotiable. That is why cities like London, Frankfurt, and Paris continue to dominate investment flows. They’re not just data hubs; they’re market anchors. But the demand in these areas is now spilling outward. Surrounding cities such as Birmingham, Bristol, and Cambridge in the UK, or regions adjacent to Frankfurt and Paris, are gaining real traction. Their proximity offers the right balance of connectivity, available land, and growing local infrastructure, making them attractive for expansion, secondary capacity, and edge deployments.
AI doesn’t remove the importance of geography; it just reshapes the equation. Training large AI models may happen in remote hyperscale campuses powered by renewable energy and backed by long-term fibre routes. But serving those models efficiently to end users still requires a presence near the population centres that matter most. Location strategy today is no longer just about cost or available power. It’s about matching the right site to the right workload. And that’s a more complex, but ultimately more strategic, conversation.
Another topic that stood out was just how little the public still understands about what we do
A recent survey of 13,000 people across Europe showed that fewer than half could accurately define what a data centre is, even when given multiple-choice options.
For many, the infrastructure that powers everything from streaming to banking to AI still feels invisible.
And yet, the perception isn’t negative.
Only 7% of respondents held an unfavourable view of the data centre industry. That contrast is telling.
It suggests that while people don’t fully understand our sector, they don’t view it with suspicion either. There’s an opening here. An opportunity to shape public understanding before misconceptions take hold.
As data centres become more prominent, whether due to land use, energy consumption, or their role in powering AI, we’ll be increasingly in the spotlight. That means it’s not just about getting planning approvals or ESG ratings. It’s about engaging with communities, regulators, media, and future talent in a more intentional way. We have a responsibility to explain what data centres actually are and why they matter.
That means being clear about the benefits we bring, including economic growth, local job creation, and support for digital services, as well as being honest about the challenges we’re addressing, such as power use and sustainability. It’s not just about managing reputation. It’s about building a foundation of trust.
And that’s something we all have a part to play in as Global’s visibility continues to grow.
The risk of stranded assets came up repeatedly, and for good reason.
The pace of change in computing hardware, cooling systems, and rack densities is accelerating. What we design and build today could be technically outdated by the time it's commissioned.
That’s not just a theoretical concern; it’s a very real operational and financial risk.
Liquid cooling is a prime example. Not long ago, the industry consensus was that widespread adoption would arrive around 2027. In reality, it’s already here.
Some workloads, particularly in AI training, are demanding power densities that traditional air cooling simply can’t handle. The pressure to pivot quickly is already reshaping how campuses are being designed.
The challenge is that data centres are capital-intensive, long-lifecycle assets. You can’t easily retrofit for new thermal requirements or rack types once the shell is up and systems are locked in.
That’s why flexibility is now a critical design principle.
Facilities need to be built with adaptability in mind, modular infrastructure, high headroom power and cooling, and the ability to integrate emerging technologies without wholesale redesigns.
This isn't just about tech; it’s about risk management. It’s about ensuring the decisions we make today won’t limit what we can deliver tomorrow. Whether it's power delivery, cooling topologies, or space configuration, we need to plan for evolution, not just execution.
That’s the mindset we’re building into everything we do at Global.
Connectivity is another pressure point.
While power supply has been the headline challenge for years, fibre infrastructure, especially dark fibre, is fast catching up as a serious constraint.
Dark fibre refers to unused fibre-optic cables that have been laid but aren't yet "lit" or connected to active equipment. These fibres offer operators the ability to control and scale their connectivity rather than relying on pre-packaged bandwidth from traditional telecom providers.
For data centre operators, that means greater control over performance, lower latency, and often lower long-term costs. However, lead times for deploying dark fibre are increasing, just like those for power infrastructure. In key markets, competition for routes is high, civil works are complex, and permitting can take months or even years.
As a result, dark fibre is no longer just a technical detail; it’s a make-or-break factor for site selection and long-term viability. Today, location decisions hinge not just on grid capacity and land availability but also on the ability to access and extend fibre networks quickly and cost-effectively. As workloads become more distributed and AI requires denser, faster connectivity between sites, this will only become more critical.
“ And while technology drives the build, relationships drive delivery.”
In regions like Greece and Aragón, strong collaboration with local authorities and utilities is unlocking rapid development. Regulatory agility, clear policy, and aligned interests are enabling projects that might have stalled elsewhere. It’s a reminder that success in our space takes more than capital. It takes buy-in. Southern Europe is one of the major beneficiaries of this dynamic. Cities like Madrid, Milan, and Athens, which were once considered secondary markets, are now securing 30 to 50MW builds at a rapid pace. Their proximity to key cable landing stations and improving infrastructure are turning them into serious nodes in the global data network. But all of this progress comes with a warning: power lead times are growing. Generation and transmission infrastructure can take up to a decade to deliver. Site selection can no longer be driven just by availability or cost; it has to start with an energy strategy.
During the leadership summit, I also heard a robust debate around data sovereignty, a topic that’s quickly moving up the priority list for governments and regulators. More countries are enacting laws that require data generated within their borders to be stored and processed locally. It’s a trend driven by privacy concerns, national security interests, and a desire for digital independence.
But while the intention is clear, the execution isn’t simple. Compliance with data sovereignty laws comes with significant trade-offs. Building local infrastructure to meet sovereign requirements often means higher costs, not just for land and construction, but for creating true redundancy and meeting uptime expectations in markets that may not yet have mature ecosystems.
There’s also a performance angle. Keeping data local can impact latency or availability if regional infrastructure isn’t robust enough to handle global application demands. For multi-national platforms and AI workloads that rely on aggregated datasets from around the world, keeping everything within national borders becomes technically complex and financially inefficient. The takeaway from Cannes? There’s no one-size-fits-all solution. Every customer, workload, and jurisdiction requires a tailored approach. Operators will need to balance compliance needs with performance, cost, and user experience while also designing infrastructure that can adapt as these regulations evolve.
Sustainability, long a defining priority in our industry, took on a more nuanced tone at Datacloud this year.
It remains a core focus, especially in Europe, where regulatory pressure and stakeholder expectations continue to push operators toward greener operations. But the reality on the ground is shifting.
The surge in AI demand is bringing new tensions to the surface.
Compute intensity is rising sharply, and delivering the necessary power density at scale often conflicts with carbon neutrality goals. Several discussions acknowledged that the path to net zero is becoming increasingly challenging to navigate, especially at the current market pace.
We also heard how operational realities are testing sustainability commitments. In markets where renewable energy is constrained or grid stability is an issue, operators are reconsidering solutions that were previously off the table, like gas turbines or hybrid backup systems. These may not align perfectly with ESG messaging, but they offer the reliability needed to deliver large-scale capacity without delay.
At the same time, reporting obligations, particularly under EU legislation, are getting stricter and more complex. This is placing additional strain on resources, especially for operators managing multiple jurisdictions.
The sense from Cannes was that while sustainability still matters deeply, the conversation is shifting from ambition to realism. The industry now faces a key question: how do we continue to scale responsibly without slowing down? The answer won’t be simple and is a question for some very clever people to answer. However, it will require smarter energy strategies, improved efficiency across every layer of infrastructure, and more transparent dialogue with customers about what it takes to remain sustainable while staying online.
One final reflection that stuck with many of us came from a speaker who offered a striking comparison.
If a megawatt-second was equivalent to 11 days, then a gigawatt-second would stretch to over 31 years.
It’s not a literal formula, but it drives home the scale shift we’re now facing. Moving from megawatts to gigawatts isn’t just about more power. It means longer lead times, bigger risks, deeper partnerships, and completely different ways of thinking about infrastructure.
We’re no longer planning in months, we’re planning in decades.
For Global, Cannes confirmed that we’re not just part of this evolution; we’re helping shape it. We stood out, we were heard, and we left with stronger relationships, sharper insights, and renewed momentum. The progress we’re making as a business is being noticed.
That’s because of our people - your work, your effort, and your belief in where we’re going.
Whatever comes next, whether it’s regulatory change, AI’s explosive growth, or shifting expectations around sustainability, we will remain at the forefront. We’ll continue to grow, adapt, and lead alongside our partners. We’re building a business that’s ready for what’s next, and we’re doing it together.
Let’s keep building what the future actually needs.
“
Louis Charlton
Chief Executive Officer
Global Commissioning