Editor’s Note: This blog was originally published in July 2023. It has been updated to include the latest information.
Today’s enterprises face a mandate to make their operations more sustainable, from customers and regulators alike. Many companies are simultaneously looking to implement compute-intensive technologies like AI, which can make their sustainability efforts even more challenging.
To get the compute resources they need while also optimizing energy efficiency, some enterprises—and the service providers they partner with—look to position data centers in colder environments, far away from major population centers. For instance, Iceland has attracted a lot of attention from the data center industry due to its unique combination of readily available renewable energy and a colder climate that helps reduce cooling requirements.
However, many modern digital applications require extremely low latency. To make the most of these applications, enterprises need distributed digital infrastructure in locations throughout the world. In some cases, this infrastructure must be deployed at the digital edge, in close proximity to the end users and devices that interact with applications on a regular basis.
In this blog post, we’ll explore how enterprises can balance their need for sustainable operations with their need to support latency-sensitive applications. To do this, they must strategize for sustainability globally while designing and building locally.
Strategize for sustainability globally
Returning to our earlier example of AI, we can see how workload placement plays an essential role in any global sustainability strategy. Different AI workloads have different requirements and should therefore be distributed across different locations for the best possible balance of performance and efficiency.
AI inference workloads are very latency sensitive, as they require a constant stream of near real-time data from many different sources. Moving data back and forth over long distances will inevitably cause delays and therefore decrease the accuracy of your AI inference. The only way to keep latency reliably low is to bring compute infrastructure closer to data sources—that is, to deploy at the digital edge. This means you can’t always choose to deploy in a particular location based solely on how energy-efficient it is. You must deploy in all the right locations and then try to make each one as efficient as possible given local conditions.
The requirements of AI training workloads are very different from those of inference workloads. In general, training workloads are much larger, which means their compute requirements are much higher. However, they’re also less sensitive to latency, which means you can run them as batch workloads in a centralized location such as a hyperscale data center or public cloud. Since latency isn’t an issue for these workloads, you can place them in the most energy-efficient locations possible, even if that means deploying hundreds of miles away from the highest concentrations of end users.
For instance, you could deploy in Finland to take advantage of the local climate. This could allow you to run even very large model training workloads without the need for power-hungry cooling systems. You should also consider other factors, such as the availability of renewables in market and whether you’ll have access to next-generation sustainability innovations like high-density on-chip liquid cooling systems as they become more widely available.
Deploying larger training workloads in efficient core locations and smaller inference workloads in distributed edge locations can help you take advantage of the tremendous potential of AI in a more sustainable manner.
Design and build for sustainability locally
Since some digital workloads inevitably need to be hosted in warmer climates, enterprises must plan carefully to deploy everywhere they need to be without derailing their global sustainability goals. As an example, let’s consider Singapore. The city-state has emerged as one of the leading digital hubs in the Asia-Pacific region, but due to its consistently warm temperatures, it wouldn’t be your first choice if you were primarily concerned with keeping your cooling requirements low.
However, this is certainly not to suggest that Singapore and other warmer metros can’t be part of an effective global sustainability strategy. There are some sustainability and efficiency initiatives that could be especially helpful in tropical climates like Singapore. To prioritize sustainable development in the country, the government of Singapore created the Singapore Green Plan 2030. Our SG5 Equinix IBX® data center in Singapore was designed to support the principles outlined in the Green Plan.
For instance, SG5 was built using Cool Array, our innovative bespoke surface cooling technology. Cool Array represents the next evolution of the fan wall design. It allows SG5 to support customers with high-density air-cooled workloads in the most efficient way possible. As a result, Cool Array drives the following benefits:
- Enables customers to run more compute capacity on the same footprint
- Reduces the energy required to run fans across the entire facility
Together, these benefits contribute to industry-leading power usage effectiveness (PUE). Like all other Equinix data centers in Singapore, SG5 also has 100% renewable energy coverage.
Cool Array at Equinix SG5
Another key aspect of pursuing sustainability at the local level is investing in the people who will create the next generation of sustainability innovations. This is why Equinix has partnered with the National University of Singapore (NUS) and Singapore Management University (SMU) to fund scholarships for sustainability-focused students. Equinix also joined a research project with NUS to study the viability of hydrogen for sustainable power generation.
Singapore is just one example of what can happen when service providers, enterprises and the public sector come together to pursue shared sustainability goals. Similar sustainability innovation is happening in many different places across the globe. As the world’s digital infrastructure company®, Equinix is proud to help support those efforts. By investing in sustainability at the community level, we can help our customers assemble the global sustainability strategies they need.
Learn more about the challenges of global sustainability and how Equinix is responding
At Equinix, being a global company is part of our DNA. We offer data centers in 70+ metros across all six populated continents because we know our customers need globally distributed digital infrastructure to future-proof their operations.
However, we also prioritize sustainability across all the different locations in which we operate. This commitment helped us achieve a number of milestones in 2023:
- We achieved 96% renewable energy coverage across our global operations for the second consecutive year. More than 235 of our data centers around the world have already achieved 100% renewables coverage.
- We reduced operational emissions (Scope 1 and Scope 2) by 24% from our 2019 baseline.
- We invested $78 million to reduce energy demand across our global footprint.
- Through our heat export initiative, we put 4,000 MWh of recovered heat from our data centers to use in the community.
All our sustainability design principles and innovations come together to support our vision for a cleaner, more efficient data center to meet the needs of tomorrow’s digital businesses. To learn more about how we’re doing it, read our sustainability report.