Today’s enterprises face a mandate to make their operations more sustainable, from customers and regulators alike. In the Equinix 2023 Global Tech Trends Survey (GTTS), 68% of global IT leaders said that the environmental impact of their IT equipment and infrastructure is something they measure and actively try to limit.
Many companies are simultaneously looking to implement compute-intensive technologies like AI, which can make their sustainability efforts even more challenging. To get the compute resources they need while also optimizing energy efficiency, some enterprises—and the service providers they partner with—look to position data centers in colder environments, far away from major population centers. For instance, Iceland has attracted a lot of attention from the data center industry due to its unique combination of readily available renewable energy and a colder climate that helps reduce cooling requirements.
However, many modern digital applications require extremely low latency. To make the most of these applications, enterprises need distributed digital infrastructure in locations throughout the world. In some cases, this infrastructure must be deployed at the digital edge, in close proximity to the end users and devices that interact with applications on a regular basis.
In this blog post, we’ll explore how enterprises can balance their need for sustainable operations with their need to support latency-sensitive applications. To do this, they must strategize for sustainability globally while designing and building locally.
Strategize for sustainability globally
Returning to our earlier example of AI, we can see how workload placement plays an essential role in any global sustainability strategy. Different AI workloads have different requirements and should therefore be distributed across different locations for the best possible balance of performance and efficiency.
AI inference workloads are very latency sensitive, as they require a constant stream of near real-time data from many different sources. Moving data back and forth over long distances will inevitably cause delays and therefore decrease the accuracy of your AI workloads. The only way to keep latency reliably low is to bring compute infrastructure closer to data sources—that is, to deploy at the digital edge. This means you can’t always choose to deploy in a particular location based solely on how energy-efficient it is. You must deploy in all the right locations and then try to make each one as efficient as possible given local conditions.
The requirements of AI training workloads are very different from those of inference workloads. In general, training workloads are much larger, which means their compute requirements are much higher. However, they’re also less sensitive to latency, which means you can run them as batch workloads in a centralized location such as a large colocation data center or public cloud. Since latency isn’t an issue for these workloads, you can place them in the most energy-efficient locations possible, even if that means deploying hundreds of miles away from the highest concentrations of end users.
For instance, you could deploy in Finland to take advantage of the local climate. This could allow you to run even very large model training workloads without the need for power-hungry cooling systems. You should also consider other factors, such as the availability of renewables in market and whether you’ll have access to next-generation sustainability innovations such as high-density on-chip liquid cooling systems as they become more widely available.
Deploying larger training workloads in efficient core locations and smaller inference workloads in distributed edge locations can help you take advantage of the tremendous potential of AI in a sustainable manner.
Design and build for sustainability locally
Since some digital workloads inevitably need to be hosted in warmer climates, enterprises must plan carefully to deploy everywhere they need to be without derailing their global sustainability goals. As an example, let’s consider Singapore. The city-state has emerged as one of the leading digital hubs in the Asia-Pacific region, but due to its consistently warm temperatures, it wouldn’t be your first choice if you were primarily concerned with keeping your cooling requirements low.
However, this is certainly not to suggest that Singapore and other warmer metros can’t be part of an effective global sustainability strategy. There are some sustainability initiatives that could be especially helpful in tropical locations like Singapore. For instance, blue carbon refers to carbon captured by coastal and marine ecosystems.[i] The coastal ecosystems of Singapore are home to mangrove forests, which have high carbon capture rates. Therefore, investing in coastal conservation as part of a blue carbon initiative could be especially helpful in Singapore.
To prioritize sustainable development in the country, the government of Singapore created the Singapore Green Plan 2030. Equinix SG5, our newest Equinix IBX® data center in Singapore, was designed to support the principles outlined in the Green Plan. For instance, SG5 was built using the Equinix Cooling Array, our innovative bespoke surface cooling technology. This allows SG5 to support customers with high-density workloads while keeping power consumption low. Like all other Equinix data centers in Singapore, SG5 has 100% renewable energy coverage.
Another key aspect of pursuing sustainability at the local level is investing in the people who will create the next generation of sustainability innovations. This is why Equinix partnered with the National University of Singapore (NUS) and Singapore Management University (SMU), pledging SG$160,000 toward sustainability-focused scholarships.
Singapore is just one example of what can happen when service providers, enterprises and the public sector come together to pursue shared sustainability goals. Similar sustainability innovation is happening in many different places across the globe. As the world’s digital infrastructure company®, Equinix is proud to help support those efforts. By investing in sustainability at the community level, we can help our customers assemble the global sustainability strategies they need.
Learn more about the challenges of global sustainability and how Equinix is responding
At Equinix, being a global company is part of our DNA. We offer data centers in 70+ metros across all six populated continents because we know our customers need globally distributed digital infrastructure to execute their digital transformations and future-proof their operations.
However, we also prioritize sustainability across all the different locations in which we operate. This commitment helped us achieve a number of milestones in 2022, as summarized in the most recent edition of the Equinix sustainability report:
- We achieved 96% renewable energy coverage across our global operations—our furthest progress yet toward our goal of 100% coverage by 2030.
- We reduced operational emissions (Scope 1 and Scope 2) by 23% from our 2019 baseline.
- We invested $45 million to reduce energy demand across our global footprint.
- We became the first digital infrastructure company to make an efficient temperature commitment for its data center operations. Allowing supply air to the IT equipment at a wider temperature range reduces the demand for cooling, thus supporting our goal of operating efficiently even in warmer climates.
All our sustainability design principles and innovations come together to support the data center of the future—our vision for a cleaner, more efficient data center to meet the needs of tomorrow’s digital businesses. To learn more about what’s involved with building the data center of the future, read our white paper The Data Center of the Future: Reaching Sustainability.
[i] “Mitigating Climate Change Through Coastal Conservation,” The Blue Carbon Initiative.