Liquid Cooling of IT equipment has been on the horizon for many years, but 2022 - 2023 has been a turning point. Applications which require high powered processors, like artificial intelligence and machine learning are increasingly deployed as liquid cooled vs air cooled. It is expected the increased deployment of these high-density systems will increase the presence of liquid cooling in data centers. Operators are grappling with the best way to plan, design, deploy and manage these systems, especially due to wide ranging growth estimations.
Climate change, geopolitics, and supply chain stress combined to create record levels of energy price in 2022, and 2023. The Russian invasion of Ukraine set off an energy crisis across Europe, with dramatic fluctuations in energy prices, and widespread anxiety about being able to meet energy demand across the continent. Although the continent was graced with a relatively mild winter, and the EU has taken action to accelerate the transition away from Russian fossil fuels, the energy crisis there is unlikely to resolve any time soon.
In the United States, a succession of natural disasters caused high volatility in wholesale electricity prices throughout 2022. These included winter storms and natural gas constraints in New England, heat waves in Texas and the western states, and extreme cold in the Pacific Northwest.
Energy cost volatility can make it difficult for data center operators to plan investments, anticipate profitability, and attract capital.
Availability of grid power and T&D capacity is a growing constraint on the placement of data centers and is further contributing to energy price uncertainty for data center operators. In Ireland, data centers use a whopping 17% of the nation’s energy – a figure that is expected to rise to 28% by 2030. Generation capacity on the Irish grid has become extremely scarce and has created a de facto moratorium on new data center development, with more than 30 projects put on hold.
In Northern Virginia, another famous digital infrastructure hub, the electrical grid is similarly challenged to meet the growing demand of data centers. The state Department of Environmental Quality was preparing to lift restrictions on running diesel generators so data centers could use their backup power systems to relieve grid stress during emergencies. After significant public outcry over the potential emissions associated with such a scheme, the state retracted the proposal in April.
Around the world, competition not just for electricity, but for water, to generate electricity, is going to continue to limit where new data centers can be built, and may stoke public opposition to new projects.
wildfires, and severe storms. Incidents of sabotage are also on the rise – both physical and cyber attacks on energy infrastructure. On top of supply constraints caused by the Russian invasion of Ukraine, Europe had to deal with widespread and sustained power outages in 2022 from heatwaves and floods. In the United States, power outages caused by severe weather are increasing in frequency, and duration. Long term outages like those precipitated by wildfires in California, or recent freezes in Texas, can challenge a data center’s store of diesel fuel, and these same conditions can both create immense competition for this fuel, and impede its delivery. Of course, extreme weather can also impact data center energy infrastructure directly, threatening onsite transformers and energy distribution equipment.
Sustainability performance is an increasingly common condition of investment for data centers. As capital flows tighten, investors are introducing new sustainability criteria for data center operators. In their corporate sustainability report, Vantage shared that in order to secure two construction loans for expansion of campuses in Quebec and Virginia from Societe Generale, a multi-national investment bank, they were required to share thorough sustainability data. The bank required formal documentation of sustainability goals, net-zero 2030 roadmap, ISO certifications, GWP disclosure, PUE calculations and more.
Customers are likewise increasing sustainability requirements in RFPs. This is true for data center operators evaluating equipment vendors, and for tenants selecting a colo provider. For several years we have seen a sustainability section in RFPs, but traditionally check boxes or boilerplate descriptions of sustainability programs have been all that’s required.
This year, we have heard from our IG customers that they will be requiring thorough, meaningful carbon data along with RFPs. They are likely to require that this data conform to international standards for carbon measurement and reporting and make it either a gate for entry into the RFP process, or a means of comparison among bids.
Going forward, data center tenants may add criteria related to resilience to their evaluation criteria as well. Service level agreements, but savvy data center customers may seek to ensure that the data center facilities housing their IT are located in an area with lower wildfire, flood, or hurricane risk. They may also ask their data center operator to give them options for energy procurement that are protected against cost volatility.
Liquid Cooling of IT equipment has been on the horizon for many years, but 2022 - 2023 has been a turning point. Applications which require high powered processors, like artificial intelligence and machine learning are increasingly deployed as liquid cooled vs air cooled. It is expected the increased deployment of these high-density systems will increase the presence of liquid cooling in data centers. Operators are grappling with the best way to plan, design, deploy and manage these systems, especially due to wide ranging growth estimations.
CPU and GPU power densities has been increasing over the years and are now pushing the limits of air-cooled heat sinks. For example, 400-watt CPUs and 700-watt GPUs are available today. At these chip densities, servers can be configured at up to 1.5kW per U, resulting in a fully configured rack of greater than 50kW.
Today, AI systems are easily configured at densities of 25 – 40kW per rack. When CPUs are greater than 250 watts and GPUs are greater than 400 watts, liquid cooling becomes a better option. When air cooled racks are greater than 25 kw, it is difficult to ensure proper airflow, is energy intensive and creates a noisy, unpleasant environment to work in.
In recent years, the focus on implementing Operational Technology (OT) specific Cybersecurity technology and programs in the data center industry has skyrocketed. This is driven by a number of modern factors, powered by the exponential increase in connected devices, meters, and software solutions to manage IoT and data analytics at scale.
While this embrace of technology is a positive thing, it expands the potential ‘threat vectors’ in a data center if it is not managed properly. Business leaders are occasionally skeptical that cybersecurity breaches can cause downtime or loss of revenue, but it is a significant real-world concern. With that said, that is only one of four key reasons why the data center industry needs to take cybersecurity more seriously:
In a recent survey of over 425 Data Center companies, Profitwell Research found that Data Center companies overwhelmingly aspire to reach a high level of OT cybersecurity – specifically, IEC 62443 Standard Security Level 3+, which provides companies protection against intentional violation using sophisticated means with moderate-to-advanced resources, system specific skills, and motivation.
With that said, in the same survey, Forrester found that Data Centers’ current level of OT cybersecurity development was much less mature than their aspirations, with long time targets to achieve higher levels of security. For example, despite the clear need, the data center sector has not adopted large-scale cybersecurity monitoring and management solutions. Instead, the industry ends up “working around the issue,” through a few ineffective means.
These workaround policies lead to more human on-site management, slower deployment times, and less agility in responding to client requirements. This is particularly painful as our industry faces a talent shortage, “ridiculously short” deployment timeframe requirements, and ever-more discerning customers.
The data center industry needs to accelerate the adoption of effective OT cybersecurity technologies, alongside the right personnel and processes, to showcase defend-ability to investors, customers, regulators, and prospective employees. Cutting-edge cybersecurity solutions exist today to automatically discover devices, unearth potential threats, and enforce cybersecurity policies. These would allow operators to adopt the latest infrastructure management solutions to become more agile without sacrificing reliability.
Schneider Electric’s Global Cybersecurity Solutions and Services organization is already hard at work creating a vision for data center OT cybersecurity standardization utilizing decades of experience across almost all major industries and attack surfaces. This standardization involves inputs from major technology and solution providers in the industry, hyperscalers, colocation providers, and enterprise data center owners.
Data center operators are increasingly looking for ways to accurately identify deeper level of asset details in a facility, such as make/model/firmware/serial/lifecycle information. In the Data Center industry, this information is absolutely critical. Not only does an accurate, up-to-date OT asset inventory show where real vulnerabilities lie and how to remediate them, but also showcases lifecycle status for better modernization planning, and can help drastically reduce the cost of services due to effective planning and efficiencies.
While the data centre sector leads the way with many innovative technologies in the IT space, it has not adopted cybersecurity monitoring and management solutions at scale in the OT space, despite these three clear needs. Ironically, the industry viewed as the leader in digital advancement instead “works around the issue,” through a number of archaic means, including air-gapping infrastructure from advisory platforms, deploying burdensome network topologies, and avoiding cloud-based solutions.
If we are to continue to lead the way in digital advancement, we must invest in proactive OT-specific cybersecurity defence platforms and strategies to prevent downtime and unexpected costs, unearth potential threats, and enforce cybersecurity policies. And as we do with IT, we must integrate those technologies into operations centres that allow for facilities to work more efficiently and be notified of specific actions to take.
Data centre companies typically rely on multiple vendors for their cybersecurity needs, including hardware, software, and consultancy services. However, these relationships often remain transactional, and the full potential of vendor expertise and partnership is not fully realized. As a result, data centers may miss out on opportunities for enhanced integration, improved efficiencies, and access to real-time data insights.
Managed services in the OT cybersecurity space provide comprehensive solutions to bridge the gap between data centre companies and the full utilization of their assets and their data. This allows for the centralization of knowledge on a global scale, enabling facilities to operate in a more efficient manner with less need for on-site specialization. Using trusted managed services providers, data centers can leverage vendors’ expertise and capabilities holistically and proactively.
Construction complexity is on the rise & is slowing down your ability to scale your data center projects. Finding new and innovative ways to build are essential to the industry’s growth strategy. Growing demands are leading to the need for owners to improve their internal processes, driving efficiency wherever possible.
Here are some of the pressures that we are seeing in the industry today and it is these pressures that are driving the need to digitize.
Construction is becoming more complex, and it is more important than ever to complete projects within budget, on time and to quality, while reducing your carbon footprint. As a result, connecting people, processes, and data in innovative ways to provide real-time visibility has never been more important.
Data center construction projects face increasing headwinds, which mean that traditional methods are no longer “fit for purpose”. Digitization allows for data collection which will lead to improved decision making, based on facts.
Owners must begin targeting specific challenges to improve business outcomes, one at a time; in order to build up an end-to-end digital roadmap for enhanced visibility, communication, quality & efficiency. This starts by identifying the problem area and implementing a digital solution
Here is a list of some of the complexities along with their business impacts, that we believe need modernization / digitization.