Artificial intelligence and machine learning are rapidly transforming the digital landscape, creating substantial demands for high-performance chips and increased power densities. As this demand surges, so do the concerns for power, space, and cooling resources needed to power AI infrastructure. Faced with these challenges, enterprises are actively seeking solutions to not only support but also provision AI data center campuses. In light of the global power shortage in the data center market, as reported by CBRE, data center operators are exploring specialized solutions to deploy AI initiatives. In response to this demand, Norway has emerged as a promising candidate to bridge these gaps and become the next AI powerhouse. However, to realize this vision, Norway must strike a delicate balance between power, cooling, and spatial considerations to secure its position as the next AI hub.
European Power & Location Scarcities Hinder AI Advancement
For years, the FLAP hubs (Frankfurt, London, Amsterdam, Paris) have been a primary backbone for global computing. Their prime locations and capabilities have consistently attracted investments, bridging computing gaps on a global scale. However, the rise of AI is placing a substantial burden on European power resources, compelling enterprises to seek alternative locations. Typically, hyperscalers would invest in one hub to cater to an entire region; however, in the light of AI, enterprises are foregoing that strategy to invest in multiple sites in hopes to strengthen their edge network, reports fDi Intelligence. Consequently, the FLAP hubs are grappling with energy procurement to meet these escalating power demands, underscoring the importance for new data centers dedicated to facilitating AI development.
The magnitude of these power constraints has prompted regulations and legislative actions for data center provisioning in many of the FLAP hubs. In 2022, London experienced a housing crisis because data center construction left no electrical capacity for new housing projects, as reported by DCD. Following London’s power-related challenges, Germany initiated regulations in response to concerns raised by the government regarding limited land availability and the potential impact of data centers’ power demands on the city’s decarbonization efforts.
These concerns have culminated in the implementation of the European Energy Efficiency Directive, scheduled to take effect in May 2024. This directive mandates that campuses with power requirements exceeding 500 kilowatts must provide reports on their energy performance. These energy regulations signify the beginning of a new era, and with just 30% of European data centers currently prepared for compliance, as reported by the Uptime Institute, it indicates a transformative year for computing practices in Europe.
Norway’s Preparation for Global AI Adoption
In contrast, Norway’s proactive initiatives in the data center market have positioned the nation as a prominent player in AI computing. In 2018, the Norwegian government unveiled a comprehensive plan to elevate the country’s data center industry to a world-class status, as highlighted by DCD. Building on this commitment, Norway established NORA, a consortium dedicated to bolstering research and education in artificial intelligence within the country, as reported by Forbes. These endeavors have gained even more momentum with the recent injection of 1 billion kroner into AI development, further reinforcing Norway’s potential as the next European data center hub, as detailed by Innovation News Network. This substantial investment was strategically aimed at preparing Norway for the impending wave of AI technologies.
Norway is renowned for possessing the fastest internet connectivity speeds in the world, as well as abundant natural resources, as underscored by Forbes, making it an appealing destination for AI enterprises seeking a strategic foothold in the European AI landscape. This appeal has already attracted hyperscalers such as Amazon, IBM and Microsoft who are leveraging Norway to expand and strengthen their cloud networks. Moreover, Norway has been vocal about its AI strategy and ethical computing guidelines, fostering a high level of trust in both the business and public sectors. However, as Norway gears up to host next-generation data centers, one question lingers: how will they effectively address the cooling requirements of AI data centers?
Direct-to-Chip Liquid Cooling: Accelerating Norwegian Data Center Advancements
Ensuring effective liquid cooling for high-density chipsets is transitioning from a luxury to a necessity, and locations unprepared to accommodate diverse customer hardware will inevitably encounter challenges. While the Nordic region boasts abundant resources and robust connectivity, sustaining its advantage depends on implementing long-term, efficient cooling strategies for AI chipsets. Let’s dive into why liquid cooling should migrate alongside AI to the Nordic region.
Image created by DALL·E 3, developed by OpenAI
Enhance High-Performance Processors with Liquid Cooling
As Norway readies itself for high-density demands, liquid cooling will play a crucial role in ensuring that devices and chipsets operate not only efficiently but also effectively. Alongside the demand for AI comes the need for specialized hardware such as GPU clusters, 3D chip packages, and purpose-built AI data centers, reports The Wall Street Journal. The distinguishing feature of AI data centers lies in their ability to simultaneously process multiple computations, enabling efficient analysis of vast data volumes. Nevertheless, these computing benefits are accompanied by increased heat production, resulting in over 40kW racks, pushing traditional cooling techniques to their maximum capacity as they struggle to meet the escalating power demands of modern GPUs and high-performance computing (HPC). As businesses seek to incorporate high-density processors, like NVIDIA’s H100, into their upcoming data centers, the importance of delivering efficient cooling for large TDP ranges becomes increasingly evident.
Advanced liquid cooling solutions, employing microjet impingement to precisely target processor hotspots, address these cooling concerns of modern GPUs for HPC and AI. By targeting the cooling directly on hotspots, microjet impingement technology can efficiently manage high-density heat fluxes without concern for thermal throttling. This not only enables the support of denser workloads but also leads to improved compute performance and energy savings. As AI migrates to Norway, liquid cooling solutions are predicted to follow suit to address the cooling requirements of AI infrastructure and ensure long-term performance and efficiency.
Prefabricated Liquid Cooling Solutions Expedite Norwegian Data Center Lead Time
Rapid AI expansion has ignited competition to efficiently cool AI infrastructure. However, despite the indisputable advantages of AI, many enterprises are opting for phased approaches to manage risk, acknowledging the uncertainties and fluctuations within the AI market. This strategy extends to cooling infrastructure, resulting in a heightened demand for scalable, prefabricated liquid cooling solutions. As a result, air-assisted liquid cooling or hybrid cooling approaches are gaining traction in the market as a way to mitigate these risks without overcommitting to one solution. These solutions offer a cost-effective means of accommodating the growing needs of Norway’s data centers as they expand. The accelerated deployment times associated with these solutions will help maintain Norway’s competitiveness as it strives to establish itself as an AI powerhouse. Furthermore, closed-loop liquid cooling solutions are location-independent and thrive even in challenging environments with temperatures exceeding 35°C, mitigating the location scarcity risks that have been observed in previous hubs like FLAP.
Warmer Coolants Advance Norway’s Sustainability Initiatives in AI Data Centers
As an energy-rich country, Norway has prioritized sustainability efforts by leveraging their extensive hydropower resources to power energy-intensive industries while minimizing GHG emissions. This energy strategy has enabled them to establish a renewables-based electricity system, with hydropower accounting for a substantial 92%. Thanks to this resource, Norway is even able to export electricity to neighboring countries. These hydropower benefits translate into low electricity rates in the data center market, fostering competitive and sustainable computing practices. Coupled with relatively affordable land prices, these benefits have attracted data center developments such as the Green Mountain facility at Rennesøy, which derives all its power from renewable sources and employs seawater for server cooling, resulting in a PUE (Power Usage Effectiveness) of less than 1.2.
Building on this strong foundation, direct-to-chip liquid cooling can further strengthen Nordic data center sustainability. By increasing inlet coolant temperatures up to 60°C, single phase direct-to-chip liquid cooling using a CDU can significantly reduce energy by alleviating the burden on CRAH and HVAC systems responsible for regulating coolants. This single phase direct-to-chip liquid cooling technology can cool over 1,500W in a single socket; all without using the unsafe fluids associated with two-phase immersion cooling. This cooling strategy improves processor performance, and eliminates the need for chillers, cooling towers and evaporative cooling. As a result, the technology translates into impressive water and energy savings with PUEs lowering to 1.02.
Furthermore, liquid cooling can play a pivotal role in advancing Norway’s heat reuse initiatives. In 2021, the Norwegian government mandated that data centers explore methods to redirect waste heat into district heating systems, as reported by DCD. However, despite these government directives, executing such initiatives efficiently and cost-effectively can be a complex endeavor, as acknowledged by Petter M. Tømmeraas, CEO of AQ Compute. Drawing from his prior experience at Norway’s Green Mountain center, Tømmeraas observed that air cooling poses more challenges for heat reuse compared to water-based systems. Simply put, air is not an efficient source for heat transfer, making it exceptionally challenging to implement heat reuse strategies in a solely air-cooled system.
However, it’s not just coolant type that impacts heat reuse; coolant temperature also plays a crucial role in feasibility. Typically, the heat generated (around 30°C to 40°C) is too low for most district heating networks, introducing another layer of complexity. By adopting liquid cooling with warmer inlet temperatures of up to 60°C, Norway can more effectively support heat reuse initiatives while simultaneously enhancing compute performance for their AI servers.
Suffice it to say, Norway is spearheading several sustainability initiatives in the data center space; however, deploying efficient D2C liquid cooling that leverages warmer coolant temperatures can further improve data center sustainability and reduce water consumption.
Driving Efficiency and Lower PUEs in Norway’s AI Hubs in 2024
The Norwegian data center industry stands poised for a remarkable future in the era of generative AI and high-performance computing. With exceptional connectivity, access to robust power supplies, abundant renewable energy sources, and a cost-effective operational environment, Norway presents an ideal setting to accommodate the exponential growth of AI and HPC workloads. When combined with the adoption of efficient liquid cooling solutions, Norway emerges as a strong contender to become the preferred destination for the next generation of AI data centers.
Discover how liquid cooling technologies can prepare your data center for sustainable AI computing in our latest data center, high-performance computing, colocation, and federal case studies.








