In our Nov/Dec 2024 issue, we sat down with industry experts to find out more about data centre cooling technologies and why they’re important.
We are delighted to have Rittal India Pvt. Ltd.’s Mr. Angelo Barboza, General Manager – IT Infrastructure, Head of Competence Centre, Re-BU IT Business for ASEAN, share his perspective on the importance of data centre cooling.
Q. Why is data centre cooling important?
These days, data centres are at the heart of the IT system in nearly all companies. This is where all the company’s important data and information converge. In recent years, data centres have grown steadily larger, and their energy consumption levels have risen. A higher performance density also means that the servers produce more waste heat that needs to be dissipated. If heat is left to accumulate in the racks, it can lead to system failures. A system failure will interrupt operations and data may be lost, which in turn incurs high costs and loss of earnings for the company. To avoid such situations, it is necessary to use a suitable cooling solution that is capable of dissipating excess heat completely, keeping the climate inside the enclosure at a constant, appropriate level.
Data centres have a vast amount of computing capacity packed into the smallest possible space. Hence the data centres require an extremely powerful cooling solution capable of dissipating the heat loss and maintaining a constant air temperature. It is also important to ensure even distribution of the cool air, because hot air is lighter and therefore rises, so-called “hot spots” can form at a considerably higher temperature. In individual racks, this could cause servers/components in the top rows to be cooled less effectively than those in the bottom rows. For this reason, a cooling solution that distributes the cold air evenly is just as important. The infrastructure of a data centre consumes the same amount of energy as the servers, with cooling being the most energy-intensive component. Out of consideration for rising energy costs and the environment, we need to find energy-efficient solutions that minimise energy consumption while dissipating rising heat levels.
Rising energy costs and the growing power consumption of powerful IT systems cause a significant hike in electricity and operating costs. The escalating operating costs for power and cooling can be significantly reduced with efficient IT infrastructure solutions from Rittal.
Modern data centres, especially those supporting AI workloads, need powerful cooling systems. The high density of computing power in these centres generates a lot of heat. To prevent overheating and system failures, these centres need cooling solutions that can handle the heat while also being energy efficient, using less water, and making less noise.
AI workloads are different from traditional workloads. They can change quickly and unpredictably, so cooling systems need to be flexible to handle these changes. By meeting these requirements, data centres can ensure that their AI-based computing infrastructure works well and is reliable.
Effective data centre cooling is crucial for ensuring the reliable and efficient operation of modern IT systems. With the increasing demand for high-density computing, especially in AI applications, cooling systems must be capable of handling high heat loads while maintaining energy efficiency, reducing water consumption, and minimising noise pollution. Adaptability to dynamic AI workloads is also essential. By investing in advanced cooling solutions, data centres can optimise their performance, minimise downtime, and contribute to a more sustainable IT environment.
Q. What are the most common data centre cooling systems?
We need to consider cooling solutions based on different data centre applications. For Edge DCs, Enterprise DCs, Colocation DCs, and Hyperscale DCs, the cooling requirements vary significantly.
Rittal offers a range of cooling solutions, from cooling units for 300 Watts to 7kW for Small Edge DCs, to Modular Rack DC with In-Rack cooling up to 55 kW per Rack. Colocation Data Centres and large Data Centre customers often use CRAC/CRAH solutions, while Hyperscalers are increasingly adopting Cool Walls, DLCs, and immersion cooling.
While CRAC/CRAH units have traditionally been the most common solution, they have limitations in terms of cooling capacity per rack. As data centre rack loads continue to increase, In-Rack cooling solutions will become more essential to meet high-density cooling requirements.
"AI workloads, especially those involving GPU-accelerated deep learning, often demand extremely high levels of computational power, which can generate substantial heat. Direct-to-chip cooling solutions, which directly cool the individual chips, offer a highly efficient and effective way to manage this heat. By eliminating the thermal resistance between the chip and the cooling medium, direct-to-chip cooling can significantly improve the performance and reliability of AI systems. This technology is particularly well-suited for high-density GPU clusters and other AI-intensive applications, where efficient heat dissipation is critical."
By understanding the various cooling options and their advantages, data centre operators can select the most appropriate solution to meet their specific needs and ensure the reliability and efficiency of their IT infrastructure.
Q. What cooling technologies do you offer?
Rittal is committed to providing energy-efficient cooling solutions. For Edge DCs, we offer Blue E+ and LCU units with a cooling range of 300 Watts to 7kW per Rack. For High-density DCs, we offer LCP DX & LX CW units with a cooling capacity of 0 to 55 kW per rack. These units utilise closed-loop cooling for high performance and low energy consumption.
The LCP supports the "front to back" cooling principle, where hot air emitted by the server equipment is drawn in at the front by fans and directed through a heat exchanger module. In the heat exchanger, the heated air is cooled by transferring its thermal energy to a cold water system. The cooled air is then routed in front of the 482.6 mm (19") level in the server enclosure.
The LCP features 6 fans, each housed in a separate fan module. The fans operate with linear control from 0% to 100% as needed for cooling.
Efficient fan technology is achieved through the use of EC motors. EC stands for "electrically commutated," referring to the activation of the motor. The motor coil is activated alternately to achieve linear speed control across the entire range. This linear speed control with EC technology adapts efficiently to the actual volumetric flow required, resulting in significantly lower electrical power consumption compared to traditional AC technology.
The Rittal monitoring software is a comprehensive software tool for managing ultra-modern data centres. It allows you to visualise alarms, analyse and manage temperatures. The software also monitors fan speeds. Additionally, it provides preventative monitoring analysis for hardware components, ensuring greater security during live operations. Optional integrated functions guarantee the shortest possible response times in case of escalation.
To address the emerging demand for servers operating on DC power and requiring direct-to-chip cooling, Rittal offers a comprehensive range of direct-to-chip cooling CDU solutions. These solutions are designed to support high-density servers in hyperscale data centre environments, providing cooling capacities ranging from 38 kW to 1 MW. Rittal’s CDUs are available in both liquid-to-air and liquid-to-liquid configurations, ensuring optimal heat dissipation and performance for demanding server workloads.
Q. What is the future of cooling technologies as data centres become modern and sustainable?
The future of data centre cooling lies in sustainable and energy-efficient solutions. Water-based cooling technologies offer significant advantages over traditional air-cooled systems due to their significantly higher heat absorption capacity. This makes them ideal for modern data centres that prioritise energy efficiency and environmental sustainability.
Various water-based cooling technologies, such as liquid cooling and cold plate cooling, are gaining popularity. These technologies can be combined with free cooling strategies to maximise energy efficiency based on ambient conditions.
As data centre operators continue to focus on energy efficiency, the need for sustainable cooling solutions will become even more critical. Factors such as rising energy costs, limited energy availability, and environmental concerns will drive the adoption of innovative cooling technologies.
Rittal’s LCP modular cooling systems provide the infrastructure necessary for building modern, energy-efficient data centres. The modular design offers flexibility for expanding and modernising existing data centres, regardless of the scale of the operation.
By investing in advanced cooling technologies and adopting sustainable practises, data centre operators can ensure the long-term viability and efficiency of their facilities. Predictive maintenance, intelligent monitoring systems, and a focus on energy efficiency will be key to building sustainable data centres that meet the demands of the future.
Q. Do you have any case studies of data centre cooling? (preferably from Southeast Asia / Asia Pacific)
Riverside Medical Centre in Bacolod, Philippines
The Riverside Medical Centre, Inc. (RMCI), is the owner and operator of the Dr. Pablo O. Torre Memorial Hospital (DPOTMH), a premier hospital and medical centre located in Bacolod City. Founded in 1954, it has grown from an 8-bed clinic to a 5-storey, 330-bed medical centre and tertiary institution.
The Riverside Medical Centre, Inc., is the owner and operator of the Dr. Pablo O. Torre Memorial Hospital and is a proud member of the Metro Pacific Hospital Holdings, Inc. (MPHHI).
With the medical facility’s expansion and growth, RMCI recognised the need to improve its IT infrastructure to digitise and modernise processes. The server room was relocated to a new building (adjacent to the main hospital), and most of the equipment was upgraded. For the IT infrastructure components, RMCI chose Rittal for its Server Cabinets, Power Distribution Units (PDUs), Precision Cooling, and Environment Monitoring Systems (EMS).
Rittal’s solutions have been implemented to enhance availability and improve the overall efficiency of the four-rack data centre. The Rittal LCP cooling system represents a significant improvement over the existing room-based cooling system in the old server room. LCP Chilled Water units are deployed without the need for raised flooring, making preventive maintenance and overall management more convenient for RMCI IT personnel. The cooling units were also designed with redundancy in mind, ensuring high availability.
Real-time temperature, humidity, water leak, and power consumption are monitored by Rittal PDU and CMC III sensors. All Rittal solutions can be monitored remotely via RMCI’s local network, eliminating the need for frequent on-site visits.
Overall, RMCI is pleased with the modernisation of its data centre. The company is now exploring opportunities to continue this partnership and replicate the same approach in other medical facilities within the MPHHI network across the Philippines.
Mr. Angelo Barboza Profile
An electrical engineer with over 18 years of experience in the IT and datacoms industry, Angelo initially worked as a product manager for IT solutions. He is ATD certified and possesses a profound knowledge and understanding of designing and consulting for data centre infrastructure with a holistic approach to ROI, TCO, PUE, and Lifecycle design, including Rack, Power, Cooling, Security, and onsite and remote monitoring.
Angelo has extensive expertise in managing small, mid, and large-scale strategic IT projects, managing stakeholders, cross-functional & complex teams, and ensuring customer satisfaction.
In his current role, his main responsibility lies in Data Centre consulting & designing of data centres with an emphasis on Rittal’s holistic approach to data centre energy efficiency and green data centre solutions with a futuristic technology approach.