Category
Category
Stay Connected
Receive the latest articles and
content straight to your mailbox
Subscribe Here!
How Much Does It Cost to Cool a Data Center? Optimizing Data Center Cooling Costs
The rapid adoption of artificial intelligence (AI) and other compute-intensive applications is driving greater data center densities. A rack of equipment for an AI workload can easily exceed 40 kW. An AI-optimized server can draw up to 10 kW on its own. While most data centers could likely support that one rack, multiple racks of AI equipment can easily strain the cooling infrastructure.
The amount of heat generated by equipment roughly equals the amount of power consumed, and traditional data center cooling systems are simply not designed to handle these densities. Data centers would need supplemental cooling to prevent AI equipment from overheating. However, cooling and ventilation systems already account for 30 percent to 55 percent of data center power consumption.
Modern liquid cooling systems can cost between $1,000 and $2,000 per kW cooled. Add that to the initial investment cost, and potential consulting fees and cooling costs add up quickly. Total yearly investment for enterprise data center cooling systems can easily be in the $100,000s, while hyperscale data centers can incur cooling costs in the $1,000,000s. That’s why the efficiency of these systems is critically important to the bottom line. Operators need to know they’re doing everything possible to keep cooling costs in check.
How To Reduce Data Center Cooling Costs
To keep a lid on costs, data center operators must find ways to dissipate heat more effectively. Here are some factors affecting data center cooling costs and techniques operators can use to increase efficiency.
Data Center Site Selection
The climate of the data center’s location directly impacts cooling efficiency. Data centers in cooler, drier climates require less cooling energy than those in hotter, more humid climates. Air-side economizers can bring low-temperature outside air into the data enter to reduce the load on the cooling system. Hot exhaust air can be directed outside or mixed with the ambient air to achieve the desired temperature and humidity range.
Water-side economizers follow the same principles as chilled-water cooling systems. They use evaporation to pre-cool water so the chiller can be shut off. They can reduce cooling costs by up to 70 percent while in operation and provide redundancy if the chiller fails.
Airflow Management and Ventilation
Cooling is wasted if chilled air doesn’t reach the IT equipment or if heat is not exhausted properly. Proper airflow management and ventilation can optimize cooling and reduce data center cooling costs.
Data center best practices call for isolating chilled air from hot exhaust air in a hot-aisle / cold-aisle configuration. With cold-aisle containment, the cold row is capped at the tops of the cabinets and across the aisle, and doors are installed at the ends of the row to contain the cold air. With hot-aisle containment, a physical barrier is constructed to prevent the mixing of hot and cold air and to direct exhaust air into the A/C return or plenum.
In addition, data center operators should install blanking panels in any open spaces within racks and seal the area between the sides of cabinets and the mounting rails. Within a row, operators should block spaces between racks or where racks are missing. In raised floor environments, perforated tiles should be properly placed and cable openings sealed.
Data Center Temperature Settings
Simply raising the cold aisle temperature can significantly reduce energy use. There generally isn’t a need to keep the cold aisle at 70 degrees Fahrenheit — virtually all equipment manufacturers allow for server inlet air temperature settings of around 80 degrees. For facilities using economizers, elevated cold-aisle temperatures enable more free cooling days and, thus, more energy savings.
Running a server at higher temperatures can reduce its lifespan. However, many larger data centers find that replacing the equipment costs less than maintaining a lower temperature. And since IT equipment is refreshed as often as every three years (to prevent technical debt), the shortened lifespan may not have any meaningful impact on costs.
The Importance of Measurement
Measurement is a crucial component in managing data center cooling costs. Data center infrastructure management (DCIM) software can help unlock the insights needed to optimize power consumption and cooling systems. Modern DCIM solutions harness the power of AI not only to monitor but also to manage data center systems in real-time. DCIM software can help mitigate risks, optimize performance, and ensure maximum uptime.
How Enconnex Can Help
Enconnex considers cooling throughout our data center and IT solutions. From our EdgeRack micro data center line with self-contained cooling units to our aisle containment solutions, we have the products and solutions to keep your equipment running at the optimal temperature. Our experts can help you develop an effective strategy for cutting cooling costs as data center densities increase. Just get in touch.
Posted by Thane Moore on December 19, 2023
Thane Moore is the Senior Director of Sales Operations & Logistics for Enconnex and has 20 years of experience in the IT infrastructure manufacturing space working for companies such as Emerson and Vertiv.