How 3 Large Data Centers Attack High Cooling Costs?

How 3 Large Data Centers Attack High Cooling Costs?

The primary enemy of a computer system is heat. Heat slows down computer processors, damages them and finally kills them. The data center, which sometimes packs thousands of computers into a limited space, is particularly vulnerable to damage from heat. Cooling can account for over one-third of data center electricity costs. As data center and colocation volume rises, providers must address rising energy costs to maintain profits.

Microsoft, Google and Facebook have each addressed data center cooling in unique ways. Many organizations have also proposed practices that can help data centers lower their cooling costs.

Microsoft Takes the Data Center Outdoors

In 2008, Microsoft’s manager of data center services, Christian Belady, speculated computers would be able to brave outdoor elements. He placed a server rack inside a pup tent, where it ran for eight months with an uptime of 100 percent. Based on Belady’s ideas, Microsoft built a roofless data center in Boydton, Virginia.

To protect computers from the elements, Microsoft built boxes that look like shipping crates. The boxes, called IT-PACs, can hold hundreds of servers. Within IT-PACs, side vents capture cool air from the outside that passes through a wet membrane before cooling the actual servers. Microsoft claims its solution uses only 10 percent of the water usually used in other data center cooling strategies.

Google Deploys Close-Coupled Cooling

Last October, Google finally lifted the veil off of its data center cooling practices by providing a photo gallery and a StreetView app that delivers a real-time view inside the company’s data centers. In the Google approach, the entire room is the cool aisle. The hot aisles are framed by rows of racks that can hold tens of thousands of servers.

Google maintains its large data centers at a temperature of 80 degrees Fahrenheit (27 C). Cooling coils that contain chilled water serve as the ceilings for the hot aisles. The air enters the server inlet and passes over the components, picking up heat from the servers. Then, fans in the chassis rear pull the hot air, which can reach a temperature of 120 degrees Fahrenheit (49 C), into rows of racks on either side.

The hot air rises to the top of the chamber, where it cools to room temperature before being exhausted through the enclosure. The cooling coil that tops the hot aisle descends through a floor opening and runs beneath the raised floor. Google also uses plastic curtains instead of rigid containment to manage airflow.

Facebook Uses Evaporative Cooling Systems

Facebook has replaced its old misting systems, which used cooled air and added humidity, with evaporative cooling systems that contain fiberglass. The top half of Facebook’s data center manages cooling supply, allowing cool air to enter from overhead. The cool outdoor air then passes through a mixing room that mixes cool air with exhaust heat to regulate temperature.

The cool air passes through a series of evaporative media that absorb moisture before going through a fan wall. This wall pushes cool air through openings in the data center floor. The openings serve as airshafts leading to the server area.

Best Practices

While three of the world’s mostprominenttech companies have developed innovative data center cooling strategies, following industry best practices can cut cooling costs:

  • Seal the data center. Sealing regulates humidity, eliminating the need for humidification or dehumidification.
  • Optimize airflow. Arrange racks, air conditioners and cables so air moves in an optimal fashion.
  • Use economizers. Let energy-free cool air from the outdoors provide data center cooling during colder months.
  • Increased efficiency. Achieve better room air conditioning through improved controls and variable capacity systems.
  • Bring cooling systems closer to the heat source. This step cuts the amount of energy data centers have to spend on air movement.

Implementing best practices requires little effort, pays immediate dividends and can cut cooling costs by 30 to 45 percent. To reduce data center cooling costs even more, consider new technologies and supplemental cooling systems so your facility can support increased machine densities.

About the Author

Gary Truglio provides consultation services that reduce energy expenditure in data centers. His innovative approaches to data center cooling cut costs while decreasing an organization’s carbon footprint.


The inspiration behind CEO Hangout is to create a community of Chief Executives and business leaders who support and inspire one another to greater heights. As they say, it's lonely at the top. Let's change that.


For inquiries, contact


© 2024 CEO Hangout. All rights reserved.


Copyright 2010 - 2021 @ CEO Hangouts - All rights reserved.