Data Center Cooling: Best Practices

Data centers are the backbone of data storage and data processing. They store data, process data, and can be accessed by anyone who connects to them.

However, they have a high power consumption and generate a lot of heat which must be dissipated before it causes the server to overheat, malfunction or become dangerous.

Because of this, it is important to employ an effective data cooling system. There are a number of different cooling methods, but some are more efficient than others.

In this article, we will discuss the data center cooling  best practices you should be employing and worst practices you absolutely should avoid.

 

In the article:

  • Implementing Data Centre Cooling Technology
  • Airflow and Proper Ventilation
  • Properly Configuring Cooling Equipment
  • Temperature Monitoring
  • Regularly Maintain Equipment
  • Bad Practices
  • Employing Data Center Best Practices

Implementing Data Center Cooling Technology

There are many things to consider when choosing data center cooling technology, but some of the most common systems include air conditioning, water cooling and liquid nitrogen.

When picking from these appropriate cooling techniques, it is important to consider the climate where the data center will be located as well as the type of equipment that will be housed within.

 

Air conditioning

Air conditioning, or air cooling is the most commonly used cooling system and it is effective in most climates. Air conditioning engineers are usually responsible for the installation and maintenance of this type of cooling unit, and these can be expensive. Air cooling systems work by using a compressor to pump cool air through the data center. This absorbs heat and is then pumped outside so the heat can dissipate.

 

Water cooling

Water cooling, a type of liquid cooling, is less common but it is more efficient and can be used in hot climates where air based systems are not as effective. It is also more reliable and does not require a lot of maintenance. For this system, water is circulated through the server racks to absorb excess heat, then cooled by a chiller, before being circulated back to the server racks.

 

Liquid nitrogen

Liquid nitrogen is another efficient data center cooling system. It is becoming more popular because of its low cost and easy maintenance. Liquid nitrogen is a coolant that is made from liquid air which is used to cool computer systems through evaporation. This absorbs heat from the computer and is then recirculated back into the data center environment. The benefits of liquid nitrogen include that it can be used in any climate, is very quick, which is important in cases of emergency, and can be used to cool data centers with minimal energy consumption. However a major disadvantage is that it can be toxic if not handled properly. Liquid nitrogen is also flammable, so it must be stored in a safe place away from heat and flames.

Each one of these cooling units have their own benefits and drawbacks, so it is important to choose what is right for your specific data center.

 

Airflow and Proper Ventilation

Data centers must also be properly ventilated in order to operate efficiently. If it is not properly ventilated, the servers will overheat and shutdown. To avoid this, and facilitate airflow , the data center cooling design must have the correct layout and the equipment must be installed correctly.

Overall, the layout should allow for warm air to move away from the working equipment. If the working or cooling equipment is installed in a way that causes poor air circulation, the cooling system will not work properly and the data center will overheat.

To increase airflow, one option is a raised floor. This is a special type of floor made up of panels that can be opened to allow air to circulate underneath the servers. The panels are also removable so that cables and wires can be run through the floor and also easily accessed. This is popular because it allows the servers to be cooled more effectively.

Blanking panels  are another option for airflow management. These work by preventing the unwanted mixing of warm air and cold air, and the movement of air into unused spaces. These are effective because they can be removed to provide access for cables and wires, which also allows servers to be easily reconfigured.

is very important when determining what types of server rack cooling systems will be required. It is crucial that you determine whether or not you require vertical or horizontal server racks before.

 

Hot and cold aisles

A hot and cold aisle layout is one of the most popular layouts. In data center terminology, aisles are the spaces between rows of racks.

A hot aisle  is where the servers are located. The servers generate hot air, and the aisle is the space where the heat is dissipated. The hot aisle should be located next to the cooling equipment so that the hot air can be removed from the data center.

A cold aisle  is where the cooling equipment is located. The cooling unit pulls in cold air from outside and blows it into the data center. The cold aisle should be located next to the data center entrance so that the cooling air can be delivered to the data center.

This layout is effective for proper flow of air and contributes to the most efficient power consumption by both working and cooling equipment.

 

Properly Configuring Cooling Equipment

When configuring the data center cooling system, it is important to ensure that it is set up for optimal performance. The cooling equipment should be properly sized for the data center, and it should be configured to work with other components. If the data center cooling equipment is too small, it will not be able to cool the servers. If the data center cooling equipment is too large, it will use a lot of energy and unnecessarily increase the data center’s operating costs.

As well as size, the cooling equipment should be configured to work with the climate the data centers are located in. If it is located in a hot climate, the cooling equipment should be configured to deal with the high temperatures and if it is located in a cold climate, the cooling equipment should be configured to deal with the low temperatures. When a data center is not properly cooled, servers can malfunction or even melt down.

In addition, the cooling equipment should be configured to work with the type of data center. For example, if the data center is a raised-floor data center, the data center cooling equipment should be configured to work with this type of data center.

 

Temperature Monitoring

Another best practice is maintaining the optimal temperature. The room cannot get too hot or else it will damage the servers’ components. There must be enough cooling capacity for all of the equipment at any given time regardless if some are not currently being used. The best way to ensure this is by proper monitoring with sensors strategically located throughout the system.

A data center thermal monitoring system is a system that monitors the data center temperature and alerts you if it starts to get too high. This type of system is important because it allows you to take action before the data center temperature gets too hot.

 

Regularly Maintain Equipment

So you have your data center cooling infrastructure and thermal monitoring system in place, but it is important not to forget that the equipment should be regularly maintained. If the cooling equipment is not properly maintained, it can malfunction and cause the servers to overheat.

Maintenance tasks include cleaning the equipment and checking for proper operation. The equipment should also be lubricated and aligned according to the manufacturer’s specifications.

 

Bad Practices

Up to this point, we’ve covered the best practice for cooling systems, but is there anything else you should avoid?

One thing that is important to remember is to avoid complicated cooling techniques. This can increase the chance that your cooling system equipment fails and causes data center overheating.

Another issue is overcooling. Overcooling is the act of cooling a data center more than is necessary. Overcooling can be caused by using too much data center cooling equipment or by using the wrong data center cooling equipment.

It is also important to avoid overuse of the equipment. If overused, it can cause the data center to become too cold which can cause the servers to malfunction and may even cause them to freeze.

When thinking about your layout, you should also avoid large open spaces. Open space data centers are susceptible to hotter temperatures because there is no thermal insulation. Data center cooling methods must work harder to cool the data center, making it more expensive to operate. Additionally, open space data centers are more prone to dust and dirt, which can clog up the cooling equipment and increase the likelihood of overheating.

 

Employing Data Center Best Practices

By following these best practices, and avoiding  bad practices you can ensure that your data center stays cool and your servers remain safe.

By choosing the right technology, configuring the data center cooling equipment properly, monitoring the temperature and regularly maintaining the equipment, your data center can stay cool and servers can run smoothly.

We have a range of products available to help with your data center cooling.

So get out there and employ these data center best practices.