Cooling-Requirements-for-Data-Center

How to Calculate Cooling Requirements for a Data Center

August 1, 2023

Heat generation is a normal side-effect of running any electrical equipment, including data center equipment. However, data centers hold your critical information, and a buildup of heat can cause irreparable damage to your servers. They may shut down if temperatures climb too high, and regularly operating under the strain of elevated temperatures can shorten the life of your equipment.

A related problem is high humidity. If the humidity level is too low, it can result in electrostatic discharge — a sudden flow of electricity between two objects that can damage equipment. If the humidity level rises too high, it can cause condensation and corrosion of your equipment. Dust and debris are more likely to gather on your machinery in high humidity.

You can mitigate these risks by abiding by data center cooling requirements, which you can accomplish using data center cooling solutions.

Data Center Cooling Standards

The American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) publishes guidelines for safe data center temperatures. The most recent recommendation for most classes of information technology (IT) equipment is a temperature between 18 and 27 degrees Celsius (°C) or 64 and 81 degrees Fahrenheit (°F), a dew point (DP) of -9˚C DP to 15˚C DP and a relative humidity (RH) of 60 percent. Those recommendations apply to equipment in the ASHRAE categories of A1 to A4.

ASHRAE also provides specific recommendations for its various classes of equipment. These recommendations apply when the equipment is powered on and applies to IT rather than power equipment.

  • Class A1: The recommended temperature is 15 to 32˚C. The recommended dew point and relative humidity ranges are -12˚C DP and 8 percent RH to 17˚C DP and 80 percent RH.
  • Class A2: The recommended temperature is 10 to 35˚C. The recommended dew point and relative humidity ranges are -12˚C DP and 8 percent RH to 21˚C DP and 80 percent RH.
  • Class A3: The recommended temperature is 5 to 40˚C. The recommended dew point and relative humidity ranges are -12˚C DP and 8 percent RH to 24˚C DP and 85 percent RH.
  • Class A4: the recommended temperature is 5 to 45˚C. The recommended dew point and relative humidity ranges are -12˚C DP and 8 percent RH to 24˚C DP and 90 percent RH.
  • Class B: The recommended temperature is 5 to 35˚C. The recommended dew point and relative humidity ranges are 8 percent RH to 28˚C DP and 80 percent RH.
  • Class C: The recommended temperature is 5 to 40˚C. The recommended dew point and relative humidity ranges are 8 percent RH to 28˚C DP and 80 percent RH.

ASHRAE had recommended a narrower temperature range in previous versions of its guidelines. The recommendations primarily considered keeping data centers operational regardless of energy costs. As conserving energy became more widespread in data centers, ASHRAE published classes that allowed a wider temperature range.

Some older equipment may be designed to older versions of the ASHRAE standard. When a data center has a mix of older and newer equipment, it can be more challenging to figure out which recommendations to use. Finding a temperature and humidity range that works for all equipment is essential to ensure longevity and continuity.

Calculating Total Cooling Requirements for Data Centers

Once you decide on an ideal temperature range, you need to quantify the heat output of your system so that you can figure out how much cooling capacity you need. To do this, you estimate the heat output from all IT equipment and other heat sources in your data center. This information will tell you how much cooling power you need.

Determining this will help you choose a cooling system that can reliably meet your needs while avoiding overspending on the capacity you don’t need. Using the method described below, anyone can calculate a data center’s cooling needs to help protect its equipment and data. Here’s how to calculate cooling requirements for your data center.

Measuring Heat Output

There are various measures used to express heat — British thermal units (BTU), tons, calories and joules. Heat output can be measured using BTU per hour, tons per day and joules per second, which is equal to watts.

Having many different measures to express heat and heat output can cause confusion, especially if multiple measurement units are used together. There’s a movement toward making the watt the standard way to measure heat output. BTU and tons are phasing out.

You may still have some data that uses other measurements. Converting data into a single format is essential if you have more than one unit. You can convert them to the standard of the watt, or you may want to convert them into whichever measurement is most common in your data. Here’s how to make some conversions you may need:

  • To convert BTU per hour into watts, multiply by 0.293.
  • To convert tons into watts, multiply by 3,530.
  • To convert watts to BTU per hour, multiply by 3.41.
  • To convert watts to tons, multiply by 0.000283.

Almost all the power your AC power mains consume turns into heat. The power moving through your data lines is comparatively insignificant, which means a piece of equipment’s thermal output in watts equals the unit’s power consumption. Sometimes, data sheets also provide heat output in BTU per hour, but you only need to use one of these numbers in your calculations, and watts are often easier.

One exception to this rule is voice-over-internet protocol (VoIP) routers. As much as one-third of the power these routers consume may be sent to remote terminals, causing their output to be lower than the power they consume. The difference between the heat output and power of VoIP routers is typically not enough to make a significant difference in your calculation. Still, you can include it if you want a more precise outcome.

Determining the Heat Output of a Complete System

To calculate your data center’s total heat output, add the heat outputs of all the system’s components, including the IT equipment, uninterruptible power supply (UPS) systems, power distribution systems and air conditioning units. It also includes lighting and people. You can use some simple rules to ascertain the heat output of these components.

The heat output of UPS and power distribution systems consists of a fixed loss and a loss proportional to operating power. These losses are relatively consistent across all brands and models of this type of equipment. You can also use standard values for the heat output of lighting and values. These values are estimates, but they’re consistent enough that they won’t cause significant errors in your cooling requirement calculations.

The fans and compressors in air conditioning units create a substantial amount of heat, which is released into the outdoors rather than the data center. Because of this, air conditioners don’t add to the thermal load of the data center. The heat they produce does, however, impact their efficiency. You should account for this loss in efficiency when sizing your air conditioner.

The other pieces of data you’ll need to calculate the cooling load are the floor area of the center in square feet and the rated electrical system power.

You can conduct an in-depth thermal analysis to determine the exact thermal output of every component in your data center. Still, a quick estimate using the standards listed above is all you need to calculate your data center or server room temperature requirements. The result using the estimate will fall within the typical margin of error of a more detailed analysis. Also, the fact that anyone can conduct the calculation using the estimates without special training is an advantage.

Start with the following calculations to determine your data center’s total heat output:

  • Add up the load power of all of your IT equipment. This number is equal to the heat output.
  • Use the following formula for a UPS system with a battery: (0.04 x Power system rating) + (0.05 x Total IT load power). If a redundant system is used, do not include the capacity of the redundant UPS.
  • Use the following formula for a power distribution system: (0.01 x Power system rating) + (0.02 x Total IT load power).
  • To calculate the heat output of your lighting, use the floor area in square feet or square meters. Then, one of the following formulas: 2.0 x floor area in square feet or 21.53 x floor area in square meters.
  • To calculate the heat produced by people in your data center, multiply the maximum number of people who would be in the facility at one time by 100.

Next, add up the subtotals from the calculations listed above. This will give you your room or facility’s total heat source output.

Other Heat Sources

Until now, we haven’t considered the possibility of heat from sources outside the data center, such as sunlight through windows and heat conducted in through outside walls. This isn’t an issue for many small data centers and server rooms, as many of them have no windows or outside walls. Some small data centers, however, do have these things. Larger data centers typically have windows, walls and a roof exposed to the outside, allowing in additional heat.

If a significant portion of the walls or ceiling of your data center or room is exposed to the outdoors or has a substantial number of windows, consult an HVAC consultant. An HVAC professional can assess the maximum thermal load of the room. Add the load determined by the HVAC consultant to the total heat output you calculated earlier.

Humidification

You’ll also need to consider supplemental humidification when calculating your cooling requirements. Air conditioner systems control humidity in addition to removing heat. In an ideal situation, the system would keep the water in the air constant, eliminating the need for additional humidification.

However, most air conditioner systems have an air cooling function that results in significant condensation, decreasing humidity. It’s best to use supplemental humidification equipment to compensate for this humidity loss. This humidification equipment adds more heat load, which you’ll need to compensate for by increasing the capacity of your cooling equipment.

Under some conditions, though, the air conditioning system might not cause any condensation. In many wiring closets and small data rooms, the air conditioning system uses ducting to separate the bulk return air from the bulk supply air. In this setup, no condensation forms, and you don’t need additional humidification. Instead, the air conditioning unit can operate at 100 percent capacity.

In a large data center with lots of air mixing, the air condition system must provide air at lower temperatures to compensate for the recirculation of the higher-temperature exhaust air. This causes significant dehumidification and the need for supplemental humidification, which leads to the air conditioning system’s performance to decrease. To accommodate for this, you need to oversize your air conditioning system by up to 30 percent.

So, if you have a small system with ducted air return, you likely do not need to account for humidity. If you have a larger system that mixes the air, you may need to oversize your air conditioning system by up to 30 percent.

Other Oversizing Requirements

You’ll also need to supplement your cooling system by adding additional capacity to account for potential equipment failures and load growth.

No piece of equipment is infallible, and eventually, some of your cooling equipment will fail. You can’t afford to let the temperature in your data center increase due to a technical issue with an air conditioning unit. You will need to take each cooling unit offline for maintenance periodically.

You can plan for these needs by adding redundant capacity to your cooling system. The rule of thumb is to add as much redundant capacity as possible. You should have at least n+1 redundancy, which means you have one additional unit over the minimum requirements.

You should also add extra capacity to accommodate potential future load growth. The amount of data that companies generate is expanding rapidly, and demand for data storage is growing with it. Oversizing your cooling capacity ahead of time will enable you to meet increasing demand more readily and expand more quickly in the future. The amount of oversizing you should add for potential growth depends on the forecasts for your data center.

Determining Air Cooling Equipment Sizes

Once you determine your cooling requirements by considering all of the factors listed above, you can accurately size an air conditioning system. These factors are:

  • The cooling load, or heat output, of your equipment
  • The heat output of your lighting
  • The heat output of personnel
  • The cooling load of your building, if necessary
  • Any oversizing required due to humidification effects
  • Oversizing for redundancy
  • Oversizing for potential future growth

Once you have all the above numbers relevant to your data center, simply add them up. The result is the cooling capacity you need for your data center. The cooling capacity required is often about 1.3 times the expected IT load alongside any redundant capacity, especially for smaller server rooms. The cooling load you calculate may differ from this though, especially if you operate a larger data center.

Signs Your Data Center Needs Additional Cooling

Calculating your cooling requirements is essential, but there are additional signs that indicate your data center is running at higher than advisable temperatures. The following signs could mean you need to optimize your data cooling:

  • Overheating server racks: If you go near your servers and feel a significant increase in temperature that isn’t present anywhere else, you could have created a micro-climate in your data center. Installing extra vents that target these hot spots can help regulate the temperature.
  • Lacking redundancy equipment: As we discussed, you should always operate at N+1 redundancy so you have a backup for each power and cooling component.
  • Noticing a smell: If your server room smells musty or you notice signs of mildew on the walls, your data center is too humid. Consider implementing a gauge that tracks readers and sends you an alert when humidity levels are too high.
  • Reaching maximum capacity: When you need space to scale but you’ve already met your capacity, you probably need additional cooling.

Cooling Methods

You can use many data center cooling systems to maintain a proper temperature in your data center. The best products for you will depend on your cooling load, the setup of your facility and other factors. Some of the cooling technologies you can choose from are:

  • Chillers: Chillers keep servers cool by removing heat from one element and transferring it to another.
  • Cold aisle containment systems: Some airflow management solutions focus on containing cold air. Cold aisle containment solutions separate the supply airflow from cooling units, enabling more precise temperature control and increasing efficiency.
  • Hot aisle containment systems: Hot aisle containment solutions contain the hot exhaust air and return it directly to the air conditioning unit, preventing it from mixing with the supply air. Returning warmer air to the air conditioning units improves their performance.
  • Blanking Panels: Blanking panels block off space between racks, preventing air recirculation and helping you to maintain a consistent temperature.
  • Directional or high-flow floor tiles: Directional and high-flow floor tiles help direct air toward your equipment, increasing the efficiency of your system and helping you make the most of your cooling capacity.
  • Downflow cooling: Downflow cooling systems direct cool air downwards from the bottom of the unit. Hot exhaust air enters through the top of the unit and then passes over the internal cooling mechanisms before entering the data center.
  • In-row cooling: In-row cooling units are installed in close proximity to the equipment they’re cooling. You can install in-row units on the floor or ceiling. In-row solutions allow for a highly scalable cooling system and enable you to quickly remove high heat loads because of their proximity to your equipment.
  • Portable cooling: Portable cooling units allow you to add flexibility to your cooling system. You can add cooling capacity exactly where you need it at any time. You can use portable cooling units for both spot cooling and area cooling.
  • Rack door heat exchangers: Rack door heat exchangers are affixed directly to server racks. They take server rack heat output and exchange it from the air before discharging it into the data center. They can feature active, passive or microchannel heat exchangers.
  • Rack-mounted cooling: You can install air conditioning units directly onto a rack, which enables exceptionally precise cooling.
    The way you design your cooling system can also have a significant impact on cooling efficiency and effectiveness. How you set up your data center and place your cooling equipment impacts the efficiency of your cooling system. You can also design your system to take advantage of natural cooling if the climate where your data center is located allows.

EXPLORE COOLING SYSTEMS

Working With DataSpan

Properly calculating your cooling requirements is crucial to your data center equipments reliable, cost-effective operation. Excessive heat and humidity can cause equipment to shut down and reduce its overall lifespan. Using a fairly simple process without specialized training or knowledge, you can calculate heat load in a room or data center. You must account for your equipment’s cooling load, building-related heat sources, lighting, people, humidification effects, redundancy and potential future growth.

Working with data center experts like those at DataSpan is another smart strategy for ensuring optimal cooling of your data center. We offer the latest cooling technologies and can also help you to design, install and maintain your cooling system. We’ll work around your schedule to complete installations and maintenance in a way that minimizes disruptions to your operations. Working with the experts at DataSpan is a sure way to make certain you end up with an optimized, efficient and reliable system.

At DataSpan, we deliver customized cooling solutions to meet your company’s unique goals. Want to learn more about how we can help you to optimize your data center cooling? Contact us for more information and start optimizing your cooling system today!

  • SHARE