Heating and cooling
Data center temperature is not constant
Servers can vent and cool themselves but wouldn't be described as warm-blooded beings. The average CPU increases by 1 °F for every degree that the ambient temperature rises. In other words, there is a direct connection between the rack equipment and data center temperature.
When does this become a problem? Depending on the equipment, most CPUs are susceptible to melting if a server is left to run for a few minutes at temperatures between 86 and 95 °F.
Most data centers aim for lower ambient temperatures, often between 64.4 and 80.4 °F recommended by ASHRAE. This range is below the CPU point of no return, but in contemporary high-density facilities, data center temperature is not constant from rack to rack. Hot spots caused by poor ventilation and other disruptive circumstances might put isolated pieces of crucial equipment at risk of overheating.
Furthermore, data center temperature considers what is occurring now and what might happen in the future. History is replete with horrifying tales of CRAC failures that resulted in deadly temperature rises.
Warmer may be better
The goal is to maintain temperature visibility. Equipment doesn’t complain, it stops working
Also, running servers at higher temperatures is indeed more effective; it's more cost and environmentally friendly. Operating nearer the edge, however, means that temperatures will increase to dangerous levels much more quickly in case of a CRAC failure.
"The gap between the enhanced server and storage volumes at data centers and other abilities to monitor them gets expanding," said Sid Nag, research vice president at Gartner. "The risk of doing nothing to resolve these shortcomings is usual for enterprises... Data center operations might increase in complexity as firms move towards diverse workloads to the cloud and as the cloud turns to be the platform for combined use of advanced technologies such as edge and 5G, to name a few."
The goal isn't to discourage data center operators from running equipment warm. Instead, it's meant to motivate them to ensure temperature visibility. They must act if they notice any indication that the rack's temperature is rising above acceptable limits. Equipment in an uncomfortable data center won't complain. It will just stop working, taking crucial operations down with it.
Computer hardware produces much heat. As a result, cooling systems are required to maintain a lower temperature and guarantee the machinery's safety. Monitoring temperature can assist managers in achieving this objective.
How do data centers improve efficiency and earn a green label?
For instance, Clive Longbottom, a contributor to ComputerWeekly, pointed out that spotting hotspots inside the data center are essential to avoid fires. As national and international attention on energy efficiency intensifies, data center providers are looking for innovative ways to reduce costs and their environmental effects.
It is a difficult task considering that data centers consumed 91 billion kilowatt-hours of energy in 2013, which is more than enough electricity to power twice as many houses as New York City. Some companies are striving to increase efficiency in response to these rising numbers.
As a result, the market for energy efficiency is more robust than ever. The global green data center market, which peaked at $25.87 billion in 2014, was anticipated to enlarge at a compounded annual rate of 30.8% from 2015 to 2022, according to a report recently released by Research and Markets.
In other words, energy efficiency in data centers is significant right now. But how do data centers improve efficiency and earn a green label?