In a conventional data centre, 35% to as much as 50% of the electrical energy consumed is for cooling versus 15% in best-practice "green" data centres.

"Virtually all data centres waste enormous amounts of electricity using inefficient cooling designs and systems," says Paul McGuckin, research vice-president at Gartner. "Even in a small data centre, this wasted electricity amounts to more than 1 million kilowatt hours annually that could be saved with the implementation of some best practices."
The overriding reason for the waste in conventional data centre cooling is the unconstrained mixing of cold supply air with hot exhaust air.
"This mixing increases the load on the cooling system and energy used to provide that cooling, and reduces the efficiency of the cooling system by reducing the delta-T (the difference between the hot return temperatures and the cold supply temperature). A high delta-T is a principle in cooling," says McGuckin.
Gartner has identified 11 best practices which, if implemented, could save millions of kilowatt hours annually.
* Plug holes in the raised floor – Most raised-floor environments exhibit cable holes, conduit holes and other breaches that allow cold air to escape and mix with hot air. This single low-tech retrofit can save as much as 10% of the energy used for data centre cooling.
* Install blanking panels – Any unused position in a rack needs to be covered with a blanking panel to manage airflow in a rack by preventing the hot air leaving one piece of equipment from entering the cold-air intake of other equipment in the same rack. When the panels are used effectively, supply air temperatures are lowered by as much as 22 degrees Fahrenheit, greatly reducing the electricity consumed by fans in the IT equipment, and potentially alleviating hot spots in the data centre.
* Co-ordinate CRAC units – Older computer room air-conditioning units (CRACs) operate independently with respect to cooling and dehumidifying the air. These units should be tied together with newer technologies so that their efforts are coordinated, or remove humidification responsibilities from them altogether and place those responsibilities with a newer piece of technology.
* Improve underfloor airflow – Older data centres typically have constrained space underneath the raised floor that is not only used for the distribution of cold air, but also has served as a place for data cables and power cables. Many old data centres have accumulated such a tangle of these cables that airflow is restricted, so the underfloor should be cleaned out to improve airflow.
* Implement hot aisles and cold aisles – In traditional data centres, racks were set up in what is sometimes referred to as a "classroom style," where all the intakes face in a single direction. This arrangement causes the hot air exhausted from one row to mix with the cold air being drawn into the adjacent row, thereby increasing the cold-air-supply temperature in uneven and sometimes unpredictable ways. Newer rack layout practices instituted in the past ten years demonstrate that organising rows into hot aisles and cold aisles is better at controlling the flow of air in the data centre.
* Install sensors – A small number of individual sensors can be placed in areas where temperature problems are suspected. Simple sensors store temperature data that can be manually collected and transferred into a spreadsheet, where it can be further analysed. Even this minimal investment in instrumentation can provide great insight into the dynamics of possible data centre temperature problems, and can provide a method for analysing the results of improvements made to data centre cooling.
* Implement cold-aisle or hot-aisle Containment – Once a data centre has been organised around hot aisles and cold aisles, dramatically improved separation of cold supply air and hot exhaust air through containment becomes an option. For most users, hot-aisle containment or cold-aisle containment will have the single largest payback of any of these energy efficiency best practices.
* Raise the temperature in the data centre – Many data centres are run colder than an efficient standard. The American Society of Heating, Refrigerating, and Air-Conditioning Engineers (ASHRAE) has increased the top end of allowable supply-side air temperatures from 77 to 80 degrees Fahrenheit. Not all data centres should be run at the top end of this temperature range, but a step-by-step increase, even to the 75 to 76 degrees Fahrenheit range, would have a beneficial effect on data centre electrical use.
* Install variable speed fans and pumps – Traditional CRAC and CRAH units contain fans that run at a single speed. Emerging best practice suggests that variable speed fans be used whenever possible. A reduction of 10% in fan speed yields a reduction in the fan's electrical use of approximately 27%, and a 20% speed reduction yields electrical savings of approximately 49%.
* Exploit "free cooling" -"Free cooling" is the general name given to any technique that cools air without the use of chillers or refrigeration units. The two most common forms of free cooling are air-side economisation and water-side economisation. The amount of free cooling available depends on the local climate, and ranges from approximately 100 hours per year to more than 8 000 hours per year.
* Design new data centres using modular cooling – Traditional raised-floor-perimeter air distribution systems have long been the method used to cool data centres. However, mounting evidence strongly points to the use of modular cooling (in-row or in-rack) as a more-energy-efficient data centre cooling strategy.
"Although most users will not be able to immediately implement all 11 best practices, all users will find at least three or four that can be immediately implemented in their current data centres," says McGuckin. "Savings in electrical costs of 10% to 30% are achievable through these most-available techniques. Users committed to aggressively implementing all 11 best practices can achieve an annual savings of 1-million kilowatt hours in all but the smallest tier of data centres."