Gartner Says Companies Can Save 1 Million Kilowatt Hours by Implementing 11 Best Practices in the Data Center

Analysts to Examine Power and Cooling Strategies at the Gartner Data Center Conference, December 2-5 in Las Vegas

STAMFORD, Conn., November 13, 2008 – In a conventional data center, 35 percent to as much as 50 percent of the electrical energy consumed is for cooling versus 15 percent in best-practice “green” data centers, according to Gartner, Inc.

“Virtually all data centers waste enormous amounts of electricity using inefficient cooling designs and systems,” said Paul McGuckin, research vice president at Gartner. “Even in a small data center, this wasted electricity amounts to more than 1 million kilowatt hours annually that could be saved with the implementation of some best practices.”

The overriding reason for the waste in conventional data center cooling is the unconstrained mixing of cold supply air with hot exhaust air. “This mixing increases the load on the cooling system and energy used to provide that cooling, and reduces the efficiency of the cooling system by reducing the delta-T (the difference between the hot return temperatures and the cold supply temperature). A high delta-T is a principle in cooling,” Mr. McGuckin said.

Gartner has identified 11 best practices which, if implemented, could save millions of kilowatt hours annually.

Plug Holes in the Raised Floor – Most raised-floor environments exhibit cable holes, conduit holes and other breaches that allow cold air to escape and mix with hot air. This single low-tech retrofit can save as much as 10 percent of the energy used for data center cooling.

Install Blanking Panels – Any unused position in a rack needs to be covered with a blanking panel to manage airflow in a rack by preventing the hot air leaving one piece of equipment from entering the cold-air intake of other equipment in the same rack. When the panels are used effectively, supply air temperatures are lowered by as much as 22 degrees Fahrenheit, greatly reducing the electricity consumed by fans in the IT equipment, and potentially alleviating hot spots in the data center.

Coordinate CRAC Units – Older computer room air-conditioning units (CRACs) operate independently with respect to cooling and dehumidifying the air. These units should be tied together with newer technologies so that their efforts are coordinated, or remove humidification responsibilities from them altogether and place those responsibilities with a newer piece of technology.

Improve Underfloor Airflow – Older data centers typically have constrained space underneath the raised floor that is not only used for the distribution of cold air, but also has served as a place for data cables and power cables. Many old data centers have accumulated such a tangle of these cables that airflow is restricted, so the underfloor should be cleaned out to improve airflow.

Implement Hot Aisles and Cold Aisles – In traditional data centers, racks were set up in what is sometimes referred to as a “classroom style,” where all the intakes face in a single direction. This arrangement causes the hot air exhausted from one row to mix with the cold air being drawn into the adjacent row, thereby increasing the cold-air-supply temperature in uneven and sometimes unpredictable ways. Newer rack layout practices instituted in the past 10 years demonstrate that organizing rows into hot aisles and cold aisles is better at controlling the flow of air in the data center.

Install Sensors – A small number of individual sensors can be placed in areas where temperature problems are suspected. Simple sensors store temperature data that can be manually collected and transferred into a spreadsheet, where it can be further analyzed. Even this minimal investment in instrumentation can provide great insight into the dynamics of possible data center temperature problems, and can provide a method for analyzing the results of improvements made to data center cooling.

Implement Cold-Aisle or Hot-Aisle Containment – Once a data center has been organized around hot aisles and cold aisles, dramatically improved separation of cold supply air and hot exhaust air through containment becomes an option. For most users, hot-aisle containment or cold-aisle containment will have the single largest payback of any of these energy efficiency best practices.

Raise the Temperature in the Data Center – Many data centers are run colder than an efficient standard. The American Society of Heating, Refrigerating, and Air-Conditioning Engineers (ASHRAE) has increased the top end of allowable supply-side air temperatures from 77 to 80 degrees Fahrenheit. Not all data centers should be run at the top end of this temperature range, but a step-by-step increase, even to the 75 to 76 degrees Fahrenheit range, would have a beneficial effect on data center electrical use.

Install Variable Speed Fans and Pumps – Traditional CRAC and CRAH units contain fans that run at a single speed. Emerging best practice suggests that variable speed fans be used whenever possible. A reduction of 10 percent in fan speed yields a reduction in the fan’s electrical use of approximately 27 percent, and a 20 percent speed reduction yields electrical savings of approximately 49 percent.

Exploit “Free Cooling” – “Free cooling” is the general name given to any technique that cools air without the use of chillers or refrigeration units. The two most common forms of free cooling are air-side economization and water-side economization. The amount of free cooling available depends on the local climate, and ranges from approximately 100 hours per year to more than 8,000 hours per year.

Design New Data Centers Using Modular Cooling – Traditional raised-floor-perimeter air distribution systems have long been the method used to cool data centers. However, mounting evidence strongly points to the use of modular cooling (in-row or in-rack) as a more-energy-efficient data center cooling strategy.

“Although most users will not be able to immediately implement all 11 best practices, all users will find at least three or four that can be immediately implemented in their current data centers,” said Mr. McGuckin. “Savings in electrical costs of 10-to-30 percent are achievable through these most-available techniques. Users committed to aggressively implementing all 11 best practices can achieve an annual savings of 1 million kilowatt hours in all but the smallest tier of data centers.”

Additional information can be found in the Gartner report “How to Save a Million Kilowatt Hours in your Data Center.” This report can be found on Gartner’s Web site at

Mr. McGuckin will provide additional analysis the data center of the future at the 27th Annual Gartner Data Center Conference, taking place December 2-5 in Las Vegas. The Gartner Data Center Conference is the most comprehensive compilation of sessions and advice on the future of the data center ever held. It offers the latest actionable insights and best practices in all areas affecting the data center � real-time infrastructure to servers and storage to business continuity and disaster recovery. This event hits the critical spot between strategic planning and tactical advice for IT organizations as they look to implement new technologies into their data centers and maintain the most efficient data center. Additional information is available at

About Gartner:

Gartner, Inc. (NYSE: IT) is the world’s leading information technology research and advisory company. Gartner delivers the technology-related insight necessary for its clients to make the right decisions, every day. From CIOs and senior IT leaders in corporations and government agencies, to business leaders in high-tech and telecom enterprises and professional services firms, to technology investors, Gartner is the indispensable partner to 60,000 clients in 10,000 distinct organizations. Through the resources of Gartner Research, Gartner Consulting and Gartner Events, Gartner works with every client to research, analyze and interpret the business of IT within the context of their individual role. Founded in 1979, Gartner is headquartered in Stamford, Connecticut, U.S.A., and has 4,000 associates, including 1,200 research analysts and consultants in 80 countries. For more information, visit