by James A. Stark, P.E., Senior Director of Engineering, Electronic Environments Co.
As Patrick Gammons, Senior Manager of Data Center Energy and Location Strategy at Google, observed in a recent blog post concerning data center energy efficiency, “The cleanest energy is the energy you don’t use.”
Gammons’ remark coincided with the announcement that Google would be converting a coal power plant in Jackson County, Alabama, scheduled for shutdown in October, to become its fourteenth data center. The Alabama data center, expected to begin construction early next year, will incorporate state-of-the-art energy efficiency technologies, including super-efficient servers and advanced cooling technologies “to squeeze more out of every watt of power” it consumes.
Today, start-up organizations to Fortune 500 global enterprises utilize mission-critical facilities to house their vital data, and with data center energy consumption nearly 100 times higher than that of a typical commercial building, data center owners and operators are placing a greater focus than ever before on improving energy efficiency. With the August 3 announcement of the Obama administration’s Clean Power Plan, the country’s first-ever national standard to limit carbon pollution from power plants, it’s highly probable that similar regulatory action to reduce energy consumption in data centers will eventually become policy.
With the writing on the proverbial wall, it’s incumbent upon our industry to develop the most energy-efficient data centers possible, ensuring maximum uptime and minimal costs, while reducing their carbon footprint. With these exigencies in mind, here are six critical steps data center owners and operators can take to achieve optimal energy efficiency.
Eco-mode is the term used to describe UPS systems that run in bypass mode during normal operation, and switch to on-line mode on the inverter when the incoming power deviates from acceptable limits for the IT equipment. This is the inverse of a traditional on-line reverse transfer UPS system where the IT load is normally supported by the inverter. Because in eco-mode the UPS is effectively offline, system efficiencies of 98-99% can be achieved and energy savings can be significant. By knowing what innovative technologies are available and understanding the ROI of many of these upgrades, you may be able to use them to your advantage within your data center design.
While not every data center owner-operator will have immediate access to such unconventional energy-efficiency methodologies, deployed in concert, the above steps will reap short- and long-term energy reduction, improve reliability, ensure overall performance and reduce expenditures.
About the Author
James A. Stark has been with the Electronic Environments Co. for over twenty years, providing engineering and construction project management in the Mission Critical Facilities Services Group. He has extensive experience in delivering mission critical data center projects, site assessments and power quality studies. Jim has a degree in electrical engineering from Northeastern University and is a registered professional electrical engineer.
As Patrick Gammons, Senior Manager of Data Center Energy and Location Strategy at Google, observed in a recent blog post concerning data center energy efficiency, “The cleanest energy is the energy you don’t use.”
Gammons’ remark coincided with the announcement that Google would be converting a coal power plant in Jackson County, Alabama, scheduled for shutdown in October, to become its fourteenth data center. The Alabama data center, expected to begin construction early next year, will incorporate state-of-the-art energy efficiency technologies, including super-efficient servers and advanced cooling technologies “to squeeze more out of every watt of power” it consumes.
Today, start-up organizations to Fortune 500 global enterprises utilize mission-critical facilities to house their vital data, and with data center energy consumption nearly 100 times higher than that of a typical commercial building, data center owners and operators are placing a greater focus than ever before on improving energy efficiency. With the August 3 announcement of the Obama administration’s Clean Power Plan, the country’s first-ever national standard to limit carbon pollution from power plants, it’s highly probable that similar regulatory action to reduce energy consumption in data centers will eventually become policy.
With the writing on the proverbial wall, it’s incumbent upon our industry to develop the most energy-efficient data centers possible, ensuring maximum uptime and minimal costs, while reducing their carbon footprint. With these exigencies in mind, here are six critical steps data center owners and operators can take to achieve optimal energy efficiency.
Assessments
Data center assessment professionals are equipped to provide thorough, in-depth analyses, providing recommendations for cost effective solutions to correct infrastructure deficiencies and improve maintenance programs. A detailed assessment of your data center’s operational performance will outline the individual areas in which your energy efficiency practices may fall short. Airflow management practices, identification of hot spots, and overall temperature regulation are all critical items to assess in order to develop a strategic plan to lower energy costs. A data center assessment professional can also provide insider information to prepare for external audits and contextual data for developing strategies.Preventive Maintenance
The Ponemon Institute reports that data center downtime costs owner-operators an average of $7,900 per minute. Downtime is the number one issue that not only adversely affects your bottom line, but your company’s reputation as a reliable organization. To ensure maximum uptime, a comprehensive preventive maintenance program should include, but is not limited to, battery testing and maintenance, load bank testing, generator coolant, fuel and oil sampling, and periodic testing of overcurrent protective devices.Equipment Upgrades
Equipment upgrades are necessary to make data centers more efficient and to maintain a robust and reliable facility. Fortunately, new technologies are continually developed that reduce overall energy consumption, such as Economy Mode (eco-mode) Uninterruptible Power Supply (UPS), 380V DC power systems, lighting system retrofits, dynamic cooling systems and efficient chillers.Eco-mode is the term used to describe UPS systems that run in bypass mode during normal operation, and switch to on-line mode on the inverter when the incoming power deviates from acceptable limits for the IT equipment. This is the inverse of a traditional on-line reverse transfer UPS system where the IT load is normally supported by the inverter. Because in eco-mode the UPS is effectively offline, system efficiencies of 98-99% can be achieved and energy savings can be significant. By knowing what innovative technologies are available and understanding the ROI of many of these upgrades, you may be able to use them to your advantage within your data center design.
Airflow Management
Flawed airflow management wastes large amounts of energy and is an immediate financial drain. By adjusting rack layouts, improving supply air delivery, and reducing air recirculation, data center operators can realize substantial savings. Improved airflow management is a simple, low-cost path to improved Power Usage Effectiveness (PUE).Dynamic Cooling Management
Cooling system operation plays a critical role in the energy efficiency of a data center; and since no two data centers are alike, cooling solutions often need to be customized accordingly. Dynamic cooling systems are easy to deploy and can produce immediate energy savings. Rather than controlling temperature and airflow based on overall ambient room conditions, CRAC fan speeds are adjusted based on real-time readings collected through rack temperature sensors and control modules. Redundant CRAC units can be set in standby mode, ready to turn on if conditions in the room change or a primary CRAC unit fails.Baseline Energy Reduction
Sustainable energy sources, including fuel cell, solar and wind power are increasingly becoming more commonplace within data centers to reduce their overall carbon footprint and save money on electricity consumption. Baseline reductions can also offer facilities free cooling and renewable energy, translating into immediate savings. To return to the example of Google, after repurposing a 60-year-old paper mill to a data center in Hamina, Finland, the American multinational technology company reused the cooling infrastructure that drew water from the Bay of Finland to cool data center equipment.While not every data center owner-operator will have immediate access to such unconventional energy-efficiency methodologies, deployed in concert, the above steps will reap short- and long-term energy reduction, improve reliability, ensure overall performance and reduce expenditures.
About the Author
James A. Stark has been with the Electronic Environments Co. for over twenty years, providing engineering and construction project management in the Mission Critical Facilities Services Group. He has extensive experience in delivering mission critical data center projects, site assessments and power quality studies. Jim has a degree in electrical engineering from Northeastern University and is a registered professional electrical engineer.