(Click here to view article in digital edition)
It is estimated that the world generates 2.5 quintillion bytes of data every day, and as the ways we live and work evolve in the digital age, this amount is increasing at an exponential rate. All of this data has to be stored somewhere. However, building and operating a data storage facility comes at a cost, both in space and energy use.
In 2018, global data centre electricity demand was an estimated 198 TWh, or almost one percent of global final demand for electricity. Whilst some companies are experimenting with subsea data centres, in most cases cooling is delivered via an HVAC system. This leads to high operating costs, which will likely increase in the future as the cost of energy rises. Delivering the storage capacity necessary to keep pace with modernity, whilst also making it financially viable, requires innovative methods to reduce the amount of energy that data centres consume.
Assessing energy consumption
Most of the energy consumption of data centres derives from the IT load, accounting for 40 percent, while the cooling system runs a close second, at up to 40 percent, with the UPS and lighting accounting for the remainder.
The efficiency of a data centre is calculated by a ratio called Power Usage Effectiveness (PUE) – the total power entering the data centre divided by the amount of power used by the IT equipment. The average data centre has a PUE of 1.8. This means that the facility uses one watt of overhead power for every watt delivered to IT equipment. An ideal PUE is 1.0, and so therefore the average data centre potentially has room for improvement. Since cooling systems account for a large proportion of data centre power usage, this is an area where small improvements can have a large effect on the PUE ratio.
The cooling system consists of a number of fans and pumps, each with their own roles – the supply fan, the return fan, the liquid cooler fans, the condenser water pump, the chiller compressor and the chilled water pump.
For example, as the servers are in constant operation around the clock, one of the highest operating costs is the air conditioning, which is normally set to 20°C and requires one watt per watt used in computing functions. Air conditioning is used to control the temperature and humidity in the data centre. The recommended temperature range is between 20 to 25°C and a humidity range of 40 to 55 percent with a maximum dew point of 17°C.
Unless adequate cooling is applied, the ambient temperature of the data centre will rise, and may even lead to equipment failure. By keeping the air temperature under control, the server components can be kept within the temperature/humidity range specified by the manufacturer.
Cutting the power used by these system components means running pumps and fans at a speed that meets the demands of the cooling system at the time, maintaining the correct temperatures and humidity levels while using an optimum amount of energy. This can most readily be achieved by using variable-speed control to reduce the speed of motors driving pumps and fans.
Eliminating wasted energy
Many existing pump and fan systems are based on throttling arrangements. With these techniques, the motor is driven at full speed while the flow of liquid or air is regulated by mechanical means such as dampers, valves or vanes. Running the motor at full speed like this while throttling its output wastes energy.
By contrast, a variable speed drive (VSD) alters the speed of the motor to match the exact demand of the pump or fan, dramatically cutting energy use. For example, a centrifugal pump or fan running at 80 percent speed consumes only half as much energy as a unit running at full speed.
Some modern drives offer a range of pre-loaded macros designed to run the cooling and air conditioning applications in a data centre. Selecting a macro sets up the drive for the expected duty. Adjusting the motor speed to the correct operation point can reduce running costs by 20 to 60 percent. Some VSDs incorporate a kilowatt calculator that shows the actual energy consumption on the control panel, together with savings in real money.
Cutting the cost of cooling
An example of what can be achieved was a major data centre operator that cut £140,000 from its air conditioning bill, as well as saving 1,300 tonnes of CO2 production per annum. The data centre, run by Experian in Nottingham, has a 2,000 square metre production floor with over 2,000 servers. On the production floor, server computers are arranged in 15 rows of 17 computers per row and have a total load of 800 kW.
The 26 air conditioning units blow cooled air under the floor through grills. This air is sent towards the racks via a cold aisle containment system. This encloses the air intakes of the servers with a roof and walls, forming a barrier that ensures the cooled air is directed where needed.
This barrier also prevents hot exhaust air from the server racks being drawn directly into the cold intake side. This means that the air conditioning units do not have to cool the exhaust more than necessary to account for warm air entering the intakes. Because they are providing only the amount of cooling needed, the air conditioning units can use speed control to match the speed of the air conditioning fans to the actual demand from the racks. Previously, motors running the air conditioning units were used direct-on-line, with no form of speed control. As the servers are operating around the clock to serve clients spread across numerous time zones, there is a constant demand for cooled air, with no opportunity to switch off air conditioning units.
The 26 air conditioning units were each fitted with a 15 kW ABB drive for HVAC. The air conditioning units are controlled by a building management system (BMS), which monitors alarms, with set points of the units altered manually.
As well as the direct savings in energy, the VSDs also helped lower maintenance of the transmission elements like belts and bearings, lowered noise, reduced stress in the water pipes and allowed for more accurate control of temperatures.
For more information, visit www.abb.co.uk/energy
Print this page | E-mail this page
Hazardex 2020 event review
Download a copy of our digital magazine