From CRACs to HVAC: Building management systems help mission-critical facilities keep humming

bigstock-Data-Center-605999
Photo © Bigstock.com

By Kevin Callahan
An essential problem with data centres and other mission-critical facilities is that heat, humidity, and dust are electronics’ ‘kryptonite.’ In other words, without carefully planned measures, computer servers and high-tech hospital equipment can rapidly fail. From small in-house server rooms to the cavernous server farms powering the titans of online commerce and social networking, design professionals have become very sophisticated at limiting heat buildup, and controlling humidity and dust.

One tool increasingly relied on is the building management system (BMS). Advanced BMS monitor and control equipment ranging from computer-room air-conditioning units (CRACs) to industrial-scale HVAC systems, as well as cooling systems for specialized diagnostic and therapeutic equipment in healthcare facilities. The BMS monitors and maintains acceptable levels for heat and humidity, as well as monitors the filters used to keep dust and particulates at acceptable thresholds.

A widely cited report by Industrial Market Trends says that mainframe computers and racks of servers generate as much heat as a 2.1-m (7-ft) stack of toaster ovens. ( For more, see the article on “computer room air-conditioning unit [CRAC]” at www.techtarget.com). Desktop computers dissipate their waste heat with built-in fans and heat sinks, but when dozens—or even thousands—of computer servers or other pieces of large electronic equipment are involved, sophisticated cooling systems are required.

Data centres from the Arctic Circle to the Persian Gulf
From cold Arctic regions to the sweltering Middle East, e-commerce companies are opening massive data centres to handle ever-growing Internet traffic. In all locations, keeping the servers cool is one of the highest priorities for ensuring reliable computer up-time.

Going north to stay cool
Contemporary data centres have become industrial-scale facilities, the largest of which covers 46,450 m2 (500,000 sf) or more. To cool the thousands of servers in row-upon-row of floor-to-ceiling racks in such facilities, some data centre operators are relocating to northern regions. One such area in northern Sweden (latitude 65 degrees north) brands itself “The Node Pole” and now has 10 data centres.

One of those data centres—operated by one of the world’s largest social networks—cools its thousands of servers with the naturally cool, low-humidity arctic air. The fresh air flows into the facility through numerous louvres, while the waste heat from servers rises to a plenum and is moved outside by large banks of exhaust fans.

The brain behind this HVAC operation is an advanced BMS. The data centre’s managers rely on it to monitor thousands of data points, from spot temperatures throughout the facility, to the operating status of the HVAC equipment, to indoor relative humidity (RH). The system annunciates an alarm if anything goes outside specification, so managers can quickly address the problem. Managers also use data from the BMS to predict potential equipment failures so they can take pro-active action to keep the servers cool and within humidity tolerances. Beyond maintaining a comfortable indoor environment for the servers, the BMS also helps ensure servers are consistently well-fed with quality electricity.

Keeping cool in the Middle East
In a climate that is radically different than the Arctic, one of the largest data centre infrastructure providers in the United Arab Emirates (UAE) opened two state-of-the-art facilities in Dubai and Abu Dhabi in 2015. In a region experiencing average high temperatures above 38 C (100 F) for months at a time, the Khazna data centre project team set a goal of 99.997 per cent up-time—no small feat, even in a cool climate.

Ensuring servers in the Khazna facilities experience less than 16 minutes annually of unplanned outages in a hot, dry climate requires aggressive monitoring and control of the buildings’ climate systems. To maintain an indoor temperature between 18 and 27 C (64 and 81 F) and humidity between 25 to 85 per cent (with only 15 minutes allowed out of range per breach), the project team specified an advanced BMS. The BMS monitors and controls a plethora of equipment from CRAC units to chillers and air-handling systems—some 3600 controllers handling 55,000 data points.

The data centre managers experienced a real-world test of the protection provided by the BMS in spring 2015, when a weather station in the area failed. In less than 10 minutes, the temperature in the data halls had risen from 19 to 28 C (66 to 82 F); it was predicted it would reach 40 C (104 F) within 30 minutes unless aggressive action was taken. The BMS annunciated an alarm immediately, which enabled the facility operators to quickly correct the problem and ensure continued server up-time.

Control the content you see on ConstructionCanada.net! Learn More.
Leave a Comment

Comments

Your email address will not be published. Required fields are marked *