Learn more about achieving maximum energy efficiency through cooling system design. This presentation was given during the Spring 2012 Data Center World Conference in Las Vegas, NV. Learn more by visiting www.datacenterworld.com.
Axa Assurance Maroc - Insurer Innovation Award 2024
Lowering operating costs through cooling system design
1. Interested in data center power and cooling?
Learn about data center efficiency/power & cooling sessions offered at the
upcoming Fall 2012 Data Center World Conference at:
www.datacenterworld.com.
This presentation was given during the Spring, 2012 Data Center World Conference and Expo.
Contents contained are owned by AFCOM and Data Center World and can only be reused with the
express permission of ACOM. Questions or for permission contact: jater@afcom.com.
2. Lowering Operating Costs Through
Cooling System Design
Paul Bemis
President
Applied Math Modeling Inc
2
3. Data Center PUE
(Power Utilization Effectiveness)
Total Facility Energy
PUE =
IT Energy
How effectively is my facility delivering
cooling to my IT equipment?
Total Facility IT Equipment
• Servers
Power
• Storage
• Telco
• etc
3
4. PUE and DCiE
Power Utilization Effectiveness (PUE)
PUE = Total Facility Energy / Total IT Energy
Data Center Infrastructure Efficiency (DCiE)
DCIE = (Total IT Energy/ Total Facility Energy) x100%
4
5. Some Facts About PUE
• Dominant Parameters are IT and Cooling Energy.
– CRAC/Chiller make up 75% of total “non-IT” load
• UPS, PDU, Switchgear are all 2nd order effects
5
Source: Uptime Institute
6. Data Center PUE
(Power Utilization Effectiveness)
Cooling Energy + IT Energy
PUE =
IT Energy
How effectively is my facility delivering
cooling to my IT equipment?
• Focusing on Cooling Efficiency provides largest payback
• Understanding how cooling systems operate is key
6
7. Cooling System Energy Efficiency
• Cooling Units (heat pumps)
– Use mechanical shaft power provided by an electric motor
to perform the work necessary to move heat from one
location (inside) to another (outside).
– The Coefficient of Performance (COP) of a heat pump is the
ratio of the heat pumped (moved) to the supplied work.
– The COP for Data Center Cooling ranges from 2-5*
Thermal Load In Room
COP = ∆Q
∆W Energy Required to Drive Heat Pump
* Patnaik, Marwah, Sharma, Ramakrishnan Department of Computer Science, Virginia
Tech, “Sustainable Operation and Management of Data Center Chillers” 7
8. Data Center PUE
(Power Utilization Effectiveness)
IT Energy/COP + IT Energy
PUE =
IT Energy
1
=> PUE = Good Approximation
COP + 1
• The COP of the data center cooling system is the dominant
parameter in the PUE calculation.
• The COP can be easily measured in existing data centers.
– Ratio of total IT load/ ATS switch load
8
9. How does COP relate to “Set Points”?
• Increasing Cold Air Supply increases COP at a rate of
3.5% for every 1.8F
From ASHRAE 90.1-204, Table 6.8.1 I 9
10. Schematic and T-s Diagram for Ideal
Vapor-Compression Refrigeration Cycle
Reducing compressor “Lift” will reducing energy consumption.. 10
11. Techniques for Improving the Efficiency
of Data Center Cooling Systems
• Maximize the temperature difference across
all heat exchangers
– Increase hot air return temperature in data center
• Increase the cold air supply temperatures as
high as possible, while maintaining ASHRAE
inlet temperature guidelines
11
12. Maximizing Heat transfer in the Data Center
• Each CRAC/CRAH utilizes a fan/coil system for
transferring heat from the data center.
• The governing equation for heat transfer is:
Q = mcP (TR − TS )
Heat
Mass Flowrate Specific Heat of air
• It is important to maximize the temperature difference
across the coil.
• It is also important to balance the mass flow rates
between the IT load and the CRAC/CRAH units
12
13. Steps to optimize cooling efficiency
(and reduce operating costs)
• Determine the total amount of airflow required
to satisfy IT thermal load.
– Use 156 CFM/KW as a ‘rule of thumb’
• Strive to reduce the supply airflow to match only
what is needed by the servers
– Maximizes return air temperatures and saves
operation costs on fan energy.
• Begin to increase air supply temperature until
worst case rack/server reaches 80.6F
– Use containment to unify rack inlet temperatures
13
14. Rack Flow Rate
Return Temperature Index (RTI)™ = x 100
Air Handler Flow Rate
or
Air Handler Delta T
Return Temperature Index (RTI)™ = x 100
Rack Delta T
RTI is a measure of net by-pass air or
net recirculation air in the data center
Return Temperature Index (RTI) is a Trademark of ANCIS Incorporated (www.ancis.us).
All rights reserved. Used under authorization 14
15. Total Over-Temp
Rack Cooling Index (RCIHI) =1 - ® x 100
Max Allowable Over-Temp
Total Under-Temp
Rack Cooling Index (RCILO) = -
1 ® x 100
Max Allowable Under-Temp
ASHRAE Recommended Allowable
Specifications: 64.4 – 80.6 F 59 – 89.6 F
RCI is a measure of compliance with
the ASHRAE thermal guideline
Rack Cooling Index (RCI) is a Registered Trademark of ANCIS Incorporated (www.ancis.us).
All rights reserved. Used under authorization 15
16. Example Data Center
• Data Center is a raised floor design of 1600 sq ft, 145 kW of IT
load, and natural convection return.
• DX based perimeter downflow cooling capacity is 325kW, 255kw
nominal (N+1).
– Ratio of 1.76x required cooling capacity
– Controlled individually, and are constant volume devices
• Interested in reconfiguring data center to improve efficiency
– Lower operational costs
– Increase “headroom” for future IT load
• Would like to predict the ROI of proposed changes:
– Cold/hot aisle containment
– VFD’s to improve the balance of supply/return airflow.
16
17. Analysis Outline
• Perform baseline analysis on existing equipment
– Currently uses 4 Lieberts (2-DH412W, 2-DH315W)
• Validate CFD Model
• Perform baseline energy calculations
• Study alternatives for optimizing RTI
– Investigate shutting down CRAC units or implementing VFD’s
– Investigate the need to add containment curtains
• Begin to increase cold air supply temperature
– Watch racks inlets until worst case temperature reaches 80F
17
20. Base Case Colo Facility
Specifications
Room di mens i ons 28 X 58.5 X 10 feet
Room fl oor a rea 1625.5 s q. ft.
Suppl y plenum hei ght 1.5 feet
Number of downfl ow CRAC units 4
Number of rack rows 20
Tota l number of racks 59
Number of ti le rows 10 Energy Savings Opportunity
Tota l number of til es 59
Airflow Assessment
Suppl y a ir fl ow rate from downfl ow CRACs 39500 CFM
Demand a ir fl ow rate from ra cks 22559 CFM
Tota l demand ai r fl ow ra te 22559 CFM
Suppl y a ir fl ow ratio(%) 175% (75% exces s of the demand ai r flow ra te.)
Es ti ma ted average fl ow rate through perforated ti les 669 CFM
Es ti ma ted average pres s ure drop acros s the ti l es 0.281 l bf/ft2
Thermal Assessment
Es ti ma ted heat dens i ty 89.02 W/s q.ft. of room a rea
Es ti ma ted downflow CRAC cool i ng capa ci ty 254.94 kW
Es ti ma ted s uppl y a ir tota l cool i ng capa ci ty 254.94 kW
Es ti ma ted tota l rack heat l oa d 144.7 kW
Tota l IT cool ing l oa d 144.7 kW
Cool ing ca pa ci ty to heatLoad rati o(%) 176% (76% exces s than cooli ng demand)
Es ti ma ted tempera ture ri s e of s upply ai r 11.35 F
Es ti ma ted average tempera ture of hot a ir 71.35 F
(for 60 F s upply a ir tempera ture)
20
21. 144 KW Base Case Selected Results
Base Case
144 KW DX
IT Heat Load 144.7 kW
5% IT Infrastructure 7 kW
Total IT Heat Load 152 kW
Total Cooling Power 143 kW
Total Supply Air Flow 39,500 CFM
Total Demand Air Flow 22,559 CFM
Fan Power 67 kW
Average Supply Temp (F) 60.75 F
COP 1.53
RTI 57% Nearly 2x the amount of airflow required
RCI hi 100% No racks out of ASHRAE High Temp guidelines
RCI lo 44% More than half out of ASHRAE Low Temp guidelines
Total Facility Power 362 kW
PUE 2.38 Baseline PUE estimate
Assumed Cost of Electricity 0.08 $/kW-Hr
Annual Cost 253,751 $
Costing as much to cool servers as to power them…
21
23. Motives for Change
• RTI of existing data center is 57%, indicating nearly 100%
bypass airflow.
• Current DX units do not support VFD, so reducing supply
airflow is limited to shutting off CRACs
– Would require backflow prevention and ICON control
• Moving to new DX CRAC’s will provide the following benefits:
– Allows VFD’s plug fans so demand/supply airflow match (RTI = 100%)
– Provides increased upside cooling capacity
– Increases COP by a factor to at least 3
• Which significantly reduces the energy consumption of cooling system.
Modeling can be used to accurately predict results prior to implementation…
23
27. Step 1: New DX Units
Fans @80% and Contain Cold Aisle
Base Case Contained Case
144 KW DX 144 KW DX
IT Heat Load 144.7 144.7 kW
5% IT Infrastructure 7 7 kW
Total IT Heat Load 152 152 kW
Total Cooling Power 143 64 kW
Total Supply Air Flow 39,500 32,000 CFM Lowered by 20%
Total Demand Air Flow 22,559 22,559 CFM
Fan Power 67 40 kW Lowered by 40%
Average Supply Temp (F) 60.75 60.75 F
COP 1.53 3.00 Improved by factor of 2x
RTI 57% 70% Improved by 23%
RCI hi 100% 100% No racks exceed high temps
RCI lo 44% 34% RCI dropped, racks too cold
Total Facility Power 362 255 kW
PUE 2.38 1.68 PUE headed in right direction
Assumed Cost of Electricity 0.08 0.08 $/kW-Hr
Annual Cost 253,751 179,027 $
Projected Savings 74,724 $
% Savings per Year 29%
Can we drive up Air Supply Temp?
27
29. Step 2: Lower Fans 20%, Contain Cold Aisle, and
Increase Supply Temp to 70F
Base Case Contained Case Contained Case
144 KW DX 144 KW DX 144KW 70F
IT Heat Load 144.7 144.7 144.7 kW
5% IT Infrastructure 7 7 7.2 kW
Total IT Heat Load 152 152 151.9 kW
Total Cooling Power 143 64 54 kW
Total Supply Air Flow 39,500 32,000 32,000 CFM
Total Demand Air Flow 22,559 22,559 22,559 CFM
Fan Power 67 40 40 kW
Average Supply Temp (F) 60.75 60.75 70.00 F
COP 1.53 3.00 3.54
RTI 57% 70% 70%
RCI hi 100% 100% 100% No racks exceed high temps
RCI lo 44% 34% 100% No racks exceed low temp
Total Facility Power 362 255 246 kW
PUE 2.38 1.68 1.62 PUE continues to improve
Assumed Cost of Electricity 0.08 0.08 0.08 $/kW-Hr
Annual Cost 253,751 179,027 172,204 $
Projected Savings 74,724 81,547 $
% Savings per Year 29% 32%
Can we improve RTI a little more??
29
33. Analysis Summary
• The combination of VFD’s and Cold Aisle Containment
yield the following benefits
– Deploying newer technology DX units and lowering fans speeds
improves RTI, RCI, PUE, and reduces energy costs considerably.
– Installing cold (or hot) aisle containment improves air
management and allows cold air supply temperature to be
increased, which improves cooling efficiency
• Doing both these things reduces overall energy
consumption by an estimated 39%, saving ~$100k per year
in operational costs.
• Estimated cost to implement these changes is $300k,
providing a ROI of 3 years (excluding energy rebate credits)
33
34. For more information
• Questions Regarding This Presentation and/or
Modeling Methods
– Paul Bemis
– Paul.Bemis@CoolSimSoftware.com
• For Information, please contact:
– Applied Math Modeling Inc,
– Jennifer Beliveau
– Jennifer.Beliveau@CoolSimSoftware.com
34
35. Interested in data center power and cooling?
Learn about the data center efficiency/power & cooling sessions offered at
the upcoming Fall 2012 Data Center World Conference at:
www.datacenterworld.com.
This presentation was given during the Spring, 2012 Data Center World Conference and Expo.
Contents contained are owned by AFCOM and Data Center World and can only be reused with the
express permission of ACOM. Questions or for permission contact: jater@afcom.com.