data center thermal management and efficiency

46
Data Center Thermal Management and Efficiency Jay Ries Regional Sales Manager Liebert Thermal Management Emerson Network Power

Upload: kennan

Post on 23-Feb-2016

86 views

Category:

Documents


5 download

DESCRIPTION

Data Center Thermal Management and Efficiency. Jay Ries Regional Sales Manager Liebert Thermal Management Emerson Network Power . Agenda. Where is energy consumed in the data center? Energy consumption example Cooling energy consumption breakdown Strategies for saving energy - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Data Center Thermal Management and Efficiency

Data Center Thermal Management and Efficiency

Jay RiesRegional Sales ManagerLiebert Thermal ManagementEmerson Network Power

Page 2: Data Center Thermal Management and Efficiency

AgendaWhere is energy consumed in the data center?Energy consumption example

– Cooling energy consumption breakdownStrategies for saving energy

– Low cost strategies– Medium cost strategies– Higher cost strategies

Taking it a step further (beyond cooling)Summary

Page 3: Data Center Thermal Management and Efficiency

3

Page 4: Data Center Thermal Management and Efficiency

Where is Energy Consumed in the Data Center?

Proce

ssor

Other S

ervic

es

Server P

ower

Supp

ly

Storage

Communic

ation E

quipmen

t

Coolin

gUPS

MV Tra

nsform

er an

d Switc

hgea

r

Lighti

ngPDU

0%

5%

10%

15%

20%

25%

30%

35%

40%

48% is consumed by power and cooling

support

52% is consumed by IT equipment

Page 5: Data Center Thermal Management and Efficiency

Energy Consumption ExampleEnergy Consumption ExampleBaseline Building design

Existing building− Limitation to physical changes that can be made− Best suited for modifications to existing equipment− Full equipment replacement is a last resort

1MW of facility power usage (all data center)

Baseline Cooling design Centrifugal water cooled chiller No economization Standard computer room cooling units

− No variable speed fans or advanced controls− Return air control− 45° F chilled water − 72° F return air, 50% RH

Page 6: Data Center Thermal Management and Efficiency

Energy Consumption ExampleEnergy Consumption ExamplePower UsageProcessors – 150kWOther Services – 150kWServer Power Supply – 140kWStorage – 40kW Communication Equipment – 40kW

Cooling – 380kWUPS – 50kWMV Transformer and Switchgear – 30kWLighting – 10kWPDU – 10kW

IT Power Usage = 520kWSupport Power Usage = 480kWTotal Facility Power Usage = 1000kWAnnualized Facility PUE = 1.92 Work our way to 1.35

Cooling is the only area that will be modified. In the real world, each variable will have an impact on the others

Page 7: Data Center Thermal Management and Efficiency

Cooling Energy Consumption BreakdownAir Cooled System Water Cooled System

Chilled Water System

Page 8: Data Center Thermal Management and Efficiency

Low Cost Strategies1. Implementing best practices2. Adjust the unit control methods

– Dew point control– Unit operating range

3. Change to supply air control4. Running at higher chilled water temperatures

Page 9: Data Center Thermal Management and Efficiency

If you have a raised floor, use it properly. Underfloor resistance wastes energy.

Utilize hot aisle / cold aisle, regardless if you have a raised floor

Low Cost Strategies1. Implementing Best Practices

Page 10: Data Center Thermal Management and Efficiency

Get air where it is supposed to go.– Blanking panels– Fix unplanned outside infiltrations and any unecessary gaps in the

raised floor– Return plenums to the cooling unit

Isolate the room, particularly if you want to control humidity

Low Cost Strategies1. Implementing Best Practices

Page 11: Data Center Thermal Management and Efficiency

Dew Point– Standard design points used to be 72° return air temperature

and 50% relative humidity (RH)– New, more aggressive design points can be 90°+ return air

temperature and an unspecified relative humidity– Why shouldn’t you fix at 50% relative humidity (RH)

• Dew point @ 72°, 50% = 52°• Dew point @ 95°, 50% = 74°

– If the return temperature is increased at a fixed RH, the dew point will rise, requiring the equipment to waste energy to remove moisture that didn’t need to be there in the first place

Low Cost Strategies2. Adjust Unit Settings

Page 12: Data Center Thermal Management and Efficiency

Unit operation settings– Expanding the operating range for the temperature and

humidity keeps unit components from cycling too frequently.– Higher return air temperatures allow CRAH units to run

more efficiently• Capacity increase up to 70% for chilled water units• Capacity increase up to 50% for compressor based units• The more efficiently the units operate, the fewer that are

required to control the space, saving energy.

Low Cost Strategies2. Adjust Unit Settings

Page 13: Data Center Thermal Management and Efficiency

Low Cost Strategies2. Adjust Unit Settings

Increased Capacity at Higher Temps

Page 14: Data Center Thermal Management and Efficiency

Supplies a consistent temperature to the cold aisle Saves energy because it allows the return air temperature to be

increased, allowing the CRAH unit to run more efficiently.

Low Cost Strategies3. Supply Air Control

Page 15: Data Center Thermal Management and Efficiency

45° chilled water temperature has been the standard design point for many years

Higher chilled water temperatures are starting to become more prevalent

Why? At higher temperatures, there are huge potential savings on the chiller – For every 1 degree increase in the chilled water supply temperature, a

2% energy savings can be realized on the chiller plant– 45°chilled water = Baseline– 55°chilled water = 20% energy savings

Low Cost Strategies4. Running At Higher Water Temperatures

Page 16: Data Center Thermal Management and Efficiency

Low Cost StrategiesThe Results of Implementation

Applying Low Cost Strategies– Changes to cooling system

• Best practices implemented• Supply air control• 50° F chilled water• 85° F return air with dew point control

Support Power Usage = 480kW Total Facility Power Usage = 1000kW Annualized Facility PUE = 1.92

Total cooling power usage drops from 380kW to 314kW. The number of units stay the same, but

some units can be turned off.

414kW934kW

1.79

Page 17: Data Center Thermal Management and Efficiency

Medium Cost Strategies1. Variable speed fan retrofits (EC Fan / VFD)2. Aisle containment3. Control retrofits4. Rack level sensors

Page 18: Data Center Thermal Management and Efficiency

Floor-mount cooling fans typically run at 100% rated rpm By utilizing variable speed technology, fan speed can be varied based

upon room conditions Energy savings based on a single 10HP motor

18

Fan Speed Energy

Consumed

Savings100%   8.1kWH    

90%   5.9kWH   27%

80%   4.2kWH   48%

70%   2.8kWH   65%

60%   1.8kWH   78%

Medium Cost Strategies1. Variable speed fan retrofits (EC Fan / VFD)

Page 19: Data Center Thermal Management and Efficiency

Medium Cost Strategies2. Aisle Containment

Allows for proper air separation Able to be done either the hot or cold aisle, though it is easier to

retrofit the cold aisle of an existing room Physical containment varies from simple curtains to a pre-fabricated

system designed to match the racks.

Page 20: Data Center Thermal Management and Efficiency

Containment Strategies Contained hot aisle

– Requires full containment to trap hot air– Can be difficult to retrofit in perimeter designs– Easier to retrofit in row cooling designs– Overhead fire suppression concerns on full containment

Contained cold aisle– Multiple containment options

• Doors only• Curtains only• Full containment

– Can be easier to retrofit in all cooling designs– Overhead fire suppression concerns on full containment

Medium Cost Strategies2. Aisle Containment

Page 21: Data Center Thermal Management and Efficiency

Medium Cost Strategies3. Control Retrofits

Allows for upgraded control schemes that save energy New controls allow units to be networked together

– Give more visibility of full system– Eliminate fighting of units, - one cooling while one is heating

Page 22: Data Center Thermal Management and Efficiency

Usually associated with a control retrofit or a designed scheme through a building management system

Increased visibility and quicker reaction to changes at the rack Generally applied with supply air sensors

“Bath tub effect”

Medium Cost Strategies4. Remote Sensors

Page 23: Data Center Thermal Management and Efficiency

Low + Medium Cost StrategiesThe Results of ImplementationApplying Low + Medium Cost Strategies

Changes to cooling system• Best practices implemented• Supply air control• +55° F chilled water• +90° F return air with dew point control• + Remote sensors• + Aisle containment• + Variable speed fans• + Control retrofits

Support Power Usage = 414kW Total Facility Power Usage = 934kW Annualized Facility PUE = 1.79

Total cooling power usage drops from 314kW to

184kW. All units are now on, running at a reduced

speed.

284kW

804kW1.55

ROI is generally less than 1 year for these strategies

Page 24: Data Center Thermal Management and Efficiency

Higher Cost Strategies (Major Capital Expenditures)

1. Bringing cooling closer to the source

2. Variable capacity compressors

3. Economization– Air economizers

– Water economizers

– Refrigerant Economizers

Page 25: Data Center Thermal Management and Efficiency

Bring the cooling closer minimizes the need for large fans, reducing energy

Some rear door designs don’t have fans, instead utilizing the server fans to move the air

Generally produce a better sensible cooling to power ratio than a typical system – more cooling for less energy

Row-based configuration

Rack-based configuration

Rear door configuration

Higher Cost Strategies1. Bringing Cooling Closer to the Source

Page 26: Data Center Thermal Management and Efficiency

Base Infrastructure (160 kw) Cooling Modules (mix and match)

Dew Point Controlled Pumped Refrigerant Cooling

Higher Cost Strategies1. Bringing Cooling Closer to the SourceRack Based Solutions Pump Refrigerant Technology

Page 27: Data Center Thermal Management and Efficiency

Refrigerant Based Rear Door• Refrigerant based, rear door heat exchanger• A rear door with 10kW to 40kW of cooling• Connect up to 16 doors onto a single pumped refrigerant loop• Designed to accommodate various racks• Energy story – passive door (no fans) that uses the server fans

to transfer air through the coil

Performance• Provides room neutral high density rack cooling• Applicable for atypical room layouts and rooms without hot

aisle / cold aisle configuration

Rear Door Solutions

Higher Cost Strategies1. Bringing Cooling Closer to the Source

Page 28: Data Center Thermal Management and Efficiency

Chilled Water Based Rear Door• Chilled water based, rear door heat exchanger• A rear door with 16kW to 35kW of cooling• Designed to accommodate various racks• Energy story – passive door (no fans) that uses the server fans

to transfer air through the coil

Performance• Provides room neutral high density rack cooling• Applicable for atypical room layouts and rooms without hot

aisle / cold aisle configuration

Rear Door Solutions

Higher Cost Strategies1. Bringing Cooling Closer to the Source

Page 29: Data Center Thermal Management and Efficiency

Row Based Solutions• Precise temperature and Humidity control• 12” or 24” wide designs• Air, Water, Glycol and Chilled Water models• Energy efficient, load matching

- Digital scroll compressor, 20-100% cooling capacity modulation- Variable speed EC plug fans

Performance• Real-time environment control• Automatic performance optimization• Adaptive component monitoring• Adjustable air baffle direction

Row Based Solutions

Higher Cost Strategies1. Bringing Cooling Closer to the Source

Page 30: Data Center Thermal Management and Efficiency

Fan Energy for 30kW of Cooling

Higher Cost Strategies1. Bringing Cooling Closer to the Source

Row-based configuration

Rack-based configuration

Rear door configuration

Perimeter Unit = 4.24 kW

Row-Based Unit = 1.38 kW

Rack Based = 0.54 kW

Rear Door = 0.00 kW (no fans)

Page 31: Data Center Thermal Management and Efficiency

Digital Scroll Compressors– Matches room load in unlimited step increments– Reliable – Not field repairable. Must be replaced.

4-step Semi-Hermetic Compressors– Matches room load in 4 step increments– Reliable – Field repairable

Compressors w/ VFD Control– Matches room load in unlimited step increments– Reliable– Usually not field repairable.

Higher Cost Strategies2. Variable Capacity Compressors

Intended for partially loaded rooms. May be used in conjunction with variable speed fans

for even greater energy savings.

Page 32: Data Center Thermal Management and Efficiency

Air side economizers– For chilled water or compressorized systems– Utilize outside air based on dew point,

minimizing compressor and/or chiller usage

Higher Cost Strategies3. Economization

Water side economizers For chilled water systems Uses water cooled by a cooling tower or a dry

cooler (fluid cooler) in low temperature conditions to minimize chiller operation

Pumped refrigerant economizers New technology for compressorized systems Uses refrigerant cooled in low temperature conditions

to minimize condenser and compressor operation Similar utilization as water side economizers

Page 33: Data Center Thermal Management and Efficiency

Liebert DSE with EconoPhase Pumped Refrigerant Economizer

Cooling PUE1.3 - 1.05

DX with Water-Side Economizer

Chilled Water with Air-Side Economizer

Liebert DSE with EconoPhase

0

50

100

150

200

250

300

350

400

450Annual Energy Usage

Ann

ual U

tility

Cos

t ($1

000’

s)

60%

Reliable, Low-Maintenance

OperationNo water usageNo water treatmentNo outside air

contaminationNo dampers and

louvers to maintainInstant, automatic

economizer changeover

Liebert DSE –The Most Efficient DX Data Center Cooling System

Higher Cost Strategies3. Economization – Pumped Refrigerant

Page 34: Data Center Thermal Management and Efficiency

Liebert DSE Indoor UnitNext generation data center cooling system

Liebert EconoPhaseFirst ever pumped refrigerant economizer Liebert MC

Intelligent, high efficiency condensers

Thermal System Manager with iCOM

Liebert Proprietary Data Center Management Intelligence and

Optimized Aisle

Liebert DSE System Overview

Higher Cost Strategies3. Economization – Pumped Refrigerant

Page 35: Data Center Thermal Management and Efficiency

8.5 kW

8.5 kW3.2 kW

3.9 kW

Check Valve

Compressor

EvaporatorElectronic expansion

valve

Check Valve

Check Valve

RefrigerantPump

SolenoidValve

Circuit 2

Circuit 1

DSE

MC Condenser

8.7 kW

8.7 kW

3.4 kW

4.1 kW

Liebert DSE System: DX Operation Mode

CoolingMode

OutdoorTemp

CoolingpPUE SCOP System

kW     

DX 95º F 1.26 3.8 24.9

Page 36: Data Center Thermal Management and Efficiency

Check Valve

Compressor

EvaporatorElectronic expansion

valve

Check Valve

Check Valve

RefrigerantPump

SolenoidValve

EconoPhase

Circuit 2

Circuit 1

DSE

9.8 kW

0.0 kW

3.4 kW

0.3kW

3.9 kW

MC Condenser

0.1 kW

Liebert DSE System: DX + Pump Operation Mode

CoolingMode

OutdoorTemp

CoolingpPUE SCOP System

kW     

DX 95º F 1.26 3.8 24.9

Partial 60º F 1.14 7.0 13.6

Page 37: Data Center Thermal Management and Efficiency

Check Valve

Compressor

EvaporatorElectronic expansion

valve

Check Valve

Check Valve

RefrigerantPump

SolenoidValve

EconoPhase

Circuit 2

Circuit 1

DSE

0.0 kW

0.0 kW

3.4 kW

0.4 kW

3.9 kW

MC Condenser

4.8 kW

Liebert DSE System:Pump Operation Mode

0.4 kW

CoolingMode

OutdoorTemp

CoolingpPUE SCOP System

kW     

DX 95º F 1.26 3.8 24.9

Partial 60º F 1.14 7.0 13.6

Full 45º F 1.09 10.6 9.0

Page 38: Data Center Thermal Management and Efficiency

Check Valve

Compressor

EvaporatorElectronic expansion

valve

Check Valve

Check Valve

RefrigerantPump

SolenoidValve

EconoPhase

Circuit 2

Circuit 1

DSE

0.0 kW

0.0 kW

3.4 kW

0.5 kW

3.9 kW

MC Condenser

0.2 kW

Liebert DSE System:Pump Operation Mode

0.5 kW

CoolingMode

OutdoorTemp

CoolingpPUE SCOP System

kW     

DX 95º F 1.26 3.8 24.9

Partial 60º F 1.14 7.0 13.6

Full 45º F 1.09 10.6 9.0

Full 30º F 1.05 20.7 4.6

Page 39: Data Center Thermal Management and Efficiency

Minneapolis, MN Bin Data – EconoPhase, Partial, Compressor

below 5

5 to 9 10 to 14

15 to 19

20 to 24

25 to 29

30 to 34

35 to 39

40 to 44

45 to 49

50 to 54

55 to 59

60 to 64

65 to 69

70 to 74

75 to 79

80 to 84

85 to 89

90 to 94

above 95

0

100

200

300

400

500

600

700

800

900

Dayton, OHGrand Rapids Michigan

Page 40: Data Center Thermal Management and Efficiency

1MW of IT load 90°F return air; 20% + redundancy; No humidity control Which is best? It depends on the customer drivers

– First cost/capital cost– Energy savings/PUE– Total cost of ownership– Redundancy/availability– Reliability

LIEBERT® DSE

Higher Cost Strategies3. Economization

Page 41: Data Center Thermal Management and Efficiency

Low + Medium + Higher Cost StrategiesThe Results of ImplementationApplying Low + Medium + Higher Cost Strategies

Key cooling system features• Supply air control• 90° F return air with dew point control• Rack level sensors• Aisle containment• Variable Speed Fans• Advanced Controls• + Pumped Refrigerant Economizers• + Variable Capacity Compressors

Support Power Usage = 284kW Total Facility Power Usage = 804kW Annualized Facility PUE = 1.55

Total cooling power usage drops from 184kW to 83kW.

All CW units have been replaced with new units.

183kW703kW

1.35ROI is generally less than 3 years for these strategies

Page 42: Data Center Thermal Management and Efficiency

Taking It a Step Further The annualized cooling PUE for cooling only is 1.09 for

the last scenario. Why is the overall PUE 1.35?– Not implementing virtualization with the servers– Inefficiencies in the power distribution:

• UPS modules• PDUs• Generators• Batteries• Switchgear• Lighting

– Lack of monitoring• Not having real time data means you cannot react quickly

Page 43: Data Center Thermal Management and Efficiency

Taking It a Step Further How can I get an even better cooling PUE?

– Raise water and air temperatures even higher– Implement alternate technologies that remove or greatly reduce

cooling

– Improve server monitoring

RISKPUE AVAILABILITY

PUESERVER LOADS

Page 44: Data Center Thermal Management and Efficiency

Implementing the StrategiesMultiple strategies to consider

– Low cost– Medium cost– Higher cost– Combination of any or all of the above

Implementing any of these strategies can be somewhat difficult– Where do I start?– What can I implement?

• Can the current equipment be upgraded?• Do I have budget for equipment upgrades?• Do I need outside help?

Page 45: Data Center Thermal Management and Efficiency

You don’t have to spend a fortune to get energy savings However, to get to a world class level, major changes generally have

to be made Total energy consumption needs to be considered along with PUE Focusing only on PUE can increase risk and availability

– Works with some data center models, but not for all

Summary

For more information on this topic, please check out the updated vendor neutral Energy Logic 2 white paper,

available on the Emerson Network Power website

Page 46: Data Center Thermal Management and Efficiency

Thank you!

Questions ? [email protected]

Or call Uptime Solutions Inc.937-237-3400