energy saving server solutions presented by microage
Post on 22-Dec-2015
223 Views
Preview:
TRANSCRIPT
Energy Saving Server SolutionsPresented by MicroAge
© 2006 MicroAge
Programme
Introduction to MicroAge
– Brett BeranekMarketing DirectorMicroAge Canada
Energy-saving Server Solutions
– Martin Chagnon System X SpecialistIBM Canada
Q & A
© 2006 MicroAge
MicroAge at a glanceCanada’s leading network of independently owned systems integrators and value-added resellers
Over 40 locations across Canada, many in operation since 1981
Specializing in multi-vendor turnkey solutions: hardware + software + technical & professional services
Vast experience in SME & public sector
© 2006 MicroAge
MicroAge Delivers
IT Solutions
Industry-leading IT products
IT consulting
Procurement planning
Training
Technical support
Flexible leasing/financing options
© 2006 MicroAge
MicroAge: your trusted business technology partner
Highly qualified local IT personnel
Proven experience and IT knowledge breadth and depth
Strength of national network
Commitment to customer service excellence
© 2008 IBM Corporation
IBM Big Green OverviewA Focus on Energy Efficiencyin the Data Center
7
Top IT Business Priorities
Virtualization now the norm in most datacenters
Power / Heat is top priority
Blades everywhere
10GB Ethernet starting to break through Multi-core processors (six core now / octo-core 2009)
Server Consolidation
‘green’ technologies
8
Technologies that Matter
Hot-swap SATA Drives
Hot-swap fans in 1u server
Light-path diagnostics
Hot-spare / mirrored memory
Blade Expansion Options
SFF SCSI drives
ServeRaid module
Cool-Blue™ Portfolio
PFA on more components
eX4 Architecture
InnovationManageability
Serviceability
Reduced Failures
ScalabilityPerformance
Redundancy
Flexibility
HIGH AVAILABILITYFeatures
9
Power in - $ out
According to IDC, by 2010 for every $1 spent on hardware, 70 cents will be spent on power and cooling and by 2012 for every $1 spent on hardware, $1 will be spent on power and cooling.
Data centers typically consume 15 times more energy per square foot than a typical office building
10
Data center
IT Load
55% 45%
Power and Cooling
How is energy typically used in the data center?
Dept of Energy Stats
11
Where does the energy go in a typical data center?
Average Data Center > 2,000 Sq. ft. and 3 years or older
33 UnitsDelivered
100 Unit Input
35 UnitOutput
Server, Storage, and
Network Operations
Chillers, Humidifiers, CRAC, PDU,
UPS, Conversion,
and Distribution
55% Power and Cooling
45% IT Load
Chillers, humidifiers,
CRAC, PDU, UPS,
Lights, and Power
distribution
Chillers, humidifiers,
CRAC, PDU, UPS,
Lights, and Power
distribution
Servers, Storage, and Network
Servers, Storage, and Network
Data source: Creating Energy-Efficient Data Centers, , U.S. Department of Energy , May 18, 2007
12
Why green data centers?• Highly energy-intensive and rapidly growing
• Consume 10 to 100 times more energy per square foot than a typical office building
• Large potential impact on electricity supply and distribution
• Used about 45 billion kWh in 2005
• At current rates, power requirements will double in 5 years.
Data source: Creating Energy-Efficient Data Centers, , U.S. Department of Energy, May 18, 2007
Typical Data Center Cooling Conversion
IT Load
Power and Cooling Power and Cooling
IT Load
A 10% Improvement could save 20 billion kwH
in the USA. 55%45%55% 45%
13
“IBM to reallocate $1 billion a year” . . . Armonk, May 10, 2007
• Create an 850 member worldwide IBM “Green Team” of energy efficiency specialists.
• Plan, build or prepare our facilities to be Green Data Centers based on IBM best practices and innovative technologies in power and cooling.
• Use virtualization as the technology accelerator for our Green Data Centers – to drive utilization up and our annual power cost per square foot down.
Re-affirmed IBM’s long standing commitment to environmental leadership
1. IBM energy conservation efforts from 1990 – 2005 have resulted in a 40% reduction in CO2 emissions and $250 billion in energy savings. IBM is committed to an additional 12% CO2 savings by 2012.
2. IBM will double the compute capacity in our Green Data Centers by 2010 without increasing power consumption or carbon footprint, thus avoiding 5 billion kilowatt hours per year.
IBM Project Big Green
IBM response – Project Big Green
14
RTP Green Demo CenterDescription Extension of Systems and Technology Group’s RTP Executive Briefing Center providing a
demonstration showcase for IBM and partner energy efficient data center elements
Rack dense Blades, Servers and Storage Solutions
Big Green Partnership Solutions: APC, Eaton, Emerson and GE
Power/Workload Management HW &SW innovations Tivoli solutions, IBM Systems Director Active Energy Manager
Rack Level Cooling Estimated size: 2000 sq ft
Value proposition Providing an environment to showcase IBM and partner innovation in energy efficient data
center solutions and to promote leadership technologies and showcasing IBM’s and partner “green” capabilities for our customers
Co-location at IBM EBC allows broad exposure to large number of EBC events and integration into focused, Power & Cooling themed events.
Onsite proximity to GTS, SWG and Retail Executive Briefing Centers for cross platform sales activity
Business impact Drive awareness of IBM leadership position in energy efficient data center technology
Promote sales for IBM solutions and data center services
15
RTP Green Solution Center Layout
A B
C
D
E
F
G
16
Broad energy ecosystem to holistically address the issue
Partnering with leading global data center power and cooling technology providers in the world
Governments and energy utilities are also helping clients improve their overall energy efficiency
RTP EBC Green Solutions Center Showcase – April 2008
17
IBM Data Center Stored Cooling Solution – The ”Cool Battery” Increase the cooling capacity in existing or new data center Cut energy costs
IBM Optimized Airflow Assessment for Cabling Replace cabling systems with high-performance fiber transport systems Improve cooling and reduce energy usage across data center
IBM Scalable Modular Data Center Get racks, power, cooling, security and monitoring Deploy 500 and 1,000 square foot data centers in 8 to 12 weeks Manage integration and coordination of data center Save 15% over the price of a traditional approach
IBM Thermal Analysis for High Density Computing Identify and resolve existing and potential heat-related issues Prevent outages and provide options for power savings and expansion
Innovations to build energy-efficient data center
18
Example – IBM Data Center Stored Cooling Solution
Thermal storage solution to improve efficiency of cooling system by 40-50% and reduce energy costs
– Shift energy usage to off-peak hours saving up to 30%
– Provide extra cooling capacity to enable growth and survive grid failures
Cooling System with PCM (phase-change material)
heat
HVAC unit PCM storage Chiller
Coolingtower
heat
Thermal storage device between computer room air conditioners and chillers
19
Server hardware
70% 30%
Power supply, memory, fans, planar, drives . . .
Processor
Data center
Power and Cooling
IT Load
55% 45%
How is energy typically used in the data center?
2020
$1B investment in energy initiatives
Technologies across full STG family
Recognized as industry leader
Modular systems contributions: Rear-door heat xChanger Bezel designs Back-to-back fans Vectored cooling Thermal sensors on planar Low-voltage processors 91% efficient power supplies Flash drives Active Energy Manager
See BLUE , think GREEN
21
IBM’s Vision For Green Data Centers Of Tomorrow
Decrease Server Power Consumption by 50% Cut expenses with leadership energy-efficiency
Eliminate the Need for Data Center Air ConditioningCap the carbon footprint for Data Centers
Increase Compute Density by a factor of 10Scale Data Center computing with no new construction
Eliminate Servers From LandfillsCompletely recycle old servers
By the End of the Decade, IBM will -
22
Nearly 10% lower power and cooling costs with BladeCenter H
Smarter thinking around power and cooling with IBM can lead to savings greater than 30% over traditional thinking LV processors
BladeCenter E (super efficient chassis)
Larger DIMMs
Solid State Drives
BladeCenter can deliver 10-30% more processors per rack for power restricted environments
Edison Group Blade Power Study (Nov 7, 2007). Comparable configurations IBM HS21XM vs. HP BL460c:: 2 x 1.86GHz processors, 8x1 GB of memory, RAID 1 internal storage. Test exerciser SpecJbb2005. Ambient temperature 75F +/- 2F. 30% claim uses 1.86GHz LV, 4x2GB DIMMs, 16GB .SSD.
Edison Power Study Summary
http://www-03.ibm.com/systems/bladecenter/new/power/Edison_Blade_Center_Power_Study.pdf
Less Power and Cooling with IBM BladeCenter
System
IBM BladeCenter
HLocal
Storage
HP BladeSyste
m c7000Local
Storage
Server Blades Per Chassis 14 16
Peak Power Consumption per Server Blade (Watts) 300.63 333.42
IBM Advantage - Percent Less Power Consumption Per Blade 9.84% N/A
Server Blade BTU/Hr 1025.14 1136.97
IBM Advantage - Percent Less BTU/Hr Per Server Blade 9.84% N/A
Combined Server and Cooling Power Consumption (kWh) 134.68 149.37
Combined (Server & Cooling) Cost per year Uniform Configuration (224 blades) $110,902.12
$122,999.76
23
Bladcenter Power
There are two kinds of power: DC – the type of power the server components run on AC – the type of power that we distribute in the data center Power supply converts AC to DC
Typical PowerSupply
220V AC in 12V DC out
Example: 2000W AC in at 70% efficiency = 1400DC output
Heat
BladeCenterPowerSupply
220V AC in 12V DC out
Example: 2000W AC in at 91% efficiency = 1820DC output
Heat
Because BladeCenter power supplies are over 90% energy efficient a lot less power is wasted as it is transitioned from AC to 12V DC for the server to run on
24
Lesspower,moresavings
How cool?Single Blade BladeCenter
Chassis4 BC Chassis
per Rack
Average Power per blade 350W 3500W 16800W
Power Consumption Reduced(from two 3.5” SAS drives to two
Solid State Drives16W – 2W = 14W x 2 drives)
28W 392W 1568W
Cost Savings per year(at $.10 USD per KW)
$27.99 (1) $391.90 $1959.53
Solid state drive power consumption is extremely low: 87% less than that of conventional drives
Generate practically no noise compared to traditional HDDs
IBM Solid State Drives Power
25
IBM Technologies “Cool Blue” Technologies
Imperative for dual/quad-core
Lower noise/power levels
Patented “Flo-thru”
Advanced cooling technology
Server level cooling
Moore’s Law
Rack level cooling
Low Voltage Smarter design
Function Density
Datacenter Improvement
Airflow increases
Utilization
Current Leakage
Power Management
Power & Cooling
Virtualization
Power Executive
Heat Sinks Efficient Power
Supplies
26
IBM Technologies “Cool Blue” Technologies
Imperative for dual/quad-core
Lower noise/power levels
Patented “Flo-thru”
Advanced cooling technology
Server level cooling
Moore’s Law
Rack level cooling
Low Voltage Smarter design
Function Density
Datacenter Improvement
Airflow increases
Utilization
Current Leakage
Power Management
Power & Cooling
Virtualization
Bezel DesignHeat xChanger
Hot Swap Fans
27
IBM Rear Door Heat eXchanger
Ideal solution to help customers deal with increased BTU output for increasing dense server deployments
Removes up to 50,000 BTU (14KVa) per rack
Fits on IBM Enterprise rack
Runs with Customer supplied water (with-in IBM specs)
Front Cold
BackHot
Cable Opening
Subfloor
Underfloor Chilled Air
Air flow
Perf tile Tile floor
Front Cold
BackHot
Cable Opening
Subfloor
Underfloor Chilled Air
Air flow
Perf tile Tile floor
water lines
Rear Door Heat Xchanger
IBM Enterprise Rack
28
Temperature Gradient on RDHx Over Time
0.0
10.0
20.0
30.0
40.0
50.0
60.0
70.0
0.0 2.0 4.0 6.0 8.0 10.0 12.0 14.0
Time in Minutes
Tem
per
atu
re in
Deg
rees
C
Air temperature exiting RDHxprior to water circulation After water
flow begins
IBM Rear Door Heat eXchanger
29
Innovation – IBM Rear Door Heat eXchangerHelp Cut Exhaust Heat up to 50 or 60%
Solving today’s data center issues with mainframe thinking
IBM used water in the mainframe back then…
Helps: Increase density easily Solve “hot spots” in the data center Avoid cost of purchasing another AC unit Potentially postpone spend on major renovations in the data center
BeforeAfter
30
Georgia Tech and Cool Blue savingsOn Feb 8, 2006 Georgia Institute of Technology dedicated their new super computer (named Razor) which, according to the press release, became the 41st fastest computer in the world. Razor features 1000 IBM BladeCenter LS20 server blades with 2000 dual-core AMD processors (4000 processing cores) in a 1000 ft2 space. The use of 12 Rear Door Heat eXchangers allowed for cooling with the existing CRAC units, avoiding a costly retrofit of the computer room -- Savings of $160,000*
*http://www-03.ibm.com/press/us/en/pressrelease/19231.wss
31
Data center
Power and Cooling
IT Load
Server hardware
70% 30%
Power supply, memory, fans, planar, drives . . .
Processor
55% 45%
Server loads
80% 20%
Idle
Resourceusage rate
How is energy typically used in the data center?
32
Typical Server Utilization Rates
UNIX® x86Mainframe
Used
Wasted
< 20%
33
System 1
AP
P 1
AP
P 2
10% busy2KW
System 2
AP
P
3
AP
P 4
10% busy2KW
System 4
AP
P 7
AP
P 8
10% busy2KW
…
Server consolidation exploiting virtualization is a very effective tool in reducing energy costs
Total Power 8KW
Virtualization Software
AP
P 5
AP
P 6
70% busy4KW
AP
P 1
AP
P 2
AP
P
3
AP
P 4
AP
P 7
AP
P 8
Total Power 4KW
Server consolidation conserves energy
Virtualization – Economic engine of a green data center
34
How could virtualization help you lower costs?
Immediate savings potential for:
Reduce the number of devices in the data center
Improve utilization of existing resources
Save on data storage costs
Reduce the number of software licenses to monitor and pay
Allow recapture floor space for more profitable use
Increase your power and cooling efficiencies
Cut administrative expenses
35
Active Energy Manager
Manage Power at the rack, server,
blade, storage,
switch and iPDU level
Compare actual vs.
name plate power at
system level
Trend power use over time and view current
data
Ability to set power
capping without
performance throttling
36
Data center Server hardware Server loads
70% 30%Processor
80%Idle
20%Resource usage rate
55% 45%Power supply, memory, fans, planar, drives . . .
Power andCooling. . .
IT Load. . .
•Scalable Modular Data CenterScalable Modular Data Center•Data Center Facilities DesignData Center Facilities Design•Energy Efficiency AssessmentsEnergy Efficiency Assessments•Thermal Analysis Thermal Analysis •Server Consolidation ServicesServer Consolidation Services
IBM BladeCenterIBM BladeCenter
IBM Cool BlueIBM Cool Blue
Rear DoorRear DoorVectored CoolingVectored CoolingBack to Back fansBack to Back fansEfficient Power suppliesEfficient Power supplies
•Active Energy Management
•Storage and server virtualization leadership
IBM offerings help across the board
37
Take advantage of innovative technologiesIBM has 40 years of experience in delivering energy efficiency
IBM BladeCenter®
IBM X-Architecture®
Active Energy Manager
© 2006 MicroAge
Questions
?
top related