© CSC - IT Center for Science Ltd. (Tero Tuononen)
Elektroniikkayhdistys 13.1.2009Green IT (eng)
CSC:n superkone-ympäristö (fin)Konesalivierailu (viittoen)
Tervetuloa CSC:lle!
© CSC - IT Center for Science Ltd. (Tero Tuononen)
Imagine your toaster being the size of a matchbox!
50 – 150W on a postage stamp Watts/socket ~constant Multicores demand memory
• Capacity and bandwidth• Each memory DIMM takes 5- 15W
Sockets/rack increasing Flops Watts
1 11.17 1.361.221.91
“Virtualization may offer significant energy savings for volume servers because these servers typically operate at an average processor utilization level of only 5 to 15 percent “ (Dietrich 2007, US EPA 2007).11
“The typical U.S. volume server will consume anywhere from 60 to 90 percent of its maximum system power at such low utilization levels “ (AMD 2006, Bodik et al. 2006, Dietrich 2007).”
© CSC - IT Center for Science Ltd. (Tero Tuononen)
Then multiply your problem by thousands
• Standard rack cabinet equals 0,77m2 / 1,44m3
• HPC rack density (cpus&ram/rack) increases
• Enter the power !• current system cabinets 25 – 40 kW• this year 60kW/cabinet• vendors predict 80 -100kW racks in 2-3 years
• It becomes impossible to feed enough air thru the cabinet (wind speed issue) • Water is ~20 times more efficient coolant than air (in practice)
• Liquid cooling and massive 2tn + racks
• Machine rooms face yet another challenge – sheer mass of computing infrastructure
Helo Pikku-tonttu
CSC Cray Cray NG Future0
10
20
30
40
50
60
70
kW/m3
kW/m3
© CSC - IT Center for Science Ltd. (Tero Tuononen)
Source: IDC, U.S. Environmental Protection Agency
EU-07U.S. -06
0
10
20
30
40
50
60
70
ICT energy (TWh)
“At a datacenter level we estimate consumption levels in Western Europe to have exceeded 40TWh in 2007 and this is expected to grow to more than 42TWh in 2008.
…which translated into €4.4 billion for entire datacenters”
IDC, LONDON, October 2, 2008
Why GREEN IT has become an issue? Is it the price of energy we are talking about?
© CSC - IT Center for Science Ltd. (Tero Tuononen)
GREEN machine rooms ?
PUE = Power Usage effectiveness(total facility power / IT power)
DCIE= Data Center Infrastructure efficiency
(1 / PUE) * 100%
And maybe one more coming
DCP =Data Center Productivity
( Useful work / total facility power)
Source: the green grid, *U.S. Environmental Protection Agency (2007)
(1.0) (1.4) (1.8) (2.0) (3.0)05
1015202530354045
overheadIT
(TWh)
(PUE*)WORSE YET! – REMEMBER DIETRICH:”AVG. IT UTILIZATION 5-15% WHICH TAKES 60-90% OF SYSTEM MAX
POWER” – THEN ADD THE OVERHEAD!
Metrics and equations for machine room efficiency
© CSC - IT Center for Science Ltd. (Tero Tuononen)
About machine room efficiency..
Source: the green grid, EPA*
Q: Why PUE can not be 1.0 (theoretical minimum)?
A: In order to guarantee operational environment you will need to:
• Provide reliable power supply in form of: uninterruble power supplies (UPS), generators, backup batteries, switchgear(s), cables, rails,..
In general: any electrical component has power losses and efficiency rates below 100% – the more you put them into use the more power you lose.
• Create coolant (cool water & air) that requires usually loads of extra energy consumed by: cooling chillers, computing room air conditioning units (CRAC), cooling towers, humidification units, pumps, direct exchange units …
© CSC - IT Center for Science Ltd. (Tero Tuononen)
How to improve machine room efficiency !
Facility power
Reduce redundancy wherever applicable! (Tiers 1-4)
State-of-the-art transformers (>98%)
State-of-the-art UPS systems (>95%)
State-of-the-art switchgears, power cables, rails,..
Variable speed :chillers, fans and pumps, CRACs,
Modular, upgradable facility approach
Facility cooling (interior)
Do not over-cool! Tune the air and water temperatures as high as possible *
Do not over-size your cooling gear, efficiency is worse at low usage levels
Hot/cold isle approach (air) Reduce the area/ air
dimensions to be cooled Liquid cooled/closed racks
(Water is ~15x more efficient than air)
*ASHRAE
© CSC - IT Center for Science Ltd. (Tero Tuononen)
And improving further...
Facility cooling (exterior)…access to cold water supply
hence no need for large chillers
District/remote cooling from local energy company?
Heat dissipation fed back to district heating system etc.(more complex?)
Large HPC sites consider CPH plants of their own to create power and cooling
Economizer or water-side free cooling (in moderate or mild climate region)
• Get the cool water from (river), deep lake, sea, groundwater source and maybe return it back slightly warmer
• Cool /cold (<15 / <7 Celcius) outside air (nights, winter time)
• Permafrost, ice/snow - how likely ? Feasibility?
© CSC - IT Center for Science Ltd. (Tero Tuononen)
Green computing systems? (Green500.org)
In computing technology the GREEN is defined by computational operations achieved by consumed wattage i.e. Mflops/Watt – ratio (the higher the greener).
State-of-the-art technology exceeds 530MFlops/Watt• IBM PowerXCell and BlueGene systems• Top result of 535MFlops/Watt
TOP-DOG /Petascale systems out there – see the difference in architectures and how it affects the power consumption:
• IBM (hybrid Cell, AMD, PowerPC) Roadrunner (2.5MW) 445MFlops/W• Sun (AMD) Ranger (7MW) 152 Mflops/W.• Hybrid system is ~3x times more energy efficient than traditional x86 based
© CSC - IT Center for Science Ltd. (Tero Tuononen)
Trends in 2011-2015 : Hosting a Petaclass system with different scenarios
(assuming 3MW and 1MW systems , with facility efficiency of 1.6 and 1.25)
CURRENT TECHNOLOGY APPROACH 14.5 M€CURRENT COMPUTING APPROACH WITH ENHANCED HOSTING 11.4 M€
STATE-OF-THE-ART COMPUTING WITH CURRENT HOSTING TECH. 4.8 M€STATE-OF-THE-ART APPROACH 3.8 M€
2011 2012 2013 2014 20150
1,000,000
2,000,000
3,000,000
4,000,000
5,000,000
Annual energy costs (€)
x86_125 Cell_125Cell_160 x86_160 Total
02,000,0004,000,0006,000,0008,000,000
10,000,00012,000,00014,000,00016,000,000
Total energy costs
Cell_125Cell_160x86_125x86_160
© CSC - IT Center for Science Ltd. (Tero Tuononen)
Some conclusions on Green IT
Make
every Flop
count!
Optimized code
in energy
efficient
hardware
Make
every Watt
count!
Improve facility
efficiency
Make
every €
count!
Reasonable
investments and
buy GREEN
energy!
© CSC - IT Center for Science Ltd. (Tero Tuononen)
CSC:n superkone-ympäristö
”LOUHI, Pohjan Akka”Suomen tieteellisen laskennan lippulaiva kuvattuna uudessa Pohja -konesalissa lokakuussa 2008.
© CSC - IT Center for Science Ltd. (Tero Tuononen)
Mikä on Louhi ja mihin se pystyy Perinteisen superkoneyhtiön
(Cray Inc.) valmistama massiivisesti rinnakkainen supertietokone
Käyttää tavallisia AMD Inc. valmistamia prosessoreja kuten koti-pc:tkin (n. 2 500 kpl)
Käyttöjärjestelmänä ”viilattu” Linux
Otettu käyttöön vaiheittain huhtikuusta -07 alkaen, nyt täydessä laajuudessaan
Hinta n. 7M€
Tehokas elinaika n. 4 vuotta
Laskentateholtaan 31. maailmassa ja 9. Euroopassa
Vastaa n. 5 000 tehokasta pc:tä Teoreettinen laskentakyky n.16
000 laskutoimitusta/ihminen/s Keskusmuistia n. 11 TB Levyjärjestelmä 70TB (satoja
kovalevyjä)
© CSC - IT Center for Science Ltd. (Tero Tuononen)
CPU1
muisti net1
1,4m3/s
1 räkki sisältää3 x 8 bladea á 4 tai 8 CPU
+ puhallin
XT4 Compute blade
© CSC - IT Center for Science Ltd. (Tero Tuononen)
© CSC - IT Center for Science Ltd. (Tero Tuononen)
© CSC - IT Center for Science Ltd. (Tero Tuononen)
Louhen fyysiset mitat ja sijoitus saliin
PA = 3.6 x 6 (21.5m2)Korkeus 2 m Massa: 15 000 kg
Koko järjestelmä asennettu 60x60cm laatoitukselle, joka on korotettu (80cm) teräsjalkojen varaan. Kantokyky 600kg/jalka,Laatan pistekuormakestävyys 9kN.
2 x 10 laskentaräkkiä2 dataräkkiä
© CSC - IT Center for Science Ltd. (Tero Tuononen)
Sähkönsyötön periaate
Louhen sähköteho on 300 - 520kW, joka Syötetään kahdelta UPS-suojatulta (10min) keskukselta
72h varavoima 2500hv/2MW 63A
3000A : 100kg/jm
© CSC - IT Center for Science Ltd. (Tero Tuononen)
475 kW sähköteho (kw) ~ 80 sähkökiukaan lämpöteho (24/7), joka pitää siirtää pois: laitteesta ilmaan, ilmasta veteen,..
13 – 15 C
30 - 35 C
75m3 / s1,4m3/räkki
n. 9 C
n. 17 Cn. 40l/s
© CSC - IT Center for Science Ltd. (Tero Tuononen)
..vedestä alkoholiin ja katolle.
Kompressorijäähdytin 1,3MW
Glykoliputket Katolle (12. krs)
Kattolauhduttimet
© CSC - IT Center for Science Ltd. (Tero Tuononen)
Kiitos mielenkiinnosta, kysymyksiä?