Philosophy 301 · 3900.00 20C (68F) 25C (77F) 30C (86F) 35C (95F) Facility Fan + IT Power IT Power...
Transcript of Philosophy 301 · 3900.00 20C (68F) 25C (77F) 30C (86F) 35C (95F) Facility Fan + IT Power IT Power...
© Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.
But can you “handle the truth”?
Philosophy 301 Nicolas Dubé, Ph.D. HyperScale Business Unit, HP
© Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.
© Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.
3
Proportion of renewable energy grew from 1.12% to 2.3% worldwide from 1990 to 2010.
Fact.
=> Therefore the world is more environmentally friendly
?
© Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.
4
Worldwide Energy Situation
Not quite.
• The Bad Side of Ratios:
• Renewables % growing • But overall fossil fuels absolute
dependency is still increasing
© Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.
5
The “holy grail” of corporate IT?
Cave “truth” #1: Power Usage Effectiveness (PUE)
0
1
2
Median 2009 Average 2010 Best Practice Free Air
Lighting
Networking
Power Distribution
Cooling
Computer
PUE = Total Energy
IT Energy
© Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.
6
ASHRAE Recommended A1 A2 A3
When you CAN’T afford Free Air Cooling
PUE: “the untold story”
0
50
100
150
200
250
300
350
400
450
500
20C (68F) 25C (77F) 30C (86F) 35C (95F) 2600
2700
2800
2900
3000
3100
3200
3300
3400
Cha
ssis
Airf
low
Cha
ssis
Pow
er
Server Environmentals vs Air Inlet Temperature
Power
CFM
+15.3%
+60%
A4
40C (104F) 45C (104F)
© Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.
7
Fan powerα (fan speed)3
Free Air Cooling’s “dirty little secret”
2500.00
2700.00
2900.00
3100.00
3300.00
3500.00
3700.00
3900.00
20C (68F) 25C (77F) 30C (86F) 35C (95F)
Wat
ts
Facility Fan + IT Power IT Power
+15.3%
+25.7%
Baseline: 8% facility fan power @ 25C
© Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.
8
Looking at Energy per Compute Operation (free air cooling)
What a ratio like PUE does not show
20C (68F) 25C (77F) 30C (86F) 35C (95F)
Joul
es p
er G
flop
Lighting
Networking
Power Distribution
Cooling
Computer
1.18 1.24 1.33 1.47
≈ energy of 20C server inlet in a DC with PUE of 1.37
© Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.
9
Why target “1”?
ERE = Total Energy - Reuse Energy
IT Energy
NZE = Total Energy - Reuse Energy - Site Production
DC IT Energy
Shouldn’t the function objective be “0”?
PUE = Total Energy
IT Energy
© Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.
10
CLUMEQ Silo Datacenter
CLUMEQ Silo PUE Data Operating hours 8760 Chilled water hours 720 Free air cooling hours 3660 Free water cooling / re-use hours 4380
kW
sum
mer
free
-air
free
-wat
er
hours energy (kWh) IT load 330.00 X X X 8760 2890800 Fans load 3.60 X X X 8760 31536 Chiller load 56.30 X 720 40536 Water tower 9.38 X 720 6756 Water pumps 3.75 X X 5100 19142 Ultrasonic humidifying 1.75 X X X 240 420 Transformer loss 6.60 X X X 8760 57816 UPS loss (60 kVA) 7.20 X X X 8760 63072 Lighting (daily average) 1.50 X X X 8760 13140 Total 420.09 3123218 Energy re-use 330.00 4380 1445400 Net Energy consumption 1677818 Summer PUE 1.27 Free air-side PUE 1.06 Free water-side PUE (heat re-use) 1.07 PUE (year average) 1.08 rPUE / ERE (year average) 0.58
Quebec city, Canada
© Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.
11
Datacenter ecosystem
Power Generation
Cooling Infrastructure
Manure (biological waste)
Biogas
Hot Water
Steam
Power
Data Center
Manure Handling
Dairy Farm
Waste Heat
Effluent Handling and
Storage
Source: Martin Arlitt HP Labs, ASME 4th International Conference on Energy Sustainability, May 17–22, 2010 Phoenix, Arizona
Digester and Biogas Handling
© Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.
12
From Cow Manure to kWh
• Cow daily production 54.7 kg / day = 20 metric tons / year • Anaerobic digester: 1 cow = 15kWh /day • 2 000 cows dairy => 30 000 kWh / day • 30MWh / 24h = 1.25MW • Enough power for a 1MW
datacenter with PUE 1.25
© Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.
13
Cave “truth” #2: “Clean” coal
http://www.thisisreality.org http://www.youtube.com/watch?v=W-_U1Z0vezw
© Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.
14
Energy Sources CO2 footprint
0
250
500
750
1,000
gCO
2 eq / kWh
1000 650 500 93 58 25 7 5 5Coal Oil Gas Biomass Solar Marine Hydro Nuclear Wind
Source: Carbon Footprint of Electricity Generation, UK Parliament Post, Oct 2006, www.parliament.uk/documents/post/postpn268.pdf
Clean coal?
© Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.
15
IT Carbon Footprint
• A 20MW data center running on coal emits 175 200 Tons of CO2 eq /year
• The equivalent annual carbon footprint of 43 800 people
• Same data center on Hydro power: 876 Tons, or 219 people
© Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.
16
Cave “truth” #3: water is free and available • Colorado Basin situation:
50% chance dry up by 2021
Lake Mead reservoir
Lake Powell
© Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.
© Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.
18
Path to Wisdom: Zero is the target, not “1” 1. Drive towards Net-Zero Energy Consumption
• Stop metering on watts, Energy is the real thing • The lower the PUE, the better (of course) • But PUE is not enough, and can be quite misleading • Time to consider auto-production / net metering for datacenters
2. Shoot for ZERO carbon emissions • Electricity source is critically important vs embedded • Carbon taxes are emerging, kWh are not all made equal
3. Use no water • No more evaporative chillers, nor boiling up the ocean
© Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.
19
Why use air, a commonly used “insulator”, as the default heat removal mechanism?
Think different.
= h =
Q
A * ∆T h: heat transfer coefficient Q: heat input (W) A: heat transfer surface area (m2) ∆T: Delta-T (K)
hwater = 100 x hair
Water heat transport capacity is up to 100 times better than air.
© Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.
20
Back to the Future
Thoughtful cable management-Wiring connected before server dropped in
-Guides align wires as server is lowered
-Coolant actually enhances electrical connections
Ethernet / Fiber
PDU
Power
Front View Section View
4 1.866.530.0101
Liquid Blade™ Server
As shown in Figure 1, the Liquid Blade™ is a server contained in an extruded aluminum enclosure. All server components, including processors, power supplies, and graphics cards, are immersed in a dielectric liquid inside the sealed case. Racked Liquid Blade™ servers with a liquid distribution manifold system are shown in Figures 2 and 3. Eight Liquid Blade™ servers fi t in a row that is 5U in height. Importantly, a Liquid Blade™ server costs about the same as a comparably equipped air-cooled server because blade fans are eliminated and chassis design complexity is reduced. This simplifi ed design results in reduced manufacturing costs.
The Liquid Blade™ server is the focus of this paper. In the following sections, the performance of the Liquid Blade™ will be evaluated and compared with a traditional air-cooled server. The results will be analyzed and, using Jonathan Koomey’s model for determining the TCO for data centers, the fi nancial and energy cost reduction benefi ts of this revolutionary technology will be developed and discussed.4
4 Koomey, J. “A Simple Model for Determining True TCO for Data Centers.” UpƟ me InsƟ tute.hƩ p://www.missioncriƟ calmagazine.com/MC/Home/Files/PDFs/(TUI3011B)SimpleModelDetermingTrueTCO.pdf.
Figure 1. Liquid Blade™ server CAD with chassis transparency showing internal component detail.
Hardcore Computer’s Liquid Blade™ Server
Figure 2. A standard 19” rack equipped with Liquid Blade™ shelving and dripless infl ow and ouƞ low quick-disconnect manifolds.
Figure 3. A row of installed Liquid Blades powered on.
2012
1985
© Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.
21
Democratizing component level liquid cooling
HP’s Liquid Cooling Today – well, quite soon
Maximize heat re-use • Hot water outlet • Optimized water flow rate
Warm water cooling • Eliminate chillers • Minimize evaporative • Leverage dry cooling
Component level cooling • Improved silicon thermals • Lower leakage currents • Facilitated “turbo” modes
© Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.
22
Rack
High-Voltage AC to the Rack: Limiting conversion steps
Power Distribution Efficiency
= 95% efficiency
98% conversion
BBU
Alternate inputs
*
480V Rec+fier
480VAC Source
DC-‐DC Step-‐down
HVAC to rack = 87%
efficiency 97%
conversion *
Ba;eries
98% conversion
99% losses
98% conversion
*
480VAC Source Rec+fier PSU
Rec+fier
Typical
Inverter 480-‐208 XFMR
PSU Step-‐down
*
PDU
UPS
IT Load
Server
99% distribution
* 97% conversion
* 98% conversion
*
97% conversion
HVDC HVDC 480VAC 480VAC 208VAC 208VAC HVDC 12VDC
HVDC IT Load
12VDC
Server
© Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.
23
10 MW Datacenter Design Match-up kWatts Conventional Free Air @ 20C Free Air @ 35C NREL + HP Rack
IT Load 10 000 10 000 11 530 10 000
DC Fan Load 400 400 1614 0
Chiller Load 1 706 0 0 0
Evap. Towers 0 0 0 284
Water Pumps 114 0 0 40
UPS Losses 500 0 0 0
Power Distribution Losses 900 900 1 038 400
Humidification/DeHum 100 200 231 0
Lighting 2 2 2 2
IT Load PUE 1.37 1.15 1.25 1.07
Total Power Consumption 13 722 11 502 14 415 10 726
2.6M$ annual energy savings! (@ 10 cents / kWh)
© Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.
24
NREL – Energy Systems Integration Facility National Renewable Energy Labs – Golden, CO
• 1.06 PUE target • ERE << 1 • Computer used as a furnace
in winter: building heating and snow melting
• “warm water cooling” • HV-AC power distribution • Heat re-use aware
scheduler
See you in @ SC13 in Denver next year!
© Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.
Thank you