How Green Is Your Data Center
-
Upload
dgillespie13 -
Category
Documents
-
view
757 -
download
2
description
Transcript of How Green Is Your Data Center
How green is your Data Center?
1
David Gillespie, AIA, CSI, LEED APDirector: Communications & TechnologyCRS Engineering & Design Consultants
How green is your Data Center?
2
David Gillespie, AIA, CSI, LEED APDirector: Communications & TechnologyCRS Engineering & Design Consultants
Green as is environmental consciousGreen as in money (saved or spent)Green as in solar poweredGreen as in reduced carbon footprintGreen as is those little green lightsGreen as in ???
Answer: YES
Commercial Building Resource Impact
©2009 C.R.S Engineering & Design Consultants 3
•Energy experts estimate that data centers gobble up somewhere between1.5 percent and 3 percent of all electricity generated in the United States.
•According to the Uptime Institute, more than 60 percent of the power used to cool equipment in the data center is completely wasted.
Perception is NOT Reality
©2009 C.R.S Engineering & Design Consultants 4
Data Center “Deficiencies”
1. Insufficient power 29%
2. Excessive heat 29%
3. Insufficient raised floor area 21%
4. Other factors* 13%
5. Poor location 6%
6. Excessive facility cost 3%
©2009 C.R.S Engineering & Design Consultants 5© 2005 Gartner
* Other factors (read: Human Error) includes absent or ineffective management policies and practices.
opportunities
Where’s the Problem?Competing priorities
IT Department
• Computing Services
• Applications and Servers
• Data Storage/ Retrieval
• Services Management
• Disaster Avoidance/ Recovery
• Business Continuity
• Commissioning and Testing
Facilities Management
• Life Safety/ Fire Protection
• Security
• Building systems: M/E/P
• Infrastructure support
• Control and Monitoring
• O&M
• Disaster Avoidance/ Recovery
©2009 C.R.S Engineering & Design Consultants 6
Mind the Gap
Network-Critical Physical Infrastructure (NCPI)
©2009 C.R.S Engineering & Design Consultants 7
Fire & Safety
NCPI
The Numbers*
• 90% of US data centers will experience a service interruption that will affect business operations in the next 5 years.
• 80% of data centers surveyed have experienced a service interruption in the last 5 years and of those 82% were power related.
• 77% of US data center managers believe they will have major physical improvement or be forced to relocate their data center in the next 10 years.
• 53% of US data centers are planning for an expansion in the next 5 years.
• 42% believe business growth will be the largest factor in data center changes over the next 10 years, followed by facility age and regulatory compliance
• Survey says: 2 largest DC problems: Power, Heat.
©2009 C.R.S Engineering & Design Consultants 8
* © 2006 Data Center Institute Keynote Address
We have a problem…• Original data center layout sketch from client
©2009 C.R.S Engineering & Design Consultants 9
• Space is overheating
• Both CRAC units are in operation 24/7. (2 units 10 Tons each)
• Power panels have capacity
• UPS is only loaded to 38.5%
• FYI: Two new blade servers racks ship in three days.
CRAC units10 Tons each
Where can we start?Techniques for data collection
• Template calculators (IT electrical, cooling, service, power factors, generators, UPS, growth, etc.)
• Summarize nameplate data: inventory all IT equipment and assign power factors.
• Field measurement: collect temperature, power consumption and air flow in the field. (timing is critical)
• Consumption records. (UPS data)
• Trend analysis of computing schedules.• Interview stakeholders.
• RESULT: total power (in watts) consumed by DCPower consumed = Heat generated = Cooling required
©2009 C.R.S Engineering & Design Consultants 10
Network Inventory
©2009 C.R.S Engineering & Design Consultants 11
That’s a nice spreadsheet but what does all this mean?
HELP…
To help determine what has caused the current problems and to help avoid future issues the IT department forwards the latest equipment inventory for the data center. This is a typical IT management document.
Equipment InventoryExample: Single Rack
B-LINE Systems: Access 47 U 19 EIA 23 wide
RU 05
RU 10
RU 15
RU 20
RU 25
RU 30
RU 35
RU 40
RU 45
PowerEdge4600 Dell: 4600 (Rack) #1
PowerVault
745N
1 2
1 2 3 4
Dell: 745N #1
Dell: RPS-600 #1Dell: RPS-600 #2
PowerEdge3250
R
P R O C S S OE R4
Dell: 3250 #1PowerEdge
3250R
P R O C S S OE R4
Dell: 3250 #2PowerEdge
3250R
P R O C S S OE R4
Dell: 3250 #3
1 3 5 7
2 4 6 8
POWER
87654321
161514131211109
PowerConnect2216
9 11 13 15
10 12 14 16
SPD/LNK/ACT
FDX/HDX
SPD/LNK/ACT
FDX/HDX
Dell: 2216 #11 3 5 7
2 4 6 8
POWER
87654321
161514131211109
PowerConnect2216
9 11 13 15
10 12 14 16
SPD/LNK/ACT
FDX/HDX
SPD/LNK/ACT
FDX/HDX
Dell: 2216 #2
PowerEdge4600 Dell: 4600 (Rack) #2
PowerEdge3250
R
P R O C S S OE R4
Dell: 3250 #4ESC
ENTER
PowerVault124T
ESC
ENTER
PowerVault124T
ESC
ENTER
PowerVault124T
Dell: 310-4227 #1
PowerEdge4600 Dell: 4600 (Rack) #3
PowerEdge4600 Dell: 4600 (Rack) #4
ESC
ENTER
PowerVault124T
1 3 5 7
2 4 6 8
POWER
87654321
161514131211109
PowerConnect2216
9 11 13 15
10 12 14 16
SPD/LNK/ACT
FDX/HDX
SPD/LNK/ACT
FDX/HDX
Dell: 2216 #3
Dell: RPS-600 #3
B-LINE Systems: Access 47 U 19 EIA 23 wide
RU 05
RU 10
RU 15
RU 20
RU 25
RU 30
RU 35
RU 40
RU 45
1
1234567
2 Dell: 4600 (Rack) #1
Dell: 745N #1
RPS4 RPS2 RPS1 Dell: RPS-600 #1RPS4 RPS2 RPS1 Dell: RPS-600 #2
Dell: 3250 #1Dell: 3250 #2Dell: 3250 #3
Dell: 2216 #1Dell: 2216 #2
1
1234567
2 Dell: 4600 (Rack) #2
Dell: 3250 #4
Dell: 124T #1
Dell: 124T #2
Dell: 124T #3Dell: 310-4227 #1
1
1234567
2 Dell: 4600 (Rack) #3
1
1234567
2 Dell: 4600 (Rack) #4
Dell: 124T #4
Dell: 2216 #3
RPS4 RPS2 RPS1 Dell: RPS-600 #3
©2009 C.R.S Engineering & Design Consultants 12
Rack Front Rack Rear
43.0 Amps
12040.0 BTU/HR
700 Lbs.
Example: Generic Data Center
• Existing layout
©2009 C.R.S Engineering & Design Consultants 13
Modeling: Existing Conditions
• Thermal model initial results: Immanent Failure
©2009 C.R.S Engineering & Design Consultants 14
(2) 10 ton CRAC units – 20 Tons
Air flow
Modeling: Hot/ Cold Aisle
• Solution: Add more cooling, change equipment orientation
• Result: Improved environment, not ideal
©2009 C.R.S Engineering & Design Consultants 15
COLD AISLE
HOT AISLE
COLD AISLE
(3) 10 ton CRAC units – 30 Tons
Solutions
• Alternate layout and equipment adjustments
©2009 C.R.S Engineering & Design Consultants 16
(4) 7.5 ton CRAC units – 30 tons
CO
LD A
ISLE
CO
LD A
ISLE
CO
LD A
ISLE
CO
LD A
ISLE
Getting it Done• Critical Issues: Power, Heat, Space
• Solutions: Infrastructure Matching• Assessment - power, cooling, sequencing
• Forecast – growth (and Gotcha’s)
• Budget – short term, long term
• Schedule – near view, far view
• Deploy – modular and balanced
• Reality: Budget and schedule are fixed (too small, but fixed)
• Small steps, big finish
So?
©2009 C.R.S Engineering & Design Consultants 17
The big “Ta-Da”• Alternatives : nothing is free but these trade offs can work
• Consolidate, no more single app boxes!• Right sizing – everything from power supplies to rack enclosures• Deploy power management • Optimize air flow - hot aisle, cold aisle• Know your thresholds – temperature and power• Use energy-efficient servers (some use 40% less power)
• Use high-efficiency power supplies (see right sizing above)
• Bridge the IT/ Facilities gap.• Do your homework – follow standards and benchmark performance
• Get the people out!• Push Harder – advocate for change from all sources, suppliers,
manufacturers, providers, etc. If you don’t ask…
©2007 C.R.S Engineering, Inc. 18
The Answer?
©2009 C.R.S Engineering & Design Consultants 19
Planning
SystemRedundancy
Awareness
Right Sizing
Management
Modular Design
All of the above
Radical Common SenseYou already know this
• Model space for temperature, air flow, power distribution and capacity.
• Identify single points of failure, infrastructure capacity issues or service interruption threats.
• Utility assessment models: current, immanent, future
• Adjust equipment orientation. (hot aisle cold aisle)
• Segregate power and data cabling.
• Introduce redundancy.
• Establish modular equipment standards. (and follow them)
• Construct and maintain a disaster avoidance/ recovery plan.
• Review deployment strategies.
• Clean up! Eliminate all non IT equipment, and people.
• Look, Plan, Build, then Deploy (in that order)
©2009 C.R.S Engineering & Design Consultants 20
How green is your Data Center?
21
David Gillespie, AIA, CSI, LEED APDirector: Communications & TechnologyCRS Engineering & Design [email protected]