Satellite Server Rooms - sustainable.stanford.edu · Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07...

22
Sustainable IT at Stanford Satellite Server Rooms “Taming the Beast” August 12, 2011 Joyce Dickerson, Director, Sustainable IT Stanford University

Transcript of Satellite Server Rooms - sustainable.stanford.edu · Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07...

Page 1: Satellite Server Rooms - sustainable.stanford.edu · Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08 Jan-09 Mar-09 May-09 Jul-09 Sep-09 Nov-09

Sust

aina

ble

IT a

t Sta

nfor

d

S T A N F O R D U N I V E R S I T Y • S U S T A I N A B L E I T • Page 1

Satellite Server Rooms“Taming the Beast”

August 12, 2011

Joyce Dickerson, Director, Sustainable ITStanford University

Page 2: Satellite Server Rooms - sustainable.stanford.edu · Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08 Jan-09 Mar-09 May-09 Jul-09 Sep-09 Nov-09

Sust

aina

ble

IT a

t Sta

nfor

d

S T A N F O R D U N I V E R S I T Y • S U S T A I N A B L E I T • Page 2

Initiatives and Results

1. The Big Picture

2. Data Center Retrofit

3. Server Room Study

4. Server Room Strategy

Page 3: Satellite Server Rooms - sustainable.stanford.edu · Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08 Jan-09 Mar-09 May-09 Jul-09 Sep-09 Nov-09

Sust

aina

ble

IT a

t Sta

nfor

d

S T A N F O R D U N I V E R S I T Y • S U S T A I N A B L E I T • Page 3

Type Server Closet Server Room Localized Data Center Mid-tier Data Center Enterprise-Class Data Center

Scope Secondary computerlocation, often outside of IT control, or may be a primary site for a small business

Secondary computer location, under IT control, or may be aprimary site for a small business

Primary or secondarycomputer location, under IT control

Primary computing location, under IT control

Primary computing location, under IT control

Power/cooling Standard roomair-conditioning, no UPS

Upgraded room airconditioning,single UPS

Maintained at 17°C; some power and cooling redundancy

Maintained at 17°C; some power and cooling redundancy

Maintained at 17°C; at least N+1 power & cooling redundancy

Applications Point-specific applications

Departmental or point-specificapplications

Some enterprisewideapplications, businesscritical

Some enterprisewideapplications, businesscritical

Enterprisewideapplications,mission critical

Sq ft <200sq ft <500sq ft <1,000sq ft <5,000sq ft >5,000 sq ftResponse to downtime

Within one day Within four hours Within two hours Within minutes; may have hot site for redundancy

Immediate; has hot site for redundancy

US data centers (2009 est)

1,345,741 1,170,399 64,229 9,758 7,006

Total Servers(2009 est)

2,135,538 3,057,834 2,107,592 1,869,595 3,604,678

Average serversper location

2 3 32 192 515

Source: IDC Special Study, Data Center of the Future, Michelle Bailey, et. al. Filing Information: April 2006, IDC #06C4799

2,580,369

Datacenters Come in all Shapes and Sizes

16,764 (0.6%)

7,300,964 5,474,273 (43%)

Page 4: Satellite Server Rooms - sustainable.stanford.edu · Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08 Jan-09 Mar-09 May-09 Jul-09 Sep-09 Nov-09

Sust

aina

ble

IT a

t Sta

nfor

d

S T A N F O R D U N I V E R S I T Y • S U S T A I N A B L E I T • Page 4

Widely Distributed and Tough to Manage

43% of Servers are in 0.6% of Datacenters (Enterprise & Mid-tier)

Staffs of electrical & mechanical engineers to design & construct efficient data centers

Concentrated & easy to find

57% of Servers are in 99.4% of rooms

> 2.5 million ‘Server Rooms’

Hospitals/Hotels/Universities/Utilities/Banks/City Halls/Chain Stores/Departments

Data centers operators struggle with heat/space/power problems without much internal expertise

Widely distributed, and often hidden

Source: EPRI Analysis of IDC Special Study, Data Center of the Future

Page 5: Satellite Server Rooms - sustainable.stanford.edu · Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08 Jan-09 Mar-09 May-09 Jul-09 Sep-09 Nov-09

Sust

aina

ble

IT a

t Sta

nfor

d

S T A N F O R D U N I V E R S I T Y • S U S T A I N A B L E I T • Page 5

Initiatives and Results

1. Server Rooms Globally

2. Data Center Retrofit

3. Server Room Study

4. Server Room Strategy

Page 6: Satellite Server Rooms - sustainable.stanford.edu · Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08 Jan-09 Mar-09 May-09 Jul-09 Sep-09 Nov-09

Sust

aina

ble

IT a

t Sta

nfor

d

S T A N F O R D U N I V E R S I T Y • S U S T A I N A B L E I T • Page 6

Datacenter: Forsythe Energy Efficiency Project

2008:

20,000 square foot datacenter

Raised Floor

Cooling: 13 CRAH Units

Mostly Admin Computing– Some Research

Rack Orientation: Some hot/cold

Little focus on energy use

Minimal environmental monitoring– Three temperature gauges on the wall

2009:

Kicked Off Energy Efficiency Project– Measured PUE: 1.8

Measures Taken:

• Monitoring

• Temperature monitors throughout

• Datacenter Dashboard

• Air Flow & Temperature

• VFD’s on CRAH units – running at 75%

• % Outside air increased

• Water chilled racks raised to 74

• Average Ambient air temp raised to 74

• Containment

• Replaced aging ceiling tiles (cleanroom)

• Better aligned floor and ceiling tiles,

relentless blanking panels

• End-aisle doors (bathtub)

Page 7: Satellite Server Rooms - sustainable.stanford.edu · Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08 Jan-09 Mar-09 May-09 Jul-09 Sep-09 Nov-09

Sust

aina

ble

IT a

t Sta

nfor

d

S T A N F O R D U N I V E R S I T Y • S U S T A I N A B L E I T • Page 7

7

Dashboard – Temperature maps

Page 8: Satellite Server Rooms - sustainable.stanford.edu · Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08 Jan-09 Mar-09 May-09 Jul-09 Sep-09 Nov-09

Sust

aina

ble

IT a

t Sta

nfor

d

S T A N F O R D U N I V E R S I T Y • S U S T A I N A B L E I T • Page 8

Datacenters – How it all turned out

-

0.20

0.40

0.60

0.80

1.00

1.20

1.40

1.60

Jan-09 Feb-09 Mar-09 Apr-09 May-09 Jun-09 Jul-09 Aug-09 Sep-09 Oct-09 Nov-09 Dec-09 Jan-10 Feb-10 Mar-10

Jan-09 Feb-09 Mar-09 Apr-09May-09 Jun-09 Jul-09 Aug-09 Sep-09 Oct-09 Nov-09 Dec-09 Jan-10 Feb-10 Mar-10

Electricity 1.00 1.00 0.91 0.99 0.99 0.99 0.98 0.97 1.00 1.03 1.01 0.95 1.00 1.04 1.02

Chilled Water 1.00 1.02 1.13 1.19 1.28 1.25 1.34 1.36 1.36 1.40 1.28 0.82 0.55 0.58 0.67

IT Load 1.00 1.01 1.01 1.02 1.01 1.00 1.02 1.02 1.04 1.06 1.07 1.07 1.07 1.10 1.09

Sensors & VFDs Operating. Roof work started

Roof complete

PUE reduced from 1.8 to 1.4• IT Load up 9%• Electricity ~flat• Chilled Water down 30-45%

Page 9: Satellite Server Rooms - sustainable.stanford.edu · Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08 Jan-09 Mar-09 May-09 Jul-09 Sep-09 Nov-09

Sust

aina

ble

IT a

t Sta

nfor

d

S T A N F O R D U N I V E R S I T Y • S U S T A I N A B L E I T • Page 9

Ongoing Results: Forsythe Datacenter

-

10.0

20.0

30.0

40.0

50.0

60.0

70.0

80.0Ja

n-07

Mar

-07

May

-07

Jul-0

7

Sep-

07

Nov

-07

Jan-

08

Mar

-08

May

-08

Jul-0

8

Sep-

08

Nov

-08

Jan-

09

Mar

-09

May

-09

Jul-0

9

Sep-

09

Nov

-09

Jan-

10

Mar

-10

May

-10

Jul-1

0

Sep-

10

Nov

-10

Jan-

11

Mar

-11

kWh/

kW IT

Loa

d

IT Load Growth vs. Energy Use per IT load

PDU load = IT Load

CW+Elec kWh/IT Load

Elec kWh/IT Load

Ton-hr/IT LoadStart of Efficiency Work

Jan 2009Outside Air CoolingNov 2009

PUE holding at 1.4

Page 10: Satellite Server Rooms - sustainable.stanford.edu · Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08 Jan-09 Mar-09 May-09 Jul-09 Sep-09 Nov-09

Sust

aina

ble

IT a

t Sta

nfor

d

S T A N F O R D U N I V E R S I T Y • S U S T A I N A B L E I T • Page 10

10

Financial Impact

Total Cost of the Project $336,428Estimated Annual Energy Savings $175,000/yearSimple Payback 1.9 years

Incentives from PG&E $ 36,428Incentives from Stanford $300,000

Net Cost to Department $0

Annual kWh Savings 1,842,105 kWh

Page 11: Satellite Server Rooms - sustainable.stanford.edu · Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08 Jan-09 Mar-09 May-09 Jul-09 Sep-09 Nov-09

Sust

aina

ble

IT a

t Sta

nfor

d

S T A N F O R D U N I V E R S I T Y • S U S T A I N A B L E I T • Page 11

Initiatives and Results

1. Server Rooms Globally

2. Data Center Retrofit

3. Server Room Study

4. Server Room Strategy

Page 12: Satellite Server Rooms - sustainable.stanford.edu · Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08 Jan-09 Mar-09 May-09 Jul-09 Sep-09 Nov-09

Sust

aina

ble

IT a

t Sta

nfor

d

S T A N F O R D U N I V E R S I T Y • S U S T A I N A B L E I T • Page 12

How Efficient are Satellite Server Rooms?

Credit: Wikimedia Commons

Satellite Server Rooms Independent server rooms managed by

departments/schools/researchers Identified by facilities on main campus 70 identified, but definitely more undiscovered

And rapidly increasing in number

Study conducted Spring/Summer 2009 Rooms selected by type of cooling

Comparisons: PUE, Energy Use, Annual Cost, and

Cost/Kwh IT load computed

Page 13: Satellite Server Rooms - sustainable.stanford.edu · Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08 Jan-09 Mar-09 May-09 Jul-09 Sep-09 Nov-09

Sust

aina

ble

IT a

t Sta

nfor

d

S T A N F O R D U N I V E R S I T Y • S U S T A I N A B L E I T • Page 13

Satellite Server Rooms – Summary Table

Closet

(Admin)

Growing High Density

(Research)

Lights Out Cinderblock

(Admin)

Mini Datacenter

(Mixed Use)

High Density

(Research)

Cooling Fan Coil + House Air

- Liebert Water Cooled Racks

DX Raised Floor & CRAH Units

APC Hot-Aisle Containment

IT Load 10 kW 41 kW 44 kW 59 kW 223 kW

IT Watts/Sq Ft 83 34 30 50 278

Operating PUE 2.36 2.00 1.70 3.14 1.27

Target Norm PUE

% of Building

% of Building Energy

Annual Utility Cost to Run

Average Daily Utility Cost/kW IT Load

1.65 1.99 1.54 2.63 1.38

0.2%

7%

12%

14%

100%

100%

15%

22%

2.7%

41%

$19,029 $62,875 $71,995 $141,918 $261,387

$5.11 $4.19 $4.44 $6.55 $3.21

1) Removed Equipment- Backups2) Redirected Airflow- From back to frontRaised Temperature

No changeNo changeIn Queue In Design

Page 14: Satellite Server Rooms - sustainable.stanford.edu · Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08 Jan-09 Mar-09 May-09 Jul-09 Sep-09 Nov-09

Sust

aina

ble

IT a

t Sta

nfor

d

S T A N F O R D U N I V E R S I T Y • S U S T A I N A B L E I T • Page 14

Initiatives and Results

1. Server Rooms Globally

2. Data Center Retrofit

3. Server Room Study

4. Server Room Strategy

Page 15: Satellite Server Rooms - sustainable.stanford.edu · Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08 Jan-09 Mar-09 May-09 Jul-09 Sep-09 Nov-09

Sust

aina

ble

IT a

t Sta

nfor

d

S T A N F O R D U N I V E R S I T Y • S U S T A I N A B L E I T • Page 15

Taking on the Server Room Challenge

What we learned– Not all server room are created equal

• Some are sufficiently efficient

– Depts motivated to save energy/cost

– Server Rooms expanding rapidly• Especially with High Performance

Computing

– No available guidelines

– Some server room total wattage higher than main datacenter

Path Forward– Create a Server Room Design Guide

– Pilot server room retrofit

– Track costs and savings, make repeatable

– Develop retrofit path

Page 16: Satellite Server Rooms - sustainable.stanford.edu · Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08 Jan-09 Mar-09 May-09 Jul-09 Sep-09 Nov-09

Sust

aina

ble

IT a

t Sta

nfor

d

S T A N F O R D U N I V E R S I T Y • S U S T A I N A B L E I T • Page 16

Server Room Design Guide

http://bgm.stanford.edu/server_telecom

Developed by Facilities, IT Services, Sustainable IT

Designed for staff and contractors

Educational, as well as prescriptive– ASHRAE standards and temperatures

– Design Worksheet

– Design Matrix

– General guidelines, and “Stanford Recommends”

Page 17: Satellite Server Rooms - sustainable.stanford.edu · Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08 Jan-09 Mar-09 May-09 Jul-09 Sep-09 Nov-09

Sust

aina

ble

IT a

t Sta

nfor

d

S T A N F O R D U N I V E R S I T Y • S U S T A I N A B L E I T • Page 17

Pilot server room retrofit: Mini-Datacenter

Goal: Develop ‘turnkey’ retrofit for server rooms

– Engage PG&E• Consultant to recommend actions

and map to savings & incentives

– Engineering design • Based on PG&E report, estimate

of costs

– Solicit Bids

– Select contractor and implement

– PG&E post-project savings confirmation

– Incentive check received, monthly energy savings

– Publish ‘turnkey’ program

– Apply across campus

Page 18: Satellite Server Rooms - sustainable.stanford.edu · Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08 Jan-09 Mar-09 May-09 Jul-09 Sep-09 Nov-09

Sust

aina

ble

IT a

t Sta

nfor

d

S T A N F O R D U N I V E R S I T Y • S U S T A I N A B L E I T • Page 18

Server room retrofit: Mini-Datacenter (PUE 3.14/2.63)

Measures To Implement( Based on PG&E report, and learnings from Datacenter )

– Monitoring• Limited temperature/pressure points

• Connected to central monitor and control system (EMCS)

– Airflow and Temperature• VFDs on CRAH units

• Temperatures increased

– Containment• Blanking Panels & Gap Fillers

• Ceiling Plenum Opened– Better aligned floor and ceiling tiles

– CRAH unit chimney to plenum

• End-aisle doors

Open Items (as of 8/12/2011):- If containment can’t go to

ceiling, is it worth it to open the plenum?

- How high for CRAH chimney?- Cold or Hot aisle containment?

- Cold = bathtub, focused- Hot = comfort

Page 19: Satellite Server Rooms - sustainable.stanford.edu · Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08 Jan-09 Mar-09 May-09 Jul-09 Sep-09 Nov-09

Sust

aina

ble

IT a

t Sta

nfor

d

S T A N F O R D U N I V E R S I T Y • S U S T A I N A B L E I T • Page 19

19

Financial Impact: Mini Datacenter Efficiency Retrofit

** These are estimates based on PG&E analysis. Project not yet implement **

Total Cost of the Project $79,000Estimated Annual Energy Savings $10,484/yearSimple Payback 7.5 years

Incentives from PG&E $ 11,814Simple Payback after incentive 6.4 years

Funds from Stanford $59,070Net Cost to Department $ 8,116Simple Payback to department 0.8 years

Annual kWh Savings 116,489

Page 20: Satellite Server Rooms - sustainable.stanford.edu · Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08 Jan-09 Mar-09 May-09 Jul-09 Sep-09 Nov-09

Sust

aina

ble

IT a

t Sta

nfor

d

S T A N F O R D U N I V E R S I T Y • S U S T A I N A B L E I T • Page 20

Server Room Retrofit: Prioritize the Rooms

Division with greatest opportunity– Meet with Central Staff and Facilities

– ‘Find’ the Rooms

– Develop room options• Move to Admin Datacenter

(PUE 1.4)

• Move to Research Compute Facility (estPUE <1.2)

• Move to Other Campus Location

• Retrofit for Efficiency

– Rank by biggest savings opportunity

– Apply learnings from Pilot and FDG

– Fund with PG&E and Internal incentives

– Measure, Implement, Measure & Report

Page 21: Satellite Server Rooms - sustainable.stanford.edu · Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08 Jan-09 Mar-09 May-09 Jul-09 Sep-09 Nov-09

Sust

aina

ble

IT a

t Sta

nfor

d

S T A N F O R D U N I V E R S I T Y • S U S T A I N A B L E I T • Page 21

Sustainable IT @ Stanford – The Big Picture

6%

1%

7%

86%

% Personal computers

% administrativecomputers

% research computing

% Non-IT

Electricity Use, Total Campus

IT, Expected Growth

-

5,000,000

10,000,000

15,000,000

20,000,000

25,000,000

30,000,000

35,000,000

40,000,000

2008 2009 2010 2011

Electricity Use Without Sustainble IT (kWh):

Electricity Use With Sustainble IT (kWh):

Electricity Savings from Sustainable IT

-

500,000

1,000,000

1,500,000

2,000,000

2,500,000

3,000,000

3,500,000

2008 2009 2010 2011

CW Use Without Sustainable IT (t-h):

CW Use With Sustainable IT (t-h):

Chilled Water Savings from Sustainable IT

Page 22: Satellite Server Rooms - sustainable.stanford.edu · Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08 Jan-09 Mar-09 May-09 Jul-09 Sep-09 Nov-09

Sust

aina

ble

IT a

t Sta

nfor

d

S T A N F O R D U N I V E R S I T Y • S U S T A I N A B L E I T • Page 22

Credit Wikimedia Commons

Questions and Answers

Joyce Dickerson [email protected]