LiquidCool Solutions Value for Facebook

39
REVOLUTION IN THE DATA CENTER MOVING FROM AIR COOLING TO LIQUID COOLING Proprietary & Confidential 2016, LiquidCool Solutions, Inc.

Transcript of LiquidCool Solutions Value for Facebook

REVOLUTION IN THE DATA CENTERMOVING FROM AIR COOLING TO LIQUID COOLING

Proprietary & Confidential 2016, LiquidCool Solutions, Inc.

LiquidCool Solutions: the market leader!

» Rich and deep history of starting the liquid submersion cooling movement

» World’s first commercially available liquid submersion computing platforms

» Over 40 patents around submersion cooling

» The only company to make submersion cooling practical, scalable & award winning

2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014 2015

Efficient Server Power

On-Board UPS

Carbon Neutral

Warehouse Scale Datacenter

Containerized Datacenter

First Public PUE Reporting

Facebook Open Compute Project

Free Air Cooling

The Evolution of LiquidCool Solutions

Hardcore Computer, Inc.

Founded

Hardcore Computer becomes

LiquidCool

Worlds First Submersion Desktop PC“Reactor”

Worlds First Submersion Workstation“Detonator”

Worlds First Submersion Rack Server

“LSS”

Worlds First Submersion Data Center

Module

Cloud Computing Consolidation

Wells Fargo Invests

Intel Partnership

Rugged Tier Computing

TROPEC Validation

Core Coolant

Developed40 patentsLiquidCool Solutions Milestones

Market Milestones

IBM 2 Phase Cooling

Google Dataflow

NREL Validation

Over the past 10+ years LiquidCool has builtthe most energy efficient, cost effective and

practical approach to data center cooling

Fans are the problem

Fans Waste Energy

• 15% of total datacenter energy is used to move air

• Fans in the chassis can use up to 20% of power at the device level

• Fans are inefficient and generate heat that must be removed

Fans Waste Space

• Racks need room to breathe

• CRAC units require space around the racks

Fans Reduce Reliability

• Fans fail

• Thermal fluctuations drive solder joint failures

• Vibration frets electrical contacts

• Fans expose electronics to air • Oxidation/corrosion of electrical contacts• Sensitivity to ambient particulate, humidity

and temperature conditions

Air is an insulator, not a coolant

» Cooling with Air is Expensive

» Air Cooling Wastes Space

» Air Cooling Limits Power Density

» Air Cooling Reduces Reliability

• Fans Fail

• Air Causes Oxidation and Corrosion

• Air Enables Large Temperature Swings

» Air Cooling Wastes Energy

• Up to 15% of Total Power is used to Move Air• Up to 20% More Power is used by Rack Fans• Fans Generate Heat that must be Dissipated

LiquidCool’s Elegant Solution

Total liquid immersion in a eco-friendly dielectric fluid

• “Cool” liquid is circulated directly to hottest components first. Remaining components are cooled by bulk flow as the dielectric liquid exits the chassis.

• Directed flow is a key differentiator

• No fans or other moving parts in the chassis

• Rack-mounted devices are easy to maintain

• Off-the-Shelf Components

Up to 125°F

Up to 110°F

No Moving Parts – Flow Rate: 0.25 gal / min

20 GRANTED PATENTS

20 PATENTS PENDING

Power-to-Cool reduced by up to 98%

Source: LiquidCool Solutions Power-to-Cool White Paper, January 2013

No water needed

No water used for cooling when the ambient dry bulb temperature is less than 110oF

LCS Cooling System Elements

• Pump supplies “cool” dielectric liquid to multiple IT racks

• If there is no energy recycling option, “hot” fluid is circulated to an evaporative fluid cooler

• Incoming “cool” fluid can be as warm as 110oF for most applications, which completely eliminates the need to use water almost anywhere in the world

Cooling Without Water

Saves Space

(1) 25kW 42U Rack64, 400W ServersCan Go Anywhere

4-8kW 42U Racks156-200W Servers (7kW for onboard fans)

Needs an Air Conditioned Room

Air Cooled

Eliminate Costly Infrastructure

What LCS Does Not Need

• Mechanical Refrigeration

• Cooling Towers

• Air Handlers

• Humidity Control

• Blade and Rack Fans

• Mixing Dampers

• High Ceilings

• Hot Aisles

• Noise

• Water

• Very much space

Case Study: Facebook (Altoona, IA Data Center)

Facebook Data Center after LCS

LCS saves Facebook (MILLIONS)

» LCS eliminates the need for a 3 story building

» LCS uses a fraction of the floor space

» LCS eliminates 1.4 million cubic feet of air per minute

» LCS eliminates DX units for dehumidification when indoor RH exceeds 85%

» LCS eliminates evaporative cooling when outdoor temperature is over 85°F

» LCS saves not only upfront capex but also the energy to run the facility

LCS eliminates all of that!

A Custom Solution for Facebook

LCS custom server for Facebook

» Modified OCP Rack w/ 7 Processors and 14 Storage Servers

» Ultra, ultra high-density enclosure

» Proprietary cooling for spindle drive bays and in-rack storage

LiquidCool Clamshell Server

Increased Reliability and Longevity

Increased Reliability and Longevity

Lower Maintenance Cost

Hearing Safety

3rd Party Validation

Lawrence Berkeley National Laboratory

• Lawrence Berkeley National Laboratory assessed LiquidCool technology for the US Navy’s Transformative Reductions in Operational Energy Consumption (TROPEC) program for continuous operation of servers and communication/network equipment in tropical environments with no mechanical cooling.

• The Coefficient of Performance for the LCS system was 49.9 at full load compared to 1.6 for current practice, 30 times better.

National Renewable Energy Laboratory• Preliminary data from the testing program, currently underway, suggest that more than 85% of

the electric energy used by LCS servers can be recycled to create hot water for building heat.

• The test program will conclude in November at which time the LCS rack and servers will be moved to NREL’s high performance computing center.

Intel Corporation• Confirmed that the LCS dielectric fluid does not have a deleterious effect on Intel processors,

and all processor cores and memory operated well within the normal operating range even at coolant temperatures of 122°F

• As a result of this study and other investigations it undertook, Intel committed to extend normal warranty terms to processors immersed in the dielectric fluid LCS employs.

Competition

Three Liquid Cooling Technologies:1. Cold Plates – Requires Fans

– Shifts heat from processors but does not save much energy– Mechanical cooling needed above 85oF– Maximum rack density is 30-kilowatts

2. In-Row Cooling – Requires Fans– Makes the room smaller – Mechanical cooling above 80oF– Maximum rack density is 35-kilowatts

3. Total Immersion in a Dielectric Fluid – No Fans; > 75kW rack densityGreen Revolution CoolingIceotope

3M (Allied Control and SGI)

LiquidCool Solutions

The LiquidCool Advantages

LiquidCool Can Have Any Shape or Size Green Revolution, Iceotope, 3M

LiquidCool Requires No Water in Datacenter Iceotope, 3M

LiquidCool Scales vertically or horizontallyGreen Revolution, 3M

LiquidCool is Easy to Swap & Maintain Devices Green Revolution, Iceotope, 3M

LiquidCool Has Harsh Environment Deployments Green Revolution, 3M

LiquidCool is on Par With Air Cooling (Low Cost)Iceotope, 3M

Customer InstallationsLiquid Submerged Server Rack

• University of Minnesota Supercomputing Institute• Microsoft Technology Center - Minneapolis• Dedicated Computing• National Renewable Energy Laboratory

Rugged Terrain Computer (started shipping in August, 2015)Installed• JBS Tannery• George Washington University• Stanford SLAC National Accelerator Laboratory• Colorado Judicial Courts• Brookdale Health

Scheduled• CISCO Systems• MITRE• CoreFocus• United Healthcare• Wisconsin Electric

Product Mix

Rugged Terrain Computer (RT)High performance computer that is ideal for factory floors, construction sites, etc.

Manufactured by Dedicated Computing.

Liquid Submerged Server (LSS Rack)

High density, easy to maintain, Cooling PUE<1.02

Manufactured by Dedicated Computing.

Explorer

Rugged, mobile 4kW compute system that is ideal for field data processing

Three heat rejection options

Manufactured by Dedicated Computing

Clamshell ServerLow cost Open Compute Project servers for cloud datacenters

Manufactured by Herold Precision Metals

200 Kilowatt Prefabricated Module (PFM)

Unique computing appliance that can be transported by cargo plane

Johnson Controls is the module manufacturer and project integrator

750 Kilowatt PFMUnique computing appliance that replaces five air-cooled PFMs

Johnson Controls is the module manufacturer and project integrator

LSC-RT Rugged Terrain Computer

LSC-RT features:

• Internal High-Efficiency Pump

• Sealed and fanless system

• IP67-rated I/O and Power

connections

• Supports Intel Core-i3/i5/i7 CPUs

up to 84W

• Operating temperature range:

10°F - 110°F

• Dual Passive Radiators

• USB 2.0, HDMI, 1Gb Ethernet

ports

LSC-RT Rugged Terrain Computer Advantages

Explorer Server Rack

LCS Explorer: 8 or 16 High-performance servers in a modular configuration of shock-isolating rack cases

Redundant 8kW CDUs

Room for rugged switch and storage

8X LSS Servers

Battlefield Tested and Ready

LiquidCool Solutions Participated In Joint Military Exercises at Emerald Warrior 2015

LSS Rack

Markets: Greenfield and Datacenter Retrofits, Prefabricated IT Modules

Features:

• 36-kilowatt rack, equivalent to four air-cooled racks

• Electronics protected from humidity, dust, oxidation, and corrosive gases

• Energy Efficient Cooling PUE<1.02

• Scalable

• Easy to Maintain

• No Moving Parts

• No Water

Modular Data Centers

Global Modular Datacenter Market Expected to Grow from $8.37 Billion in 2015 to $35.11 Billion by 2020 at a CAGR of 33.2%

--- Research and Markets Organization 6/25/15

LiquidCool Clamshell Rack

Target Market: Cloud Datacenters

Features:

• Up to 75-kW and 120 servers per rack

• Open Compute Project architecture

• Power and I/O outside the wet area

• Energy Efficient – Cooling PUE<1.03

• No water

• Easy to Maintain

• No Moving Parts

LiquidCool 200-Kilowatt Modular Data Center

Markets: Temporary Deployments, Military, Catastrophic Weather Events, Filmmaking

Features:

• 200-kilowatt Prefabricated Module designed to be transported by cargo plane

• Rapid field set up

• Energy Efficient – Cooling PUE<1.03

• No water

• Easy to Maintain

• Electronics protected from humidity, dust, oxidation, and corrosive gases

Overall dimensions: 12’ x 12’ x 20’

LiquidCool 750-KW Prefabricated Module

Overall dimensions: 12’ x 12’ x 42’

Markets: Enterprise Datacenters, Research

Features:

• Replaces four air-cooled modules

• Energy Efficient – Cooling PUE<1.03

• No water

• Easy to Maintain

• No Moving Parts

• Electronics protected from humidity, dust, oxidation, and corrosive gases

Data Center Energy Savings

• In a typical datacenter, roughly 38% of energy costs go to HVAC cooling and fans

(ASHRAE and DOE, 2012)

• Using LiquidCool technology, fans in devices are eliminated and the need for HVAC cooling and fans is greatly reduced

• As a result, datacenter energy costs can be reduced by 50%or more by using LiquidCool technology

Waste Heat Can Be Reclaimed & Used

The Largest Liquid Cooling Patent Portfolio

20 Issued Patents

• Liquid tight server case with dielectric liquid for cooling electronics

• Extruded server case used with liquid coolant of electronics

• Computer with fluid inlets and outlets for direct-contact liquid cooling

• Case and rack system for liquid submersion cooling of an array of electronic components

• Computer case for liquid submersion cooled components

• Liquid submersion cooled computer with directed flow

• Gravity assisted directed liquid cooling

20 Additional Patents Pending

Executive Team

38

• Herb Zien (CEO) who has over 30 years of experience in project development, engineering management, power generation and energy conservation. He was co-founder of a firm that grew to become the largest owner and operator of District Energy Systems in the U.S.

• Rick Tufty (VP Engineering) who is an experienced executive having worked for Dell, Maxtor, HP and IBM. He holds 11 U.S. and international patents.

• David Roe (Program Manager) who has over 20 years of technical development experience and is an expert in computer and cooling technologies.

• Daren Klum (co-Founder) who is an experienced executive with 15 years building industry leading technology products and bringing them to market. Experience includes product strategy, software, hardware development and executive leadership.

• Stephen Einhorn (Chairman) who is a partner at Capital Midwest Fund, the largest shareholder.

Stephen Einhorn

[email protected]

414-453-4488

David Roe, Program Manager

[email protected]

507-250-4727

Herb Zien, CEO

[email protected]

414-289-7171

Rick Tufty, VP Engineering

[email protected]

507-535-5829

Daren Klum, Co-Founder

[email protected]

651-324-4440