Scientific Computing at SLAC: The Transition to a Multiprogram Future Richard P. Mount Director:...

10
Scientific Computing at SLAC: The Transition to a Multiprogram Future Richard P. Mount Director: Scientific Computing and Computing Services Stanford Linear Accelerator Center January 10, 2007

Transcript of Scientific Computing at SLAC: The Transition to a Multiprogram Future Richard P. Mount Director:...

Page 1: Scientific Computing at SLAC: The Transition to a Multiprogram Future Richard P. Mount Director: Scientific Computing and Computing Services Stanford Linear.

Scientific Computingat SLAC:

The Transition to a Multiprogram Future

Richard P. Mount

Director: Scientific Computing and Computing Services

Stanford Linear Accelerator Center

January 10, 2007

Page 2: Scientific Computing at SLAC: The Transition to a Multiprogram Future Richard P. Mount Director: Scientific Computing and Computing Services Stanford Linear.

January 10, 2007Richard P. Mount, SLAC 2

SLAC Scientific ComputingBalancing Act

• Aligned with the evolving science mission of SLAC

but neither

• Subservient to the science mission

nor

• Unresponsive to SLAC mission needs

Page 3: Scientific Computing at SLAC: The Transition to a Multiprogram Future Richard P. Mount Director: Scientific Computing and Computing Services Stanford Linear.

January 10, 2007Richard P. Mount, SLAC 3

DOE Scientific Computing Funding at SLAC circa 2005

• Particle and Particle Astrophysics

– $14M SCCS

– $5M Other

• Photon Science

– $0M SCCS

– $1M SSRL

• Computer Science

– $0.5M to $1.5M

Page 4: Scientific Computing at SLAC: The Transition to a Multiprogram Future Richard P. Mount Director: Scientific Computing and Computing Services Stanford Linear.

January 10, 2007Richard P. Mount, SLAC 4

SLAC Scientific Computing FutureScience Goals Computing Techniques

BaBar Experiment (winds down 2009-2012)

Measure billions of collisions to understand matter-antimatter asymmetry (why matter exists today)

High-throughput data processing, trivially parallel computation, heavy use of disk and tape storage

Experimental HEP (LHC/ATLAS, ILC …)

Analyze petabytes of data to understand the origin of mass

High-throughput data processing. trivially parallel computation, heavy use of disk and tape storage

Accelerator Science Current: World-leading simulation of electromagnetic structures;

Future: Simulate accelerator (e.g. ILC) behavior before construction and during operation

Parallel computation, visual analysis of large data volumes

Particle Astrophysics (mainly simulation)

Star formation in the early universe, colliding black holes, …

Parallel computation (SMP and cluster), visual analysis of growing volumes of data

Particle Astrophysics Major Projects (GLAST, LSST …)

Analyze terabytes to petabytes of data to understand the dark matter and dark energy riddles

High-throughput data processing, very large databases, visualization

Photon Science (biology, physics, chemistry, environment, medicine)

Current: SSRL – state-of-the art synchrotron radiation lab;

Future LCLS – Femtosecond x-ray pulses, “ultrafast” science, structure of individual molecules …

Medium-scale data analysis and simulation. Visualization.

High throughput data analysis and large-scale simulation. Advanced visualization

New Architectures for SLAC Science

Radical new approaches to computing for Stanford-SLAC data-intensive science

Current focus: massive solid-state storage for high-throughput, low-latency data analysis

Page 5: Scientific Computing at SLAC: The Transition to a Multiprogram Future Richard P. Mount Director: Scientific Computing and Computing Services Stanford Linear.

January 10, 2007Richard P. Mount, SLAC 5

Scientific Computing Equipment(2007 on: estimate of what SLAC will propose)

SLAC Scientific Computing Equipment Purchases

0

5

10

15

20

25

30

1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014 2015 2016 2017 2018

$M

Page 6: Scientific Computing at SLAC: The Transition to a Multiprogram Future Richard P. Mount Director: Scientific Computing and Computing Services Stanford Linear.

January 10, 2007Richard P. Mount, SLAC 6

Used/Required Power(Twice the power dissipated by the Computing Equipment)

Total Power (twice power consumed by computing equipment) (MW)

0

2

4

6

8

10

12

14

16

18

1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014 2015 2016 2017 2018

Page 7: Scientific Computing at SLAC: The Transition to a Multiprogram Future Richard P. Mount Director: Scientific Computing and Computing Services Stanford Linear.

January 10, 2007Richard P. Mount, SLAC 7

Scientific ComputingThe relationship between Science and the

components of Scientific Computing Application Sciences

Issues addressable with “computing”

Computing techniques

Computing hardware

High-energy and Particle-Astro Physics, Accelerator Science, Photon Science …

Particle interactions with matter, Electromagnetic structures, Huge volumes of data, Image processing …

PDE Solving, Algorithmic geometry, Visualization, Meshes, Object databases, Scalable file systems …

Processors, I/O devices, Mass-storage hardware, Random-access hardware, Networks and Interconnects …

Computing architectures

Single system image, Low-latency clusters, Throughput-oriented clusters, Scalable storage …

SCCS FTE

~20

~26

Page 8: Scientific Computing at SLAC: The Transition to a Multiprogram Future Richard P. Mount Director: Scientific Computing and Computing Services Stanford Linear.

January 10, 2007Richard P. Mount, SLAC 8

Challenges (1)

• Very strong, unquestionably legitimate drivers of diversity: e.g.

– Development of new astrophysics and cosmology models is helped by huge SMPs

– MPI and low-latency interconnects are really needed

Page 9: Scientific Computing at SLAC: The Transition to a Multiprogram Future Richard P. Mount Director: Scientific Computing and Computing Services Stanford Linear.

January 10, 2007Richard P. Mount, SLAC 9

Challenges (2)• Very strong, but more frustrating drivers of diversity:

e.g.– Macs

– Postoc-supported clusters

– Inability to live with HEP security models

• Making the environment attractive to world-leading faculty candidates and users in general

• Photon Science history (users bring a sample, take data for 3 days and then take it away with them)

• Brave new scientific worlds where nobody has any idea how understanding will be extracted from petabytes (but the hardware and software must be ready in 18-36 months and detailed costing is needed yesterday).

Page 10: Scientific Computing at SLAC: The Transition to a Multiprogram Future Richard P. Mount Director: Scientific Computing and Computing Services Stanford Linear.

January 10, 2007Richard P. Mount, SLAC 10

Or in Other Words

• It’s a very exciting future.