Designing a cluster for geophysical fluid dynamics applications

69
Designing a cluster for geophysical fluid dynamics applications Göran Broström Dep. of Oceanography, Earth Science Centre, Göteborg University.

description

Designing a cluster for geophysical fluid dynamics applications. Göran Broström Dep. of Oceanography, Earth Science Centre, Göteborg University . Our cluster (me and Johan Nilsson, Dep. of Meterology, Stockholm University). Grant from the Knut & Alice Wallenberg foundation (1.4 MSEK) - PowerPoint PPT Presentation

Transcript of Designing a cluster for geophysical fluid dynamics applications

Page 1: Designing a cluster for geophysical fluid dynamics applications

Designing a cluster for geophysical fluid dynamics

applications

Göran BroströmDep. of Oceanography, Earth Science

Centre, Göteborg University.

Page 2: Designing a cluster for geophysical fluid dynamics applications

Our cluster(me and Johan Nilsson, Dep. of Meterology,

Stockholm University)

• Grant from the Knut & Alice Wallenberg foundation (1.4 MSEK)

• 48 cpu cluster• Intel P4 2.26 Ghz• 500 Mb 800Mhz Rdram• SCI cards

• Delivered by South Pole• Run by NSC (thanks Niclas & Peter)

Page 3: Designing a cluster for geophysical fluid dynamics applications

What we study

Page 4: Designing a cluster for geophysical fluid dynamics applications

Geophysical fluid dynamics

• Oceanography• Meteorology• Climate dynamics

Page 5: Designing a cluster for geophysical fluid dynamics applications

Thin fluid layersLarge aspect ratio

Page 6: Designing a cluster for geophysical fluid dynamics applications

Highly turbulentGulf stream: Re~1012

Page 7: Designing a cluster for geophysical fluid dynamics applications

Large variety of scales

Parameterizations are important in geophysical fluid dynamics

Page 8: Designing a cluster for geophysical fluid dynamics applications

Timescales

• Atmospheric low pressures: 10 days

• Seasonal/annual cycles: 0.1-1 years• Ocean eddies: 0.1-1 year• El Nino: 2-5 years.• North Atlantic Oscillation: 5-50 years.• Turnovertime of atmophere: 10 years.• Anthropogenic forced climate change: 100 years.• Turnover time of the ocean: 4.000 years.• Glacial-interglacial timescales: 10.000-200.000 years.

Page 9: Designing a cluster for geophysical fluid dynamics applications

Some examples of atmospheric and oceanic low pressures.

Page 10: Designing a cluster for geophysical fluid dynamics applications

Timescales

• Atmospheric low pressures: 10 days• Seasonal/annual cycles: 0.1-1 years• Ocean eddies: 0.1-1 year• El Nino: 2-5 years.• North Atlantic Oscillation: 5-50 years.• Turnovertime of atmophere: 10 years.• Anthropogenic forced climate change: 100 years.• Turnover time of the ocean: 4.000 years.• Glacial-interglacial timescales: 10.000-200.000 years.

Page 11: Designing a cluster for geophysical fluid dynamics applications

Normal state

Page 12: Designing a cluster for geophysical fluid dynamics applications

Initial ENSO state

Page 13: Designing a cluster for geophysical fluid dynamics applications

The ENSO state

Page 14: Designing a cluster for geophysical fluid dynamics applications

The ENSO state

Page 15: Designing a cluster for geophysical fluid dynamics applications

Timescales

• Atmospheric low pressures: 10 days• Seasonal/annual cycles: 0.1-1 years• Ocean eddies: 0.1-1 year• El Nino: 2-5 years.• North Atlantic Oscillation: 5-50 years.• Turnovertime of atmophere: 10 years.• Anthropogenic forced climate change: 100 years.• Turnover time of the ocean: 4.000 years.• Glacial-interglacial timescales: 10.000-200.000 years.

Page 16: Designing a cluster for geophysical fluid dynamics applications

Positive NAO phase Negative NAO phase

Page 17: Designing a cluster for geophysical fluid dynamics applications
Page 18: Designing a cluster for geophysical fluid dynamics applications

Positive NAO phase Negative NAO phase

Page 19: Designing a cluster for geophysical fluid dynamics applications
Page 20: Designing a cluster for geophysical fluid dynamics applications

Timescales

• Atmospheric low pressures: 10 days• Seasonal/annual cycles: 0.1-1 years• Ocean eddies: 0.1-1 year• El Nino: 2-5 years.• North Atlantic Oscillation: 5-50 years.• Turnovertime of atmophere: 10 years.• Anthropogenic forced climate change: 100 years.• Turnover time of the ocean: 4.000

years.• Glacial-interglacial timescales: 10.000-200.000 years.

Page 21: Designing a cluster for geophysical fluid dynamics applications

Temperature in the North Atlantic

Page 22: Designing a cluster for geophysical fluid dynamics applications

Timescales

• Atmospheric low pressures: 10 days• Seasonal/annual cycles: 0.1-1 years• Ocean eddies: 0.1-1 year• El Nino: 2-5 years.• North Atlantic Oscillation: 5-50 years.• Turnovertime of atmophere: 10 years.• Anthropogenic forced climate change: 100 years.• Turnover time of the ocean: 4.000 years.• Glacial-interglacial timescales: 10.000-

200.000 years.

Page 23: Designing a cluster for geophysical fluid dynamics applications

Ice coverage, sea level

Page 24: Designing a cluster for geophysical fluid dynamics applications

What model will we use?

Page 25: Designing a cluster for geophysical fluid dynamics applications

MIT General circulation model

Page 26: Designing a cluster for geophysical fluid dynamics applications

MIT General circulation model• General fluid dynamics solver• Atmospheric and ocean physics• Sophisticated mixing schemes• Biogeochemical modules• Efficient solvers• Sophisticated coordinate system• Automatic adjoint schemes• Data assimilation routines

• Finite difference scheme• F77 code• Portable

Page 27: Designing a cluster for geophysical fluid dynamics applications

MIT General circulation model

Spherical coordinates “Cubed sphere”

Page 28: Designing a cluster for geophysical fluid dynamics applications

MIT General circulation model• General fluid dynamics solver• Atmospheric and ocean physics• Sophisticated mixing schemes• Biogeochemical modules• Efficient solvers• Sophisticated coordinate system• Automatic adjoint schemes• Data assimilation routines

• Finite difference scheme• F77 code• Portable

Page 29: Designing a cluster for geophysical fluid dynamics applications

MIT General circulation model

Page 30: Designing a cluster for geophysical fluid dynamics applications

MIT General circulation model

Page 31: Designing a cluster for geophysical fluid dynamics applications

MIT General circulation model

Page 32: Designing a cluster for geophysical fluid dynamics applications

MIT General circulation model

Page 33: Designing a cluster for geophysical fluid dynamics applications

MIT General circulation model

Page 34: Designing a cluster for geophysical fluid dynamics applications

MIT General circulation model

Page 35: Designing a cluster for geophysical fluid dynamics applications

MIT General circulation model

Page 36: Designing a cluster for geophysical fluid dynamics applications

MIT General circulation model

Page 37: Designing a cluster for geophysical fluid dynamics applications

Some computational aspects

Page 38: Designing a cluster for geophysical fluid dynamics applications

Some tests in INGVAR

(32 AMD 900 Mhz cluster)

Page 39: Designing a cluster for geophysical fluid dynamics applications

Experiments with 60*60*20 grid points

Page 40: Designing a cluster for geophysical fluid dynamics applications

Experiments with 60*60*20 grid points

Page 41: Designing a cluster for geophysical fluid dynamics applications

Experiments with 60*60*20 grid points

Page 42: Designing a cluster for geophysical fluid dynamics applications

Experiments with 120*120*20 grid points

Page 43: Designing a cluster for geophysical fluid dynamics applications

MM5 Regional atmospheric model

Page 44: Designing a cluster for geophysical fluid dynamics applications

MM5 Regional atmospheric model

Page 45: Designing a cluster for geophysical fluid dynamics applications

MM5 Regional atmospheric model

Page 46: Designing a cluster for geophysical fluid dynamics applications

Choosing cpu’s, motherboard, memory,

connections

Page 47: Designing a cluster for geophysical fluid dynamics applications

Specfp (swim)

0100200300400500600700

Run

tim

e

Page 48: Designing a cluster for geophysical fluid dynamics applications

Run time on different nodes

02000400060008000

1000012000140001600018000

run

time

Page 49: Designing a cluster for geophysical fluid dynamics applications

Choosing interconnection

(requires a cluster to test)Based on earlier experience we

use SCI from Dolphinics (SCALI)

Page 50: Designing a cluster for geophysical fluid dynamics applications

Our choice

• Named Otto• SCI cards• P4 2.26 GHz (single cpus)• 800 Mhz Rdram (500 Mb)• Intel motherboards (the only available)

• 48 nodes• NSC (nicely in the shadow of Monolith)

Page 51: Designing a cluster for geophysical fluid dynamics applications

Otto (P4 2.26 GHz)

Page 52: Designing a cluster for geophysical fluid dynamics applications

Scaling

Otto (P4 2.26 GHz) Ingvar (AMD 900 MHz)

Page 53: Designing a cluster for geophysical fluid dynamics applications

Why do we get this kind of results?

Page 54: Designing a cluster for geophysical fluid dynamics applications

Time spent on different “subroutines”

60*60*20 120*120*20

Page 55: Designing a cluster for geophysical fluid dynamics applications

Relative time Otto/Ingvar

Page 56: Designing a cluster for geophysical fluid dynamics applications

Some tests on other machines

• INGVAR: 32 node, AMD 900 MHz, SCI• Idefix: 16 node, Dual PIII 1000 MHz, SCI• SGI 3800: 96 Proc. 500 MHz• Otto: 48 node, P4 2.26 Mhz, SCI• ? MIT, LCS: 32 node, P4 2.26 Mhz, MYRINET

Page 57: Designing a cluster for geophysical fluid dynamics applications

Comparing different system (120*120*20 gridpoints)

Page 58: Designing a cluster for geophysical fluid dynamics applications

Comparing different system (120*120*20 gridpoints)

Page 59: Designing a cluster for geophysical fluid dynamics applications

Comparing different system (60*60*20 gridpoints)

Page 60: Designing a cluster for geophysical fluid dynamics applications

SCI or Myrinet?

120*120*20 gridpoints

Page 61: Designing a cluster for geophysical fluid dynamics applications

SCI or Myrinet?

120*120*20 gridpoints (60*60*20 gripoints)

(ooops, I used the ifcCompiler for these tests)

Page 62: Designing a cluster for geophysical fluid dynamics applications

SCI or Myrinet?

120*120*20 gridpoints (60*60*20 gripoints)

(ooops, I used the ifcCompiler for these tests)

(1066Mhz rdram?)

Page 63: Designing a cluster for geophysical fluid dynamics applications

SCI or Myrinet?(time spent in pressure calc.)

120*120*20 gridpoints (60*60*20 gripoints)

(ooops, I used the ifcCompiler for these tests)

(1066Mhz rdram?)

Page 64: Designing a cluster for geophysical fluid dynamics applications

Conclusions

• Linux clusters are useful in computational geophysical fluid dynamics!!

• SCI cards are necessary for parallel runs >10 nodes.• For efficient parallelization: >50*50*20 grid points per

node!• Few users - great for development.

• Memory limitations, for 48 proc. a’ 500 Mb, 1200*1200*30 grid points is maximum (eddy resolving North Atlantic, Baltic Sea).

• For applications similar as ours, go for SCI cards + cpu with fast memory bus and fast memory!!

Page 65: Designing a cluster for geophysical fluid dynamics applications

Experiment with low resolution (eddies are parameterized)

Page 66: Designing a cluster for geophysical fluid dynamics applications
Page 67: Designing a cluster for geophysical fluid dynamics applications
Page 68: Designing a cluster for geophysical fluid dynamics applications

Experiment with low resolution (eddies are parameterized)

Page 69: Designing a cluster for geophysical fluid dynamics applications

Thanks for your attention