Collaboration David Reed & Jerzy Osmolski, artists in practice.
Visit of Professor Jerzy Szwed Under Secretary of State Ministry of Science and Higher Education
description
Transcript of Visit of Professor Jerzy Szwed Under Secretary of State Ministry of Science and Higher Education
CERN IT Department
CH-1211 Genève 23
Switzerlandwww.cern.ch/it
Visit of Professor Jerzy Szwed
Under Secretary of StateMinistry of Science and Higher Education
Poland
Tuesday 23rd February 2010
The LHC Computing Grid
Frédéric HemmerIT Department Head
CERN IT Department
CH-1211 Genève 23
Switzerlandwww.cern.ch/it
2
7000 tons, 150 million sensorsgenerating data 40 millions times per second
i.e. a petabyte/s
The ATLAS experiment
The LHC Computing Grid, February 2010Frédéric Hemmer
The LHC Computing Grid, February 2010 33
A collision at LHC
Frédéric Hemmer
Tier 0 at CERN: Acquisition, First pass processing Storage & Distribution
1.25 GB/sec (ions)
CERN IT Department
CH-1211 Genève 23
Switzerlandwww.cern.ch/it
6
The LHC Data Challenge
• The accelerator will run for 10-15 years
• Experiments will produce about 15 Million Gigabytes of data each year (about 20 million CDs!)
• LHC data analysis requires a computing power equivalent to ~100,000 of today's fastest PC processors
• Requires many cooperating computer centres, as CERN can only provide ~20% of the capacity
The LHC Computing Grid, February 2010Frédéric Hemmer
CERN IT Department
CH-1211 Genève 23
Switzerlandwww.cern.ch/it
The LHC Computing Grid, March 2009 8
Solution: the Grid
• Use the Grid to unite computing resources of particle physics institutes around the world
The World Wide Web provides seamless access to information that is stored in many millions of different geographical locations
The Grid is an infrastructure that provides seamless access to computing power and data storage capacity distributed over the globe
Frédéric Hemmer
The LHC Computing Grid, February 2010 10
Tier 0 – Tier 1 – Tier 2Tier-0 (CERN):•Data recording•Initial data reconstruction
•Data distribution
Tier-1 (11 centres):• Permanent storage• Re-processing• Analysis
Tier-2 (~130 centres):• Simulation• End-user analysis
Frédéric Hemmer
CERN IT Department
CH-1211 Genève 23
Switzerlandwww.cern.ch/it
The LHC Computing Grid, February 2010 12
The CERN Tier-0– 24x7 operator support and System Administration services to
support 24x7 operation of all IT services.– Hardware installation & retirement (~7,000 hardware
movements/year)– Management and Automation framework for large scale Linux
clusters
– Installed Capacity• 6’300 systems, 39’000 processing cores
– CPU servers, disk servers, infrastructure servers– Tenders planned or in progress: 2’400 systems, 16’000 processing cores
• 13’900 TB usable on 42’600 disk drives– Tenders planned or in progress: 19’000 TB usable on 20’000 disk drives
• 34’000 TB on 45’000 tape cartridges– (56’000 slots), 160 tape drives
Frédéric Hemmer
The LHC Computing Grid, February 2010 14
The European Network Backbone
• LCG working group with Tier-1s and national/ regional research network organisations
• New GÉANT 2 – research network backbone
Strong correlation with major European LHC centres
• Swiss PoP at CERN
Frédéric Hemmer
The LHC Computing Grid, February 2010 15
Overall summary• November
– Ongoing productions– Cosmics data taking
• November – December– Beam data and collisions– Productions + analysis
• December – February– Ongoing productions– Cosmics
• WLCG service has been running according to the defined procedures– Reporting and follow up of problems
at same level– Middleware process – updates &
patches – as planned
Frédéric Hemmer
The LHC Computing Grid, February 2010 16
2009 Physics Data Transfers
Final readiness test (STEP’09)
Preparation for LHC startup LHC physics data
Nearly 1 petabyte/week
More than 8 GB/s peak transfers from Castor fileservers at CERN
Frédéric Hemmer
The LHC Computing Grid, February 2010 19
GRID COMPUTING NOW
Frédéric Hemmer
ArcheologyAstronomyAstrophysicsCivil ProtectionComp. ChemistryEarth SciencesFinanceFusionGeophysicsHigh Energy PhysicsLife SciencesMultimediaMaterial Sciences…
>250 sites48 countries>50,000 CPUs>20 PetaBytes>10,000 users>150 VOs>150,000 jobs/day
• LCG has been the driving force for the European multi-science Grid EGEE (Enabling Grids for E-sciencE)
• EGEE is now a global effort, and the largest Grid infrastructure worldwide
• Co-funded by the European Commission (Cost: ~170 M€ over 6 years, funded by EU ~100M€)
• EGEE already used for >100 applications, including…
Impact of the LHC Computing Grid in Europe Impact of the LHC Computing Grid in Europe
21 Health-e-Child
Similarity Search
Temporal Modelling
Visual Data Mining
Genetics Profiling
Treatment Response
Inferring Outcome
Biomechanical ModelsTumor Growth Modelling
Semantic Browsing
Personalised Simulation
Surgery Planning
RV and LV Automatic Modelling
Measurement of Pulmonary Trunk
Sustainability• Need to prepare for permanent Grid infrastructure• Ensure a high quality of service for all user
communities• Independent of short project funding cycles• Infrastructure managed in collaboration
with National Grid Initiatives (NGIs)• European Grid Initiative (EGI)
CERN IT Department
CH-1211 Genève 23
Switzerlandwww.cern.ch/it
25
For more information about the Grid:
Thank you for your kind attention!
www.cern.ch/lcg www.eu-egee.org
www.eu-egi.org/
www.gridcafe.org
The LHC Computing Grid, February 2010Frédéric Hemmer
CERN IT Department
CH-1211 Genève 23
Switzerlandwww.cern.ch/it
The LHC Computing Grid, February 2010 26