Tony Doyle [email protected] “GridPP – Year 2 to Year 3”, Collaboration Meeting,...

55
Tony Doyle [email protected]. ac.uk “GridPP – Year 2 to Year 3”, Collaboration Meeting, Bristol, 22 September 2003

Transcript of Tony Doyle [email protected] “GridPP – Year 2 to Year 3”, Collaboration Meeting,...

Tony [email protected]

“GridPP – Year 2 to Year 3”, Collaboration Meeting, Bristol, 22 September 2003

Tony Doyle - University of Glasgow

OutlineOutline

• Challenges • Management• The Project Map• GridPP Status:

two years on..• UK Grid Users• Deployment – LCG

and UK perspective• Current Resources• EDG 2.0/LCG 1.0

Deployment Status

• Accounting

• Today’s Operations

• Future Operations Planning

• Middleware Status

• Middleware Evolution

• GridPP2 Planning Status

• Dissemination

• Summary

Tony Doyle - University of Glasgow

The Challenges Ahead: The Challenges Ahead: Event SelectionEvent Selection

LEVEL-1 Trigger Hardwired processors (ASIC, FPGA) Pipelined massive parallel

HIGH LEVEL Triggers Farms of

processors

10-9 10-6 10-3 10-0 103

25ns 3µs hour yearms

Reconstruction&ANALYSIS TIER0/1/2

Centers

ON-lineOFF-line

sec

Giga Tera Petabit

9 or

ders

of

mag

nitu

de

The HIGGS

All interactions

Tony Doyle - University of Glasgow

The Challenges Ahead: The Challenges Ahead: ComplexityComplexity

Understand/interpret data via numerically intensive simulations:• e.g. ATLAS Monte Carlo (gg H bb)

182 sec/3.5 MB event on 1 GHz linux box

• Many events– ~109 events/experiment/year– >~1 MB/event raw data– several passes required

Worldwide Grid computing requirement (2007):

~300 TeraIPS (100,000 of today’s fastest processors

connected via a Grid)

16 Million channels

100 kHzLEVEL-1 TRIGGER

1 MegaByte EVENT DATA

200 GigaByte BUFFERS500 Readout memories

3 Gigacell buffers

500 Gigabit/s

Gigabit/s SERVICE LAN PetaByte ARCHIVE

Energy Tracks

Networks

1 Terabit/s(50000 DATA CHANNELS)

20 TeraIPS

EVENT BUILDER

EVENT FILTER

40 MHzCOLLISION RATE

Charge Time Pattern

Detectors

Grid Computing Service 300 TeraIPS

Tony Doyle - University of Glasgow

The Challenges Ahead: The Challenges Ahead: Experiment Requirements: UK onlyExperiment Requirements: UK only

CPU

0

2000

4000

6000

8000

10000

12000

2004 2005 2006 2007

Year

kS

I20

00

ye

ar

ATLAS

CMS

LHCb

ALICE

Phenomenology

ZEUS

UKQCD

UKDMC

MINOS

MICE

LISA

D0

CDF

BaBar

ANTARES

LHC

NonLHC

Disk

0

500

1000

1500

2000

2500

2004 2005 2006 2007

Year

TB

ATLASCMSLHCbALICEPhenomenologyUKQCDUKDMCMINOSMICED0CRESSTCDFBaBarANTARES

LHC

NonLHC

Tape

0

500

1000

1500

2000

2500

3000

2004 2005 2006 2007

Year

TB

ATLASCMS

LHCb

ALICEUKDMC

MINOS

MICED0

ANTARES

LHC

NonLHC

Total Requirement:

Year 2004 2005 2006 2007

CPU [kSI2000] 2395 4066 6380 9965

Disk [TB] 369 735 1424 2285

Tape [TB] 376 752 1542 2623

Tony Doyle - University of Glasgow

GridPP ManagementGridPP Management

CB (20 members) meets half-yearly to provide Institute overview

PMB (12 members) meets weekly [via VC] to provide management of project

TB (10 members) meet as required in response to technical needs and regularly via phone

EB (14 members) meet quarterly to provide experiments input

Tony Doyle - University of Glasgow

GridPP Project OverviewGridPP Project Overview

1. 1 2. 1 3. 1 4. 1 5. 1 6. 1 7. 1

1. 2 2. 2 3. 2 4. 2 5. 2 6. 2 7. 2

1. 3 2. 3 3. 3 4. 3 5. 3 6. 3 7. 3

1. 4 2. 4 3. 4 4. 4 5. 4

1. 5 2. 5 3. 5 4. 5

Navigate down

External link 2. 6 3. 6 4. 6 Link to goals

2. 7 3. 7

2. 8 3. 8

System

UK Grid Rollout

Data Challenges

Dissemination

Tier-1 Centre

Tier-A Centre

Applications

CERN DataGrid Applications

LCG Creation WP1

WP2

ResourcesInfrastructure

Presentation Deployment of

related areasGANGA/GaudiATLAS/LHCb

resources

Participation in

of GridPP

Monitoring of

UK e-Science

Computing Fabric

Grid Technology

Tier-2 Centres

CMSMonte Carlo

WP3

WP4

LHCb

Standards

Open SourceImplementation

Grid Deployment

resourcesEngagementof UK groups

Attract newParticle Physics

UK Testbed

resources

WP6

WP7

WP5

SAM Framework

UKQCD

BaBar

Integration

Integration

Applications

QCD Application

Other

Data Analysis

CDF/D0

GridPP Goal

ATLAS

in the UK for the use of the Particle Physics communityTo develop and deploy a large scale science Grid

5Interoperability

International

WP8

6 71 2 3 4

Worldwide

Tony Doyle - University of Glasgow

GridPP Status: The Project MapGridPP Status: The Project Map

1 . 1 2 . 1 3 . 1 4 . 1 5 . 1 6 . 1 7 . 1

1 . 1 . 1 1 . 1 . 2 1 . 1 . 3 1 . 1 . 4 2 . 1 . 1 2 . 1 . 2 2 . 1 . 3 2 . 1 . 4 3 . 1 . 1 3 . 1 . 2 3 . 1 . 3 3 . 1 . 4 4 . 1 . 1 4 . 1 . 2 4 . 1 . 3 4 . 1 . 4 5 . 1 . 1 5 . 1 . 2 5 . 1 . 3 6 . 1 . 1 6 . 1 . 2 6 . 1 . 3 6 . 1 . 4 7 . 1 . 1 7 . 1 . 2 7 . 1 . 3 7 . 1 . 41 . 1 . 5 2 . 1 . 5 2 . 1 . 6 2 . 1 . 7 2 . 1 . 8 3 . 1 . 5 3 . 1 . 6 3 . 1 . 7 3 . 1 . 8 4 . 1 . 5 4 . 1 . 6 4 . 1 . 7 4 . 1 . 8 6 . 1 . 5

2 . 1 . 9 3 . 1 . 9 3 . 1 . 1 0 4 . 1 . 9

1 . 2 2 . 2 3 . 2 4 . 2 5 . 2 6 . 2 7 . 2

1 . 2 . 1 1 . 2 . 2 1 . 2 . 3 1 . 2 . 4 2 . 2 . 1 2 . 2 . 2 2 . 2 . 3 2 . 2 . 4 3 . 2 . 1 3 . 2 . 2 3 . 2 . 3 3 . 2 . 4 4 . 2 . 1 4 . 2 . 2 4 . 2 . 3 4 . 2 . 4 5 . 2 . 1 5 . 2 . 2 5 . 2 . 3 6 . 2 . 1 6 . 2 . 2 6 . 2 . 3 7 . 2 . 1 7 . 2 . 2 7 . 2 . 31 . 2 . 5 1 . 2 . 6 2 . 2 . 5 2 . 2 . 6 2 . 2 . 7 3 . 2 . 5 3 . 2 . 6 3 . 2 . 7 3 . 2 . 8 4 . 2 . 5 4 . 2 . 6 4 . 2 . 7

3 . 2 . 9

1 . 3 2 . 3 3 . 3 4 . 3 5 . 3 6 . 3 7 . 3

1 . 3 . 1 1 . 3 . 2 1 . 3 . 3 1 . 3 . 4 2 . 3 . 1 2 . 3 . 2 2 . 3 . 3 2 . 3 . 4 3 . 3 . 1 3 . 3 . 2 3 . 3 . 3 3 . 3 . 4 4 . 3 . 1 4 . 3 . 2 4 . 3 . 3 4 . 3 . 4 5 . 3 . 1 5 . 3 . 2 5 . 3 . 3 6 . 3 . 1 6 . 3 . 2 6 . 3 . 3 6 . 3 . 4 7 . 3 . 1 7 . 3 . 2 7 . 3 . 3 7 . 3 . 41 . 3 . 5 1 . 3 . 6 2 . 3 . 5 2 . 3 . 6 2 . 3 . 7 3 . 3 . 5 3 . 3 . 6 4 . 3 . 5

1 . 4 2 . 4 3 . 4 4 . 4 5 . 4

1 . 4 . 1 1 . 4 . 2 1 . 4 . 3 1 . 4 . 4 2 . 4 . 1 2 . 4 . 2 2 . 4 . 3 2 . 4 . 4 3 . 4 . 1 3 . 4 . 2 3 . 4 . 3 3 . 4 . 4 4 . 4 . 1 4 . 4 . 2 4 . 4 . 3 4 . 4 . 4 5 . 4 . 1 5 . 4 . 2 5 . 4 . 3 5 . 4 . 41 . 4 . 5 1 . 4 . 6 1 . 4 . 7 1 . 4 . 8 2 . 4 . 5 2 . 4 . 6 2 . 4 . 7 3 . 4 . 5 3 . 4 . 6 3 . 4 . 7 3 . 4 . 8 4 . 4 . 5 4 . 4 . 6 5 . 4 . 51 . 4 . 9 3 . 4 . 9 3 . 4 . 1 0 M e t r ic O K 1 . 1 . 1

M e t r ic n o t O K 1 . 1 . 1 1 . 5 2 . 5 3 . 5 4 . 5 T a s k c o m p le t e 1 . 1 . 1

1 . 5 . 1 1 . 5 . 2 1 . 5 . 3 1 . 5 . 4 2 . 5 . 1 2 . 5 . 2 2 . 5 . 3 2 . 5 . 4 3 . 5 . 1 3 . 5 . 2 3 . 5 . 3 3 . 5 . 4 4 . 5 . 1 4 . 5 . 2 4 . 5 . 3 4 . 5 . 4 T a s k o v e r d u e 1 . 1 . 11 . 5 . 5 1 . 5 . 6 1 . 5 . 7 1 . 5 . 8 2 . 5 . 5 2 . 5 . 6 2 . 5 . 7 3 . 5 . 5 3 . 5 . 6 3 . 5 . 7 6 0 d a y s 1 . 1 . 11 . 5 . 9 1 . 5 . 1 0 T a s k n o t d u e s o o n 1 . 1 . 1

N o l o n g e r a c t i v e 1 . 1 . 1 2 . 6 3 . 6 4 . 6 N o T a s k o r m e t r i c

2 . 6 . 1 2 . 6 . 2 2 . 6 . 3 2 . 6 . 4 3 . 6 . 1 3 . 6 . 2 3 . 6 . 3 3 . 6 . 4 4 . 6 . 1 4 . 6 . 2 4 . 6 . 32 . 6 . 5 2 . 6 . 6 2 . 6 . 7 2 . 6 . 8 3 . 6 . 5 3 . 6 . 6 3 . 6 . 7 3 . 6 . 8 N a v i g a t e u p

2 . 6 . 9 3 . 6 . 9 3 . 6 . 1 0 3 . 6 . 1 1 3 . 6 . 1 2 N a v i g a t e d o w n

E x t e r n a l li n k 2 . 7 3 . 7 L i n k t o g o a l s

2 . 7 . 1 2 . 7 . 2 2 . 7 . 3 2 . 7 . 4 3 . 7 . 1 3 . 7 . 2 3 . 7 . 3 3 . 7 . 42 . 7 . 5 2 . 7 . 6 2 . 7 . 7 2 . 7 . 8 3 . 7 . 5 3 . 7 . 6

2 . 8 3 . 8

2 . 8 . 1 2 . 8 . 2 2 . 8 . 3 2 . 8 . 4 3 . 8 . 1 3 . 8 . 2 3 . 8 . 32 . 8 . 5

T o d e v e l o p a n d d e p l o y a l a r g e s c a l e s c i e n c e G r i di n t h e U K f o r t h e u s e o f t h e P a r t i c l e P h y s i c s c o m m u n i t y

P r e s e n t a t i o n D e p lo y m e n t

5 6 74

1 1 - S e p - 0 3S t a t u s D a t e

I n t . S t a n d a r d s

O p e n S o u r c e

W o r l d w i d e I n t e g r a t i o n

U K I n t e g r a t i o n

M o n i t o r i n g

D e v e lo p in gE n g a g e m e n t

P a r t i c i p a t i o n

O t h e r

D a t a C h a l l e n g e s

R o l l o u t

T e s t b e d

W P 7

A T L A S / L H C b

C M S

B a B a r

C D F / D O

U K Q C D

W P 1

W P 2

W P 3

L C G C r e a t i o n

A p p li c a t io n s

C E R N D a t a G r i d A p p l i c a t i o n s I n f r a s t r u c t u r e

D u e w it h in

A T L A S

G r i d P P G o a l

R e s o u r c e sI n t e r o p e r a b i l i t y D i s s e m i n a t i o n

T i e r - 1

T i e r - A

L H C b T i e r - 2

W P 8

1 2 3

D e p lo y m e n t

W P 4

W P 5

F a b r i c

T e c h n o lo g y

W P 6

U p d a t e

C l e a r

Tony Doyle - University of Glasgow

Quarterly ReportingQuarterly Reporting

Q9 Q10 Q11 Q12

Actual Planned Actual Planned Actual Planned Actual Planned

1.50 0.60 1.30 0.60 0.60 0.60

2.25 1.50 2.50 1.50 1.50 1.50

0.75 1.50 0.50 1.50 1.50 1.50

3.00 3.00 3.00 3.00 3.00 3.00

1.50 0.54 1.60 2.40 2.40 2.40

0.38 1.20 0.60 0.00 0.00 0.00

1.28 0.99 1.00 0.90 0.90 0.90

0.39 0.00 0.00 0.00 0.00

2.40 2.40 2.40 2.40 2.40 2.40

0.90 1.20 0.00 1.20 1.20 1.20

3.00 3.00 3.00 3.00 3.00 3.00

3.00 3.00 3.00 3.00 3.00 3.00

1.80 0.00 2.00 1.50 1.50 1.50

3.00 3.00 3.00 3.00 3.00 3.00

3.00 3.00 3.00 3.00

0.00 3.00 3.00 2.00 0.00

1.50 1.50 1.00 1.50 1.50 1.50

1.50 1.50 1.50 1.50 1.50 1.50

3.00 3.00 2.00 3.00 3.00 1.00

3.00 3.00 3.00 3.00 3.00 3.00

3.00 3.00 3.00 3.00 3.00 3.00

3.00 3.00 3.00 3.00 3.00 3.00

3.00 3.00 3.00 3.00 3.00 3.00

3.00 3.00 3.00 3.00 3.00 3.00

1.50 1.50 1.50 1.50 1.50 1.50

1.75 1.50 1.50 1.50 1.50 1.50

1.75 1.50 1.50 1.50 1.50 1.50

1.50 1.50 1.50 1.50 1.50 1.50

1.50 1.50 1.50 1.50 1.50 1.50

3.00 3.00 3.00 3.00 3.00 3.00

3.00 3.00 2.00 3.00 3.00 3.00

3.00 3.00 3.00 3.00 3.00 3.00

3.00 3.00 3.00 3.00 3.00 3.00

3.00 3.00 3.00 3.00 3.00 3.00

3.00 3.00 3.00 3.00 3.00 3.00

3.00 3.00 3.00 3.00 3.00 3.00

1.50 1.50 1.50 1.50 1.50 1.50

1.90 1.50 1.40 1.50 1.50 1.50

1.50 1.50 1.50 1.50 1.50 1.50

1.50 1.50 1.50 1.50 1.50 1.50

3.00 3.00 3.00 3.00 3.00 3.00

3.00 3.00 3.00 3.00 3.00 3.00

3.00 3.00 3.00 3.00 3.00 3.00

3.00 3.00 3.00 3.00

3.00 3.00 3.00 3.00 3.00 3.00

3.00 3.00 3.00 3.00 3.00 3.00

3.00 3.00 3.00 3.00 3.00 3.00

0.00 0.00 1.50 1.50

1-Jul-031-Jan-03 1-Apr-03 1-Oct-03Q Start Date

Initial Surname WG ExptPost/DescriptionInstituteRoute

J Gordon WP5 eSC RALL Cornwall WP3 PPD RALL Cornwall WP7 PPD RALS Fisher WP3 PPD RALD Kelsey WP7 PPD RALD Kelsey WP6 PPD RALB Saunders WP6 PPD RALT Folkes WP5 eSC RALA Sansum WP6 eSC RALR Tasker WP7 BITD RALR Tam WP5 PPD RALS Traylen WP6 PPD RALJ Jensen WP5 eSC RALG Kuznetsov LHCb PPD RALS Burke WP8 EUDG - WP8 topupRALT Shah WP5 PPD RALD Colling WP1 DGPP: WP6/WP1ImperialD Colling WP6 DGPP: WP6/WP1ImperialW Bell WP2 DGPP: WP2GlasgowR Cordenonsi WP3 DGPP: WP3QMULA McNab WP6 DGPP: WP6ManchesterC Cioffi WP8 LHCb DGPP: WP8OxfordM Gardner WP8 Atlas DGPP: WP8RHULH Talini WP8 CMS DGPP: WP8ImperialF Brochu WP8 Atlas DGPP: WP8CambridgeA Washbrook WP4 DGPP: WP5/WP4LiverpoolA Washbrook WP5 DGPP: WP5/WP4LiverpoolO Moroney WP6 DGPP: WP6BristolO Moroney WP8 CMS DGPP: WP8BristolP Mealor WP7 DGPP: WP7UCLG Fairey WP7 DGPP: WP7 + spptManchesterG McCance WP2 DGPP: WP2GlasgowA Holt WP4 DGPP: WP4EdinburghA Soroko ATLAS GANGA1OxfordK Harrison LHCb GANGA2CambridgeA Tan ATLAS ATLAS BirminghamB MacEvoy CMS CMS ImperialJ Nebrensky CMS CMS BrunelT Barrass CMS CMS BristolT Barrass BaBar Babar DBBristolJ Martyniak BaBar Babar JSImperialA Forti BaBar Babar MDManchesterR Walker D0 SAM 1 ImperialA Flavell CDF SAM 2 GlasgowS Stonjek CDF SAM 3 OxfordD Evans D0 D0 LancasterJ Perry UKQCD UKQCD EdinburghM Egbert UKQCD UKQCD Edinburgh

Quarterly reporting allows comparison of delivered effort with expected effort

Feedback loop as issues arise

Tony Doyle - University of Glasgow

Funded Effort Breakdown Funded Effort Breakdown (Snapshot 2003Q3)(Snapshot 2003Q3)

Applications18%

LCG35%

EDG30%

Tier-1/A11%

Management6%

LCG effort is largest single area of GridPP

Future project priorities focussed on LCG and EGEE

Tony Doyle - University of Glasgow

GridPP Status: SummaryGridPP Status: Summary

• GridPP1 has now completed 2 ex 3 years

• All metrics are currently satisfied

• 103 of 182 tasks are complete

• 70 tasks not yet complete or overdue

• 9 tasks are overdue:– 6 are associated with LCG

• 2 of these are trivial (definition of future milestones)• 4 of these are related to the delay in LCG-1

– 2 are associated with applications (CMS and D0)– 1 is associated with the UK infrastructure (test of a

heterogeneous testbed)

Tony Doyle - University of Glasgow

Risk Register (Status April 03)Risk Register (Status April 03)

ID NameLi Im Risk Li Im Risk Li Im Risk Li Im Risk Li Im Risk Li Im Risk

R1 Recruitment/retention difficulties 1 2 2 2 2 4 2 2 4 2 2 4R2 Sudden loss of key staff 1 4 4 1 3 3 1 3 3 1 3 3R3 Minimal Contingency 1 2 2 2 2 4 1 2 2R4 GridPP deliverables late 1 3 3 4 2 8R5 Sub-components not delivered to project 2 4 8 2 3 6 2 3 6 2 3 6R6 Non take-up of project results 2 1 2 1 4 4 1 4 4R7 Change in project scope 1 1 1 2 2 4 1 1 1R8 Bad publicity 2 2 4 1 3 3 1 3 3R9 No publicity 2 1 2

R10 External software dependence 2 4 8 1 4 4 2 3 6R11 Lack of monitoring of staff 1 2 2 2 2 4 1 3 3 1 2 2R12 Withdrawal of an experiment 1 4 4 1 4 4 1 2 2R13 Lack of cooperation between Tier centres 1 4 4 1 2 2R14 Scalablity problems 3 3 9 2 3 6 1 3 3R15 Software maintainability problems 3 2 6 1 3 3R16 Technology shifts 1 3 3 1 4 4 1 3 3R17 Repitition of research 3 2 6R18 Lack of funding to meet LCG PH-1 goals 1 3 3R19 Adequate persistency solution not ready 1 3 3R20 Conflicting software requirements 4 1 4 2 2 4R21 Tier-A hardware fails to meet requirements 1 3 3R22 Other Hardware fails to meet requirements 1 1 1R23 Hardware physical risk (large scale) 1 4 4R24 Hardware physical risk (small scale) 2 2 4R25 Hardware procurement problems 2 2 4R26 LAN Bottlenecks 2 1 2R27 Tier-2 organisation fails 1 2 2R28 Tier-2 hardware not used as planned 2 1 2R29 SYSMAN effort inadequate 3 3 9R30 Firewalls interfere with Grid 1 3 3R31 Inablility to establish trust relationships 2 2 4R32 Security inadequate to operate Grid 1 3 3R33 GGF does not establish standards 1 3 3R34 Minimal open source code development 2 2 4R35 Failure of international cooperation 1 4 4 1 4 4R36 e-Science and GridPP divergence 2 2 4R37 Institutes do not embrace Grid 1 3 3R38 Grid is not stable enough for use 4 2 8R39 Delay of the LHC 2 2 4R40 Lack of future funding 2 4 8 2 2 4R41 Network backbone failure 0 4 0R42 Network backbone bottleneck 1 2 2

Alt-i-r

Interop.GridPP LCG EDG Apps Infrast.

Scaling up to aproduction system(LCG-1 deployment)

System managementeffort at UK Tier-2 sites(being addressed as part of GridPP2)

Tony Doyle - University of Glasgow

UK Certificates andUK Certificates andVO membership VO membership

1. UK e-Science CA now used in production EDG testbed

2. PP “users” engaged from many institutes

3. UK participating in 6 ex 9 EDG VOs

0 20 40 60 80 100 120

BaBar

Eobs

Iteam

LHCb

Alice

BioMe

CMS

Atlas

WP6

Members

UK Members

1.

2. 3.

Tony Doyle - University of Glasgow

• Certification and distribution process established • Middleware package – components from

– European DataGrid (EDG2.0)

– US (Globus, Condor, PPDG, GriPhyN) Virtual Data Toolkit, VDT 1.1

• Agreement reached on principles for registration and security• RAL to provide the initial grid operations centre• FZK to operate the call centre

• Initial service being deployednow to 10 centres US, Europe, Asia

• Expand to other centres assoon as the service is stable

LCG

Academia Sinica Taipei, BNL, CERN, CNAF, FNAL,FZK, IN2P3 Lyon, Moscow State Univ., RAL, Univ. Tokyo

LHC Computing Grid ServiceLHC Computing Grid Service

Tony Doyle - University of Glasgow

UK Deployment OverviewUK Deployment Overview

• Significant resources within EDG. Currently being upgraded to EDG2.

• Integrating EDG on farms has been repeated many times but it is difficult.

• Sites are keen to take part within EDG2 currently, with LCG1 deployment after this.

• By the end of the year many HEP farms plan to be contributing to LCG1 resources.

• Basis of Deployment Input to LCG Plan.

• Input from Tier-1 (~50%) initially and four distributed Tier-2’s (50%) on ~Q1 2004 timescale.

  CPU(kSI2K)

Disk TB

Support FTE

Tape TB

CERN 700 160 10.0 1000

Czech Repub 60 5 2.5 5

France 420 81 10.2 540

Germany 207 40 9.0 62

Holland 124 3 4.0 12

Italy 507 60 16.0 100

Japan 220 45 5.0 100

Poland 86 9 5.0 28

Russia 120 30 10.0 40

Taiwan 220 30 4.0 120

Spain 150 30 4.0 100

Sweden 179 40 2.0 40

Switzerland 26 5 2.0 40

UK 1656 226 17.3 295USA 801 176 15.5 1741

Total 5600 1169 120.0 4223

LCG

Resources committed for 1Q04

Tony Doyle - University of Glasgow

EDG 2.0 Deployment Status EDG 2.0 Deployment Status 12/9/0312/9/03

• RAL (Tier1A): Up and running with 2.0.1. UI gppui04 available (as part of CSF) and offer to give access to LCFGng node to help people compare with their own LCFGng setup.

• IC: Existing WP3 testbed site is at 2.0.0. Standard 2.0 RB available• UCL: Trying to go to 2.0: SE up so far.• QMUL: 2.0 installation ongoing.• RAL (PPD): 2.0.0 site up and running.• Oxford: wait until October for 2.0. • Birmingham: Working on getting a 2.0 site up next week• Bristol: WP3 testbed site at 2.0.0. Also doing a new 2.0 site install.

UI and MON up, still doing CE, SE and WN.• Cambridge to follow.• Manchester: Trying to get 2.0.1 set up.• Glasgow: Concentrating on commissioning new hardware during

the next month. Wait until then before going to 2.0.• Edinburgh to follow.

Tony Doyle - University of Glasgow

Tier-1 @ RALTier-1 @ RAL

CE SE

LCG 1.0/EDG 2.0

5xWN

LCG Testbed

CE

LCG 1.0/EDG 2.0

230xWN

Tier-1/A

CE SE

EDG 2.0

WP3 Testbed

MON

CE SE

EDG 2.0

1xWN

EDG Dev Testbed

MON SE

ADS

•UI within CSF.

•NM for EDG2.

•Top level MDS for EDG.

•Various WP3 and WP5 dev nodes.

•VOMS for DEV TB.

•http://ganglia.gridpp.rl.ac.uk/

SE

LCG0 Testbed

CE 1xWN

Tony Doyle - University of Glasgow

London Grid:London Grid:Imperial CollegeImperial College

CE SE

EDG 2.0

EDG Testbed

WNs

CE

EDG 2.0

WNs

BaBar Farm

CE SE

CMS-LCG0

CMS-LCG0

WN

CE SE

EDG 2.0

1xWN

WP3 Testbed

MON

•RB for EDG 2.0.

•Plan to be in LCG1 and other testbeds.

Tony Doyle - University of Glasgow

London Grid:London Grid:Queen Mary and UCLQueen Mary and UCL

CE SE

EDG 1.4

1xWN

EDG Testbed

32xWN

• Queen Mary CE also feeds EDG jobs to 32 node e-Science farm.

•Plan to have LCG1/EDG2 running for the end of the year.•Expansion with SRIF grants.(64WN+2TB in Jan 2004, 100WN + 8TB in Dec 2004.)

•http://194.36.10.1/ganglia-webfrontend

CE SE

EDG 1.4

1xWN

EDG Testbed

•UCL Network Monitors for WP7 development.

•SRIF bid in place for 200 CPUs for the end of the year to join LCG1.

Tony Doyle - University of Glasgow

Southern Grid:Southern Grid:BristolBristol

CE SE

EDG 2.0

1xWN

EDG Testbed

CE SE

EDG 2.0

1xWN

WP3 Testbed

MON

CE SE

CMS-LCG0

CMS/LHCb Farm

24xWN

CE SE

EDG 1.4

BaBar Farm

78xWN

•GridPP RC.

•Plan to join LCG1

Tony Doyle - University of Glasgow

Southern Grid:Southern Grid:Cambridge and OxfordCambridge and Oxford

CE SE

EDG 1.4

15xWN

EDG Testbed •Cambridge farm shared with local NA-48, GANGA users.

•Some RH73 WNs for ongoing ATLAS challenge.

•3TB GridFTP-SE.

•Plan to join LCG1/EDG2 later in the year with an extra 50 CPUs.

•EDG jobs will be fed into local e-Science farm.

•http://farm002.hep.phy.cam.ac.uk/cavendish/

CE SE

EDG 1.4

2xWN

EDG Testbed•Oxford: Plan to join EDG2/LCG1.

•Nagios monitoring has been set up.

•(RAL is also evaluating Nagios)

•Planning to send EDG jobs into 10 WN CDF farm.

•128 node cluster being ordered now.

Tony Doyle - University of Glasgow

Southern Grid:Southern Grid:RAL PPD and BirminghamRAL PPD and Birmingham

CE SE

EDG 2.0

9xWN

EDG Testbed

CE SE

EDG 2.0

1xWN

WP3 Testbed

MON

•PPD User Interface

•Part of Southern Tier2 Centre within LCG1.

•50 CPUs and 5TB of disk expected for the end of year.

CE SE

EDG 1.4

1xWN

EDG Testbed•Birmingham: Expansion to 60 CPUs and 4TBs.

•Expect to participate within LCG1/EDG2

Tony Doyle - University of Glasgow

NorthGrid: NorthGrid: Manchester and LiverpoolManchester and Liverpool

CE SE(1.5TB)

EDG 1.4

80xWN

CE SE(5TB)

EDG 1.4

60xWN

CE SE

EDG 1.4

9xWN

EDG Testbed BaBar Farm DZero Farm

•GridPP and BaBar VO Servers.

•User Interface

•Plan that DZero farm will join LCG.

•SRIF bid in place for significant HEP resources.

CE SE

EDG 1.4

1xWN

EDG Testbed

•Liverpool plan to follow EDG 2, possibly integrating newly installed Dell (funded by NW Development Agency) and BaBar farm. Largest single Tier-2 resource.

Tony Doyle - University of Glasgow

ScotGrid: Glasgow, Edinburgh ScotGrid: Glasgow, Edinburgh and Durhamand Durham

CE SE

EDG 1.4

ScotGRID

59xWN

• WNs on a private network with outbound NAT in place.

• Various WP2 development boxes.

• 34 dual blade servers just arrived. 5TB FastT500 expected soon.

• Shared resources (CDF and Bioinformatics)

CE SE

EDG 2.0

WP3 Testbed

MON

• Edinburgh: 24TB FastT700 and 8-way server just arrived.

• Durham: existing farm available.

• Plan to be part of LCG.

CDF

LHC BIO

Tony Doyle - University of Glasgow

Testbed StatusTestbed StatusSummer 2003Summer 2003

UK-wide developmentusing EU-DataGrid tools (v1.47).Deployed during Sept 02-03. Currently being upgraded to v2.0.See http://www.gridpp.ac.uk/map/

Tony Doyle - University of Glasgow

Meeting Current LHC Requirements: Meeting Current LHC Requirements: Experiment AccountingExperiment Accounting

Number of Normalised Processors per country48

9

765

900

0

136

105

332

15

154

297

218

326

0

256

348

127

1156

758

1 2 3 4 5 6 7 8 9 10 11 12 13

14 15 16 17 18 19

Regional Centre Simulation Hits No Pile Up 2x1033 2x1044 NassIDBristol/RAL 0.55 0.33 0.04 0.06 0.02 20Caltech 0.17 0.15 0.00 0.15 0.00 6CERN 0.89 2.20 1.40 2.66 2.25 300Fermilab 0.35 0.41 0.00 0.25 0.33 70ICST&M 0.88 0.59 0.50 0.15 0.12 84IN2P3 0.20 0.00 0.00 0.00 0.00 1INFN 1.55 1.18 0.40 0.72 0.71 99Moscow 0.43 0.14 0.14 0.00 0.00 41UCSD 0.34 0.30 0.00 0.29 0.30 80UFL 0.54 0.04 0.00 0.04 0.04 11USMOP 0.00 0.00 0.00 0.00 0.00 1Wisconsin 0.07 0.08 0.00 0.06 0.00 12

TOTAL 5.94 5.40 2.47 4.36 3.77

Experiment-driven project.Priorities determined by Experiments Board.

Tony Doyle - University of Glasgow

Tier-1/A AccountingTier-1/A Accounting

LHCb

ATLAS

CMS

CMS

Monthly accounting:Online Ganglia-based monitoring, see:http://www.gridpp.ac.uk/tier1a/Last month: CMS and BaBar jobs.

Annual accounting:ATLAS, CMS and LHCb jobs. Generally dominated by BaBar since January.

BaBar

BaBar

Tony Doyle - University of Glasgow

Today’s OperationsToday’s Operations

1. Support Team • built from sysadmins. 4 funded by

GridPP to work on EDG WP6, the rest are site sysadmins.

2. Methods• Email list, phone meetings,

personal visits, job submission monitoring

• RB, VO, RC for UK use to support non-EDG use

3. Rollout• Experience from RAL in EDG dev

testbeds and IC and Bristol in CMS testbeds

• 10 sites have been part of EDG app testbed at one time

Tony Doyle - University of Glasgow

0

1

2

3

4

5

6

7

8

9

10

FT

EOperations Manager

Tier-2 Expert

Tier-2 Expert

Tier-2 Expert

Tier-2 Expert

Tier-1 Expert

Tier-1 Expert

Applications Expert

University WP6 Posts

RAL WP6 Post

WP8 Post

Testbed Team Production Team

GridPP1 GridPP2

GridPP2 OperationsGridPP2 Operations

• To move from testbed to production, GridPP plans a bigger team with a full-time Operations Manager

• Manpower will be from the Tier-1 and Tier-2 Centres who will contribute to the Production Team

• The team will run a UK Grid which will belong to various grids (EDG, LCG,..) and also support other experiments

Ta

gged

re

leas

e s

elec

ted

for

cert

ifica

tion

Ce

rtifi

ed r

ele

ase

sele

cte

d fo

r d

eplo

yme

nt

Ta

gged

p

acka

ge

Problem reports

add unittested code to

repository

Run nightly build

& auto. tests

Grid certification

Fix problemsApplication Certification

BuildSystem

CertificationTestbed ~40CPU

ProductionTestbed ~1000CPU

Certified publicrelease

for use by apps.

24x7

Build system

Test Group

WPs

Unit Test Build

Certification

Production

Users

DevelopmentTestbed ~15CPU

Individual WP tests

Integration

Team

Integration

Overall release tests

Releases candidate

Tagged Releases

Releases candidate

Certified Releases

Apps. Representative

s

Tony Doyle - University of Glasgow

LCG OperationsLCG Operations

• RAL has led project to develop an Operations Centre for LCG1– Applied GridPP and MapCenter monitoring to LCG1– Dashboard combining several types of monitoring – Set up a web site with contact information– Developing Security Plan– Accounting (the current priority, building upon

resource centre and experiment accounting)• RAL is also leading the LCG Security Group

– written 4 documents setting out procedures and User Rules

– working with GOC task force on Security Policy

– Risk Analysis and further planning for LCG in 2004

Tony Doyle - University of Glasgow

Tony Doyle - University of Glasgow

Tony Doyle - University of Glasgow

EGEE Technical Annex:EGEE Technical Annex:nearing completionnearing completion

Tony Doyle - University of Glasgow

EGEEEGEE

Tier1 (16.5 FTE)

UK Team (8 FTE)

UK GSC (2 FTE)

(2FTE)

EGEE ROC(5 FTE)

EGEE CIC(4.5 FTE)

• The UK Production Team will be expanded as part of EGEE ROC and CIC posts to meet EGEE requirements

• To deliver an EGEE grid infrastructure that must also deliver to other communities and projects

• Could do this just within PP (matching funding available) but also want to engage fully with UK Core programme

• Ongoing discussions…

Tony Doyle - University of Glasgow

Possibility for future ~ 1 year timescalePossibility for future ~ 1 year timescale(10/7/03, PC)(10/7/03, PC)

GT2

L2 Grid

Application

GT2

PP Grid

PP Application

EGEE-0 (GT2)

If UK is backing

EGEE (which it is)..

..then it probably

makes sense to

embrace EGEE-0

which will be based upon GT2

… this way we would influence development,

and reap benefit of leverage

Tony Doyle - University of Glasgow

““The Italian Job”The Italian Job”Management structure for the production GridManagement structure for the production Grid

~36 people~36 people

Experiments GRID Projects

Coordination Committee

Management Operations

•Planning/Deployment

• resource Policy usage•Management tools

Central TeamMonitoringAuthorization

Site-man

GridServicesupport

VO support

Usersupport

Application

Grid ResourceCoordination

Experimemt or research org. support

releaseConfigurationmanagement

4p

6p

8p 4p 4p

2pRelease distribution, documentation and porting

EGEE, LCG coord

2p

Tony Doyle - University of Glasgow

Tier-1/A Services [FTE]Tier-1/A Services [FTE]

• High quality data services

• National and International Role• UK Focus for International Grid development

• Highest single priority within GridPP2

Current Planning

CPU 2.0

Disk 1.5

AFS 0.0

Tape 2.5

Core Services 2.0

Operations 2.5

Networking 0.5

Security 0.0

Deployment 2.0

Experiments 2.0

Management 1.5

Total 16.5

Tony Doyle - University of Glasgow

Tier-2 Services [FTE]Tier-2 Services [FTE]

• Four Regional Tier-2 Centres • London: Brunel, Imperial College, QMUL, RHUL, UCL.• SouthGrid: Birmingham, Bristol, Cambridge, Oxford, RAL PPD.• NorthGrid: CLRC Daresbury, Lancaster, Liverpool, Manchester, Sheffield.• ScotGrid: Durham, Edinburgh, Glasgow.

• Hardware provided by Institutes

• GridPP provides added manpower

Role FTE

Tier Centre Experts 4 (taken from those below)

Hardware Support 8

Core Services 4

User Support 2

Specialist Support 6

Total 20

  Current Planning

  Y 1 Y 2 Y 3

Hardware Support 4.0 8.0 8.0

Core Services 4.0 4.0 4.0

User Support 1.0 2.0 2.0

Specialist Services      

Security 1.0 1.0 1.0

Resource Broker 1.0 1.0 1.0

Network 0.5 0.5 0.5

Data Management 2.0 2.0 2.0

VO Management 0.5 0.5 0.5

  14.0 19.0 19.0

Existing Staff -4.0 -4.0 -4.0

GridPP2 10.0 15.0 15.0

Total SY     40.0

Tony Doyle - University of Glasgow

Operational RolesOperational Roles

• Core Infrastructure Services (CIC) – Grid information services

– Monitoring services

– Resource brokering

– Allocation and scheduling services

– Replica data catalogues

– Authorisation services

– Accounting services

• Still to be defined fully in EGEE

• Core Operational Tasks (ROC)– Monitor infrastructure, components

and services

– Troubleshooting

– Verification of new sites joining Grid

– Acceptance tests of new middleware releases

– Verify suppliers are meeting SLA

– Performance tuning and optimisation

– Publishing use figures and accounts

Tony Doyle - University of Glasgow

LCG Level 1 MilestonesLCG Level 1 Milestonesproposed to LHCCproposed to LHCC

M1.1 - June 03 First Global Grid Service (LCG-1) available-- this milestone and M1.3 defined in detail by end 2002

M1.2 - June 03 Hybrid Event Store (Persistency Framework) available for general users

M1.3a - November 03 LCG-1 reliability and performance targets achieved

M1.3b - November 03 Distributed batch production using grid services

M1.4 - May 04 Distributed end-user interactive analysis-- detailed definition of this milestone by November 03

M1.5 - December 04 “50% prototype” (LCG-3) available-- detailed definition of this milestone by June 04

M1.6 - March 05 Full Persistency Framework

M1.7 - June 05 LHC Global Grid TDR

Tony Doyle - University of Glasgow

RB Stress Tests (9/9/03): RB Stress Tests (9/9/03): Benchmark for LCG DeploymentBenchmark for LCG Deployment

• Status (LCG report by Ian Bird):

• RB never crashed

• ran without problems at a load of 8.0 for several days in a row 20 streams with 100 jobs each ( typical error rate ~ 2 % still present )

• RB stress test in a job storm of 50 streams , 20 jobs each :– 50% of the streams ran out of connections between UI and

RB. (configuration parameter – but machine constraints)– Remaining 50% streams finished normal ( 2% error rate)– Time between job-submit and return of the command

(acceptance by the RB) is 3.5 seconds. ( independent of number of streams )

• NOTE:RB interrogates all suitable CE's : wide area delay-killer

Tony Doyle - University of Glasgow

Next m/w steps (9/9/03)Next m/w steps (9/9/03)

• Status (LCG report by Ian Bird):Next LCG-1 upgrades:• The same software recompiled with gcc 3.2• New VDT? Based on Globus 2.4? LCG will work on this issue.• Add VOMS• Add RLI? – see discussion of RLS• R-GMA now seems to be off the table *Corrective Action 16/9/03*• Working group to resolve data access issues:

– Components exist: SRM, GFAL, RLS, gridFTP; – need to make coherent based on use-cases and integrate with MSS’s

For more features we would like to apply the simple rule:Add it if and only if

– it is proven to work (by EDG, LCG, together)– it adds some desirable feature or feature requested by users– it makes the user’s application setup significantly simpler or practical– it is required by new user applications

Bug fixing will always have the highest priority.Target release date for 2004 system: November

Tony Doyle - University of Glasgow

RGMA – Status (16/9/03)RGMA – Status (16/9/03)

• Running on WP3, EDG-development and EDG-application testbeds• Application Deployment: 29 CEs, 11 SEs, 10 sites in 6 countries

– RGMA browser access in < 1sec

• Monitoring scripts being run on the testbeds and results linked from the WP3 web page– http://hepunx.rl.ac.uk/edg/wp3/

• Registry replication is being tested on WP3 testbed– Better performance & higher reliability required

• Authentication successfully tested on WP3 testbed• Two known bugs remain

– Excessive threads requiring GOUT machine restart• New code has been developed with extensive unit tests. Now

being tested on WP3 testbed• This new code will support at least 90 sites

– Latest Producer choosing algorithm failing to reject bad LPs – shows up intermittent absence of information

• Revised algorithm needs coding (localised change)

Tony Doyle - University of Glasgow

RGMA - UsersRGMA - Users

• Users and Interfaces to other systems: – Resource Broker– CMS (Boss)– Service and Service Status for all EDG services – Network Monitoring & Network Cost Function– MapCenter– Logging & Bookkeeping– UK e-Science, CrossGrid and BaBar evaluating– Replica Manager– MDS (GIN/GOUT) – Nagios – Ganglia (Ranglia)

• Future: RB direct use of RGMA (no GOUT)– Better performance and reliability

Tony Doyle - University of Glasgow

SRB for CMSSRB for CMS

• UK eScience has been interested in SRB for several years.

• CCLRC has gained expertise for other projects and is collaborating with SDSC

• Now hosting MCAT for worldwide CMS pre-DC04

• Interfaced to RAL Datastore– Service Started 1 July 2003

• 183,000 files registered

• 10 TB of data stored in system

• Used across 13 sites worldwide including CERN and Fermilab

• 30 Storage resources managed across the sites

MCATDatabase

MCATServer

SRB AServer

SRB BServer

SRBClient

a

b

c d

e

f

g

Tony Doyle - University of Glasgow

EDG StorageElementEDG StorageElement

• Not initially adopted by LCG1

• Since then limited SRM functionality has been added to support GFAL – available for test by LCG

• Full SRMv1 functionality has been developed and is currently being integrated on internal testbed

• GACLs being integrated

Tony Doyle - University of Glasgow

622Mb/s

UK Internal

2x2.5Gb/s

2x2.5Gb/s

Tony Doyle - University of Glasgow

OptorSim: OptorSim: File Replication SimulationFile Replication Simulation

Test P2P file replication strategies: e.g. economic models

1. Optimisation principles applied to GridPP 2004 testbed with realistic PP use patterns/policies

2. Job scheduling: Queue access cost takes into account queue length and network connectivity. Anticipate replicas needed at “close” sites using three replication algorithms.

3. Build in realistic JANET background traffic

4. Replication algorithms optimise CPU use/job time as replicas are built up

on the Grid.

Tony Doyle - University of Glasgow

Middleware, Security and Network Middleware, Security and Network Service EvolutionService Evolution

0

5

10

15

20

25

30

FT

E

GridPP1 GridPP2

Security

Networking

Data Management

Information Services

Workload ManagementWP1

WP2

WP3

WP4

WP5

WP70

5

1 0

1 5

1 2 3 4

GridPP

Non GridPP

• Information Services [5+5 FTE] and Networking [3.0+1.5+1.5 FTE]: strategic roles within EGEE• Security expands to meet reqts.• Data and Workload Management continue• No further configuration management development

• programme defined by – mission criticality (experiment

requirements driven) – International/UK-wide lead – leverage of EGEE, UK core and

LCG developments

• Redefine (EDG-based) UK Work Packages into five Development Groups

Activity Current Planning

Security 3.5

Info-Mon. 4.0

Data & Storage 4.0

Workload 1.5

Networking 3.0

TOTAL 16.0

SecurityMiddleware

Networking

Tony Doyle - University of Glasgow

Application InterfacesApplication Interfaces - Service Evolution- Service Evolution

Experiment Present Grid Activities Present Posts Proposed New Grid Activities

ATLAS MC production EDG integration

2.5

MC production Metadata LCG integration Persistency and data management

CMS MC production Persistency and data management Workload management Monitoring

3

MC production Persistency and data management Workload management Monitoring

LHCb MC production Metadata Persistency

2

MC production Metadata LCG integration Persistency and data management

GANGA (ATLAS+LHCb) Common Grid user interface

2

Common Grid user interface LCG integration Monitoring

BaBar Job submission Replica catalogue Persistency

2.5

Job submission Replica catalogue Persistency MC production

DØ MC production SAM development

2 MC production SAM deployment

CDF MC production SAM development

2 MC production SAM deployment

UKQCD Data access Job submission

1 Data federation Data binding

WP8 Applications integration 1 Applications integration UKDMC MC production

Analysis ZEUS MC production PhenoGrid Metadata MICE MC production

Data handling ANTARES Data handling CALICE MC production LC-ABD Metadata Total 18

• Applications– 18 FTEs: ongoing programme

of work can continue– Difficult to involve experiment

activity not already engaged within GridPP

• Project would need to build on cross-experiment collaboration – GridPP1 already has experience

– GANGA: ATLAS & LHCb– SAM: CDF & D0– Persistency: CMS &

BaBar• Encourage new joint

developments across experiments

Tony Doyle - University of Glasgow

£0.24m

£2.21m

£2.41m

£4.01m

£3.05m

£3.14m

£4.82m

£2.25m

£3.16m

£m £1m

£2m

£3m

£4m

£5m

£6m

£7m

Tier-0 Hardware

Tier-0 Staff

Tier-1 Hardware

Tier-1 Staff

Tier-2 Hardware

Tier-2 Staff

App. Integration

LHC Application Dev.

Non LHC Application Dev.

Middleware/Security/Network

Operations/Management/Dissem.

23% Institutes

27% EGEE + Others

Experiment Collaboration Bids

98% non-PPARC

?50% of £5mSRIF-1 SRIF-2

18% CLRC

Current planning based uponCurrent planning based upon£19.6m Funding Scenario£19.6m Funding Scenario

£0m

£2m

£4m

£6m

£8m

£10m

£12m

£14m

2003 2004 2005 2006 2007 2008

Application Development

Tier-1 and 2 staff

Middleware/Security/Network

Tier-2 Hardware

Tier-2 Staff

Tier-1 Hardware

Tier-1 Staff

Tier-0 Hardware

Tier-0 Staff

Application Integration

Middleware/Security/Network

Dissemination

Travel and Operations

Management

GridPP2Proposal

ExternallyFunded

PPARC Review Timeline:

Projects Peer Review Panel(14-15/7/03)

Grid Steering Committee(28-29/7/03)

Science Committee(October 03)

Tony Doyle - University of Glasgow

Dissemination:Dissemination: e-Science and Web e-Science and Web

e-Science ‘All Hands’ Meeting held at Nottingham, 2-4 September 2003

– ~ 500 people in total

– ~ 20 GridPP People attended

– ~ 17 GridPP Abstracts accepted

– ~ 10 GridPP Papers published + posters displayed

– 6 GridPP Invited talks

– 3 GridPP Demonstrations

• Next step: SC2003 (DC et al.)• and a dissemination officer…

GridPP Web Page Requests:

Currently ~35,000 per month

month: reqs: pages: --------: ------: -----: Nov 2001: 29918: 4374: Dec 2001: 29315: 5576: Jan 2002: 50892: 7594: Feb 2002: 66166: 7724: Mar 2002: 135683: 8180: Apr 2002: 222939: 12008: May 2002: 249830: 11879: Jun 2002: 205480: 12679: Jul 2002: 148604: 14125: Aug 2002: 186690: 21449: Sep 2002: 266853: 24318: Oct 2002: 237031: 26893: Nov 2002: 243710: 29796: Dec 2002: 204119: 27101: Jan 2003: 251185: 27291: Feb 2003: 295381: 30002: Mar 2003: 419985: 35650: Apr 2003: 316816: 34548:

Tony Doyle - University of Glasgow

LCG Press Release…LCG Press Release…

• LHC Computing Grid goes Online

• CERN, Geneva, 29 September 2003.

• The world's particle physics community today announces the launch of the first phase of the LHC computing Grid (LCG-1). Designed to handle the unprecedented quantities of data that will be produced at CERN's Large Hadron Collider (LHC) from 2007 onwards, the LCG will provide a vital test-bed for new Grid computing technologies. These are set to revolutionise the way we use the world's computing resources in areas ranging from fundamental research to medical diagnosis.

• "The Grid enables us to harness the power of scientific computing centres wherever they may be to provide the most powerful computing resource the world has to offer" Les Robertson

Tony Doyle - University of Glasgow

ConclusionsConclusions

• Visible progress this year in GridPP1

• Management via the Project Map and Project Plan

• High level tasks and metrics: under control

• Major component is LCG– We contribute significantly to

LCG and our success depends critically on LCG

• Middleware components on critical path w.r.t. LCG adoption

• Deployment – high and low level perspectives merge via monitoring/accounting

• Resource centre and experiment accounting are both important

• Today’s operations in the UK are built around a small team

• Future operations planning expands this team: Production Manager being appointed

• Middleware deployment focus on Information Service performance

• Security (deployment and policy) is emphasised

• “Production Grid” will be difficult to realise: need to start GridPP2 planning now (already underway)

• GridPP2 proposal: formal feedback in November

• Transition period for:– Middleware/Security/Networking

Groups– Experiments Phase II– Production Grid Planning

Tony Doyle - University of Glasgow

GridPP8 - WelcomeGridPP8 - Welcome• Monday September 22nd

• 10:30-11:00 Arrive - Coffee • Opening Session (Chair: Nick Brook) • 11:00-11:30 Welcome and Introduction - Steve Lloyd • 11:30-12:00 GridPP2 Next Stages - Tony Doyle • 12:00-12:30 LCG Overview - Tony Cass • 12:30-13:30 Lunch • Application Developments I: Grid User Interfaces

(Chair: Roger Barlow) • 13:30-14:00 The GANGA Interface for ATLAS/LHCb -

Alexander Soroko • 14:00-14:30 GANGA for BaBar - Janusz Martyniak • 14:30-15:00 UKQCD interfaces - James Perry • 15:00-15:30 CMS analysis framework - Hugh Talini • 15:30-16:00 Coffee • Application Developments II (Chair: Roger Jones) • 16:00-16:25 Control/Monitoring for LHCb Production -

Gennady Kuznetsov • 16:25-16:50 BaBar Grid status - Alessandra Forti • 16:50-17:10 JIM and the Gridification of SAM - Morag

Burgon-Lyon • 17:10-17:30 UKQCD progress - Lorna Smith • 17:30-17:50 CMS Status - Peter Hobson • 17:50-18:10 D0 Status - Rod Walker • 18:10-18:30 ATLAS Status - Alvin Tan • 18:30-18:35 Logistics - Dave Newbold • 19:00 Collaboration Dinner at Goldney Hall

• Tuesday 23rd September

• Bristol e-Science, GSC Operation and Tier Centre Reports (Chair: David Britton)

• 09:00-09:15 Bristol e-Science - Tim Phillips, Deputy Director of Information Services

• 09:15-09:30 HP Grid Strategy - Paul Vickers • 09:30-10:00 Grid Support and Operations Centre -

John Gordon • 10:00-10:30 Tier-1 report - Andrew Sansum • 10:30-10:45 London Tier-2 Report - Dave Colling • 10:45-11:00 Southern Tier-2 Report - Jeff Tseng • 11:00-11:30 Coffee • Tier-2 Centre Reports and Planning (Chair: John

Gordon) • 11:30-11:45 Northern Tier-2 Report - Andrew McNab • 11:45-12:00 ScotGrid Tier-2 Report - Akram Khan • 12:00-12:30 Tier-2 Centre Development Plans - Steve

Lloyd • 12:30-13:30 Lunch • Middleware Development (Chair: Steve Lloyd) • 13:30-14:00 UK e-Science BOF session summary -

Robin Middleton • 14:00-14:20 Workload Management - Dave Colling • 14:20-14:40 Data Management - Paul Millar • 14:40-15:00 Information & Monitoring - Steve Fisher • 15:00-15:20 Security - Dave Kelsey / Andrew McNab • 15:20-15:40 Networking - Peter Clarke • 15:40-16:00 Fabric Management - Lex Holt