UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

33
UK e-Infrastructure for Research Michael Ball, BBSRC Frances Collingborn, NERC Martin Hamilton, Jisc David de Roure, ESRC / University of Oxford Photo credit: STFC 1 UKUSAHPC - July 2015 27/04/2022

Transcript of UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

Page 1: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

15/04/2023

UKUSAHPC - July 2015 1

UK e-Infrastructure for Research

Michael Ball, BBSRCFrances Collingborn, NERCMartin Hamilton, JiscDavid de Roure, ESRC / University of Oxford

Photo credit: STFC

Page 2: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

UK e-Infrastructure for research

1. UK e-Infrastructure for research– Public funding for major science facilities and institutes– Support for translation from R&D into business

2. e-Infrastructure survey– Build inventory of the e-Infrastructure– Operating systems and software environment– Funding and budgeting models– Training and support arrangements– Academic and industrial impact

3. RCUK e-Infrastructure roadmap– Vision and aspirations– Investment plan

Photo credit EPCC / EPSRC

Page 3: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

15/04/2023

UKUSAHPC - July 2015 3

1. UK e-Infrastructure for research

Page 4: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

UK e-Infrastructure for research

HPC Project RC Amount/£M

National Service EPSRC, NERC 43

Hartree Centre STFC 30

DIRAC STFC 15

GRIDPP STFC 3

The Genome Analysis Centre (TGAC)

BBSRC 8

Monsoon NERC/Met Office 1

JASMIN2 & CEMS NERC, & UKSA 7.75

Regional Centres: N8, SES5, MID+, HPC Midlands, ARCHIE-WeSt

EPSRC 6.5

JANET Network and Authentication Moonshot

Jisc 31

HPC Data Storage EPSRC, STFC 15

Total 160

Investments by BIS, the Research Councils and HEIs have resulted in core elements of the national e-Infrastructure being put in place.

» 2011-2012 - £160m

Investments were made in core HPC and Networking infrastructure. In addition investments were made in the Authentication Infrastructure Moonshot (now known as Jisc Assent).

» 2012-2013 - £189m

» 2014-2015 - £257m

Page 5: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

UK e-Infrastructure for research

From www.jasmin.ac.uk

Page 6: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

UK e-Infrastructure for research

Big Data Project RC Amount/£M

Digital transformations in arts and humanities

AHRC 8

E-infrastructure for biosciences

BBSRC 13

Research data facility and software Development

EPSRC 8

Administrative data centres ESRC 36

Understanding populations ESRC 12

Business datasafe ESRC 14

Biomedical informatics MRC 55

Environmental virtual observatory

NERC 13

Square Kilometre Array STFC 11

Energy Efficiency Computing Hartree Centre

STFC 19

Total   189

Investments by BIS, the Research Councils and HEIs have resulted in core elements of the national e-Infrastructure being put in place.

» 2011-2012 - £160m

» 2012-2013 - £189m

Big Data projects using funds announced by the Government in December 2012 were funded at this time. Major Awards have been made to 18 centres in the UK, 16 of whom are HEIs. The pre-eminent role of HEIs in managing and providing national and Large Specialist data and compute services to UK academia is emphasised by these awards.

» 2014-2015 - £257m

Page 7: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

UK e-Infrastructure for research

Energy Efficient Computing

Infrastructure(STFC)

De-identified admin (including health) data

Business

data

Open data (public sector)

Social media data

Research data

Longitudinal

survey data

Open data

Securely held data

Environment data

Business Datasafe

(ESRC)

Admin Data Research Centres

(ESRC)

High Performance Data Environment(NERC)

Clinical data

Medical Bioinformatics (MRC)Understanding Populations (ESRC)Clinical Practice Datalink (MHRA, NIHR)100,000 Genome Project NHS)

Research Data Facility (EPSRC)European Bioinformatics Institute (EMBL)Bioscience E-Infrastructure (BBSRC)Square Kilometre Array (STFC)

Digital Transformations(AHRC)

Archive data

Open Data Institute

Com

merc

ial

Rese

arch

Understanding Populations (ESRC)

RCUK Big Data 21st century raw material

Page 8: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

UK e-Infrastructure for research

ESRC Big Data Network

Page 9: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

UK e-Infrastructure for research

Investments by BIS, the Research Councils and HEIshave resulted in core elements of the national e-Infrastructure being put in place.

» 2011-2012 - £160m

» 2012-2013 - £189m

» 2014-2015 - £257m

Three major investments dominated this period:

» Centre for Cognitive Computing at the Hartree Centre. This was funded at the £115M level with a further £230M from IBM

» A 10 Pflop Supercomputer for the Met Office (£100M)

» Alan Turing Centre for Data Science (£42M)

In addition it was announced that a further £100M would be made available to the SKA Project as part of Big Data Investments.

Photo credit: EPSRC

Page 10: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

April 2015 BBSRC bioscience big data infrastructure funding:» £1.79M to build a next generation image repository, to make available

original scientific image data that underpins life sciences research.

» £2M for big data infrastructure for crop genomics, stimulating new opportunities in crop development to help improve some of the world's most important crops.

» £1.9M to establish infrastructure for functional annotation of farmed animal genomes, to help feed us in the future by providing an important framework for the discovery of genetic variation in domesticated animals and how that influences their characteristics.

» £1.78M to create cyber infrastructure for the plant sciences. The UK iPlant node that will help to spread expertise and best practice between the UK and US. UK/US collaboration with University of Arizona and the Texas Advanced Computing Center.

UK e-Infrastructure for research

Page 11: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

UK e-Infrastructure for research

bit.ly/dowlingreport bit.ly/bis8great

Context:› Reviews, e.g.

Pearce, Diamond, Dowling, Shadbolt– Demonstrable

efficiency, effectiveness andproductivity

› UK Government Industrial Strategy– 8 Great Technologies– Catapult Centres

› Cultural shifts– Open Science– Open Access– Open Research Data

Page 12: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

UK e-Infrastructure for research

bit.ly/hauserreportbit.ly/jischpc

Drivers:› Shared facilities

and industry access– Finding them– Using them (kit &

people)

› Big push for translation and consolidation– New Catapult Centres– Farr Institute,

Francis Crick Institute,Alan Turing Institute

› Impact of Austerity 2.0– Comprehensive

Spending Review, Autumn 2015

Page 13: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

15/04/2023

UKUSAHPC - July 2015 13

2. e-Infrastructure Survey

Page 14: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

e-Infrastructure Survey

What we did:› Build an inventory of UK research e-

Infrastructure– Including interconnects, storage, accelerators etc– Gathering data on use of cloud technologies

› Itemize operating environment– e.g. OS distributions, schedulers, filesystems,

authentication & authorization

› Funding and budgeting models– Power costs, PUE, split between CAPEX/OPEX,

location of scientific computing in the institution

› Training and support arrangements– Where is support effort spent, role of women in

HPC

› Academic and industrial impact– Grants, papers, businesses using the facilities

Photo credit: CC-BY HPC Midlands

Page 15: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

e-Infrastructure Survey

bit.ly/nei2013 bit.ly/nei2014

Page 16: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

e-Infrastructure Survey

› Top 9Large &Specialist(by corecount)

Page 17: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

e-Infrastructure Survey

› Top 9Large &Specialist(by corecount)

Page 18: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

e-Infrastructure Survey

› Top 9Large &Specialist(by size of storage)

Page 19: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

e-Infrastructure Survey

› Top 9Large &Specialist(by size of storage)

Page 20: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

e-Infrastructure Survey

› Top 9Large &Specialist(by number of users)

1. Large and Specialist Services

Organisation name System nameWhat are the top three research areas the system is used for?

Total number of processor cores in the system

Total usable storage for HPC users (TB)

Number of registered users

Theoretical Peak Performance (Tflop/s)

NERC (operated by STFC) JASMIN

Climate Science, Earth Observation, environmental genomics 4,500 25 Over 10,000

STFC Hartree Centre Blue Wonder

Modelling & Simulation (CFD, Materials, and Computer Aided Formulation) 24,000 9000 750 - 1,000 200

Norwich Bioscience Institutes (TGAC, JIC, IFR, TSL)

Bioinformatics, mathematical modelling. 9,000 4,000 750 - 1,000

DiRAC @ University of Cambridge (HPCS) Darwin

Life Sciences. Atomic structure. Computational Fluid Dynamics. 9,600 2,847 750 - 1,000 200

STFC Scientific Computing Division

UK e-Science Certification Authority

Supports all UK research. Major users Particle Physics 750 - 1,000

STFC Scientific Computing Division SCARF

Computational Chemistry Plasma Physics, Processing Satellite images Support of ISIS, CLF, RAPSP, DLS user communities 7,000 320 500 - 750 165

STFC Hartree Centre Blue Joule

Modelling & Simulation (CFD, Materials, and Computer Aided Formulation) 98,000 6000 200 - 500 1,200

EMBL-EBI - European Bioinformatics Institute Embassy Cloud Life science research 31,000 3,200 200 - 500DiRAC @ EPCC DIRAC BG/Q QCD, Soft Matter Physics 98,304 1,000 200 - 500 1,258

Page 21: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

e-Infrastructure Survey

› Top 9Large &Specialist(by number of users)

1. Large and Specialist Services

Organisation name System nameWhat are the top three research areas the system is used for?

Total number of processor cores in the system

Total usable storage for HPC users (TB)

Number of registered users

Theoretical Peak Performance (Tflop/s)

NERC (operated by STFC) JASMIN

Climate Science, Earth Observation, environmental genomics 4,500 25 Over 10,000

STFC Hartree Centre Blue Wonder

Modelling & Simulation (CFD, Materials, and Computer Aided Formulation) 24,000 9000 750 - 1,000 200

Norwich Bioscience Institutes (TGAC, JIC, IFR, TSL)

Bioinformatics, mathematical modelling. 9,000 4,000 750 - 1,000

DiRAC @ University of Cambridge (HPCS) Darwin

Life Sciences. Atomic structure. Computational Fluid Dynamics. 9,600 2,847 750 - 1,000 200

STFC Scientific Computing Division

UK e-Science Certification Authority

Supports all UK research. Major users Particle Physics 750 - 1,000

STFC Scientific Computing Division SCARF

Computational Chemistry Plasma Physics, Processing Satellite images Support of ISIS, CLF, RAPSP, DLS user communities 7,000 320 500 - 750 165

STFC Hartree Centre Blue Joule

Modelling & Simulation (CFD, Materials, and Computer Aided Formulation) 98,000 6000 200 - 500 1,200

EMBL-EBI - European Bioinformatics Institute Embassy Cloud Life science research 31,000 3,200 200 - 500DiRAC @ EPCC DIRAC BG/Q QCD, Soft Matter Physics 98,304 1,000 200 - 500 1,258

Page 22: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

e-Infrastructure Survey

› Regional centres (by total cores)

2. Regional Systems

Organisation name System nameWhat are the top three research areas the system is used for?

Total number of processor cores

in the system

Total usable storage for HPC

users (TB)Number of

registered users

Theoretical Peak Performance

(Tflop/s)

HPC Wales Various (distributed system)

Advanced Materials & Manufacturing, Life Sciences and Energy & Environment 16,816 702 2,000 - 5,000 319

N8HPC Polaris 5,312 175 200 - 500 138

ARCHIE-WeSt ARCHIEMolecular dynamics, CFD, Plasma Physics 3,920 148 200 - 500 38

HPC Midlands HeraAdvanced Materials Energy Efficient Transport 3,008 120 100 - 200 48

Page 23: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

e-Infrastructure Survey

› Regional centres (by total cores)

2. Regional Systems

Organisation name System nameWhat are the top three research areas the system is used for?

Total number of processor cores

in the system

Total usable storage for HPC

users (TB)Number of

registered users

Theoretical Peak Performance

(Tflop/s)

HPC Wales Various (distributed system)

Advanced Materials & Manufacturing, Life Sciences and Energy & Environment 16,816 702 2,000 - 5,000 319

N8HPC Polaris 5,312 175 200 - 500 138

ARCHIE-WeSt ARCHIEMolecular dynamics, CFD, Plasma Physics 3,920 148 200 - 500 38

HPC Midlands HeraAdvanced Materials Energy Efficient Transport 3,008 120 100 - 200 48

Page 24: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

e-Infrastructure Survey

› Top 8 HEIs(by totalcores)

3. HEI Systems

Organisation name System name

What are the top three research areas the system is used for?

Total number of processor

cores in the system

Total usable

storage for HPC

users (TB)

Number of registered

users

Theoretical Peak

Performance

(Tflop/s)

Imperial College London cx1 21,558 2,000750 -

1,000

University of Bristol BlueCrystalChemistry, Aerospace Eng, Geographical Sciences 9,000 740

750 - 1,000 240

University College London Legion

Chemistry, Physics, Biological Sciences (according to REF Categories) 7,816 356 500 - 750 115

Imperial College London cx2 7,000 500 0 - 100 60

University of ManchesterComputational Shared Facility

Computational Chemistry / MD CFD FEA 6,288 750

750 - 1,000 111

Durham University HamiltonCondensed Matter Molecular Dynamics Fluid Dynamics 5,600 350 200 - 500 75

University of Oxford Arcus-B 5,440 4322,000 -

5,000 538

Lancaster University HEC (High End Cluster)

High Energy Physics Condensed Matter Theory CFD 4,784 1,530 200 - 500

Page 25: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

e-Infrastructure Survey

Page 26: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

e-Infrastructure Survey

Page 27: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

e-Infrastructure Survey

Page 28: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

e-Infrastructure Survey

Page 29: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

e-Infrastructure Survey

Page 30: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

15/04/2023

UKUSAHPC - July 2015 30

3. RCUK e-Infrastructure roadmap

Page 31: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

RCUK e-Infrastructure roadmap

From the roadmap document:

“Our aspiration is for the UK to have an integrated e-infrastructure: one that is run and managed as a whole without silos or boundaries, where there are simple processes by which users can get access to the e-infrastructure they need across the eco-system, as appropriate for the type or stage of research they are doing. We need to consider how best to integrate:

» Vertically up and down the eco-system pyramid, so users have easy access to the most appropriate type of e-infrastructure they need;

» Horizontally across the different elements, as shown in the diagram;

» Across the different research communities and the different stakeholders;

» Internationally, across other national e-infrastructures to deliver end-to-end services in the global environment of collaborative research.”

bit.ly/eroadmap

Page 32: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

RCUK e-Infrastructure roadmap

bit.ly/eroadmap

Page 33: UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

15/04/2023

UKUSAHPC - July 2015 33

That’s all, folks…

Except where otherwise noted, this work is licensed under CC-BY

Martin HamiltonFuturist, Jisc, London

@[email protected]