Copyright James Kent Blackburn 2007. This work is the intellectual property of the author....

13
Copyright James Kent Blackburn 2007. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial, educational purposes, provided that this copyright statement appears on the reproduced materials and notice is given that the copying is by permission of the author. To disseminate otherwise or to republish requires written permission from the author. Copyright Notice Copyright Notice

Transcript of Copyright James Kent Blackburn 2007. This work is the intellectual property of the author....

Page 1: Copyright James Kent Blackburn 2007. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial,

Copyright James Kent Blackburn 2007. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial, educational purposes, provided that this copyright statement appears on the reproduced materials and notice is given that the copying is by permission of the author. To disseminate otherwise or to republish requires written permission from the author.

Copyright NoticeCopyright Notice

Page 2: Copyright James Kent Blackburn 2007. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial,

Open Science GridOpen Science Grid

James “Kent” BlackburnJames “Kent” BlackburnOSG Resources ManagerOSG Resources Manager

Senior ScientistSenior ScientistLIGO LaboratoryLIGO Laboratory

California Institute of Technology California Institute of Technology

Page 3: Copyright James Kent Blackburn 2007. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial,

Instrumentation

Security

Control

DataGeneration

Computation

Analysis

Simulation

Program

Security

ManagementSecurity and

Access

AuthenticationAccessControl

Authorization

Researcher

ControlProgram

ViewingSecurity

3DImaging

Display andVisualization

.

DisplayTools Security

DataInput

CollabTools Publishing

HumanSupportHelp

Desk

Policy andFunding

ResourceProviders

FundingAgencies

Campuses

SearchData SetsStorage

Security

RetrievalInput

SchemaMetadata

DataDirectories

Ontologies

Archive

EducationAnd

Outreach

Network

Training

OSG’s Coverage of the OSG’s Coverage of the CI “Bubble” DiagramCI “Bubble” Diagram

OSGOSG OSG ConsortiumOSG Consortium

Page 4: Copyright James Kent Blackburn 2007. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial,

The Open Science GridThe Open Science Grid

• The Open Science Grid’s mission is to help satisfy the ever-growing computing and data management requirements of researchers by enabling them to share a greater percentage of available computer cycles and software with less effort.

• The OSG is a distributed, common cyberinfrastructure spanning campus, regional, national and international boundaries. At over 50 provider sites, independently-owned and managed resources make up the distributed facility; agreements between members provided the glue; their requirements drive the evolution; their effort helps make it happen.

• The facility is dedicated to high throughput computing and is open to researchers from all domains.

OSG is a Cyberinfrastructure for ResearchOSG is a Cyberinfrastructure for Research

The OSG is a framework for large scale distributed resource sharing, addressing the technology, policy and social requirements of sharing

Page 5: Copyright James Kent Blackburn 2007. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial,

OSG Consortium PartnersOSG Consortium Partners

Academia Sinica

Argonne National Laboratory (ANL)

Boston University

Brookhaven National Laboratory (BNL)

California Institute of Technology

Center for Advanced Computing Research

Center for Computation & Technology at Louisiana State University

Center for Computational Research, The State University of New York at Buffalo

Center for High Performance Computing at the University of New Mexico

Columbia University

Computation Institute at the University of Chicago

Cornell University

DZero Collaboration

Dartmouth College

Fermi National Accelerator Laboratory (FNAL)

Florida International University

Georgetown University

Hampton University

Indiana University

Indiana University-Purdue University, Indianapolis

International Virtual Data Grid Laboratory (iVDGL)

Thomas Jefferson National Accelerator Facility

University of Arkansas

Universidade de São Paulo

Universideade do Estado do Rio de Janerio

University of Birmingham

University of California, San Diego

University of Chicago

University of Florida

University of Illinois at Chicago

University of Iowa

University of Michigan

University of Nebraska - Lincoln

University of New Mexico

University of North Carolina/Renaissance Computing Institute

University of Northern Iowa

University of Oklahoma

University of South Florida

University of Texas at Arlington

University of Virginia

University of Wisconsin-Madison

University of Wisconsin-Milwaukee Center for Gravitation and Cosmology

Vanderbilt University

Wayne State University

Kyungpook National University

Laser Interferometer Gravitational Wave Observatory (LIGO)

Lawrence Berkeley National Laboratory (LBL)

Lehigh University

Massachusetts Institute of Technology

National Energy Research Scientific Computing Center (NERSC)

National Taiwan University

New York University

Northwest Indiana Computational Grid

Notre Dame University

Pennsylvania State University

Purdue University

Rice University

Rochester Institute of Technology

Sloan Digital Sky Survey (SDSS)

Southern Methodist University

Stanford Linear Accelerator Center (SLAC)

State University of New York at Albany

State University of New York at Binghamton

State University of New York at Buffalo

Syracuse University

T2 HEPGrid Brazil

Texas Advanced Computing Center

Texas Tech University

Page 6: Copyright James Kent Blackburn 2007. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial,

What The OSG OffersWhat The OSG Offers

• Low-threshold access to many distributed computing and storage resources

• A combination of dedicated, scheduled, and opportunistic computing

• The Virtual Data Toolkit software packaging and distributions

• Grid Operations, including facility-wide monitoring, validation, information services and system integration testing

• Operational security• Troubleshooting of end-to-end problems• Education and Training

Page 7: Copyright James Kent Blackburn 2007. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial,

The OSG as a Community The OSG as a Community AllianceAlliance

• The OSG is a grass-roots endeavor bringing together research institutions throughout the U.S. and the World. The OSG Consortium brings together the stakeholders. The OSG Facility brings together resources and users.

• The OSG’s growing alliance of universities, national laboratories, scientific collaborations and software developers, contribute to the OSG, share ideas and technologies reap the benefits of the integrated resources through both

agreements with fellow members and opportunistic use.

• An active engagement effort adds new domains and resource providers to the OSG Consortium.

• Training is offered at semi-annual OSG Consortium meetings and through educational activities organized in collaboration with TeraGrid. One to three day hands-on training sessions are offered around

the U.S and abroad for users, administrators and developers.

Page 8: Copyright James Kent Blackburn 2007. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial,

OSG Community StructureOSG Community StructureVirtual Organizations (VOs)Virtual Organizations (VOs)

• The OSG community shares/trades in groups (VOs) not individuals

• VO management services allow registration, administration and control of members within VOs

• Facilities trust and authorize VOs

• Compute and storage services prioritize according to VO group membership

Set of Available Resources

VO Management Service

OSG and WAN

Campus Grid Experimental Project Grid

Image courtesy: UNM Image courtesy: UNM

VO Management

& Applications

Page 9: Copyright James Kent Blackburn 2007. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial,

Campus GridsCampus Grids

• They are a fundamental building block of the OSG The multi-institutional, multi-disciplinary nature of the OSG is a macrocosm of many campus IT cyberinfrastructure coordination issues.

• Currently OSG has three operational campus grids on board: Fermilab, Purdue, Wisconsin Working to add Clemson, Harvard, Lehigh

• Elevation of jobs from Campus CI to OSG is transparent

• Campus scale brings value through Richness of common software stack with common interfaces

Higher common denominator makes sharing easier Greater collective buying power with venders Synergy through common goals and achievements

Page 10: Copyright James Kent Blackburn 2007. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial,

Current OSG Resources Current OSG Resources

• OSG has more than 50 participating institutions, including self-operated research VOs, campus grids, regional grids and OSG-operated VOs

• Provides about 10,000 CPU-days per day in processing

• Provides 10 Terabytes per day in data transport

• CPU usage averages about 75%• OSG is starting to offer support for MPI

Page 11: Copyright James Kent Blackburn 2007. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial,

Weekly OSG Process HoursWeekly OSG Process Hours

Page 12: Copyright James Kent Blackburn 2007. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial,

Facts and Figures from First Facts and Figures from First Year of OperationsYear of Operations

• OSG contributed an average of over one thousand CPU-days per day for two months to the D0 physics experiment

• OSG provided the LHC collaboration more than 30% of their processing cycles worldwide, in which up to 100 Terabytes per day were transferred across more than 7 storage sites

• LIGO has been running workflows of more than 10,000 jobs across more than 20 different OSG sites.

• A climate modeling application has accumulated more than 10,000 CPU days of processing on the OSG.

• The Kuhlman Lab completed structure predictions for ten proteins, consuming more than 10,000 CPU-days on the OSG.

Page 13: Copyright James Kent Blackburn 2007. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial,

Facing the CI Challenge Facing the CI Challenge TogetherTogether

• OSG is looking for a few partners to help deploy campus wide grid infrastructure that integrates with local enterprise infrastructure and the national CI

• OSG’s Engagement Team is available to help scientists get their applications running on OSG Low impact starting point Help your researchers gain significant compute cycles

while exploring OSG as a framework for your own campus CI

• Send your inquires to [email protected]

• Learn more about the OSG at Learn more about the OSG at http:www.opensciencegrid.org