SitE Report
description
Transcript of SitE Report
![Page 1: SitE Report](https://reader030.fdocuments.us/reader030/viewer/2022033106/56816426550346895dd5e45d/html5/thumbnails/1.jpg)
![Page 2: SitE Report](https://reader030.fdocuments.us/reader030/viewer/2022033106/56816426550346895dd5e45d/html5/thumbnails/2.jpg)
SITE
REP
ORT
University of JohannesburgSouth Africa
Stavros LambropoulosNetwork EngineerI.C.S Department
![Page 3: SitE Report](https://reader030.fdocuments.us/reader030/viewer/2022033106/56816426550346895dd5e45d/html5/thumbnails/3.jpg)
OVERVIEW• History of the UJ Research Cluster• User Groups• Hardware• South African Compute Grid (SA Grid)• Status• Applications• Issues• Future• Links• Contributions
![Page 4: SitE Report](https://reader030.fdocuments.us/reader030/viewer/2022033106/56816426550346895dd5e45d/html5/thumbnails/4.jpg)
HISTORY OF UJ RESEARCH CLUSTER
• UJRC started as an initiative of the High Energy Physics Group
• March 2009 - The UJ-OSG Compute Element passes validation and is registered on VORS (Resource Selector)
• March 2009 – 56 CPU Cores available• April 2009 – UJ Hosted Grid School
![Page 5: SitE Report](https://reader030.fdocuments.us/reader030/viewer/2022033106/56816426550346895dd5e45d/html5/thumbnails/5.jpg)
USER GROUPS• High Energy Physics (Physics) • Astrophysics (Physics) • Molecular Dynamics (Chemistry) • Quantum Chemistry (Chemistry) • Applied Mathematics • Numerical Studies (Engineering)
![Page 6: SitE Report](https://reader030.fdocuments.us/reader030/viewer/2022033106/56816426550346895dd5e45d/html5/thumbnails/6.jpg)
HARDWARE• 1 Head Node comprising :
Dell 2950, 2 x 4 Core Xeon Processors16 GB RAM900 GB – RAID5Scientific Linux 4/64 Bithosts : NFSv4, Accounts, Torque, Ganglia
![Page 7: SitE Report](https://reader030.fdocuments.us/reader030/viewer/2022033106/56816426550346895dd5e45d/html5/thumbnails/7.jpg)
HARDWARE (CONTD)• Separate Virtual Machines (VMWare
Server) for :OSG CE (1 GB RAM)OSG UI (submit node) – (2 GB RAM)gLite CE (1 GB RAM)gLite UI (submit node) – (1 GB RAM)
![Page 8: SitE Report](https://reader030.fdocuments.us/reader030/viewer/2022033106/56816426550346895dd5e45d/html5/thumbnails/8.jpg)
HARDWARE (CONTD)• 7 Worker Nodes comprising :
Dell 1425, 2 x 4 Core Opteron Processors16 GB RAMScientific Linux 4/64 BitgLite sw locally installedOSG sw from NFS
• Alcatel 6400 Gig Switch
![Page 9: SitE Report](https://reader030.fdocuments.us/reader030/viewer/2022033106/56816426550346895dd5e45d/html5/thumbnails/9.jpg)
HARDWARE (CONTD)
![Page 10: SitE Report](https://reader030.fdocuments.us/reader030/viewer/2022033106/56816426550346895dd5e45d/html5/thumbnails/10.jpg)
SA GRID• The South African National Grid is a
project to provide a national grid computing infrastructure to support scientific computing and collaboration. This project is managed by a consortium of universities, national laboratories and the Meraka Institute, under the cyber infrastructure programme, based on the gLite middleware .
![Page 11: SitE Report](https://reader030.fdocuments.us/reader030/viewer/2022033106/56816426550346895dd5e45d/html5/thumbnails/11.jpg)
STATUS• OSG is operational on SL4 base
![Page 12: SitE Report](https://reader030.fdocuments.us/reader030/viewer/2022033106/56816426550346895dd5e45d/html5/thumbnails/12.jpg)
STATUS (CONTD)• Started Discussion on the choice of
referent Tier1/Tier2 for ATLAS and ALICE• WN’s, CE’s and UI’s to be updated to SL5
as requested by LHC Computing Grid• Cobbler and Puppet to be used for the
new SL5 node installation and management
• Updating of Head Node from SL4/VMWare to SL5/Xen is planned
![Page 13: SitE Report](https://reader030.fdocuments.us/reader030/viewer/2022033106/56816426550346895dd5e45d/html5/thumbnails/13.jpg)
APPLICATIONS• 2 Commercial Applications Running
Locally :ANSYS FLUENT – Flow modeling SoftwareStar-CCM+ - Computational Fluid Dynamics
• Other Local Applications :Geant4 for NA63, MineralPETNA63 dedicated simulation codeDiamond Lattice Deformation
![Page 14: SitE Report](https://reader030.fdocuments.us/reader030/viewer/2022033106/56816426550346895dd5e45d/html5/thumbnails/14.jpg)
APPLICATIONS• On OSG :
Full ATLAS VO Support ENGAGE VO runs a few jobs local ATLAS users submit remote jobs from local UI Initial discussions have started to allow DOSAR VO
• On SAGrid : Will allow SAGrid VO’s ALICE VO ATLAS e-NMR VO WISDOM VO GILDA
![Page 15: SitE Report](https://reader030.fdocuments.us/reader030/viewer/2022033106/56816426550346895dd5e45d/html5/thumbnails/15.jpg)
The UJ Research Clusterand the OSG GRID
UJ – Physics High Energy Physics, ATLAS experiment at CERNKetevi Assamagan, Simon Connell, Sergio Ballestrero, Claire Lee, Neil Koch, Phineas Ntsoele ATHENA installed, using Pythia event generator to study variousHiggs scenarios.
![Page 16: SitE Report](https://reader030.fdocuments.us/reader030/viewer/2022033106/56816426550346895dd5e45d/html5/thumbnails/16.jpg)
UJ – Physics Diamond Ore Sorting (Mineral-PET)Sergio Ballestrero, Simon Connell, Norman Ives, Martin Cook, Winile SibandeGEANT4 MonteCarlo
Online diamond detection
Online diamond detection
Monte Carlo simulation
![Page 17: SitE Report](https://reader030.fdocuments.us/reader030/viewer/2022033106/56816426550346895dd5e45d/html5/thumbnails/17.jpg)
ISSUES• Limited International Bandwidth
Currently Using 11Mb/sTo be Upgraded early next year with the SEACOM cable
![Page 18: SitE Report](https://reader030.fdocuments.us/reader030/viewer/2022033106/56816426550346895dd5e45d/html5/thumbnails/18.jpg)
ISSUES (CONTD)• Research Funding – for HW and training• Additional complexity to manage both
OSG and gLite• Lack of caching by OSG installer, partially
solved with local Squid cache• No automated install & config system
yet, starting to work on Cobbler and Puppet
• NFSv4 problematic on SL4• Monitoring, need to add detailed job
monitoring/stats for Torque
![Page 19: SitE Report](https://reader030.fdocuments.us/reader030/viewer/2022033106/56816426550346895dd5e45d/html5/thumbnails/19.jpg)
ISSUES (CONTD)• Manpower – Grid services not primary
job/role for the 3 people – Addressing problem with single national Operations Team
• Low Usage – Marketing of services and availability has been done but researchers are slow to start
• No experience gathered on utilization of resource in terms of constraints on memory, disks, CPU and network
• Final VO acceptance policy required
![Page 20: SitE Report](https://reader030.fdocuments.us/reader030/viewer/2022033106/56816426550346895dd5e45d/html5/thumbnails/20.jpg)
FUTURE• Hardware Upgrade :
Additional 4 x WN’s being configured 1 x Dell MD1000 Storage shelf (6TB raw)
will be connected to the Head Node. – Ordered
16 x WN’s (Dell M605 blade chassis, with 2 x 6 Cores, 32GB Ram) – Ordered
224 Cores will be available• DOSAR Workshop in South Africa in
2010
![Page 21: SitE Report](https://reader030.fdocuments.us/reader030/viewer/2022033106/56816426550346895dd5e45d/html5/thumbnails/21.jpg)
LINKS• University of Johannesburg
http://www.uj.ac.za• UJ Physics
http://physics.uj.ac.za/cluster• South African Grid
http://www.sagrid.ac.za
![Page 22: SitE Report](https://reader030.fdocuments.us/reader030/viewer/2022033106/56816426550346895dd5e45d/html5/thumbnails/22.jpg)
CONTRIBUTIONS• Prof. S. Connell – UJ Physics Department• Sergio Ballestrero – UJ Physics & CERN
ATLAS TDAQ• Bruce Becker – SA Grid Co-ordinator• Francois Mynhardt – UJ I.C.S Department
![Page 23: SitE Report](https://reader030.fdocuments.us/reader030/viewer/2022033106/56816426550346895dd5e45d/html5/thumbnails/23.jpg)
Questions