Comparison of NREN Service Level Agreements
-
Upload
violet-santana -
Category
Documents
-
view
33 -
download
1
description
Transcript of Comparison of NREN Service Level Agreements
Comparison of NREN Service Level Agreements
Life Cycle and Portfolio Management Workshop
Terena, 23/11/05
Ann Harding
HEAnet Network Operations Manager
Topics• Why
• Scope
• Deliverables
• Actions
• Resources
• Case study - HEAnet
Why• Internal factors
– Technical management e.g. upgrades– Technical considerations
• External factors– European and international projects– Operating environment– Client demand
• Better controlled introduction than hasty imposition
Scope• “Comparison of NREN Service Level
Agreements”– Narrow?– Achieveable?– Useful?– Internal or external?
• Need to include analysis and recommendations or processes?
Deliverables
• Taxonomy of service types?• List of NRENs with SLAs for each service
type?• Framework for investigating need for SLA
provision?• Framework for defining agreed service levels?• Identify shared services which may need a
shared SLA?
Actions• Today
– Finalise scope, deliverables– Identify interested parties
• Before end 2005– Identify individual actions for each deliverable
• Start today?
– Assign deliverable priorities– Assign work!!!
Resources
• WI 1 “Comparison of NREN service portfolios”
• Access to appropriate contacts in NRENs
• Access to appropriate contacts in Terena/GN2/Other
• ...
• Our time
Case Study
HEAnet Strategic Objective CS1:Monitoring Service Levels to
Ensure Excellence.
Actions
1. Identify an appropriate set of operational benchmarks and service metrics to measure our performance for clients.
2. Benchmark against other NRENs.
3. Benchmark against ‘competitors’.
4. Benchmark operational performance on an ongoing basis.
Deliverables
1. Definitions of types of measurements and performance thresholds.
2. List of client requirements.
3. Comparison of proposed benchmarks and metrics.
4. Recommendations for tools.
5. Methods for communicating data to clients.
6. Actions in the event of failure to meet thresholds.
Resources
• HEAnet teams, Schools & NOC.
• Client Contacts.
• NRENs & Terena.
• HEAnet CTO & Admin team.
• Data from Contracts & CfTs.
• Data and analysis from existing monitoring tools.
Work in Progress• Example list of Key Performance Indicators (KPIs)
assembled.• Analysis of JANET SLA document• New monitoring tools under trial.• Initial rough availability stats compiled for Q1 2005.• First run of draft client questions compiled.• Other resources such as Sonas report and OS1 work
identified.
Initial KPIs• Client Accessibility, both v4 and v6.• Outbound access to Géant2 & General Internet.• Service availability• Mean Time Between Failures (MTBF).• RTTs to Clients and other sites.• Throughput on connections.• NOC response times
– Per service type– Per request type
Initial Questions
• Questions for clients broken into ‘hard’ & ‘soft’.• Start with softer questions to gauge
expectations before asking harder, direct and enumerable questions.
• E.g. Soft – “Are you happy with your current levels of service with HEAnet?”
• Hard – “In scenario X, what availability would you expect from HEAnet?”
New Tools
• Cricket – Potential MRTG replacement.– Easier to configure.– More detailed information output.– Greater capabilities.
• Nagios – Potential Netsaint replacement.– Greater feature list.– More detailed reporting.– Greater compatibility with latest webservers.
Still to do
• Expand comparison with other NRENs
• Refine taxonomy for service types– Allow for future flexibility within limits
• Link specific benchmarks and metrics to service types
• Define operational policies