ServiceNow Event 15.11.2012 / ITIL for the Enterprise @CERN
-
Upload
rene-haeberlin -
Category
Business
-
view
364 -
download
4
description
Transcript of ServiceNow Event 15.11.2012 / ITIL for the Enterprise @CERN
How service management best practice can be applied beyond IT
Reinoud Martens CERN GS/SMS Maienfeld 15/11/2012
Agenda
• Introduction to • CERN • CERN Service Management environment
• The project • Definition • Implementation • Review • Current situation • Conclusion
About CERN • World’s largest particle physics centre • World’s largest scientific instrument • 1954 - Europe’s first joint venture • 2012 - 20 member states
- 2 associate members & 7 observers • 1.2 bn CHF budget • ~ 2300 staff • >1000 fellows, students
& project associates • >10000 visiting scientists
over 100 nationalities (half of world’s particle physicists)
LHC
CERN’s missions Seeking answers to questions about the Universe.
What is it made of? How did it come to be the way it is?
Advancing the frontiers of technology and engineering. Uniting nations together through science.
Today >10000 visiting scientists from more than 100 countries. Training young scientists and engineers who will be the experts of tomorrow.
CERN : Facts & Fiction
• The World Wide Web was invented at CERN in 1989 by the British scientist Tim Berners-Lee.
• Some of the spin-offs: Improving cancer therapy technology, medical and industrial imaging, radiation processing, electronics, measuring instruments, new manufacturing processes and materials,..
• CERN does unfortunately not own an X-33 aircraft as it was suggested in Dan Brown’s book “Angels and Demons”.
• The LHC, allows us to look at microscopic big bangs to understand the fundamental laws of nature
• Which questions: • Why do particles have mass?
• Newton could not explain it and neither can we…
• What is 96% of the Universe made of? • We only ‘know’ 4% of it!
• Why is there no antimatter left in the Universe? • Nature should be symmetrical
• What was matter like during the first second of the Universe’s life, right after the "Big Bang"?
• A journey towards the beginning of the Universe will gives us deeper insight
The Large Hadron Collider (LHC) helps to find answers to fundamental questions
The Physics Challenge
The Physics Challenge
The universe is a tough nut to crack. By smashing pieces of matter together, creating energies and temperatures not seen since the universe's earliest moments, the LHC could reveal the particles and forces that wrote the rules for everything that followed. It could help answer one of the most basic questions for any being in our universe: What is this place?
The particle physicist’s toolbox
collisions
events Particle accelerator Experiments
Data
Analysis
The particle physicist’s toolbox
• LHC The world’s most powerful accelerator: • A 27 km long tunnel filled with high-tech
instruments • Equipped with thousands of
superconducting magnets • Accelerates particles to energies never
obtained before • Produces particle collisions
The accelerator complex
The particle physicist’s toolbox 4 Experiments : Very large sophisticated detectors
§ Atlas experiment during construction : 7000 tons § Hundred million measurement channels § Data acquisition systems treating Petabytes per second
The particle physicist’s toolbox
• Computing infrastructure to store, distribute and analyse the data • A Computing Grid linking ~200
computer centres around the globe • Sufficient computing power and
storage to handle 15 Petabytes per year, making them available to thousands of physicists for analysis
The fastest racetrack on the planet…
Trillions of protons race around the 27km ring in opposite direc7ons over 11,000 7mes a second,
travelling at 99.999999991 per cent the speed of light.
CERN, a place of extremes
The emptiest space in the solar system…
To accelerate protons to almost the speed of light requires a vacuum as empty as interplanetary space. There is 10 7mes
more atmosphere on the moon than there is in the LHC.
CERN, a place of extremes
One of the coldest places in the universe..
With an opera7ng temperature of about -‐271 degrees Celsius, just 1.9 degrees above absolute zero,
the LHC is colder than outer space.
CERN, a place of extremes
The hottest spots in the galaxy…
When two beams of protons collide, they generate temperatures 1000 million 7mes hoIer than the
heart of the sun, but in a minuscule space.
CERN, a place of extremes
The biggest most sophisticated detectors ever built…
ALICE
To sample and record the debris from up to 600 million proton collisions per second, scien7sts are building gargantuan devices that measure par7cles with micron precision.
CERN, a place of extremes
One of the most extensive computer systems in the world…
To analyze the data, tens of thousands of computers around the world are being harnessed in the Grid. The laboratory that gave the world the web, is now taking distributed compu7ng a big step further.
CERN, a place of extremes
CERN latest events CERN experiments observe particle consistent with long-sought Higgs boson Geneva, 4 July 2012. At a seminar held at CERN today as a curtain raiser to the year’s major particle physics conference, ICHEP2012 in Melbourne, the ATLAS and CMS experiments presented their latest preliminary results in the search for the long sought Higgs particle. Both experiments observe a new particle in the mass region around 125-126 GeV.
CERN A laboratory with extreme
requirements in many domains.
How about service management ?
Service management – What we think it IS ? • Established industry best practice, used with success by thousands of
organisations worldwide (“de facto” standard) • A strategic framework, covering all services (not only IT) • Business/customer/user focussed (focus on WHAT not HOW) • A set of management processes covering the complete service lifecycle • An approach to ‘adopt and adapt’ to ensure service solutions provide the
best possible fit to the specific requirements of the organization
Service management – What it IS NOT ! • A tool (e.g. service now) • A service desk • A conspiracy to monitor people
• Our users! • engineers • physicists • technicians • administrators • computer scientists • craftspeople • mechanics • support personnel • …
The Particle Physics community is bringing the world together …
CERN Service Management Environment
• … is providing us every type of user!
The freedom of science…
CERN Service Management Environment
Services we are covering (1) :
§ General IT services
§ Physics specific IT services (including Grid)
§ Medical Services & Fire Protection Services
§ Civil Engineering & Facility Management
§ Registration, Access & Safety Services
§ Alarm System Services
CERN Service Management Environment
Services we are covering (2) :
§ Visits & Outreach Services
§ Material & Storage Services
§ Mail & Shipping Services
§ Library & Archive Services
§ Housing Services
§ HR, Finance & Legal Services
CERN Service Management Environment
Some numbers
§ 495 hotel rooms, 3 restaurants
§ 2 Sites, 657 Buildings, 238 Barracks
§ 15000 active access cards
§ > 1000 cars
§ 5500 PC’s & 1500 MAC desktops
§ 6900 servers with 41000 cores
§ 14 PB disk space
§ 48 PB tape storage
§ 70000 network ports. feeding 34000 hosts
CERN Service Management Environment
Physics &
Experiments
Accelerators &
Technology
Infrastructure &
Support
SERVICES
CERN Service Management Environment
Services ~25% Accelerators ~45% Physics ~30%
Director-‐General -‐ Rolf-‐Dieter Heuer ~ 6%
DG services (Safety, Legal, Audit, Planning, VIP, etc..) ~ 6% X
AdministraNon and general infrastructure -‐ Sigurd LePow ~ 14%
FP Finance, Procurement and Knowledge Transfer -‐ T. Lagrange ~ 2.5% X
GS General infrastructure services -‐ T. PeIersson ~9% X
HR Human resources -‐ A.-‐S. Catherin ~ 2.5% X
Research and scienNfic compuNng -‐ Sergio Bertolucci ~33% + CERN USERS
IT Informa7on technology -‐ F. Hemmer ~9% X X
PH Physics -‐ P. Bloch ~25% X
Accelerators and technology -‐ Steve Myers ~ 46%
BE Beams -‐ P. Collier ~16% X
EN Engineering -‐ R. Saban ~14% X X
TE Technology -‐ F. Bordry ~15% X
CERN Service Management Environment
• Infrastructure and services neglected in favor of LHC project • Resources under scrutiny à desire for benchmarking, metrics, KPI’s • CERN victim of LHC’s success à do more with less • Realization of (lack of) maturity à wish to grow up • Administration à Management Cockpit Project
The situation 3 years ago. CERN Service Management Environment
Awareness à Project
• Objectives • Standard or framework • Service structure • Process definitions • Roles of the people • Implementation • Review • Current situation • Conclusion
Service Management for CERN
Objectives
1. One Service Desk for CERN (one number to ring, one place to go, 24/7 coverage)
2. Standard Processes for all Service Providers at CERN (one behavior)
3. Services defined from a User’s point of view
4. Services easy to find by everybody, without knowledge of CERN internal structures
5. Service and process quality measurable
6. Improved collaboration over the borders of sections, groups and departments (break down silo’s)
7. Very high level of automation of all known procedures
8. Framework for continuous improvement in the fields of efficiency and effectiveness
Which Standard / Framework ?
Different best practice for IT and NON IT services doesn’t look a brilliant idea
ISO
Choice: ITIL V3 framework, but • Remove all references to IT
(for the non IT people)
• Stay PRAGMATIC (only take what is useful; leave the rest for ‘later’ J)
• No Extremism, No over-engineering
Service Management team
Service structure in 2009
Users (Institutes Experiments)
Administrative Users
Technical Expert Users
§ High level experts in all areas § Functional “elements” scattered around § Different types of users with different interests § No comprehensible communication framework § No structured service offering § No central contact point to find what you need
• Customer Services & Service Elements • From the user‘s point of view (new for CERN) • Different for different types of users • Combination of functional elements to provide a
complete functionality for users • New „Service Owner“ Roles representing Services • Related to users
Service structure (Business Service Catalogue)
• Functional Services • Lists all technical services,
activities & functions • Group and Section leaders
in charge of all quality and resource related topics
• Related to „support groups“ – groups of experts that perform 2nd and 3rd line support
Service structure (Business Service Catalogue)
Service structure (Business Service Catalogue)
Culture Change Shift from How à What
Shift from Function à Service
2 dimensional Service Catalogue
• Covers all Services provided
• Lists all Functional Services
• Connecting both sides of the catalogue
• Contains classification to shows level of importance
• Foundation for Process Automation & Service Portal
• Contains: • Services (240) • Functions (462) • Relations (~1800)
Users in Projects and Experiments
Administrative Users Expert Users
Service Web Portal
Service Desk
Service structure (Service Desk & Portal)
Common Tool
1 Down 2 Degraded 3 Affected 4 Disruptedcritical adverse
impact on the servicemajor adverse impact
on the serviceminor adverse impact
on the servicesmall number of the population affected
1 High: The damage caused by the Incident increases rapidly.
1Major
2High
3Moderate
4Low
2 Medium: The damage caused by the Incident increases cons iderably over time
2High
3Moderate
4Low
5Planning
3 Low: The damage caused by the Incident only marg inally increases over time
3Moderate
4Low
5Planning
6Very Low
(Business) Impact
Urg
ency
Priority Matrix
Process Definition: Incident & Request
Process Definition: Change
Process Definition (status) Finalized:
• Incident, Request & Change
• Business Service Catalogue management
• Knowledge management
• Service level management
In preparation:
• Major incident handling
• Outage handling & Status boards
• Event management
Service Management Role definitions
Services
functions
User
Use
Service Manager On Duty
Assistance
Service Owner
Coo
rdin
ates
Customer
Support Groups
Service Desk
Service Desk Manager
Supervises
Assigns To
Functional Service Manager Coordinates
Supervised
Tool Selection (After process and catalog definition!)
• 40 Tools evaluated in the pre-selection phase • 6 Tools evaluated in detail (300 questions)
• 2 Tools in the final competition (on site demo, hands on, and reference visits) • Tool Selected – 100% Web based - SAAS • Considered Requirements:
• Processes • Service Catalogue • Measurement & Reporting • Technical (Architecture) • Interface • Future Needs
Tool
Manufacturer
ProviderMain Category
Sub-‐categoryHigh-‐level Criteria
Weight ScoreConfigurat
ion/ programm
Additional comments
ScoreConfigurat
ion/ programm
Additional comments
ScoreConfigurat
ion/ programm
Additional comments
ScoreConfigurat
ion/ programm
Additional comments
ScoreConfigurat
ion/ programm
Additional comments
ScoreConfigurat
ion/ programm
Additional comments
ScoreConfigurat
ion/ programm
Additional comments
ScoreConfigurat
ion/ programm
Additional comments
ScoreConfigurat
ion/ programm
Additional comments
ScoreConfigurat
ion/ programm
Additional comments
ScoreConfigurat
ion/ programm
Additional comments
ScoreConfigurat
ion/ programm
Additional comments
ScoreConfigurat
ion/ programm
Additional comments
ScoreConfigurat
ion/ programm
Additional comments
ScoreConfigurat
ion/ programm
Additional comments
ScoreConfigurat
ion/ programm
ScoreConfigurat
ion/ programm
ScoreConfigurat
ion/ programm
ScoreConfigurat
ion/ programm
ScoreConfigurat
ion/ programm
ScoreConfigurat
ion/ programm
ScoreConfigurat
ion/ programm
ScoreConfigurat
ion/ programm
ScoreConfigurat
ion/ programm
ScoreConfigurat
ion/ programm
ScoreConfigurat
ion/ programm
ScoreConfigurat
ion/ programm
ScoreConfigurat
ion/ programm
ScoreConfigurat
ion/ programm
ScoreConfigurat
ion/ programm
ScoreConfigurat
ion/ programm
ScoreConfigurat
ion/ programm
ScoreConfigurat
ion/ programm
ScoreConfigurat
ion/ programm
ScoreConfigurat
ion/ programm
ScoreConfigurat
ion/ programm
ScoreConfigurat
ion/ programm
ScoreConfigurat
ion/ programm
Individual elementsCurrent Process Requirements
35 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐
Future Use Requirements10 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐
Interface Requirements15 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐
Technical Requirements (requirements that are not applicable can be ignored)10 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐
Measurement Requirements5 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐
Reporting Requirements5 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐
Hosting schemes (no impact final score, as one or more of these schemes are always present)0 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐
Licence Model and costs (Info to be provided in cost and comments column)10 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐
General Quality Factors10 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐
100 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐ 0 -‐
Preselection criteria % of scoreCustomer experience 20Implementation effort (feasible roll out within 4 months) 15Provider's viability and completeness of vision 15Native relevant ITIL best-‐practice content 10Fully Web 2.0 based (Back office & Portal) 10Technology Stack Compatability 10Product Design Quality & Maturity 20
100http://www.itsmportal.com/tools?searchtext=service%20management&action=search&op=Search&form_build_id=form-‐963bb6a68bcce30357e7bd0ba6c94292&form_id=querytools_form
Pink Verify 2.0 toolsets (ITIL V2) yes YES YES yes YES yesPink Verify 3.0 toolsets YES 2009 release YES Remedy 7.0 YES NO yes YES 2.0 YES 9.0 YES Footprint YES YES YES SERVICE-‐MANAGFEMENT 7.0 YES YES YES YES YES YESPink Verify 3.1 toolsets YES Service-‐manager 7.1 YES SERVICE DESK 4
0.90 0.90 0.90 2.10 1.80 2.10 0.90
TopDesk Help Desk Service Desk Service Desk helpLine Service Desk Assyst 7.5
2.40 2.40 2.90 2.10 0.90 0.90 1.50 1.80 0.90 2.10 2.10 2.10 2.70 2.10 2.10 0.90
20 /20 e-‐Service Desk SupportDesk ITSM SMI Suite Service Management Suite ITSM Help DeskService Manager ExpertDesk Service Desk Service Manager 7 Tivoli Valuemation POB Service Management Service Desk ManagerOmnitracker Service Desk plus Easy Vista OTRS ITSM S-‐BPM Suite TrackIT TicketXpert Incident MonitorService Now Remedy
Future Use Requirements -‐ Score (10% of overall score)
TOTAL SCORE
44.56
3243
3.00
4455
315
Current Process Requirements -‐ Score (35% of overall score)
Interface Requirements -‐ Score (15% of overall score)
Service Desk Express iET ITSM
33
333
535
533
35
33
44
244 3
4
Assyst 7.5
Axios Systems
0033 03 0 3 0 0 33 3 3 0 5 0
MSM iWise
3 3 3 0 5 35 53 4 3 33 3 0 3 33 0 3 0 3 03 0 3 0 0 03 3 0 0 033 3 3
0 33 0 0 0 0 30 3 3 3 3 35 3 0 0 0 00 04 3 3 34 2 1 3
LAN Desk helpLine EasitiNet TopDesk Nilex Naumen
Numara isonet Monitor 24 -‐ 7 inc.
Fritz & MaciolPriox EpicorSunrise Marvel InfraWise ICCM House on the Hill WestburyHP IBM USU Wendia Cherwell CA
Service Desk helpLine Service Desk
service-‐now.com BMC BMC iET Solutions Omninet Manage Engine Staff & Line
Service Management Suite ITSM Help Desk TopDesk Help Desk Service Desk20 /20 MSM iWise e-‐Service Desk SupportDesk ITSM SMI SuiteService Manager 7 Tivoli Valuemation POB Service Management Service Desk ManagerTicketXpert Incident Monitor Service Manager ExpertDesk
Downselect
Service Desk
Livetime Mansystems
Matrix 42
OTRS ITSM S-‐BPM Suite TrackITService Now Remedy Service Desk Express iET ITSM Omnitracker Service Desk plus Easy Vista
jcom1
Aspediens IT Concepts IT Concepts Fritz & Maciol
OTRS Frontrange
Legend:"Score": Rating from 0 (feature not available, not programmable) to 5 (full functionality provided out of the box, no extra effort required)."Configuration/programming effort": Optional field for defining customization/programming effort. Values entered here will be used mainly for detailed distinction in case of undecision."Additional comments": Optional field for additional helpful comments aimed at progressing tool selection.Overall score is determined by multiplying Weight and out-‐of-‐the-‐box rating; a 0 rating for a "Must-‐have" feature leads to automatic failure.Definition of Weight values : 5=must have, 3=should have, 1=nice to haveScore values: 5 = Out of the box, 4 = customisation/configuration, 3= scripting, 2=minor programming, 1=application programming 0=not possible, Effort in mandays
55
Technical Requirements (requirements that are not applicable can be ignored) -‐ Score (10% of overall score)
Measurement Requirements -‐ Score (5% of overall score)
Reporting Requirements -‐ Score (5% of overall score)
Hosting schemes (no impact final score, as one or more of these schemes are always present) -‐ Score (0% of overall score)
Licence Model and costs (Info to be provided in cost and comments column) -‐ Score (10% of overall score)
General Quality Factors -‐ Score (10% of overall score)
3.953.4042454554
4.10 2.00
3334
3.90
1533
3.40
5 35 05 35 13 0
1.80 2.57 2.70 3.50 3.65 3.60 3.00
3 3 3 5 4 5 30
5
TOTAL SCORE33 3 3 3
1 0 0 3 5 5 33 3 4
Tool Implementation Phase 1 – Do the basics to get operational (autumn 2010) :
• Business Service catalogue ( CERN design & implementation )
• Incident & Request ( existing basic processes – CERN catalogue automation )
• Service portal ( CERN design & implementation)
Phase 2 – Grow the system while acquiring experience (2011) :
• Service level management (out of the box for incident / CERN build fro request)
• Reporting ( ok for operational reports disappointing for
• Functional improvements (many requirements /
• Interfaces to other “ticket” systems ( GRID computing, Assed management ..)
• Knowledge management (adapted to the business service catalogue)
• Change management ( CERN build simplified process )
Phase 3 (2012)
• Process & tool improvements
• Event management (CERN implementation)
Tool Implementation • Business Service Catalogue driven
• Access from both dimensions
• Service à Function
• Functions à Service
• Highly automated
• Support Assignments
• SLA & OLA
• Performance reporting
Tool Implementation (Service Portal)
§ User access to all services
§ Search function
§ Browse the catalogue
§ Report issues
§ Follow issues
§ Access knowledge base
Service Management for CERN How did we go about it?
§ External ITIL expert knowledge and experience § Ncc Management Consultants
(Experts in ITIL applied to non IT) § 2 dimensional Service Matrix proven in practice § Specific experience in huge organizations
§ Tool implementation § Aspediens: expert coaching § ServiceNow: good relations with the tool manufacturer
§ Initially 2 Departments (IT & General Services) § Initially 2 Processes (Incident, Request) § Grow and Improve gradually
§ More departments (Finance, Human Resources) § More processes (Change, Problem, Event, Knowledge, …)
Service Management roadmap
Concepts
• Service Catalogue & Matrix • Service Role (e.g. Service Owner) Definitions • Process Design
Data and Tools
• Web Portal & Service Repository • Service Descriptions • Service Management Tool Evaluation
Implementation
• SM Tool Implementation (Oct 2010 .. Feb 2011) • Service Desk Planning & Staffing • Role Assignments
Rollout
• Training • Consolidation • Change, Knowledge & Service level management
Operation
• Improve • More Processes (Event, Problem & Outage) • Extended Scope
2010
2009
2011
2012
Review
Project Review § Quality review by ITIL senior consultant (Colin Rudd - autumn 2011)
§ Approach:
§ Interviews with key staff involved in all area’s of service management activities
§ Interviews with representatives of the user community
§ Identify and quantify the current level of service management maturity
§ Results:
§ Report on strengths, weaknesses and gaps with prioritized recommendations for improvements
0 0.5
1 1.5
2 2.5
3 3.5
4 4.5
5 Incident
Request
Event
Problem
Change
Knowledge
SLM SCM
Supplier
Desk
Technical
Portal
Assessment summary: Process & Functions
Colin Rudd Service Management consultant, mentor and coach ITIL Author Chairman of itSMF UK [email protected]
Colin Rudd Service Management consultant, mentor and coach ITIL Author Chairman of itSMF UK [email protected]
0
0.5
1
1.5
2
2.5
3
3.5
4
4.5
5
Processes Functions
Assessment summary: Process & Functions
Assessment summary: The bigger picture
2.0
2.0
2.9
2.8 3.0
2.4
1.6
0.0
1.0
2.0
3.0
4.0
5.0
Vision and governance
Strategy and steering
Processes
People Products and technology
Culture, service and a\tude
OrganisaNon, communicaNon &
relaNonships
Colin Rudd Service Management consultant, mentor and coach ITIL Author Chairman of itSMF UK [email protected]
Assessment summary: Industry comparison
0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0
Vision and governance
Strategy and steering
Processes
People Products and technology
Culture, service and a\tude
OrganisaNon, communicaNon
CERN Max Avge Min
Key Colin Rudd Service Management consultant, mentor and coach ITIL Author Chairman of itSMF UK [email protected]
Time
Establish a coalition
Governance policy
Vision
Management reports
Strategy & roadmap
Proactive capability
Continual improvement
Enterprise architecture Continual learning
Knowledge sharing
Customer driven processes
Transform culture
Improved communication
Improved inter-working
Problem management
Change management
Service level management Major incidents
Service criticality
Service desk schedule
Knowledge transfer
Service portal
Key recommendations
Important recommendations
Assessment : Recommendations
Colin Rudd Service Management consultant, mentor and coach ITIL Author Chairman of itSMF UK [email protected]
18 months of operation: some figures
§ 77000 incidents
§ 98000 Requests
§ 1300 Knowledge articles
§ 291 Services
§ 521 Functions (support groups)
§ >900 concurrent sessions
§ Portal popularity grows
§ Service quality improves
What's next: § Extend scope (More CERN Services)
§ Improve service quality (process & people)
§ Coach support teams using SLA / OLA / UC
§ Include risk matrix in the system
§ Change management
§ Event management
§ Implement the Colin Rudd recommendations
Conclusion • CERN’s mission is to provide tools and
infrastructure to its ‘users’; the particle physicists of the world
• This infrastructure was build over a 50 years period, represents an enormous investment, and will be exploited for many years to come
• In order for our ‘users’ to be able to do their job efficiently they must be supported by “best in class” service organization, using simple, comprehensive, coherent, smooth and efficient processes and tools (both IT services AND NON IT services).
• While trying to implement this ‘vision’ we learned a lot,
1. Higher than expected benefits for the non IT area
2. Underestimation of the cultural differences (IT <-> non IT)
• We are convinced the result will be a positive and necessary change for CERN, it’s users and support staff.
Questions