SIMPDA 2011 - An Open Source Platform for Business Process Mining
Simpda 2014 - A living story: measuring quality of developments in a large industrial software...
-
Upload
spagoworld -
Category
Technology
-
view
345 -
download
2
description
Transcript of Simpda 2014 - A living story: measuring quality of developments in a large industrial software...
www.eng.it
Measuring quality of developments in a large industrial software factory
A living story with Open
Source Software
Gabriele Ruffatti, Director
Architectures & Consulting ServicesBig Data & Open Source Competency CentersMethodologies, Processes & Services forEngineering's Software Labs
Engineering Group
www.eng.it2 SIMPDA 2014 – Milan,Italy November 20th, 2014
Wilmington
USA BRAZIL ARGENTINA BELGIUM
( ) São Paulo / Rio de Janeiro / Recife Belo
Horizonte / CuritibaBrusselsBuenos Aires( ) ( ) ( )
REPUBLIC OF
SERBIA
Belgrade( )
Business integration
Consulting
Outsourcing
Products and solutions
A global player
31 branches in ITALY
7.2%
1,000
Large accountsabout 7,300
Professionals
822.8 mn€
Italian market
PRODUCTION
TECHNOLOGICAL SOLUTIONS
& INNOVATIVE
APPLICATIONS
RESEARCH
IDEAS FOR
RESEARCH PROJECTS
INNOVATION
EXPERIMENTAL
CHECKS
RESEARCH PROJECTS
RESULTS
Participation in European
research programs and creation
of a network of collaborations
25mn€/year
I N V E S T M E N T S
in I N N O V A T I O N
+
ENGINEERING GROUP
www.eng.it3 SIMPDA 2014 – Milan,Italy November 20th, 2014
Economic efficiencyTechnical efficiency
Strategic efficiency
Social efficiency
OPEN SOURCE
www.eng.it4 SIMPDA 2014 – Milan,Italy November 20th, 2014
Source: OW2, Cédric Thomas, 2014
OPEN SOURCE
www.eng.it5 SIMPDA 2014 – Milan,Italy November 20th, 2014
INTEGRATOR
INNOVATOR
knowledge sharing
collaborative projects
Competitive Competitive leverlever
PURE PLAYER
global communities
DIGITAL AGENDAFOR EUROPE
OPEN SOURCE @Engineering
www.eng.it6 SIMPDA 2014 – Milan,Italy November 20th, 2014
� Cost reduction� Flexibility� Innovation
System integrators DO NOT sell “licenses”but skills and know -how
OPEN SOURCE @Engineering INTEGRATOR
Knowledge as a Common
www.eng.it7 SIMPDA 2014 – Milan,Italy November 20th, 2014
www.spagoworld.org
OPEN SOURCE @Engineering PURE PLAYER
GLOBAL COMMUNITIES GLOBAL COMMUNITIES
www.eng.it8 SIMPDA 2014 – Milan,Italy November 20th, 2014
100% open source software forever
user-oriented, flexible and scalable
A comprehensive business
intelligence suite
Innovative themes and
solutions
OPEN SOURCE @Engineering SPAGOBI
www.eng.it9 SIMPDA 2014 – Milan,Italy November 20th, 2014
OPEN SOURCE @Engineering SPAGOBI
www.eng.it10 SIMPDA 2014 – Milan,Italy November 20th, 2014
Cloud Computing
Big Data
Future Internet
Privacy/Security
OPEN SOURCE @Engineering INNOVATOR
www.eng.it11 SIMPDA 2014 – Milan,Italy November 20th, 201411
ENGINEERING
TECHNICAL
UNIT
COMPETENCY CENTERS
SOFTWARE LABORATORIES
R & D
• Automation & Control
• BI & DataWarehouse
• ECM
• ERP
• GIS
• Managed Operations
• Mobile
• Big Data
• Open Source & SpagoBI
BUSINESS
UNITS
PA & HEALTHCARE
INDUSTRY & SERVICES
TELCO & UTILITIES
FINANCE
MARKETS
ENGINEERING GROUP ORGANIZATION
www.eng.it12 SIMPDA 2014 – Milan,Italy November 20th, 2014
Is there REALLY
a way to measure performance ?
Is my project on track?
How can I improve
the development process ?
Which is the
quality level of my product ?
Which are
corporate audit results ?Which is
users' and customers'
level of satisfaction ?
How productive
is my organization ?
How can I improve
performance?
How can I compare
different labs?
TopManager
QualityManager
ProjectManager
I want to know the productivityof my software factory.
REQUIREMENTS MANAGERS’ NEEDS
www.eng.it13 SIMPDA 2014 – Milan,Italy November 20th, 2014
● Continuous Quality Improvement in Engineering Group's projects
● Unified Infrastructure supporting quality processes granting flexibility and adaptability to Engineering's Software Labs
● CMMI-DEV and ISO certifications, as independent method to validate the compliance of processes and infrastructures with quality standards
REQUIREMENTS COMPLIANCE TO QUALITY STANDARDS
www.eng.it14 SIMPDA 2014 – Milan,Italy November 20th, 2014
BACKGROUND MODEL & TOOL
QEST nD model, a conceptual framework for measuring process
performance based on multiple analysis dimensions
Spago4Q, the open source SpagoBI analytic for Quality and
Performance Improvement
www.eng.it15 SIMPDA 2014 – Milan,Italy November 20th, 2014
Source: Buglione L. & Abran A., QEST nD: n-dimensional extension and generalisation of a Software Performance Measurement Model, International Journal of
Advances in Engineering Software, Elsevier Science Publisher, Vol. 33, No. 1, January 2002, pp.1-7
Method: Performance is expressed as the combination of the specific ratios
selected for each of the 3 dimensions of the quantitative assessment
(Productivity - PR) and the perceived product quality level of the qualitative
assessment (Quality - Q)
Performance = PR + Q
Model: QEST (Quality factor + Economic, Social & Technical dimensions) is a “structured shell” to be filled according to management objectives in relation to a specific project
Such a model has the ability to handle independent sets of dimensions without predefined ratios and weights - referred to as an open model
BACKGROUND QEST MODEL
www.eng.it16 SIMPDA 2014 – Milan,Italy November 20th, 2014
It is possible to measure performance considering at least 3 distinct geometrical concepts:
• Distance between the tetrahedron base center of gravity and the center of the plane section along the tetrahedron height – the greater the distance from 0, the higher the performance level;• Area of the sloped plane section – the smaller the area, the higher the performance level;
• Volume of the lowest part of the truncated tetrahedron – the greater the volume, the higher the performance level.
�Target: measuring project performance (p) using 3 distinct viewpoints
�Input Data: list of weighted ratios for each dimension and quality questionnaires
�Output Data: an integrated normalized value of performance
BACKGROUND QEST MODEL
www.eng.it17 SIMPDA 2014 – Milan,Italy November 20th, 2014
BACKGROUND SPAGO4Q
www.eng.it18 SIMPDA 2014 – Milan,Italy November 20th, 2014
THE PROCESS PMAI APPROACH
The procedure is coherent with the PMAI (Plan-Measure-Assess-Improve) cycle:
� PLAN, defining a set of metrics, based on the GQM approach, and possible dimensions of analysis characterizing the analysis
� MEASURE, including the collection of data, and the computation of metric values and global performance value
� ASSESS, presenting results through dashboards and reports
� IMPROVE, analyzing in detail each value below expected thresholds in order to find possible problems or bottlenecks from a process based viewpoint
www.eng.it19 SIMPDA 2014 – Milan,Italy November 20th, 2014
Declaration of a complete GQM, with the definition of
� the analysis dimensions
� the concepts to measure
� the metrics to apply to project’s work-products
THE PROCESS S1. METRICS & MODEL DEFINITION
www.eng.it20 SIMPDA 2014 – Milan,Italy November 20th, 2014
Couple each metric with the respective weight
� Indicates the importance that such a concept plays in the dimension it belongs to
Define the specific thresholds
� Evaluates the value with respect to organization policies
Assign (if provided) QF to each dimension
� Give to each dimension a quality evaluation
THE PROCESS S2. WEIGHTS & THRESHOLDS DEFINITION
www.eng.it21 SIMPDA 2014 – Milan,Italy November 20th, 2014
Measures are taken directly from Spago4Q data warehouse
� The DB is filled by data automatically collected by extractors accessing process work-products (code package, text documents, project information, …)
Metrics are described in terms of:
�Name of the model to which the metric is assigned to
�Default value
�Minimum and maximum values (for normalization)
�KPI computation algorithm
THE PROCESS S3. DATA GATHERING
www.eng.it22 SIMPDA 2014 – Milan,Italy November 20th, 2014
Overall and dimension-wise performance indexes are computed as
KPIs that take in input configuration data and results of the
metrics
The performance value of each dimension is calculated as the
weighted sum of each selected measure by its assigned weight
for that dimension
THE PROCESS S4. PERFORMANCE CALCULATION
www.eng.it23 SIMPDA 2014 – Milan,Italy November 20th, 2014
Sets of reports and dashboards could be defined and configured
to satisfy reporting and managerial needs
Spago4Q provides methods and interfaces to directly configure
and create new reports using all the facilities provided by the
SpagoBI suite of analytical tools
THE PROCESS S5. REPORTING
www.eng.it24 SIMPDA 2014 – Milan,Italy November 20th, 2014
CASE STUDY AM IN FINANCE
Application Management services
� Software Maintenance (Corrective, Adaptive, Perfective, Preventive) for a large mission-critical system in a Finance Institute
Services started in 2006 (analysis period : January 2008 – June 2010)
� Verify QEST nD applicability and results in a context of AM Services
� Define a QEST nD model aligned to the AM services goals
� Monitor the effectiveness of improvement action with specific goals and metrics
Goals� EC-G3 Reduce the rework (intended as impact of defects in UAT or production
environment)� TE-G1 Improve the deploy process� TE-G5 Improve effectiveness of peer reviews
Improvement actions � Deploy process automation and automatic analysis of source code � Progressively increasing of the number of peer reviews on critical work products� Specific tasks were included in Impact analysis phase at the aim to:
� Classify and identify critical Work Products to be reviewed� Assign an owner to solve complex defects impacting on different development
streams � Root-cause analysis of the recurring defects
www.eng.it25 SIMPDA 2014 – Milan,Italy November 20th, 2014
Four analysis dimensions:
1. Economical (EE)
2. Technical (TT)
3. Resource Usage (RSRS)
4. Customer Satisfaction (CSCS)
Each dimension is characterized by a specific metrics set for
process evaluation
Performance values for each dimension allow to identify process
areas that need improvements
CASE STUDY DIMENSIONS OF ANALISYS
www.eng.it26 SIMPDA 2014 – Milan,Italy November 20th, 2014
Four analysis dimensions and goals as follows:
1. Economical (EE)
E.G1 Reduce the effort of corrective maintenance
E.G2 Improve the number of delayed deliverables
E.G3 Reduce the rework (intended as impact of defects in UAT or production environment)
2. Technical (TT)
T.G1 Improve the deploy process
T.G2 Reduce the resolution time for defects and technical issues
T.G3 Improve quality of documents and source code
T.G4 Reduce the rework (intended as impact of defects during development phase)
T.G5 Effectiveness of peer reviews
T.G6 Improve non regression test
3. Resource Usage (RSRS)
RS.G1 Reduce impact of human resource management issues
RS.G2 Improve hardware system availability
4 Customer Satisfaction (CSCS)
CS.G1 Improve user satisfaction about training courses and application services
CASE STUDY GOALS & METRICS
www.eng.it27 SIMPDA 2014 – Milan,Italy November 20th, 2014
Dimension Metric Description FormulaEconomical (EE) Incidence of Corrective Maintenance Effort w.r.t. maintained code size Corrective Maintenance Effort/ KLOC
Ratio Corrective Maintenance Effort - Adaptive Maintemance Effort Corrective Maintenance Effort/ AdaptiveMaintenance Effort
Incidence of Delayed Deliverables w.r.t. total number of Deliverables no. Delayed Deliv. / no. Deliv.
Incidence of Defects after system test w.r.t. total number of Defects no. Defects in UAT or production / total no. ofDefects
Resource Usage(RSRS)
Human Resources management issues w.r.t. total number of issues admitted forworking group size
no. HR issues / no. Issues for group size
Hardware System Availability Percentage System AvailabilityTechnical (TT) Technical management issues w.r.t. total number of issues admitted no. Technical issues / no.issues admitted
Issues Mean Resolution Time Total Res. Time / no. IssuesDocument quality: respect of document quality standard Percentage of positive response to a
checklist Software Complexity Results of automatic static code analysis
Coding rules non-conformity level Results of automatic static code analysis
Software Maintenability Results of automatic static code analysisIncidence of Peer Reviews w.r.t. total number of Deliverables no. Peer reviews / no. Deliverables
Number of Defects discovered by peer reviews w.r.t. total number of Defects no. Peer review defects / total no. defectsno. Defects / FP
Incidence of Defects Due to Design Phase w.r.t. total number of Defects no. Defects(Design phase) / Total no. Defectsfor any phase p
Test coverage w.r.t. Requirements no. Test Cases / no. Requirements
Production Defects Mean Resolution Time Total Res. Time / no. defects
CustomerSatisfaction (CSCS)
Training Services Questionnaire results
User Satisfaction Questionnaire results
CASE STUDY GOALS & METRICS
www.eng.it28 SIMPDA 2014 – Milan,Italy November 20th, 2014
CASE STUDY RESULT – QEST DASHBOARD
www.eng.it29 SIMPDA 2014 – Milan,Italy November 20th, 2014
Trend for each dimension
Last results for each dimension
CASE STUDY RESULT – DIMENSIONS TREND ANALYSIS
www.eng.it30 SIMPDA 2014 – Milan,Italy November 20th, 2014
CASE STUDY RESULT – GLOBAL & TECHNICAL % INCREASE
www.eng.it31 SIMPDA 2014 – Milan,Italy November 20th, 2014
[AM-EC-M.04] Defects reduction in UAT and production environment
[AM-TE-M.11] Defects mean resolution time reduction
CASE STUDY RESULT – SAMPLE
www.eng.it32 SIMPDA 2014 – Milan,Italy November 20th, 2014
[AM-TE-M.01] Technical issues reduction: specifically related to deployment process
[AM-TE-M.02] Technical issues mean resolution time
CASE STUDY RESULT – SAMPLE
www.eng.it33 SIMPDA 2014 – Milan,Italy November 20th, 2014
[AM-TE-M.07] Number of peer reviews actually executed vs. number of critical Work
Products
[AM-TE-M.08] Defects or potential defects discovered during peer reviews
[AM-TE-M.09] Incidence of defects due to design phase
CASE STUDY RESULT – SAMPLE
www.eng.it34 SIMPDA 2014 – Milan,Italy November 20th, 2014
ENGINEERING’S SOFTWARE LABS TECHNICAL INFRASTRUCTURE
www.eng.it35 SIMPDA 2014 – Milan,Italy November 20th, 2014
CollaborationForum, Blog, Wiki
Documentmanagement
ProblemManagement
ChangeManagement
RequestManagement
RiskManagement
ProgramManagement Requirement
Management
DevelopmentManagement
Software Quality
TestManagement
DeployManagement
RepositoryDocumenti e KB
ConfigurationManagement
Sistema di Reporting eCruscotti SLA
Application Lifecycle Management Service Management
Service desk
Knowledge Management
System Monitoring
ALM
SCMIDE
ContinuousIntegration
Test Automation
Agile ProjectManagement
Monitoring & Control
Software Factory
CMDB
IncidentManagement
Catalogo RiusoSiti cliente
CustomerSatisfaction
ENGINEERING’S SOFTWARE LABS SOFTWARE INFRASTRUCTURE
www.eng.it36 SIMPDA 2014 – Milan,Italy November 20th, 2014
Quality emerge ad the result of three dimensions of analysis:
Economic (EE)
Social (SS)
Technical (TT)
Performance values for each dimension
allow to identify process areas that need improvements
PRODUCTIVITY INTELLIGENCE INFRASTRUCTURE
I want to know the productivityof my software factory.
www.eng.it37 SIMPDA 2014 – Milan,Italy November 20th, 2014
● Social Dimension is a First Class Citizen
● Quantitative data about how people adhere
to corporate processes
● Qualitative data from LimeSurvey about
satisfaction level of customers, integrators,
developers
● Net Promoter Score approach
PRODUCTIVITY INTELLIGENCE SOCIAL ANALYSIS
www.eng.it38 SIMPDA 2014 – Milan,Italy November 20th, 2014
Top Manager (TM)
Level 1
ESL Chief Manager
Level 2
ESL Lab Manager
Level 3Project Manager (PM)
ESL
ESL 1 ESL 2 ESL 3
PRJ 1
PRJ n
PRJ 1
PRJ n PRJ n
PRJ 1
Engineering's Software Labs
Project Development Project Development Application M aintenance
PRODUCTIVITY INTELLIGENCE DRILL DOWN VIEWS
www.eng.it39 SIMPDA 2014 – Milan,Italy November 20th, 2014
PRODUCTIVITY INTELLIGENCE GENERAL DASHBOARD
www.eng.it40 SIMPDA 2014 – Milan,Italy November 20th, 2014
PRODUCTIVITY INTELLIGENCE QEST ANALYSIS
www.eng.it41 SIMPDA 2014 – Milan,Italy November 20th, 2014
PRODUCTIVITY INTELLIGENCE QEST DASHBOARD
www.eng.it42 SIMPDA 2014 – Milan,Italy November 20th, 2014
PRODUCTIVITY INTELLIGENCE SINGLE INDICATOR
www.eng.it43 SIMPDA 2014 – Milan,Italy November 20th, 2014
Finally we can REALLY
measure performance!
Now I can compare
Labs performance!
Productivity Intelligence
enables performance
improvement!Now I know how productive
my organization is!Users & Customers
feedbacks are now integrated
with corporate data!
Through audit dashboards,
corporate QA
is under control !
I can monitor the
quality level
of my product !
I know if my project is
on track & I can identify
issues !
The development process
is under control and
I can improve it !
TopManager
QualityManager
ProjectManager
PRODUCTIVITY INTELLIGENCE ANSWERS TO MANAGERS’ NEEDS
www.eng.it44 SIMPDA 2014 – Milan,Italy November 20th, 2014
PRODUCTIVITY INTELLIGENCE RESULTS & NEXT STEPS
INTEGRATOR
INNOVATOR
Skill consolidation SW FACTORY EFFECTIVENESSSW FACTORY EFFECTIVENESSNew market New market sectorsector : ALM: ALM , PRJ AUTO, PRJ AUTO
PURE PLAYER
ProductImprovement
Collaborative projectsResearch developments
ProjectsProjects : : RISCOSSRISCOSSConferencesConferences , , publicationspublications (ISSRE, IT (ISSRE, IT ConfidenceConfidence, ICSOB), ICSOB)
www.eng.it45 SIMPDA 2014 – Milan,Italy November 20th, 201445
Prepare the environment and build the ecosystem
Stimulate creativity
Help bring innovation into market
Deliver market-ready offerings
Measure, assess, and value the results
LET’S MAKE IT HAPPEN!
OPEN SOURCE HAS NOT INTRINSIC VALUE PER SE
PRODUCTIVITY INTELLIGENCE CONCLUSIONS
www.eng.it46 SIMPDA 2014 – Milan,Italy November 20th, 2014
We care of your problems and we have in mind a solution
Thanks for your Attention !
resources: www.spago4q.org
ecology of value: www.spagoworld.org/blog/
comments: www.linkedin.com (group: SpagoWorld)
www.twitter.com (@gruffatti, @spagoworld, @engineeringspa)
mailto: [email protected]
contacts: