An empirical analysis of c preprocessor use - Software Engineering ...
Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results...
Transcript of Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results...
![Page 1: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/1.jpg)
© 2013 Carnegie Mellon University
Empirical Study of Software Engineering Results
Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 TSP Initiative, SEAP/SDD Sept. 18, 2013
![Page 2: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/2.jpg)
If you are a data lover …
… then this interactive is for you.
![Page 3: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/3.jpg)
3
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
If you are a data-lover, don’t be shy …
… because you are in great company.
![Page 4: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/4.jpg)
W. Edwards Deming
![Page 5: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/5.jpg)
5
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
“When it comes to the really important
decisions, data trumps intuition every time.”
Jeff Bezos
![Page 6: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/6.jpg)
6
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
That’s 12 quadrillion bytes
12 petabytes for data storage
… or 1015 bytes
![Page 7: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/7.jpg)
7
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
TSP Symposium Attendee
![Page 8: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/8.jpg)
8
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Introduction
Format of the “Interactive”
Data Provenance
The Data
Next Steps
Topics
![Page 9: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/9.jpg)
9
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Please ask questions!
![Page 10: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/10.jpg)
10
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
And … we will be asking you questions!
![Page 11: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/11.jpg)
11
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
![Page 12: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/12.jpg)
12
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Looking for Fresh Ideas
Looking for FRESH ideas …
![Page 13: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/13.jpg)
13
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
All comments are welcome!
![Page 14: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/14.jpg)
14
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
![Page 15: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/15.jpg)
What is the single most important question that you would want to be addressed through the analysis of TSP data?
Before
![Page 16: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/16.jpg)
16
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Introduction
Format of the “Interactive”
Data Provenance
The Data
Next Steps
Topics
![Page 17: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/17.jpg)
17
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
What Is Data Provenance?
Provenance, from the French provenir, "to come from," refers to the chronology of the ownership, custody or location of a historical object.
Data provenance refers to a record trail that accounts for the origin of a piece of data (in a document, repository, or database) together with an explanation of how it got to the present place.
Data provenance assists scientists with tracking their data through all transformations, analyses, and interpretations.
![Page 18: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/18.jpg)
18
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Master Jim at work … prying the data from the tool.
![Page 19: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/19.jpg)
19
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
The TSP Data Repository includes two archives of approximately 50,000 files.
• 20-25% of those are supporting documents for a launch or postmortem (e.g., presentations, analysis spreadsheets, lists, surveys, etc.).
• 75-80% of those are TSP team performance data files.
There are more than 50 different file formats.
• At this time, only 22 of the needed file import utilities have been developed.
• Only those that contain TSP cycle or postmortem data were processed.
• About 60% of Archive #1 fit the criteria and have been processed. Archive #2 has not been processed yet.
Tests were conducted to ensure that extracted data represented unique projects.
![Page 20: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/20.jpg)
20
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Analysis Based on Teams’ Composite Data The data is composite team data; each record represents data that has been aggregated from individual team member data.
Team composite data
Individual team member data
To Analysis
![Page 21: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/21.jpg)
21
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
To Analysis
Team composite data
Individual team member data
Statistical Analysis
![Page 22: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/22.jpg)
22
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Outlier Analysis
Exploratory analysis yielded some outliers that were removed from some of the analyses.
![Page 23: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/23.jpg)
23
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Introduction
Format of the “Interactive”
Data Provenance
The Data • Team and Project Characteristics • Product Size • Schedule Performance • Quality Indicators
Topics
![Page 24: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/24.jpg)
24
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Country Source
93
20
0
10
20
30
40
50
60
70
80
90
100
United States Mexico
![Page 25: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/25.jpg)
25
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Project Start Date
Jul-2012Jul-2010Jul-2008Jul-2006Jul-2004Jul-2002Jul-2000
30
25
20
15
10
5
0
Project Start Date
Frequency
n=113
![Page 26: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/26.jpg)
26
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Project Duration (Calendar Days)
6005004003002001000
50
40
30
20
10
0
Project Duration (Days)
Freq
uenc
y
Mean = 119.4 Median = 91.0 n = 113
![Page 27: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/27.jpg)
27
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
80644832160
30
25
20
15
10
5
0
Duration (Weeks)
Frequency
Project Duration (Weeks)
Mean = 16.9 Median = 13.0 n = 113
![Page 28: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/28.jpg)
28
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Team Size
181512963
18
16
14
12
10
8
6
4
2
0
Team Size
Frequency
Mean = 8.2 Median = 8.0 n = 111
![Page 29: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/29.jpg)
What other types of “team and project characteristics” analyses would you find valuable?
![Page 30: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/30.jpg)
30
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Introduction
Format of the “Interactive”
Data Provenance
The Data • Team and Project Characteristics • Product Size • Schedule Performance • Quality Indicators
Next Steps
Topics
![Page 31: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/31.jpg)
31
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
24020016012080400
60
50
40
30
20
10
0
Total Size (KLOC)
Frequency
Actual Total Size
Mean = 24.0 Median = 3.8 n = 111
![Page 32: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/32.jpg)
32
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
6050403020100
60
50
40
30
20
10
0
Actual Added & Modified Code (KLOC)
Frequency
Actual Added and Modified Size
Mean = 11.7 Median = 6.6 n = 112
![Page 33: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/33.jpg)
33
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Another View of the Size Data From the data, calculate the
1. log of each value
2. mean (�̅�𝑥) of values from #1
3. standard deviation (𝑠𝑠) of values from #1
4. exponent of values from #2 and #3.
Values of the relative size table:
Very Small
Small
Medium
Large
Very large
�̅�𝑥 − 2𝑠𝑠
�̅�𝑥 − 𝑠𝑠
�̅�𝑥
�̅�𝑥 + 𝑠𝑠
�̅�𝑥 + 2𝑠𝑠 �̅�𝑥 −𝑠𝑠 −2𝑠𝑠 2𝑠𝑠 𝑠𝑠 M S VS L VL
Bin Width = 1 SD
1 SD
1 SD
1 SD
1 SD
![Page 34: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/34.jpg)
34
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
0
10
20
30
40
50
287 1,265 5,571 24,532 108,022
Frequency
Added and Modified Lines of Code
Added & Modified Code – Relative Sizes
VS S M L VL
Relative Size
![Page 35: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/35.jpg)
What would you like to see in terms of analyses associated with product size?
![Page 36: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/36.jpg)
36
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Introduction
Format of the “Interactive”
Data Provenance
The Data • Team and Project Characteristics • Product Size • Schedule Performance • Quality Indicators
Next Steps
Topics
![Page 37: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/37.jpg)
What do you think is the average number of weekly task hours that teams are able to accomplish?
![Page 38: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/38.jpg)
38
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Mean Team Member Weekly Task Hours
23.620.016.412.89.25.62.0
18
16
14
12
10
8
6
4
2
0
Mean Team Member Weekly Task Hours
Frequency
Mean = 10.3 Median = 9.0 n = 111
![Page 39: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/39.jpg)
39
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
9075604530150
40
30
20
10
0
Productivity (LOC/Hr)
Frequency
Productivity
Mean = 10.3 Median = 7.1 n = 112
![Page 40: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/40.jpg)
40
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Let’s Look At Some Scatter Plots
X
y Using linear regression:
y = mx + b
where: m is the slope b is the y-intercept
r is a measure of the correlation between x values and y values.
Values of r2 ≥ 0.5 indicate a meaningful relationship.
![Page 41: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/41.jpg)
41
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
There Are A Few Rules …
![Page 42: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/42.jpg)
42
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Plan Task Hours Vs. Actual Task Hours
R² = 0.8038
0
1000
2000
3000
4000
5000
6000
7000
8000
9000
0 2000 4000 6000 8000 10000
Actual Task Hours
Plan Task Hoursn=113
![Page 43: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/43.jpg)
43
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Duration: Planned Weeks Vs. Actual Weeks
R² = 0.7401
0
10
20
30
40
50
60
70
80
90
0 20 40 60 80 100 120
Duration - Actual Weeks
Duration - Planned Weeksn = 113
![Page 44: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/44.jpg)
44
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Actual Task Hours Vs. Added & Modified LOC
R² = 0.3012
0
10,000
20,000
30,000
40,000
50,000
60,000
70,000
0 2,000 4,000 6,000 8,000 10,000
Actual Added & Modified LOC
Actual Task Hoursn = 113
![Page 45: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/45.jpg)
45
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Team Size vs. Productivity
R² = 0.0297
0
10
20
30
40
50
60
70
80
90
100
0 5 10 15 20
Productivity(LOC/Hr)
Team Sizen = 111
![Page 46: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/46.jpg)
46
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Actual Added & Modified LOC Per Staff Week
2000160012008004000
60
50
40
30
20
10
0
Total LOC Per Staff Week
Frequency
Mean = 204.9 Median = 100.9 n = 109
n = 109 Actual A&M LOC Per Staff Week
![Page 47: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/47.jpg)
47
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Plan Vs. Actual Hours for Completed Parts
R² = 0.952
0
1,000
2,000
3,000
4,000
5,000
6,000
7,000
8,000
9,000
10,000
0 2,000 4,000 6,000 8,000 10,000
Actual Hours for Completed Parts
Plan Hours for Completed Partsn = 113
![Page 48: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/48.jpg)
48
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Schedule Growth Beyond Baseline
12080400-40-80
80
70
60
50
40
30
20
10
0
Schedule Growth from Baseline (hours)
Frequency
Mean = 2.8 Median = 0.0 n = 113
![Page 49: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/49.jpg)
49
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
907560453015
50
40
30
20
10
0
Final Earned Value
Frequency
Final Earned Value
Mean = 85.2 Median = 94.3 n = 110
![Page 50: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/50.jpg)
Are there other types of “schedule performance” analyses that you would like to see?
![Page 51: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/51.jpg)
51
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Introduction
Format of the “Interactive”
Data Provenance
The Data • Team and Project Characteristics • Product Size • Schedule Performance • Quality Indicators
Next Steps
Topics
![Page 52: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/52.jpg)
52
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Total Defects Injected Per KLOC
480400320240160800
25
20
15
10
5
0
Defects Injected per KLOC
Frequency
Mean = 54.7 Median = 31.5 n = 79
![Page 53: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/53.jpg)
53
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Defect Density – DLD Review
2824201612840
40
30
20
10
0
Defects Per KLOC - DLD Review
Frequency
Mean = 5.2 Median = 2.2 n = 106
![Page 54: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/54.jpg)
54
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Defect Density – Code Review
33.628.824.019.214.49.64.80.0
16
14
12
10
8
6
4
2
0
Defects Per KLOC - Code Review
Frequency
Mean = 8.4 Median = 5.2 n = 106
![Page 55: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/55.jpg)
55
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
120100806040200
50
40
30
20
10
0
Defects Per KLOC - Inspection
Frequency
Defect Density – Code Inspection
Mean = 6.7 Median = 3.3 n = 106
![Page 56: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/56.jpg)
56
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Defect Density – Unit Test
847260483624120
35
30
25
20
15
10
5
0
Defects Per KLOC - Unit Test
Frequency
Mean = 5.0 Median = 3.8 n = 106
< 5
![Page 57: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/57.jpg)
57
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Defect Density – Build and Integration Test
16.814.412.09.67.24.82.40.0
50
40
30
20
10
0
Defects Per KLOC - Build and Integration Test
Frequency
Mean = 1.7 Median = 0.7 n = 106 < 0.5
![Page 58: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/58.jpg)
58
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
15129630
60
50
40
30
20
10
0
Defects Per KLOC - System Test
Frequency
Defect Density – System Test
Mean = 1.5 Median = 0.15 n = 106 < 0.2
![Page 59: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/59.jpg)
59
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
2824201612840
40
30
20
10
0
Defects Per KLOC - DLD Review
Frequency
33.628.824.019.214.49.64.80.0
16
12
8
4
0
Defects Per KLOC - Code Review
Frequency
120100806040200
40
30
20
10
0
Defects Per KLOC - Inspection
Frequency
847260483624120
30
20
10
0
Defects Per KLOC - Unit Test
Frequency
15129630
48
36
24
12
0
Defects Per KLOC - Build and Integration T
Frequency
15129630
60
45
30
15
0
Defects Per KLOC - System Test
Frequency
Defect Density - Summary
![Page 60: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/60.jpg)
60
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Defect Density – Median of Defects Per KLOC
0.15
0.7
3.8
3.3
5.2
2.2
0 1 2 3 4 5 6
System Test
Build/Integration Test
Unit Test
Code Inspection
Code Review
DLD Review
Defects Per KLOC (Median)
< 0.2
< 0.5
< 5
![Page 61: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/61.jpg)
61
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Actual UT Defect Density Vs. ST Defect Density
R² = 0.0031
0
2
4
6
8
10
12
14
16
18
0 5 10 15 20 25 30
System Test Defect Density
Unit Test Defect Densityn=111
![Page 62: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/62.jpg)
62
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
The Yield Quality Measure
Defect removal phase
Development phase injects defects Intermediate product with defects
Phase Yield
Phase Yield Defect removal phase
Development phase injects defects Intermediate product with defects
Defect removal phase
Development phase injects defects Intermediate product with defects
The yield of a phase is the percentage of defects removed in that phase.
Phase Yield
![Page 63: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/63.jpg)
63
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Yield: Detailed-Level Design Review
1.00.80.60.40.20.0
25
20
15
10
5
0
Yield - Detailed-Level Design Review
Frequency
Mean = 31.6% Median = 29.3% n = 98
![Page 64: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/64.jpg)
64
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
1.00.80.60.40.20.0
16
14
12
10
8
6
4
2
0
Yield - Detailed-Level Design Inspection
Frequency
Yield: Detailed-Level Design Inspection
Mean = 49.3% Median = 50.0% n = 83
![Page 65: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/65.jpg)
65
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
0.900.750.600.450.300.150.00
25
20
15
10
5
0
Yield - Code Review
Frequency
Yield: Code Review
Mean = 29.2% Median = 30.1% n = 109
![Page 66: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/66.jpg)
66
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
0.900.750.600.450.300.150.00
14
12
10
8
6
4
2
0
Yield - Code Inspection
Frequency
Yield: Code Inspection
Mean = 30.3% Median = 26.1% n = 110
![Page 67: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/67.jpg)
67
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
1.00.80.60.40.20.0
20
15
10
5
0
Yield - Unit Test
Frequency
Yield: Unit Test
Mean = 49.7% Median = 46.2% n = 106
![Page 68: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/68.jpg)
68
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Summary: Median Phase Yields
46.2
26.1
30.1
50.0
29.3
0.0 10.0 20.0 30.0 40.0 50.0 60.0
Unit Test
Code Inspection
Code Review
DLD Inspection
DLD Revew
![Page 69: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/69.jpg)
69
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Process Yield
1.00.90.80.70.60.50.4
12
10
8
6
4
2
0
Process Yield
Frequency
Mean = 73.6% Median = 73.7% n = 77
![Page 70: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/70.jpg)
70
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Review of Some Definitions …
In the TSP:
𝑨𝑨𝑨𝑨𝑨𝑨𝑨𝑨𝑨𝑨𝑨𝑨𝑨𝑨𝑨𝑨𝑨𝑨 𝑪𝑪𝑪𝑪𝑪𝑪 =𝑹𝑹𝑹𝑹𝑹𝑹𝑨𝑨𝑹𝑹𝑹𝑹 & 𝑰𝑰𝑰𝑰𝑨𝑨𝑨𝑨𝑹𝑹𝑰𝑰𝑰𝑰𝑨𝑨𝑰𝑰𝑰𝑰 𝑻𝑻𝑨𝑨𝑻𝑻𝑹𝑹𝑻𝑻𝑰𝑰𝑰𝑰𝑨𝑨𝑨𝑨 𝑫𝑫𝑹𝑹𝑹𝑹𝑹𝑹𝑨𝑨𝑰𝑰𝑨𝑨𝑻𝑻𝑹𝑹𝑰𝑰𝑰𝑰 𝑻𝑻𝑨𝑨𝑻𝑻𝑹𝑹
× 𝟏𝟏𝟏𝟏𝟏𝟏
𝑭𝑭𝑨𝑨𝑨𝑨𝑨𝑨𝑭𝑭𝑨𝑨𝑹𝑹 𝑪𝑪𝑪𝑪𝑪𝑪 =𝑻𝑻𝑹𝑹𝑨𝑨𝑰𝑰 𝑻𝑻𝑨𝑨𝑻𝑻𝑹𝑹
𝑻𝑻𝑰𝑰𝑰𝑰𝑨𝑨𝑨𝑨 𝑫𝑫𝑹𝑹𝑹𝑹𝑹𝑹𝑨𝑨𝑰𝑰𝑨𝑨𝑻𝑻𝑹𝑹𝑰𝑰𝑰𝑰 𝑻𝑻𝑨𝑨𝑻𝑻𝑹𝑹× 𝟏𝟏𝟏𝟏𝟏𝟏
![Page 71: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/71.jpg)
71
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Appraisal Cost of Quality
50403020100
14
12
10
8
6
4
2
0
Appraisal Cost of Quality (Percent)
Frequency
Mean = 26.6% Median = 27.0% n = 86
![Page 72: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/72.jpg)
72
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Failure Cost of Quality
75604530150
20
15
10
5
0
Failure Cost of Quality (Percent)
Frequency
Mean = 22.0% Median = 16.9% n = 86
![Page 73: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/73.jpg)
73
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
R² = 0.0347
0
20
40
60
80
100
120
140
0 10 20 30 40 50 60
Defects Per KLOC Removed During Appraisal Phases
Appraisal Cost of Quality (Percent)
Appraisal: COQ vs. Defects Removed
n=81
![Page 74: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/74.jpg)
74
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
R² = 0.495
0
100
200
300
400
500
600
700
0 100 200 300 400 500
Defects Removed in Code Review
Actual Time in Code Review (Hours)
Time in Code Review Vs. Defects Removed
n=112
![Page 75: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/75.jpg)
75
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
R² = 0.6272
0
100
200
300
400
500
600
0 100 200 300 400 500 600
Defects Removed
DuringInspection
Actual Time in Inspection (Hours)
Time in Inspection Vs. Defects Removed
n=112
![Page 76: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/76.jpg)
76
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
R² = 0.3505
0
50
100
150
200
250
300
350
0 200 400 600 800 1000 1200
Def
ects
Fou
nd
Actual Time in Unit Test (Hours)
Time in Unit Test Vs. Defects Removed
n=112
![Page 77: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/77.jpg)
77
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
R² = 0.1996
0
50
100
150
200
250
300
0 100 200 300 400 500 600 700
Defects Found
Actual Time in System Test (Hours)
Time in System Test Vs. Defects Removed
n=29
![Page 78: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/78.jpg)
78
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Summary: Time in Phase Vs. Defects Removed
Phase Correlation
Appraisal Code Review 0.50 Inspection 0.63
Test Unit Test 0.35 System Test 0.20
Predictive
![Page 79: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/79.jpg)
79
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Code Review to Code – Actual Time in Phase
0.900.750.600.450.300.150.00
18
16
14
12
10
8
6
4
2
0
Code Review to Code - Actual Time
Frequency
Mean = 0.3 Median = 0.3 n = 113 > 0.5
![Page 80: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/80.jpg)
80
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
4.503.753.002.251.500.750.00
25
20
15
10
5
0
Design to Code - Actual Time
Frequency
Design to Code – Actual Time in Phase
Mean = 0.9 Median = 0.8 n = 113 > 1.0
![Page 81: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/81.jpg)
81
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Design Review to Design – Actual Time in Phase
1.21.00.80.60.40.20.0
20
15
10
5
0
Design Review to Design - Actual Time
Frequency
Mean = 0.28 Median = 0.23 n = 113 > 0.5
![Page 82: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/82.jpg)
82
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
3.753.002.251.500.750.00
6
5
4
3
2
1
0
Req. Inspection to Req. - Actual Time
Frequency
Req. Inspection to Req. – Actual Time in Phase
Mean = 0.88 Median = 0.46 n = 19
> 0.25
![Page 83: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/83.jpg)
For “quality indicators,” what additional analyses would you like to see?
![Page 84: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/84.jpg)
What is the single most important question that you would want addressed through the analysis of TSP data?
After
![Page 85: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/85.jpg)
85
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Topics
Introduction
Format of the “Interactive”
Data Provenance
The Data • Team and Project Characteristics • Product Size • Schedule Performance • Quality Indicators
Next Steps
![Page 86: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/86.jpg)
86
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
• Review your feedback from today and adjust the analysis approach accordingly.
• Extract the data from Process Dashboard tool submitted files.
• Extract and analyze individual team member data.
• Continue with the data analysis. Publish the results.
![Page 87: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/87.jpg)
87
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Contact Information
Mark Kasunic Senior Member of Technical Staff TSP Initiative Telephone: +1 412-268-5863 Email: [email protected]
U.S. Mail Software Engineering Institute Customer Relations 4500 Fifth Avenue Pittsburgh, PA 15213-2612 USA
Web www.sei.cmu.edu www.sei.cmu.edu/contact.cfm
Customer Relations Email: [email protected] Telephone: +1 412-268-5800 SEI Phone: +1 412-268-5800 SEI Fax: +1 412-268-6257
![Page 88: Empirical Study of Software Engineering Results · Empirical Study of Software Engineering Results . Software Engineering Institute . Carnegie Mellon University . Pittsburgh, PA 15213](https://reader030.fdocuments.us/reader030/viewer/2022041011/5ebc27831c99c104bf30ecfc/html5/thumbnails/88.jpg)
88
TSP Initiative Kasunic & Chick │ Sept. 18, 2013 © 2013 Carnegie Mellon University
Copyright 2012 Carnegie Mellon University.
This material is based upon work supported by the Department of Defense under Contract No. FA8721-05-C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center.
Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the United States Department of Defense.
NO WARRANTY
THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING INSTITUTE MATERIAL IS FURNISHED ON AN “AS-IS” BASIS. CARNEGIE MELLON UNIVERSITY MAKES NO WARRANTIES OF ANY KIND, EITHER EXPRESSED OR IMPLIED, AS TO ANY MATTER INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR PURPOSE OR MERCHANTABILITY, EXCLUSIVITY, OR RESULTS OBTAINED FROM USE OF THE MATERIAL. CARNEGIE MELLON UNIVERSITY DOES NOT MAKE ANY WARRANTY OF ANY KIND WITH RESPECT TO FREEDOM FROM PATENT, TRADEMARK, OR COPYRIGHT INFRINGEMENT.
This material has been approved for public release and unlimited distribution except as restricted below.
Internal use:* Permission to reproduce this material and to prepare derivative works from this material for internal use is granted, provided the copyright and “No Warranty” statements are included with all reproductions and derivative works.
External use:* This material may be reproduced in its entirety, without modification, and freely distributed in written or electronic form without requesting formal permission. Permission is required for any other external and/or commercial use. Requests for permission should be directed to the Software Engineering Institute at [email protected].
*These restrictions do not apply to U.S. government entities.