Good Projects Gone Bad: an Introduction to Process Maturity

47
Introduction to Process Maturity AAM Annual Conference April 27, 2008 Michael Edson Director, Web and New Media Strategy Smithsonian Institution From the session Good Projects Gone Bad Managing and Delivering Complex Technology Projects Michael Edson Director, Web and New Media Strategy Smithsonian Institution Nik Honeysett Head of Administration J. Paul Getty Museum

description

Museum Web and New Media software projects offer tantalizing rewards, but the road to success can be paved with uncertainty and risk. To small organizations these risks can be overwhelming, and even large organizations with seemingly limitless resources can flounder in ways that profoundly affect staff morale, public impact, the health and fitness of our partners in the vendor community, and our own bottom lines. Something seems to happen between the inception of projects, when optimism and beneficial outcomes seem clear and attainable, and somewhere down the road when schedules, budgets, and outcomes go off course. What is it? And what can we do to gain control?This paper, created for the 2008 annual conference of the American Association of Museums, describes some common ways that technology projects get into trouble. It examines a proven project-process framework called the Capability Maturity Model and how that model can provide insight and guidance to museum leaders and project participants, and it tells how to improve real-world processes that contribute to project success. The paper includes three brief case studies and a call-to-action which argues that museum leaders should make technology stewardship an urgent priority.The intended audience is people who are interested in understanding and improving how museum-technology gets done. The paper’s primary focus is Web and New Media software projects, but the core ideas are applicable to projects of all kinds.

Transcript of Good Projects Gone Bad: an Introduction to Process Maturity

Page 1: Good Projects Gone Bad: an Introduction to Process Maturity

Introduction to Process MaturityAAM Annual Conference

April 27, 2008

Michael EdsonDirector, Web and New Media Strategy

Smithsonian Institution

From the session

Good Projects Gone BadManaging and Delivering Complex Technology Projects

Michael EdsonDirector, Web and New Media Strategy

Smithsonian Institution

Nik HoneysettHead of AdministrationJ. Paul Getty Museum

Page 2: Good Projects Gone Bad: an Introduction to Process Maturity

Table of Contents

Abstract...................................................................................................................................................................1

Projects in Trouble.......................................................................................................................................................1

Technology management as core competency.......................................................................................................2

Process Maturity and Capability Maturity Model Integration......................................................................................5

Understanding different levels of capability maturity.............................................................................................6

Using CMMI.................................................................................................................................................................8

First, figure out where you are................................................................................................................................8

Ratchet up one level at a time...............................................................................................................................10

Don’t try to skip levels...........................................................................................................................................10

Don’t slip back.......................................................................................................................................................11

Pick projects appropriate for your level.................................................................................................................12

Assign responsibility and measure measure measure...........................................................................................12

Some Practical Ways to increase process maturity....................................................................................................13

Classic mistakes avoidance....................................................................................................................................13

Transparency through standardized reporting......................................................................................................20

Governance Structure............................................................................................................................................22

Consequences and phenomena.................................................................................................................................23

Web 2.0: Lightweight Development Frameworks..................................................................................................23

Governance and Control........................................................................................................................................23

Capability mismatch..............................................................................................................................................23

Real World Examples.................................................................................................................................................25

Capability Mismatch: Smithsonian Multimedia Guide...........................................................................................25

Lightweight Software Development: SAAM’s Eye Level Blog.................................................................................26

Matching goals to capacity and maturity: SAAM “Findability” project..................................................................27

Conclusion................................................................................................................................................................. 27

Smithsonian Web and New Media Strategy..............................................................................................29

Page 3: Good Projects Gone Bad: an Introduction to Process Maturity
Page 4: Good Projects Gone Bad: an Introduction to Process Maturity

AbstractMuseum Web and New Media software projects offer tantalizing rewards, but the road to success can be paved with uncertainty and risk. To small organizations these risks can be overwhelming, and even large organizations with seemingly limitless resources can flounder in ways that profoundly affect staff morale, public impact, the health and fitness of our partners in the vendor community, and our own bottom lines. Something seems to happen between the inception of projects, when optimism and beneficial outcomes seem clear and attainable, and somewhere down the road when schedules, budgets, and outcomes go off course. What is it? And what can we do to gain control?

This paper, created for the 2008 annual conference of the American Association of Museums, describes some common ways that technology projects get into trouble. It examines a proven project-process framework called the Capability Maturity Model and how that model can provide insight and guidance to museum leaders and project participants, and it tells how to improve real-world processes that contribute to project success. The paper includes three brief case studies and a call-to-action which argues that museum leaders should make technology stewardship an urgent priority.

The intended audience is people who are interested in understanding and improving how museum-technology gets done. The paper’s primary focus is Web and New Media software projects, but the core ideas are applicable to projects of all kinds.

A note on style and formatting

I decided write this paper, instead of just creating PowerPoint slides, to force myself to delve deeper into these ideas, tie them together, and give them a home on the Web where others can find them, use them, critique them, and improve them. I’m not a trained academic writer and I haven’t benefited from editorial assistance, so I ask for the reader’s forgiveness for the errors in consistency and style that they will certainly find. I suppose my paper-writing process maturity is “evolving”—a joke that will be funnier when you’ve reached the final page.

1

Page 5: Good Projects Gone Bad: an Introduction to Process Maturity

Projects in TroubleBack in the 1980’s the Federal Government had a problem. Software projects were failing—expensively, painfully, publicly failing.

The US General Services Administration in a report titled “Mission-Critical Systems: Defense Attempting to Address Major Software Challenges” 1 observed:

As systems become increasingly complex, successful software development becomes increasingly difficult. Most major system developments are fraught with cost, schedule, and performance shortfalls. We have repeatedly reported on costs rising by millions of dollars, schedule delays of not months but years, and multibillion-dollar systems that don’t perform as envisioned.

The problem wasn’t just that the government couldn’t complete software projects on time or on budget, or that it couldn’t predict which projects it was currently working on would succeed or fail—though these were both significant and severe problems—but most worrisome from my perspective is that it couldn’t figure out which new projects it was capable of doing in the future. If a business case or museum mission justifies an investment in technology that justification is based on the assumption that the technology can be competently implemented. If instead the assumption is that project execution is a crap shoot, the business case and benefit-to-mission arguments crumble and managers are stuck, unable to move forward (because of the risk of failure) and unable to not move forward because business and mission needs still call.

The pace of change in foundational technologies and assumptions (or dreams) about what could be done with software makes it very difficult for managers to know what skills, competencies, and capacities they need to ensure success.2 There’s little in most museum employees’ training or experience to prepare them for making technology decisions, and there are few patterns to follow. Unlike the building-construction and maintenance trades, software engineering is a relatively new profession in museums and doesn’t benefit from generations of established practice, training, certification, standards, and lessons learned through trial-and-error. One wouldn’t try to run a museum building without a building manager and building engineer—your insurance company would probably cancel your policy if you tried, yet few people would raise a red flag if you ran a museum without the equivalent technology expertise.

1 General Accounting Office, 1992, IMTEC-93-13 Mission-Critical Systems: Defense Attempting to Address Major Software Challenges, http://archive.gao.gov/d36t11/148399.pdf accessed 4/21/2008

2 At the 2006 Gilbane Conference on Content Technologies in Government (Washington, DC. June 2006) CIO’s confessed that they had given up multi-year planning because the pace of change in the industry was just too great. (I attended and presented at this conference.)

2

Page 6: Good Projects Gone Bad: an Introduction to Process Maturity

Technology management as core competencyAs a case-in-point, the AAM accreditation process requires museums to provide detailed information about the operation and maintenance of physical facilities—including detailed documentation of physical security, health and safety programs, grounds and land-management plans, and something called an “RC-AAM Standard Facility Report for the museum’s buildings”—but there are no questions about information-technology beyond a single item on “Internet-related interpretive activities.” It would appear that a museum can become AAM certified without an exploration of its information-technology operations, or at the very least that the process does not draw museums’ attention to IT best practices in the same way that it does for buildings and grounds.3 I expect this sends a message to museum leaders that that information technology is not significantly important to their long-term success.

But it is important—critically so. Even if you’re a small museum with fewer than ten employees, somebody is involved in software development. They’re making Excel spreadsheets, Access databases, collections of word documents and forms. Maybe they’ve got you involved in YouTube, Flickr, or Facebook. And if you staff isn’t doing this, your visitors are. I’d be surprised to find an AMM member museum that doesn’t have single technology initiative, and I’m sure that you could not find a museum whose audiences were ambivalent about technology.

And museums can’t choose not to focus on technology. Witness the story of Doug Morris, Chair and CEO of Universal Music Group, which I offer as a cautionary tale.

Doug Morris, Chair and CEO Universal Music Group (Photo: Getty Images)4

Mr. Morris, by all appearances, is a successful tycoon, running a $7 billion-a-year pop culture empire 5 and hobnobbing with the rich-and-famous—he would be recognizable and comfortable as a donor and member on museum boards. (He was Director of the Rock-and-Roll Hall of Fame6.) Mr. Morris is also a

3 Based on the AAM Accreditation Self-Study Questionnaire (2007) and the author’s recent experience with the AAM Accreditation process.

4 http://www.wired.com/entertainment/music/magazine/15-12/mf_morris, accessed 4/26/2008

5 Universal's CEO Once Called iPod Users Thieves. Now He's Giving Songs Away, Wired, 11/27/2007, http://www.wired.com/entertainment/music/magazine/15-12/mf_morris accessed 4/21/2008

3

Page 7: Good Projects Gone Bad: an Introduction to Process Maturity

creative person: he wrote "Sweet Talkin' Guy for The Chiffons in 1966 and produced "Smokin' In the Boys Room" for Brownsville Station in 19737.

But at the helm of his $7 billion-a-year business Mr. Morris chose to opt-out of the technology business in the 1990’s, just when digital music and the Internet went supernova. The awkward stumbling of the music business in the last 15 years, the acrimony caused by the relentless pursuit of its customers, and a cascade of technology failures, missed boats, and squandered opportunities was the result.

From a Wired Magazine interview:

"There's no one in the record company that's a technologist," Morris explains. "That's a misconception writers make all the time, that the record industry missed this. They didn't. They just didn't know what to do. It's like if you were suddenly asked to operate on your dog to remove his kidney. What would you do?"

"We didn't know who to hire," he says, becoming more agitated. "I wouldn't be able to recognize a good technology person — anyone with a good bullshit story would have gotten past me.8"

As New York Entertainment’s blog Vulture observed this about Mr. Morris’s confession:

Even though we shouldn't be, we're actually a little shocked. We'd always assumed the labels had met with a team of technology experts in the late nineties and ignored their advice, but it turns out they never even got that far — they didn't even try!

New York Entertainment continues:

Understanding the Internet certainly isn't easy — especially for an industry run by a bunch of technology-averse sexagenarians — but it's definitely not impossible. The original Napster hit its peak in 1999 — kids born since then have hacked into CIA computers. Surely it wouldn't have taken someone at Universal more than a month or two to learn enough about the Internet to know who to call to answer a few questions. They didn't even have any geeky interns?9

6 Vivendi board bio http://www.vivendi.com/corp/en/governance/dir_morris.php accessed 4/21/2008

7 Universal's CEO Once Called iPod Users Thieves. Now He's Giving Songs Away, Wired, 11/27/2007, http://www.wired.com/entertainment/music/magazine/15-12/mf_morris accessed 4/21/2008

8 Universal's CEO Once Called iPod Users Thieves. Now He's Giving Songs Away, Wired, 11/27/2007, http://www.wired.com/entertainment/music/magazine/15-12/mf_morris accessed 4/21/2008

9 Apropos of Nothing, New York Entertainment http://nymag.com/daily/entertainment/2007/11/universal_music_ceo_doug_morris.html accessed 4/19/2008 .

4

Page 8: Good Projects Gone Bad: an Introduction to Process Maturity

So what’s the headline here? It’s that large and small businesses have a lot to gain from focusing on how to get good and stay good at technology, nobody is immune from failure, and nobody gets to opt-out. The irony is that many museums are drawn to complex technology initiatives and the risks of getting in over their heads just as they reach the point where successful technology projects can have a positive impact.10

Process Maturity and Capability Maturity Model IntegrationCapability Maturity Model Integration (CMMI), was developed by the Software Engineering Institute (SEI) at Carnegie Mellon University (http://www.sei.cmu.edu) in 1991 to help the Federal Government understand the capabilities of its software vendors and deal proactively with the problem of out-of-control software projects. It became and remains a best-practice software-development framework11 and its core ideas can help organizations of all kinds escape from, as Steve McConnell puts it in his software development bible Rapid Development (Microsoft Press, 1996), the Gilligan’s Island cycle of under-performing projects.

Figure 1. Use CMMI to help your team escape from Gilligan's Island

CMM posits that organizations, or groups or processes within organizations, function at one of five levels of process maturity, with level 1 being the lowest or least mature level, and level 5 as the highest or most mature level.12

1. Initial – Processes, if they are defined at all, are ad hoc. Successes depend on individual heroics and are generally not repeatable.

10 Edson, Michael, Data Access Strategy, in J. Trant and D. Bearman (eds.). Museums and the Web 2006: Proceedings, Toronto: Archives & Museum Informatics, published March 1, 2006 at http://www.archimuse.com/mw2006/papers/edson/edson.html

11 Gartner Research: CMMI Remains the Standard for Software Process Frameworks. ID Number: G00156315. Published 4/18/2008.

12 The descriptions from this list are from Paulk, M., et al (1995). The Capability Maturity Model: Guidelines for Improving the Software Process, New York: Addison-Wesley Professional, but the names of the five levels show the updated terminology established by the Software Engineering Institute in 2003.

5

Page 9: Good Projects Gone Bad: an Introduction to Process Maturity

2. Managed – Basic project management practices are established and the discipline is in place to repeat earlier successes with similar projects.

3. Defined – Processes are documented and standardized and all projects use approved, tailored versions of the standard processes.

4. Quantitatively Managed – The performance of processes and the quality of end-products are managed with quantitative measurement and analysis.

5. Optimizing – Continuous process improvement is enabled by quantitative feedback from the process and from piloting innovative ideas.

Figure 2. The five levels of the Capability Maturity Model

The five levels should be understood as a kind of staircase, lowest maturity on the bottom and highest on the top, with each level serving as the foundation for the level above (figure 1).

Understanding different levels of capability maturityPaulk et. al. in The Capability Maturity Model: Guidelines for Improving the Software Process (New York: Addison-Wesley Professional, 1995) lay out a very useful chart that helps bring into focus how the model relates to our own organizations and what our work worlds would look like if capability and maturity were improved. (See Table 1.)

Typical in the low-maturity column (Level 1) are phrases like

success depends on individual heroics

and,

few stable processes exist or are used

Typical in the high-maturity column are phrases like

strong sense of teamwork exists across the organization

6

Page 10: Good Projects Gone Bad: an Introduction to Process Maturity

and,

everyone is involved in process improvement

I myself, have worked on more than a few projects that functioned at level 1. At level 1, not much is written down, nobody is sure who is doing what or who “owns” various parts of the project, nobody can really tell what the schedule is or whether you are on schedule or running late, meetings are held but nobody takes notes or records actionable assignments, and products are often only partially finished, with lots of surprises and defects discovered at the last minute before “completion.”

This isn’t to say that you can’t have successes at level 1. I worked for a museum that produced several award-winning technology projects this way, but the successes were built on individual heroics and the effort almost killed them. (Just after winning awards for a major Web site this organization spent over two years trying to complete what was to have been a six-month redesign.) Functioning at this maturity level certainly diminished their willingness to stay together as a team or build on their success, and it bread a distrust of technology-content initiatives in general that made it difficult to get buy-in for urgent and necessary new projects.

7

Page 11: Good Projects Gone Bad: an Introduction to Process Maturity

Table 1. Implications of advancing through CMM levels. Which columns best describe your organization? (This table was very slightly modified to enhance clarity for non-software professionals.)

Level 1. Level 2. Level 3. Level 4. Level 5.

People Success depends on individual heroics

“Fire fighting” is a way of life

Relationships between disciplines are uncoordinated, perhaps even adversarial

Success depends on individuals

Commitments are understood and managed

People are trained

Project groups work together, perhaps as an integrated team

Training is planned and provided according to roles

Strong sense of teamwork exists within each project

Strong sense of teamwork exists across the organization.

Everyone is involved in process improvement

Processes Few stable processes exist or are used

“Just do it!”

At the individual project level, documented and stable estimating, planning and commitment processes are used

Problems are recognized and corrected as they occur

Integrated management and engineering (how things get built) processes are used across the organization

Problems are anticipated and prevented, or their impacts are minimized

Processes are quantitatively understood and stabilized

Sources of individual problems are understood and eliminated

Processes are continuously and systematically improved.

Common sources of problems are understood and eliminated

Measurement Data collection and analysis are ad hoc

Planning and management data used by individual projects

Data are collected and used in all defined processes

Data are systematically shared across projects

Data definition and collection are standardized across the organization

Data are used to understand work processes quantitatively and stabilize them

Data are used to evaluate and select process improvements

Technology Introduction of new technology is risky

Technology supports established, stable activities

New technologies are evaluated on a qualitative basis

New technologies are evaluated on a quantitative basis

New technologies are proactively pursued and deployed

8

Page 12: Good Projects Gone Bad: an Introduction to Process Maturity

Using CMMIUsing the CMMI can be a relatively informal process that involves understanding and applying process-improvement best practices to your organization. Or, it can be a formal process that involves extensive training, creation of a process improvement infrastructure, appraisals, and more.

To avoid confusing people who are familiar with heavy-duty process-improvement efforts I must draw a distinction between the formal CMMI process defined by the Software Engineering Institute and what I’m talking about here. In this paper I argue that many organizations can benefit from what CMMI has to offer, but I am not advocating a full-fledged CMMI program which typically involves formal assessment teams, rigid interpretations of CMMI, a great deal of work: these kinds of efforts don’t deliver good return-on-investment for organizations at emerging maturity levels.13 What I advocate is a kind of CMMI-Lite in which organizations borrow the most useful aspects of CMMI without becoming overly bound to the formal doctrine. As Gartner, Inc. says, “Organizations should use CMM as a guidebook, not a ‘cookbook.’ Results-based improvement should be the key.”14

First, figure out where you areUnless you’re working with a formal CMM assessment team the first step to understanding and improving your capability maturity is to look at Table 1 and identify the statements that best describe how your team does work. You don’t have to think across every kind of project your organization does: pick one or two projects or activities that you think would benefit from some improvement. Note that it’s not uncommon for organizations to have some processes that are very mature and some that are very immature. CMMI orthodoxy recognizes this and encourages a methodology of continuous improvement at varying levels of maturity.

You may find it useful to modify table 1 or the overarching CMMI levels of maturity listed above and to cast them in terms that better describe your organization, or your project.

13 Phone interview with Sanjeev Sharma, CMM specialist and IT Manager Safety & Mission Assurance, NASA Goddard Space Flight Center in Greenbelt, MD. 3/11/08

14 Gartner Research, CMMI Remains the Standard for Software Process Frameworks. Article ID #G00156315, 4/16/2008. The exact quote is “Internal development organizations should use CMMI as a guidebook, not a ‘cookbook.’ Results-based improvement should be the key.”

9

Page 13: Good Projects Gone Bad: an Introduction to Process Maturity

Figure 3. Figure out where you are on the Capability Maturity Model

For example, in 2006 I modified the out-of-the-box CMM level definitions to be more meaningful to a data-strategy project at the Smithsonian American Art Museum.15 The definitions shown below helped me understand the roadmap and projects that were needed to get us from where we were (level 2) to where we wanted to be (levels 3 and 4).

Level 1 – Limited data federation; often with redundant and inconsistent data. Data strategy is not even on the organizational radar.

Level 2 – Limited data consolidation; documenting redundancies and inconsistencies. Some isolated departments are trying to raise awareness and initiate projects.

Level 3 – Data integration initiated; new ‘disintegration’ is discouraged. Multi-departmental teams begin working on policies and procedures to advance a data strategy.

Level 4 – Data integration widely adapted; ‘disintegration’ is penalized. All projects in the organization adhere to data integration policies and managers are held accountable for variances.

If you conclude that you’re at a low level of maturity, you’re not alone. Gartner research finds that most organizational software development teams function at Level 1 or Level 2, “which means that, at best, they have some reasonably good project management practices,” and less than 25% of teams function at level 3 or higher (Hotle, 'Just Enough Process' for Applications). Taken at face value, this means that most software development efforts can be expected to produce inconsistent results with little control of budget and timelines. Though this is appauling, the good news is that basic process improvement initiatives could have a dramatic effect on the productivity and predictability of a great many software projects.

15 Edson M., Data Access Strategy, in J. Trant and D. Bearman (eds.). Museums and the Web 2006: Proceedings, Toronto: Archives & Museum Informatics, published March 1, 2006 at http://www.archimuse.com/mw2006/papers/edson/edson.html. I’m not sure why I described four levels instead of five.

10

Page 14: Good Projects Gone Bad: an Introduction to Process Maturity

Ratchet up one level at a timeIf you’re at level 1, what small steps can you take to get to level 2? The Software Engineering Institute says that you can get from level 1 to level 2 just by establishing sound project management practices (CMMI for Acquisition, 2007). Such practices might include activities such as tracking and communicating project status, measuring effort and outcomes, or ensuring roles-and-responsibilities are adequately defined.

Figure 4. Improve maturity gradually, one level at a time

These process-improvement efforts don’t need to take a lot of time and effort. Matt Hotle of Gartner says that he very seldom sees an basic process improvement effort that takes more than a couple of weeks” (interview with the author, 4/24/08).

The Software Engineering Institute notes that improvements that move a group from level 1 to level 2 may depend on “heroics” of individual staff members until the concepts of process improvements are more widely understood and supported (CMMI for Acquisition, 2007).

Don’t try to skip levelsIt’s very tempting to try to skip from low levels of maturity to high ones without going through the intermediate steps. For example, if your organization really wants to use new technologies on the cutting edge, but your current state is that the “introduction of new technology is risky” (Level 1 from Table 1) then you would be well served to work first on ratcheting your technology adoption capabilities up to level 2, “technology supports established, stable activities” and see how that goes.

11

Page 15: Good Projects Gone Bad: an Introduction to Process Maturity

Figure 5. Avoid the temptation to skip steps. It’s risky.

Trying to leapfrog from level 1 to level 4 or five doesn’t give your organization time to establish the core competencies needed to succeed at high levels of expected performance. The Software Engineering Institute (SEI) says “Because each maturity level forms a necessary foundation for the next level, trying to skip maturity levels is usually counterproductive.” (CMMI Project Team, 2007.) The SEI further notes that “processes without the proper foundation may fail at the point they are needed most—under stress.” John P. Kotter, in the Harvard Business Review notes that “Skipping steps creates only an illusion of speed and never produces a satisfying result.” (Kotter, 1995)

Don’t slip backA recent book on evolution16 stated that Charles Darwin’s greatest contribution was not that he thought up modification with descent (natural selection), but that his research and writing tied the idea down so firmly so that it could never drift away. There’s an important lesson here for process improvement: try to ensure that whatever improvements you do make to software development processes become codified and formalized so that as staff and managers come and go and teams adapt and change your hard won progress doesn’t atrophy. Remember that every level is a foundation for the one that comes next.

Figure 6. Solidify gains in maturity so that they're permanent.

16 I read this somewhere recently but have not been able to track down the citation!12

Page 16: Good Projects Gone Bad: an Introduction to Process Maturity

Pick projects appropriate for your levelThis is related to “don’t skip steps” pattern, but is more focused on tailoring what you need to get done with what you’re capable of doing. Usually, at lower levels of maturity this means breaking ambitious visions into smaller, less costly, and less risky sub-projects that together, achieve the vision. This approach is harmonious with a lot of recent thinking, particularly in Web application development, there are significant beneficial consequences for organizations at all levels of maturity. (More on this later.)

Figure 7. Pick projects appropriate for your currentcapability maturity level

Assign responsibility and measure measure measureMatt Hotle, Gartner’s CMMI expert, states that assigning responsibility for process improvement initiatives is one of the most important highest-value steps an organization can take.17 Gartner strongly asserts that assigning responsibility for process improvement and measuring efforts are the most critical steps.

Of measurement, Hotle writes

Application people are generally terrible with measurement. This actually may be a kind statement, because we believe that fewer than 20% of application organizations have a usable measurement system. From a governance perspective, it's a sin to have a set of processes that have been defined, but have no feedback loop to understand whether the processes are doing well. (The 'Seven Deadly Sins' of Application Governance. ID Number: G00155896, 2008)

So what should you measure at lower levels of maturity? For typical museum Web development projects, start by measuring staff-hour projections, actual staff-hours spent, defects (bugs and errors), and the movement of content through development, review, and approval processes.

17 Phone interview with the author, 4/24/200813

Page 17: Good Projects Gone Bad: an Introduction to Process Maturity

Some Practical Ways to increase process maturity

Classic mistakes avoidanceSteve McConnell, in his classic book Rapid Development: Taming Wild Software Schedules (Microsoft Press, 1996) uses the concept of Classic Mistakes to help software developers avoid commonly encountered, and repeated, errors. Classic Mistakes identify things that often go wrong with People, Processes, and Technology and they are often related to immature work processes. Avoiding Classic Mistakes is one of the best ways to move towards successful technology development.

The following list of classic mistakes is adapted from Adapted from Rapid Development (McConnell, 1996.)

14

Page 18: Good Projects Gone Bad: an Introduction to Process Maturity

Classic mistakes enumerated

• Process-Oriented Mistakes

– Lack of project management planFailure to define, up front, what project management practices will be used

– Failure to follow-through on project management planGood plan at start of project but not followed and implemented day-to-day

– Failure to define requirements up frontTeam fails to define, in writing, what is to be delivered.

– Failure to accurately estimate time and resourcesRelated to requirements gathering

– Micro Management… by project sponsors or managers.

– Failure to define roles and responsibilitiesWho is responsible for what?

– Failure to develop creative briefLack of codified creative direction leads to stress with sponsors and partners. Related to requirements.

– Failure to empower creative teamcreative team hobbled by unclear sponsorship.

– Failure to maintain project visibilityRelated to lack of project management plan

– Wishful thinkingRelated to lack of requirements gathering, estimation

– Overly optimistic schedulesRelated to wishful thinking, failure to estimate accurately

– Insufficient risk managementKnown and obvious risks are not accounted for in management plan

– Wasted time upstreamMost projects waste time in beginning of project

– insufficient quality assuranceFailure to produce and follow a test plan. (Stems from failure to define requirements in advance.)

– Feature CreepRelated to lack of management controls

– Insufficient management controlsProject metrics lacking, deliverables unclear, visibility poor

– Failure to produce a design

15

Page 19: Good Projects Gone Bad: an Introduction to Process Maturity

“Design” in the architecture and requirements sense, not graphic design.

– Gold-plating requirementsUnrealistic desire to have all bells & whistles

– Ineffective management of contractorsRelated to requirements gathering, management plan.

• People Oriented Mistakes

– Friction within teamUnaddressed problem relationships within team lower productivity and morale for entire project team

– Friction with customers/partners#2 complaint of software development teams

– Weak personnel#1 complaint of software development teams

– Reliance on heroics to complete a projectThis is related to wishful thinking and lack of requirements, management controls, etc.

– Unrealistic expectations“We can just code like hell and get this done.” Related to lack of requirements, management controls, etc

– Lack of effective project sponsorshipAmbiguous or inconsistent direction/participation from sponsors

– Lack of stakeholder buy-inRamming a project down a stakeholder’s throat. Related to sponsorship.

– Lack of user inputFailure to maintain relationship with customers

• Technology Oriented Mistakes

– Switching tools or technologies in the middle of a projectFalse promise of productivity or performance improvements often derail projects

– Lack of content or source-code controlDevelopers/authors overwrite each others documents.

– Silver-bullet syndrometoo much faith put in benefits of new technology and not enough thought put into how well they would do in your organization

– Overestimated savings from new tools or methodsOrganizations seldom improve in giant leaps

16

Page 20: Good Projects Gone Bad: an Introduction to Process Maturity

In 2004 I surveyed an experienced, award-winning project team about which classic mistakes they felt were likely to occur during a software project we were initiating. The results were sobering, team members identified 26 classic mistakes that they thought had a 1:3 or greater chance of occurring during the course of the project. As a result, the top ten most-likely classic mistakes, and how to avoid them, were described in the project’s management plan.

Table 2. Top-ten Classic Mistakes, from a 2004 project management plan

Rank Estimated Probability of Occurrence

Classic Mistake Action to Take

1 68% Lack of content or source-code control

Implement source-code control practices

2 60% Failure to produce a design document

Produce a design, Ex Post Facto, starting week of August 25th

3 60% Lack of project management plan

Project plan v 1.0 completed August 20th

4 60% Failure to maintain project visibility

Project visibility addressed in project plan.

5 60% Feature Creep Produce a design. Prioritize feature set.

6 58% Wasted time upstream The cow is already out of the barn on this one!

7 57% Reliance on heroics to complete a project

Define roles and responsibilities. Emphasize accurate estimation. Implement management controls to track progress and anticipate delays.

8 53% Friction within team Address proactively with team members and management.

9 53% Failure to accurately estimate time and resources

Related to lack of design. Having a project management plan should help. Managers must ensure staff accurately defines and estimates tasks.

10 50% Failure to define requirements up front

create requirements doc Ex Post Facto.

Use Spiral Project PlansIf you’re not familiar with any particular project management frameworks then you might want to start with a Spiral Project Plan. Spiral project plans are described by Steve McConnell as an iterative project-management approach that is particularly appropriate for times when you’re not exactly sure of scope

17

Page 21: Good Projects Gone Bad: an Introduction to Process Maturity

and functionality when you start a project. (This is often the case with small-scale Web development projects.)

Spiral project plans are organized around loops of increasing effort and complexity. Initial loops are brief: subsequent loops last longer, take more effort, and have more impact. Each loop includes activities where requirements are described and analyzed, some tangible product is created, results are evaluated, decisions are made, and the next loop is planned. In early loops the products created may be simple purpose statements or paper prototypes that are tested quickly on sample users. Later loops may involve significant blocks of code and functionality that are tested with automated test scripts or in usability labs, or, the project may transition into some other project-management framework (McConnell, 1996).

Figure 8. Spiral Project Plan, sometimes called“the cinnamon roll” because of the distinctive

shape of the spiral

The beauty of spiral project plans is apparent in three ways. First, it provides a flexible lightweight process that practically any team of adults can implement. It doesn’t take a Project Management Institute certified engineer to work this way. Second, teams can use this process structure projects at their earliest moments of planning, way before funds are committed and programmers are hired, when rational processes can have their greatest effect. Third and finnally, they provide a mechanism for reality checks at the end of each loop where stakeholders can provide input on whether the project is still aligned with goals. This enables teams to make adjustments before outcomes are set in concrete.

Roles and ResponsibilitiesMost tasks that fail to get done, fail because of unclear or non-existent ownership, and friction within projects is frequently caused by ambiguous responsibilities. Conversely, tasks that have clear owners are likely to get done. One of my favorite techniques for improving basic project management improvements is to get project teams to define roles-and-responsibilities formally, before work begins in earnest.

I developed the following list of role definitions to clarify roles-and-responsibilities for Web projects at the Smithsonian.

18

Page 22: Good Projects Gone Bad: an Introduction to Process Maturity

Excerpt from Roles and Responsibilities Definition (Edson, Smithsonian)

• Managerial Roles

– Sponsor

• Internal client(s) for whom we’re producing the project. Defines goals. Supervises Project Owner and provides resources and direction to Project Owner and team. Provides “head above the trees” perspective of overall effort.

– Project Owner

• Responsible for,

• high level organization and execution of project.• requirements analysis• creative brief• interface with project sponsors• team selection• high-level definition project lifecycle• monitoring and periodic reviews of content/functionality over entire project

lifecycle• Usually reports to the Project Management Team

19

Page 23: Good Projects Gone Bad: an Introduction to Process Maturity

Sample Roles and Responsibilities Template

Role and Responsibility Assignments

Roles are assigned to individuals for the purpose of a) ensuring that all roles have someone to play them, and b) to promote clarity for the purpose of project management. Many team members will have more than one role. In general, individuals are encouraged to participate/collaborate/contribute beyond their strict role assignments! (Table is partially filled out as an example.)

Team

Mem

bers

Amy

Bob

Cath

y

Den

nis

(add

team

mem

bers

as

appr

opria

te)

Man

ager

ial R

oles

Sponsor XProject Owner XProject Management Team X XProject Manager XTechnical Director XQuality Control Manager

Cont

ent P

rodu

ction

Rol

es

Partner

Content Provider

Creative Director

Lead Writer/Editor

Creative Producer

Writer/Editor

Graphics Producer

Tech

nica

l Pro

ducti

on R

oles

Graphic Designer

Graphical User Interface Designer

Information Architect

Software Analyst

Programmer

Database Designer

Image Production

System Architect

Web Server Administrator

20

Page 24: Good Projects Gone Bad: an Introduction to Process Maturity

Transparency through standardized reportingMany projects are only transparent at their inception and completion. The goal of standardized reporting is to give managers and participants insight into project status and direction so they can make decisions and manage.

Some examples of simple project reporting methods are shown below.

Example 1, a weekly project status PowerPoint file for general consumption by stakeholders. This template was filled out weekly by the project manager. The PowerPoint format encouraged brevity and focus on the most important points.

Figure 9. Two slides showing project status for a Website redesign project

Example 2, a bi- weekly status for parallel projects. This Microsoft Word template was used by 14 senior managers to report on the status of their projects for the reopening of the Smithsonian American Art Museum in 2006. Each manager had their own document in a network folder and individual documents were rolled-up into a “master” document (using Word’s linking feature) for a bi-weekly progress-review meeting.

21

Page 25: Good Projects Gone Bad: an Introduction to Process Maturity

Figure 10. Bi-weekly status reports

Example 3, weekly meeting minutes emphasizing assignments and decisions made. I used this type of report for a network installation project. Note the use of the term “Action Required” to call attention to specific assignments. The creation and tracking of Action Items is a highly effective process improvement. These reports were typed in Microsoft Word’s Outline view as the meeting progressed. Reports were distributed to team members and uploaded to a project extranet site.

Figure 11. Weekly meeting minutes emphasizing actions required and decisions made

22

Page 26: Good Projects Gone Bad: an Introduction to Process Maturity

Governance StructureMany organizations lack a formal and uniformly understood mechanism for gathering input on proposed technology projects and determining which should be submitted for consideration and approval by senior decision makers. It doesn’t take a lot of process to be effective in this area— “just enough” as Gartner says. At the Smithsonian American Art Museum, I instituted a simple Web site proposal form that asked people initiating projects to answer basic questions about the goals and processes.

This process had several beneficial outcomes. First, it ensured that everyone involved in a project agreed on a project’s scope and assumptions before it began. Second, it forced stakeholders to discuss priorities, content, direction, and timing before resources were committed. Third, it elevated the discussion of previously under-valued processes such as roles-and-responsibilities and maintenance lifecycles. And finally, it provided a single, transparent gateway for all recommendations going to the Director. Was the process perfect? No. But it was “just enough” process to allow Web-development projects to begin to be managed, rather than ad-hoc.

Sample Document: Web-site proposal form.

Purpose

The purpose of this form is to provide an overview of proposed objectives and production/maintenance lifecycles for new Web content. This form requires information needed to support the editorial decision-making process. A completed form serves as a contract between project sponsors, team members, and SAAM decision makers.

Process

This section is written with the project manager/project leader in mind

1. Somebody generates an idea and you take ownership of it: you are the project leader. You discuss the idea with potential partners, team members, and SAAM management. You define a project and walk it through the approval process.

2. You discuss the idea/project at the SAAM Web Weekly, and (optionally) at the SAAM Web Quarterly.3. If the idea passes through informal discussions you formalize the creative and management aspects of the

project and fill out this form.4. You present the project and this form to the SAAM Web Quarterly and lead a discussion. You can review

simple projects via e-mail: more complex projects require a meeting of the Web Quarterly and may require several meetings.

5. The SAAM Web Quarterly approves the idea (or engages you in an iterative process of questions, comments and review) and makes a recommendation to the Director.

6. The Director approves the idea.7. You begin the next stages of planning and execution.From this point on project management is handled at a detailed level by a Project Management Plan.

What kinds of projects should use this process?It is hard to describe this categorically. We’ll be using common sense case-by-case.

Web Site Proposal Form

23

Page 27: Good Projects Gone Bad: an Introduction to Process Maturity

1. Who will be leading this idea though the approval process?2. Who will be the project sponsor?3. Who will be the project owner?4. What other project “roles” are defined?5. What is the title of the idea?6. Please give an overview of the idea as you would pitch it to the Director and the Web Quarterly.7. What deadlines are associated with this idea?8. What partners (internal or external) will be involved?9. Please describe the 3-year lifecycle of this idea.

10. What staff resources will be required for the 3-year lifecycle?11. What financial resources will be required for the 3-year lifecycle?12. What technological resources will be required for the 3-year lifecycle?

Consequences and phenomenaThree consequences and phenomena related to the pursuit of process-improvements for museum-technology projects are worth noting. They are capability mismatches, the difficulties of getting buy-in for governance and control efforts, and what “lightweight” Web 2.0 software development practices have to offer museums.

Capability mismatchCapability mismatch describes a situation in which different groups on a project have incompatible processes or radically different levels of process maturity. For example, capability mismatches often occur when small to medium sized museums with few defined processes and not much project management expertise or hire accomplished outside technology companies. Successful technology companies tend to be very process and results oriented and often have staff with formal training and advanced certification in project management, software development, measurement and analysis, and business-process engineering. These people speak a different language than most museum teams, which is not to say that they are always right, but the disconnect between intuitive decision-making cultures and structured business cultures can cause problems.

Capability mismatches aren’t found only in internal-external relationships. Mismatches are also found between work groups within museums. In mature organizations it would be the responsibility of a Project Management Office (PMO) to establish standard practices and resolve mismatches, but museum technology projects seldom benefit from this kind of function.

24

Page 28: Good Projects Gone Bad: an Introduction to Process Maturity

Figure 12. Capability maturity mismatches create a disruptive shearing effect

In a mismatched engagement, technology vendors working with museum clients often see behavior on the museum side such as

conflicting institutional voices/opinions (client doesn’t speak with one voice) adversarial relationships (“I don’t feel like we’re on the same team”) wrong people in key positions unrealistic expectations content-approval deadlines are not met undefined decision-making processes little or no measurement of key performance indicators insufficient staffing for the task at hand completed projects are not maintained after delivery

I have interviewed vendors of all sizes to gain insight into this phenomenon. Most say the thing they want most from their museum clients are unified decision making processes and a willingness for senior managers to “hear what’s realistic and act accordingly” when confronted with evidence of flawed internal processes or unrealistic expectations.

I have seen more than one museum technology project struggle, under-perform, or fail because of capability mismatches, and this is an area where vendors and clients need to help each other out.

Figure 13. Mismatches are caused by differences in the way groups approach work

25

Page 29: Good Projects Gone Bad: an Introduction to Process Maturity

Governance and Control Many work groups and departments balk at the idea of new rules, procedures, controls, or governance structures being imposed on them. As one museum professional I interviewed put it: “Museums workers often have a kind of entrenched eccentricity that treats all efforts to institute standard procedures as infringements on creativity.” (Anonymous interview, 4/27/2008.) And museums are not alone. Matt Hotle writes “Most [software] development organizations seem to have a clear avoidance mechanism when it comes to ‘process.’ However, using a ‘just enough’ approach to processes enables an organization's behaviors to match its goals. ( 'Just Enough Process' for Applications, 2007.)

Gartner’s “just enough” approach encourages managers to keep rules and governance to the absolute minimum required to help get products completed the “right” way, and I have found that use of the “just enough” phrase itself sends a positive and soothing message to concerned stakeholders. Governance and control efforts need internal marketing and wise stewardship to get buy-in and acceptance, but ultimately governance and control will be accepted by teams when they see that the new rules and procedures benefit their work, reduce errors and rework, and free them up to perform more creative and rewarding tasks. Most wisdom on this topic asserts that a light hand, “more carrots: fewer sticks” (positive incentives rather than the threat of punishment) is the most successful way to bring governance structures into an organization.

Web 2.0: Lightweight Development FrameworksThe way Websites are built and improved has changed dramatically in the last few years, and these changes are good for small organizations wanting to have a greater impact online. In the client-server or mainframe computing era, software applications were meticulously planned in excruciating detail months or years ahead of delivery and the final software product worked for the task it was designed for (or not) and that was more-or-less the end of the story. If requirements changed or new opportunities arose not much could be done in the short-term, and end-users had little or no opportunity to add or change the product’s functionality to suit their own distinctive needs. Making software this way required large teams working at high levels of process maturity to make a product. It made Microsoft rich in the 80’s and 90’s, but there are new models now.

There’s a phenomenal amount of hope and hype around the term Web 2.0, which is typically associated with social networking Web sites, tagging, and user-created content. But publisher Tim O’Reilly sees something deeper going on here in the way that these kinds of sites are being developed. In his manifesto on the subject, What is Web 2.0: Design Patterns and Business Models for the Next Generation of Software,18 O’Reilly describes how powerful, effective, and wildly profitable Web applications can (and should) be built using lightweight, rapid-development processes and continuous improvement and innovation fueled by interaction with (and contributions by) customers. In contrast with previous practices, these sites go public with basic functionality and are constantly modified, adjusted, and expanded according to what works and what doesn’t.

18 http://www.oreillynet.com/pub/a/oreilly/tim/news/2005/09/30/what-is-web-20.html, accessed 4/27/200826

Page 30: Good Projects Gone Bad: an Introduction to Process Maturity

O’Reilly writes,

Cal Henderson, the lead developer of Flickr, recently revealed that they deploy new builds up to every half hour. This is clearly a radically different development model! While not all web applications are developed in as extreme a style as Flickr, almost all web applications have a development cycle that is radically unlike anything from the PC or client-server era. It is for this reason that a recent ZDnet editorial concluded that Microsoft won't be able to beat Google: "Microsoft's business model depends on everyone upgrading their computing environment every two to three years. Google's depends on everyone exploring what's new in their computing environment every day."

I wouldn’t argue that Google doesn’t require mature software development processes—quite the contrary in fact—but lightweight framework models do demonstrate that valuable software can be built in small, manageable pieces by small manageable teams. The acceptance and mainstreaming of free and open source software further lowers the barriers to entry: a team of two or three developers can produce flexible and high-quality Web applications in small iterative steps at low cost and low risk. (Or, museums can opt-out of software development altogether and adapt the same blog, wiki, tagging, and file-sharing software that’s available to the general public right now. )

A full discussion of the Web 2.0 platform is beyond the scope of this paper, but Tim O’Reilly’s article is required reading for anybody thinking seriously about the future of software development.

Real World ExamplesThe following brief case studies give concrete examples of how process maturity and an understanding of capability maturity models can affect the direction and outcome of projects.

Capability Mismatch: Handheld Multimedia Guide

Overview

In 2004 the Smithsonian issued a request for proposals (RFP) for a pilot project for multimedia handheld tours at six museums, with the hope that the successful system would eventually be extended to all Smithsonian museums. (Four Smithsonian museums were in the process of implementing their own handheld-guide pilot projects or had just completed them.)

A press release described the to-be functionality of the device,

Visitors to the Smithsonian will have the option to rent a lightweight, wireless handheld device that is the heart of the SIguide solution. With the SIguide handheld device, Smithsonian patrons will be able to take preset tours or customized tours that match their interests; view multimedia content such as documents, photos, and audio and video clips; locate and be directed to exhibits, landmarks or other members of their group; communicate with someone or everyone in their group;

27

Page 31: Good Projects Gone Bad: an Introduction to Process Maturity

create a schedule of activities and receive reminders when events are due to begin; save content, messages, sketches and notes to a scrapbook they subsequently can access via the Web; and much more.19

The RFP stipulated that the Smithsonian would contribute content and staff-hours to the project, but no capital funds: development costs would be borne by the vendor and recouped by sharing revenue from device rentals. Technical specifications and requirements were assembled from needs and wish-lists submitted by participating Smithsonian museums. The project had executive-level sponsorship and high visibility throughout the Institution.

The contract was awarded to a small startup with a compelling vision. Company founders demonstrated a history of successful involvement in museum content/technology deployments, but no track record delivering the technology required by this project on this scale. The technology specification included installing multiple wireless networks; developing a system of automated kiosks to distribute, charge, and synchronize the inventory of handheld devices; a complex database infrastructure; integration with e-commerce and customer databases; and “wireless positioning” technology to relate the moment-to-moment location of each handheld device with maps of artifacts and related content.

Process Maturity

To buffer the project from risk, the awardee created a project management office (PMO) through a sub-contract with a local technology company with a highly mature project management group. The intent of the PMO was to provide a system of checks and balances that could match the realities of day-to-day execution and decision making with the idealized vision of the project. However, there was a significant capability mismatch between the startup’s culture and that of the PMO and after several weeks the awardee disbanded the PMO (with the Smithsonian’s approval). No similarly mature project-management expertise was established to replace that which was lost.

Technology development was not vetted through previously defined processes but was fast-tracked. (Part of the rationale for this was, perhaps, the fact that the Smithsonian did not have capital investments at risk.) Furthermore, project success was contingent on the on-time delivery and satisfactory performance of several critical assumptions about a) the accuracy and performance of the wireless positioning system, b) the performance and reliability of the automated kiosks, c) revenue projections, and d) operating costs. These assumptions were not rigorously tested and risk-mitigation or contingency plans, if they existed, were not well known.

Outcome and Lessons Learned

This was a complex project and a full description of its promise and flaws is beyond the scope of this paper and the knowledge of its author. But it’s important to try to learn from what went wrong and two process-maturity mistakes are apparent. Ineffective process controls allowed stakeholders to define a

19 Press release, labeled as coming from the Smithsonian, http://www.lynx-net.net/web/SIguide.php, accessed 4/26/2008. Also see From the Secretary: Guiding Light at http://www.smithsonianmag.com/travel/10013371.html, accessed 4/26/2008

28

Page 32: Good Projects Gone Bad: an Introduction to Process Maturity

project that was beyond the process maturity level of both the vendor and the Institution, and the lack of a PMO (caused by a capability mismatch) allowed early warning signs to go unnoticed or insufficiently addressed. Given the Smithsonian’s highly mature content-creation processes and the recognizably bleeding-edge nature of this project’s technology and business models, a better approach might have been to gradually increase investments in successful museum-based pilot projects to test theories about audience acceptance, technology, and operations in a more controlled manner. This kind of evolutionary roadmap has been tried successfully at several other institutions.

Lightweight Software Development: SAAM’s Eye Level Blog

Overview

In May of 2005 Smithsonian American Art Museum’s (SAAM’s) New Media Initiatives department proposed creating a blog for SAAM’s reopening, which was fourteen months away. The New Media team knew they needed to establish a new Web site to support the outreach goals of reopening and build buzz leading up to opening day, but the museum’s normal content-creation teams were pinned-down with day-to-day tasks pertaining to the bricks-and-mortar museum reopening and money was tight. In addition, museum managers realized that high-visibility projects (such as Web and kiosk applications for the Luce Foundation Center for American Art) would leave little capacity for complex software and content development efforts.

Process Maturity

The blog was identified as an achievable objective specifically because it required a low level of process maturity, had a very small budget impact, and had a low risk of failure (and not enormous consequences if it did.) Through a structured governance process the project’s goals, risks, roles-and-responsibilities, and project management methodology was articulated and reviewed. Some project stakeholders were uncomfortable with unknowns in the content-creation and editorial process, so the project was approved for a trial-run in which the blog site available on a password-protected page available only to SAAM employees. After a short period running internally, stakeholders became comfortable with production processes and the blog was approved for external publication.

Lessons Learned

This approach was effective for SAAM.

Matching goals to capacity and maturity: SAAM “Findability” project

Overview

The creation of new Web sites for the Smithsonian American Art Museum’s (SAAM’s) reopening in 2006 also created problems with navigation, branding, and information architecture. Rather than initiate a redesign, SAAM chose to take an iterative approach to Web site improvement by focusing on findability (making SAAM’s Web content easier to find) and structuring work so that results would be achieved

29

Page 33: Good Projects Gone Bad: an Introduction to Process Maturity

through a series of short, low-risk sub projects (rather than one large, monolithic project as SAAM had done in the past).

Process Maturity

SAAM was extremely focused on conducting a controlled and managed development process that avoided the pitfalls and distractions of traditional Web redesign projects. The SAAM Web team was aware of capability and maturity weaknesses and did not have great confidence in its capacity to manage a large Web redesign. The team was more comfortable and experienced with smaller projects of two or three month duration so the RFP it issued explicitly required vendors to structure work into short sub-projects and use processes similar to the Spiral Project Plan. (See text box below.)

Excerpt from RFP

C.1.2. STRUCTURE WORK TO REDUCE RISK

Proposed work plans shall be designed to achieve desired results through a series of contained, low-risk sub-projects (as opposed to a monolithic, all-or-nothing methodology). Methodologies should include continuous testing, measurement, assessment, and refinement. As the saying goes, “teach us to fish” rather than build us some fancy boats and go away. This is especially important if parts of the work plan include community-building or visitor-created content.

In addition, the RFP that intentionally avoided the term “redesign” and instead focused attention on making measureable improvements to end-user perceptions of findability, including the performance of search engines, information architecture, labeling, and overall usability—but only if those facets could be tied back to findability. Use of the word “redesign” was discouraged in project meetings, documents, and discussion.

Lessons Learned

The project has not concluded and the outcomes are not clear.

ConclusionThe road to effienent development town.

30

Page 34: Good Projects Gone Bad: an Introduction to Process Maturity

Redrawn from Rapid Development, Steve McConnell, Microsoft Press, 1996

31

Page 35: Good Projects Gone Bad: an Introduction to Process Maturity

ReferencesCarroll, Sean B. 2005. Endless Forms Most Beautiful: The New Science of Evo Devo. New York : Norton, 2005.

CMMI Project Team. 2007. CMMI for Acquisition, Version 1.2: Improving Processes for Acquiring Better Products and Services. Pittsburgh : Software Engineering Institute, Carnegie Mellon, 2007.

—. 2006. CMMI for Development, Version 1.2. Pittsburgh : Software Engineering Institute, Carnegie Mellon University, 2006.

Edson, Michael. 2006. Data Access Strategy. Museums and the Web: Conference Proceedings. [Online] 3 1, 2006. [Cited: 4 22, 2008.] http://www.archimuse.com/mw2006/papers/edson/edson.html.

General Accounting Office. 1992. IMTEC-93-13 Mission-Critical Systems: Defense Attempting to Address Major Software Challenges. [Online] 1992. [Cited: 4 22, 2008.] http://archive.gao.gov/d36t11/148399.pdf.

Hotle, Matthew. 2007. 'Just Enough Process' for Applications. ID Number: G00145561. s.l. : Gartner, Inc., 2007.

—. 2007. The Little Big Application Organization: How a Big Organization Can Still Remain Small. ID Number: G00146962. s.l. : Gartner, Inc., 2007.

—. 2008. The 'Seven Deadly Sins' of Application Governance. ID Number: G00155896. s.l. : Gartner, Inc., 2008.

Jones, Peter. 2008. We Tried To Warn You, Part 2: Failure is a matter of timing. Boxes and Arrows. [Online] 3 26, 2008. [Cited: 4 22, 2008.] http://www.boxesandarrows.com/view/we-tried-to-warn-you32.

Keen, Andrew. 2007. The Cult of the Amateur: How today's internet is killing our culture. New York : Doubleday, 2007.

Kopcho, Joanne and Hottle, Matthew. 2008. CMMI Remains the Standard for Software Process Frameworks. Article ID #G00156315. Gartner.com. [Online] 4 18, 2008. [Cited: 4 22, 2008.] http://gartner.com.

Leading Change: Why Transformation Efforts Fail. Kotter, John P. 1995. 1995, Harvard Business Review; Mar/Apr95, Vol. 73 Issue 2, pp. 59 - 67.

McConnell, Steve. 1996. Rapid Development: Taming Wild Software Schedules. Redmond : Microsoft Press, 1996.

32

Page 36: Good Projects Gone Bad: an Introduction to Process Maturity

O'Reilly, Tim. 2005. Web 2.0: Design Patterns and Business Models for the Next Generation of Software. O'Reilly.com. [Online] 9 30, 2005. [Cited: 4 24, 2008.] http://www.oreillynet.com/pub/a/oreilly/tim/news/2005/09/30/what-is-web-20.html.

2007. Universal Music CEO Doug Morris Speaks, Recording Industry in Even Deeper Shit Than We Thought. Aprapos of Nothing. [Online] 11 26, 2007. [Cited: 4 22, 2008.] http://nymag.com/daily/entertainment/2007/11/universal_music_ceo_doug_morris.html accessed 4/19/2008.

2007. Universal's CEO Once Called iPod Users Thieves. Now He's Giving Songs Away. Wired.com. [Online] 11 27, 2007. [Cited: 4 22, 2008.] http://www.wired.com/entertainment/music/magazine/15-12/mf_morris.

Vivendi Board Biography of Doug Morris. [Online] [Cited: 4 22, 2008.] http://www.vivendi.com/corp/en/governance/dir_morris.php.

33

Page 37: Good Projects Gone Bad: an Introduction to Process Maturity

34