Organization at the Limit

409

description

Book on Organization

Transcript of Organization at the Limit

  • Organizationat the Limit

    OATA01 06/14/2005, 10:48 AM1

  • OATA01 06/14/2005, 10:48 AM2

  • OrganizationAT THE LIMIT

    Lessons from the Columbia Disaster

    EDITED BYWILLIAM H. STARBUCK

    ANDMOSHE FARJOUN

    OATA01 06/14/2005, 10:48 AM3

  • 2005 by Blackwell Publishing Ltdexcept for editorial material and organization 2005 by William H. Starbuck

    and Moshe Farjoun

    BLACKWELL PUBLISHING350 Main Street, Malden, MA 021485020, USA

    9600 Garsington Road, Oxford OX4 2DQ, UK550 Swanston Street, Carlton, Victoria 3053, Australia

    The right of William H. Starbuck and Moshe Farjoun to be identified as the Authors of theEditorial Material in this Work has been asserted in accordance with the UK Copyright,

    Designs, and Patents Act 1988.

    All rights reserved. No part of this publication may be reproduced, stored in a retrieval system,or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording orotherwise, except as permitted by the UK Copyright, Designs, and Patents Act 1988, without the

    prior permission of the publisher.

    First published 2005 by Blackwell Publishing Ltd

    1 2005

    Library of Congress Cataloging-in-Publication Data

    Organization at the limit : lessons from the Columbia disaster / edited byWilliam H. Starbuck and Moshe Farjoun.

    p. cm.Includes bibliographical references and index.

    ISBN-13: 978-1-4051-3108-7 (hardback : alk. paper)ISBN-10: 1-4051-3108-X (hardback : alk. paper)

    1. Columbia (Spacecraft)Accidents. 2. Corporate cultureUnited StatesCase studies.3. Organizational behaviorUnited StatesCase studies. 4. United States. National

    Aeronautics and Space Administration. I. Starbuck, William H., 1934II. Farjoun, Moshe.TL867.O74 2005

    363.1240973dc222005006597

    A catalogue record for this title is available from the British Library.

    Set in 10/121/2pt Rotis Serifby Graphicraft Limited, Hong Kong

    Printed and bound in the United Kingdomby TJ International, Padstow, Cornwall

    The publishers policy is to use permanent paper from mills that operate a sustainable forestry policy,and which has been manufactured from pulp processed using acid-free and elementary chlorine-free

    practices. Furthermore, the publisher ensures that the text paper and cover board used have metacceptable environmental accreditation standards.

    For further information onBlackwell Publishing, visit our website:

    www.blackwellpublishing.com

    OATA01 06/14/2005, 10:48 AM4

  • Contents

    Notes on Contributors viiiPreface xvii

    Sean OKeefe

    Part I Introduction 1

    1 Introduction: Organizational Aspects of the Columbia Disaster 3Moshe Farjoun and William H. Starbuck

    Synopsis: NASA, the CAIB Report, and the Columbia Disaster 11Moshe Farjoun and William H. Starbuck

    Part II The Context of the Disaster 19

    2 History and Policy at the Space Shuttle Program 21Moshe Farjoun

    3 System Effects: On Slippery Slopes, Repeating Negative Patterns,and Learning from Mistake? 41

    Diane Vaughan

    4 Organizational Learning and Action in the Midst of Safety Drift:Revisiting the Space Shuttle Programs Recent History 60

    Moshe Farjoun

    5 The Space Between in Space Transportation: A RelationalAnalysis of the Failure of STS-107 81

    Karlene H. Roberts, Peter M. Madsen, and Vinit M. Desai

    OATA01 06/14/2005, 10:48 AM5

  • Part III Influences on Decision-Making 99

    6 The Opacity of Risk: Language and the Culture of Safety inNASAs Space Shuttle Program 101

    William Ocasio

    7 Coping with Temporal Uncertainty: When Rigid, AmbitiousDeadlines Dont Make Sense 122

    Sally Blount, Mary J. Waller, and Sophie Leroy

    8 Attention to Production Schedule and Safety as Determinants ofRisk-Taking in NASAs Decision to Launch the Columbia Shuttle 140

    Angela Buljan and Zur Shapira

    Part IV The Imaging Debate 157

    9 Making Sense of Blurred Images: Mindful Organizing inMission STS-107 159

    Karl E. Weick

    10 The Price of Progress: Structurally Induced Inaction 178Scott A. Snook and Jeffrey C. Connor

    11 Data Indeterminacy: One NASA, Two Modes 202Roger Dunbar and Raghu Garud

    12 The Recovery Window: Organizational Learning FollowingAmbiguous Threats 220

    Amy C. Edmondson, Michael A. Roberto, Richard M.J. Bohmer,Erika M. Ferlins, and Laura R. Feldman

    13 Barriers to the Interpretation and Diffusion of Informationabout Potential Problems in Organizations: Lessons from theSpace Shuttle Columbia 246

    Frances J. Milliken, Theresa K. Lant, and Ebony N. Bridwell-Mitchell

    Part V Beyond Explanation 267

    14 Systems Approaches to Safety: NASA and the SpaceShuttle Disasters 269

    Nancy Leveson, Joel Cutcher-Gershenfeld, John S. Carroll,Betty Barrett, Alexander Brown, Nicolas Dulac, and Karen Marais

    15 Creating Foresight: Lessons for Enhancing Resilience from Columbia 289David D. Woods

    16 Making NASA More Effective 309William H. Starbuck and Johnny Stephenson

    vi Contents

    OATA01 06/14/2005, 10:48 AM6

  • 17 Observations on the Columbia Accident 336Henry McDonald

    Part VI Conclusion 347

    18 Lessons from the Columbia Disaster 349Moshe Farjoun and William H. Starbuck

    Index of Citations 364

    Subject Index 370

    Contents vii

    OATA01 06/14/2005, 10:48 AM7

  • Notes on Contributors

    Betty Barrett is currently a Research Scientist with the Massachusetts Institute ofTechnology. Before going to Massachusetts Institute of Technology she worked onthe faculty of Michigan State Universitys School of Industrial Relations and HumanResource Management. Her research interests include the impact of instability onworkers in the aerospace industry, globally dispersed teams, system safety, workplaceknowledge creation, and organizational learning. She has published work on aerospaceworkforce and employment, team-based work systems, and alternative dispute resolu-tion, and is co-author of Knowledge-Driven Work (Oxford University Press, 1998).

    Sally Blount is the Abraham L. Gitlow Professor of Management at the LeonardN. Stern School of Business, New York University. She focuses on the study ofmanagerial cognition and group behavior and is best known for her research in theareas of negotiation, decision-making, and time. Her research has been published ina wide variety of psychology and management journals, including Academy of Man-agement Review, Administrative Science Quarterly, Journal of Personality and SocialPsychology, Organizational Behavior and Human Decision Processes, PsychologicalBulletin, and Research in Organizational Behavior. Dr. Blount is currently writing abook entitled Time in Organizations.

    Richard M.J. Bohmer is a physician and an Assistant Professor of BusinessAdministration at Harvard University. His research focuses on the management ofclinical processes and the way in which health-care teams learn to improve out-comes, prevent error, and reduce adverse events. He has studied catastrophic failuresin health care, the adoption of new technologies into medical practice, and morerecently the way in which health-care delivery organizations deal with custom andstandard operations concurrently. He holds a medical degree from the University ofAuckland, New Zealand, and an MPH from the Harvard School of Public Health.

    Ebony N. Bridwell-Mitchell is a doctoral candidate at New York UniversitysStern School of Business in the Department of Management and Organizations.Her research focuses on the effects of social assessments and influence processes at

    OATA01 06/14/2005, 10:49 AM8

  • group, organizational and inter-organizational levels. Her most recent project is afour-year NSF-funded study that examines how the social dynamics of the pro-fessional community in New York City public schools affect organizational change.In addition to training as an organizational scholar, she has a Masters degree inpublic policy from the Harvard John F. Kennedy School of Government and a BA,summa cum laude, from Cornell University in American policy studies. She has overten years experience in educational research, consulting, and practice in organiza-tions such as the US Department of Education, the Peruvian Department of theInterior, the Navajo Nation Tribal (Din) College, and the New York City Departmentof Education.

    Alexander Brown is a graduate student in Massachusetts Institute of TechnologysProgram in Science, Technology and Society. His research examines engineeringpractice from the 1960s to the 1990s. Using accidents/failures and their subsequentinvestigations as a window into the black box of engineering, he examines thechanging cultures of engineering within NASA. He is tracking changes in engineer-ing practices from Apollo 1 to Challenger to Columbia.

    Angela Buljan is a Strategic Planning Director at McCann Erickson Croatia and apre-doctoral researcher at the University of Zagreb. She plans to start her Ph.D.program in Management and Organization at the University of Zagreb, where shereceived a B.S. degree in psychology and a Masters degree in marketing. Herresearch interests include managerial risk-taking, organizational decision-making,and consumer decision-making. In 2004 she was a guest researcher at Managementand Organizations Department at the Stern School of Business, New York University,where she participated in research projects on risk-taking under the supervision ofZur Shapira. One of these is presented in this book.

    John S. Carroll is Professor of Behavioral and Policy Sciences at the MassachusettsInstitute of Technology Sloan School of Management and the Engineering SystemsDivision. He is co-director of the MIT Lean Aerospace Initiative. He taught previouslyat Carnegie-Mellon University, Loyola University of Chicago, and the University ofChicago. He received a B.S. (physics) from MIT and a Ph.D. (social psychology) fromHarvard. His research has focused on individual and group decision-making, therelationship between cognition and behavior in organizational contexts, and theprocesses that link individual, group, and organizational learning. Current projectsexamine organizational safety issues in high-hazard industries such as nuclear power,aerospace, and health care, including self-analysis and organizational learning, safetyculture, leadership, communication, and systems thinking. He is also part of a re-search team working collaboratively with the Society for Organizational LearningSustainability Consortium, a cross-industry group of companies developing sustain-able business practices.

    Jeffrey C. Connor is a Lecturer in Organizational Behavior at the Harvard MedicalSchool. He has previously been on the faculty of the Graduate School of Educationat Harvard University where he co-taught the Organizational Diagnosis seminar. Heis an independent contractor for senior leadership development in the intelligence

    Notes on Contributors ix

    OATA01 06/14/2005, 10:49 AM9

  • community of the US government and consults with professional service organiza-tions and businesses on executive leadership development and organizational change.He received a Masters degree in psychology from Boston College, and a Ph.D. inadministration, policy, and research from Brandeis University.

    Joel Cutcher-Gershenfeld is a senior research scientist in the Massachusetts Instituteof Technologys Sloan School of Management and Executive Director of its EngineeringSystems Learning Center. He is co-author of Valuable Disconnects in OrganizationalLearning Systems (Oxford University Press, 2005), Lean Enterprise Value (Palgrave,2002), Knowledge-Driven Work (Oxford University Press, 1998), Strategic Negotia-tions (Harvard Business School Press, 1994), and of three additional co-authored orco-edited books, as well as over 60 articles on large-scale systems change, new worksystems, labormanagement relations, negotiations, conflict resolution, organizationallearning, public policy, and economic development. He holds a Ph.D. in industrialrelations from MIT and a B.S. in industrial and labor relations from Cornell University.

    Vinit M. Desai is a doctoral student and researcher in organizational behavior andindustrial relations at the Walter A. Haas School of Business, University of Californiaat Berkeley. His research interests include learning, decision-making, and the studyof organizations in which error can have catastrophic consequences. He works withcolleagues to examine organizations that operate with hazardous technologies yetexperience extremely low error rates, and his work spans various industries, includ-ing space exploration, health care, telecommunications, naval aviation, and naturalgas. He has worked in the private and public sectors.

    Nicolas Dulac is a doctoral student in the department of Aeronautics and Astronauticsat the Massachusetts Institute of Technology. His current research interests spansystem engineering, system safety, visualization of complex systems, hazard analysis insocio-technical systems, safety culture, and dynamic risk analysis. He holds an M.S.degree in aeronautics and astronautics from MIT, and a B.S. degree in mechanicalengineering from McGill University.

    Roger Dunbar is a Professor of Management at the Stern School of Business, NewYork University. He is interested in how understandings develop in support ofparticular perspectives in organizations, and how this basis for stability makes itdifficult for change to occur. His research explores this theme in different contexts.One example is the dialog that took place in the Journal of Management Inquiry, 5(1996) around two papers: A Frame for Deframing in Strategic Analysis, and Run,Rabbit, Run! But Can You Survive? with Raghu Garud and Sumita Raghuram. He iscurrently a senior editor of Organization Studies.

    Amy C. Edmondson is Professor of Business Administration, Harvard BusinessSchool, and investigates team and organizational learning in health care and otherindustries. Her research examines leadership, psychological safety, speaking up, andexperimentation in settings ranging from hospitals to corporate boardrooms. Recentpublications include Framing for Learning: Lessons in Successful Technology Imple-mentation (California Management Review, 2003) and The Local and Variegated

    x Notes on Contributors

    OATA01 06/14/2005, 10:49 AM10

  • Nature of Learning in Organizations (Organization Science, 2002). With co-authorsEdmondson developed both a multimedia and a traditional teaching case on theColumbia shuttle tragedy (HBS Publishing, 2004), designed to deepen studentsappreciation of the organizational causes of accidents. She received her Ph.D. inorganizational behavior from Harvard University in 1996.

    Moshe Farjoun is an associate professor at the Schulich School of Business, YorkUniversity, Toronto. While editing this book, he was a visiting associate professor atthe Stern School of Business, New York University. His research interests lie in theintersection of strategic management and organization. His research has exploredmarket and organizational dynamics, particularly as they pertain to the processesof strategy formulation, implementation and change. In studying these topics, hebuilds on his background in economics, behavioral sciences, and system analysisand emphasizes process, interaction, and synthesis. He is particularly attracted to thethemes of learning, tension, and complexity and studies them across different levelsof analysis and using diverse methodologies. His research has appeared in StrategicManagement Journal, Academy of Management Journal, Organization Science, andAcademy of Management Review. A recent paper was a finalist (top three) in the2002 AMJ best paper competition. Professor Farjoun received his Ph.D. in organiza-tion and strategy from the Kellogg Management School of Northwestern University.

    Laura R. Feldman is a developer and fundraiser for a nonprofit youth mentoringorganization. While a research associate at Harvard Business School, Feldman con-tributed to research on psychological safety and team learning in health-care opera-tions. In addition to the traditional and multimedia Columbia case studies, she hasco-authored with Amy Edmondson a series of case studies on the decisive meetingbetween NASA and its subcontractor Morton Thiokol the eve of the Challengershuttle tragedy. Feldman graduated cum laude from Wellesley College with a B.A. insociology.

    Erika M. Ferlins is a research associate in general management at the HarvardBusiness School. Her research examines leadership, teams, and decision-making inhigh-stakes situations. Recent research includes firefighting, health care, space flight,and pharmaceutical catastrophes. Ferlins and her co-authors also developed both amultimedia and a traditional case study on the Columbia shuttle tragedy (ColumbiasFinal Mission: A Multimedia Case, Harvard Business School case N9-305-032 andColumbias Final Mission, Harvard Business School case 9-304-090), designed toillustrate the complex causes of disasters.

    Raghu Garud is Associate Professor of Management and Organizations at the SternSchool of Business, New York University. He is co-editor of Organization Studies andan associate editor of Management Science. Currently he is co-editing (with CynthiaHardy and Steve Maguire) a special issue of Organization Studies on InstitutionalEntrepreneurship.

    Theresa K. Lant is an Associate Professor of Management at the Stern School ofBusiness, New York University. She received her Ph.D. from Stanford University

    Notes on Contributors xi

    OATA01 06/14/2005, 10:49 AM11

  • in 1987, and her A.B. from the University of Michigan in 1981. She has served asa senior editor of Organization Science, and is currently an associate editor ofnon-traditional research at the Journal of Management Inquiry, and serves on theeditorial review boards of Strategic Organization and Organization Studies. She hasserved in a variety of leadership roles in the Academy of Management and theINFORMS College on Organization Science, including, most recently, serving as Chairof the Managerial and Organizational Cognition Division of the Academy of Man-agement. Professor Lants research focuses on the processes of managerial cognition,organizational learning and strategic adaptation.

    Sophie Leroy is a Ph.D. student in organizational behavior at the Stern School ofBusiness, New York University. Prior to enrolling at NYU, she earned an MBA fromHEC (France), part of which was completed at Columbia Business School. She isinterested in understanding how individuals are affected by and manage dynamicwork environments, in how people experience working under extreme time pressure,and how managing multiple projects under time pressure affects peoples engage-ment with their work and their performance. She is currently working with ProfessorSally Blount on understanding how peoples perception and valuation of time influ-ence the way they synchronize with others.

    Nancy Leveson is Professor of Aeronautics and Astronautics and Professor ofEngineering Systems at the Massachusetts Institute of Technology. She has worked inthe field of system safety for 25 years, considering not only the traditional technicalengineering problems but also the cultural and managerial components of safety.She has served on many NASA advisory committees, including the Aerospace SafetyAdvisory Panel, as well as working with other government agencies and companiesin the nuclear, air transportation, medical devices, defense, automotive, and otherindustries to help them write safety standards and to improve practices and organ-izational safety culture. Professor Leveson is an elected member of the NationalAcademy of Engineering and conducts research on system safety, software engineer-ing and software safety, humanautomation interaction, and system engineering.She has published 200 research papers and is the author of Safeware: System Safetyand Computers.

    Peter M. Madsen is a doctoral student at the Walter A. Haas School of Business,University of California Berkeley. His research interests focus on organizationalreliability and on the interrelationship between organizational and environmentalchange. His current research deals with high-reliability organizations and institutionaland technological change, examining these issues in the aerospace, health-care, andinsurance industries.

    Karen Marais is a doctoral candidate in the Department of Aeronautics andAstronautics at the Massachusetts Institute of Technology. Her research interestsinclude safety and risk assessment, decision-making under uncertainty, and systemsarchitecture.

    xii Notes on Contributors

    OATA01 06/14/2005, 10:49 AM12

  • Henry McDonald is the Distinguished Professor and Chair of Computational Engin-eering at the University of Tennessee in Chattanooga. Prior to this appointment,from 1996 until 2002 he was the Center Director at NASA Ames Research Laboratory.Educated in Scotland in aeronautical engineering, he worked in the UK aerospaceindustry before emigrating to the US, where after working as a staff member in largecorporate research laboratory he formed a small research and development company.Professor McDonald subsequently held a number of academic posts at Penn Stateand Mississippi State universities before joining NASA as an IPA in 1996. He is amember of the National Academy of Engineering and a Fellow of the Royal Academyof Engineering.

    Frances J. Milliken is the Edward J. Giblin Faculty Fellow and a Professor ofManagement at the Stern School of Business, New York University. She was theco-author, with William Starbuck, of a paper on the causes of the space shuttleChallenger accident (Journal of Management Studies, 1988). Her chapter in thepresent volume thus represents a second foray into trying to understand decision-making at NASA. Her most recent research interests include understanding howdiversity affects the functioning of groups and of organizations, the dynamics ofupward communication processes in organizations, as well as the relationshipbetween individuals work and non-work lives. She is currently on the editorial boardof the Academy of Management Review and the Journal of Management Studies.

    William Ocasio is the John L. and Helen Kellogg Distinguished Professor of Man-agement and Organizations at the Kellogg School of Management, NorthwesternUniversity. He received his Ph.D. in organizational behavior from Stanford Universityand his MBA from the Harvard Business School, and was previously on the faculty ofthe Massachusetts Institute of Technology Sloan School of Management. His researchfocuses on the interplay of power, communication channels, and cognition in shap-ing organizational attention, decision-making, and corporate governance. He haspublished in the Administrative Science Quarterly, Advances in Strategic Management,American Journal of Sociology, Research in Organizational Behavior, OrganizationScience, Organization Studies, and the Strategic Management Journal, among others.Recently he has been studying how specialized vocabularies of organizing shape theway in which organizations categorize their experiences and practices; how theseevolving vocabularies influence organizational strategies; and, thirdly, how networksof formal communication channels shape strategy formulation, implementation, andperformance in multi-business organizations.

    Sean OKeefe is Chancellor of Louisiana State University and A&M College; heassumed this office on February 21, 2005. He has been a Presidential appointee onfour occasions. Until February 2005, he served as the Administrator of the NationalAeronautics and Space Administration. Earlier, he was Deputy Director of the Officeof Management and Budget, Secretary of the Navy, and Comptroller and Chief Fin-ancial Officer of the Department of Defense. He has also been Professor of Businessand Government Policy at Syracuse University, Professor of Business Administration

    Notes on Contributors xiii

    OATA01 06/14/2005, 10:49 AM13

  • and Dean of the Graduate School at Pennsylvania State University, staff member forthe Senate Committee on Appropriations, and staff director for the Defense Appro-priations Subcommittee, as well as a visiting scholar at Wolfson College, Universityof Cambridge. He is a Fellow of the National Academy of Public Administration, aFellow of the International Academy of Astronautics, and a member of the NavalPostgraduate School Board of Advisors. He has received the Distinguished PublicService Award from the President, the Chancellors Award for Public Service fromSyracuse University, the Navys Public Service Award, and five honorary doctoratedegrees. He is the author of several journal articles, and co-author of The DefenseIndustry in the Post-Cold War Era: Corporate Strategies and Public Policy Perspectives.

    Michael A. Roberto is Assistant Professor of Business Administration, HarvardBusiness School, where he examines organizational decision-making processes andsenior management teams. More recently, he has studied the decision-making dy-namics involved in catastrophic group or organizational failures such as the Colum-bia space shuttle accident and the 1996 Mount Everest tragedy. His recent book, WhyGreat Leaders Dont Take Yes for an Answer: Managing for Conflict and Consensus,was published in June 2005 by Wharton School Publishing. In addition to his teach-ing and research duties, Professor Roberto has developed and taught in leadershipdevelopment programs at many leading companies over the past few years. Hereceived his doctorate from Harvard Business School in 2000 and earned his MBAwith high distinction in 1995.

    Karlene H. Roberts is a professor in the Haas School of Business at University ofCalifornia, Berkeley. She received her Ph.D. in psychology from the University ofCalifornia, Berkeley. Her research concerns the design and management of organ-izations that achieve extremely low accident rates because errors could have cata-strophic consequences. Her findings have been applied to US Navy and coastguardoperations, the US Air Traffic Control System, and the medical industry, and she hascontributed to committees and panels of the National Academy of Sciences regardingreliability enhancement in organizations. She has advised the National Aeronauticsand Space Administration and testified before the Columbia Accident InvestigationBoard. She is a Fellow in the American Psychological Association, the Academy ofManagement, and the American Psychological Society.

    Zur Shapira is the William Berkley Professor of Entrepreneurship and Professor ofManagement at the Stern School of Business, New York University. His researchinterests focus on managerial attention and their effects on risk-taking and organiza-tional decision-making. Among his publications are Risk Taking: A ManagerialPerspective (1995), Organizational Decision Making (1997), Technological Learning:Oversights and Foresights (1997), with R. Garud and P. Nayyar, and OrganizationalCognition (2000), with Theresa Lant.

    Scott A. Snook is currently an Associate Professor of Organizational Behavior atthe Harvard Business School. Prior to joining the faculty at Harvard, he served as acommissioned officer in the US Army for over 22 years, earning the rank of colonelbefore retiring. He has led soldiers in combat. He has an MBA from the Harvard

    xiv Notes on Contributors

    OATA01 06/14/2005, 10:49 AM14

  • Business School and a Ph.D. in organizational behavior from Harvard University.Professor Snooks book Friendly Fire was selected by the Academy of Management toreceive the 2002 Terry Award. His research and consulting activities have been in theareas of leadership, leader development, change management, organizational systemsand failure, and culture.

    William H. Starbuck is ITT Professor of Creative Management in the Stern Schoolof Business at New York University. He has held faculty positions at Purdue, JohnsHopkins, Cornell, and Wisconsin-Milwaukee, as well as visiting positions in England,France, New Zealand, Norway, Oregon, and Sweden. He was also a senior researchfellow at the International Institute of Management, Berlin. He has been the editorof Administrative Science Quarterly; he chaired the screening committee for seniorFulbright awards in business management; he was the President of the Academy ofManagement; and he is a Fellow in the Academy of Management, American Psycho-logical Association, American Psychological Society, British Academy of Manage-ment, and Society for Industrial and Organizational Psychology. He has publishedmore than 120 articles on accounting, bargaining, business strategy, computerprogramming, computer simulation, forecasting, decision-making, humancomputerinteraction, learning, organizational design, organizational growth and development,perception, scientific methods, and social revolutions.

    Johnny Stephenson serves as the implementation lead for the One NASA initia-tive, whose end result is to be a more highly unified and effective NASA organiza-tion. In this capacity, he served on NASAs Clarity team, whose recommendationsled to the 2004 reorganization; led the effort to engage employees in NASAs trans-formational activities; was chief architect of The Implementation of the NASA Agency-Wide Application of the Columbia Accident Investigation Board Report: Our RenewedCommitment to Excellence, which addresses the implementation of agency-wide issuesfrom the CAIB report; led the study on inter-center competition within NASA thatis now being implemented; and leads an effort focused on integrating numerouscollaborative tools within the agency. He was selected for NASAs Senior ExecutiveService Candidate Development Program in May 2002. He has been the recipient ofNASAs Exceptional Achievement Medal and the Silver Snoopy Award.

    Diane Vaughan is Professor of Sociology at Boston College. She is the author ofControlling Unlawful Organizational Behavior, Uncoupling: Turning Points in Inti-mate Relationships, and The Challenger Launch Decision. Much of her research hasinvestigated the dark side of organizations: mistake, misconduct, and disaster. She isalso interested in the uses of analogy in sociology, now materializing as Theorizing:Analogy, Cases, and Comparative Social Organization. She is currently engagedin ethnographic field work of four air traffic control facilities for Dead Reckoning:Air Traffic Control in the Early 21st Century. Related writings are OrganizationRituals of Risk and Error, in Bridget M. Hutter and Michael K. Power, eds., Organiza-tional Encounters with Risk (Cambridge University Press, forthcoming); and Signalsand Interpretive Work, in Karen A. Cerulo (ed.), Culture in Mind: Toward a Sociol-ogy of Culture and Cognition (New York: Routledge, 2002).

    Notes on Contributors xv

    OATA01 06/14/2005, 10:49 AM15

  • Mary J. Waller is an Associate Professor of Organizational Behavior in TulaneUniversitys A.B. Freeman School of Business. She earned her Ph.D. in organizationalbehavior at the University of Texas at Austin. Prior to obtaining her graduate degree,Professor Waller worked for Amoco Corporation, Delta Air Lines, and ColumbineSystems. Her research focuses on team dynamics and panic behaviors under crisisand in time-pressured situations. Her field research includes studies of commercialairline fight crews, nuclear power plant crews, and air traffic controllers, and hasbeen funded by NASA and the Nuclear Regulatory Commission. She has receivedawards for her research from the Academy of Management and the American Psy-chological Association, and is the recipient of Tulanes Irving H. LaValle ResearchAward. Her work has appeared in the Academy of Management Journal, Academy ofManagement Review, Management Science, and other publications.

    Karl E. Weick is the Rensis Likert Distinguished University Professor of Organiza-tional Behavior and Psychology at the University of Michigan. He holds a Ph.D. insocial and organizational psychology from Ohio State University. He worked previ-ously at the University of Texas, Austin, Seattle University, Cornell University, theUniversity of Minnesota, and Purdue University. He has received numerous awards,including the Society of Learnings scholar of the year and the Academy of Manage-ments award for distinguished scholarly contributions. His research interests includecollective sensemaking under pressure, medical errors, handoffs in extreme events,high-reliability performance, improvisation and continuous change. Inc Magazinedesignated his book The Social Psychology of Organizing (1969 and 1979) one of thenine best business books. He expanded the formulation of that book into a booktitled Sensemaking in Organizations (1995). His many articles and seven books alsoinclude Managing the Unexpected (2001), co-authored with Kathleen Sutcliffe.

    David D. Woods is Professor in the Institute for Ergonomics at Ohio State Univer-sity. He has advanced the foundations and practice of cognitive systems engineer-ing since its origins in the aftermath of the Three Mile Island accident. He hasalso studied how human performance contributes to success and failure in highlyautomated cockpits, space mission control centers, and operating rooms, includingparticipation in multiple accident investigations. Multimedia overviews of his re-search are available at http://csel.eng.ohio-state.edu/woods/ and he is co-author ofthe monographs Behind Human Error (1994) and A Tale of Two Stories: ContrastingViews of Patient Safety (1998), and Joint Cognitive Systems: Foundations of Cogni-tive Systems Engineering (2005). Professor Woods research has won the Ely Awardfor best paper in the journal Human Factors (1994), a Laurels Award from AviationWeek and Space Technology (1995), and the Jack Kraft Innovators Award from theHuman Factors and Ergonomics Society (2002).

    xvi Notes on Contributors

    OATA01 06/14/2005, 10:49 AM16

  • PrefaceSean OKeefe

    In each of our lives there are a few events that forever serve as reminders of whatwas, what is, and what ultimately can be. Those few events and the dates on whichthey occurred serve as lenses through which we judge the successes of yesterday,gauge the relative importance of decisions facing us today, and ultimately decide thecourse we set for tomorrow. February 1, 2003 serves as one such date for me; theevent was NASAs tragic loss of the space shuttle Columbia and her crew.

    On that particular day, I expected to welcome home seven courageous individualswho chose as their mission in life to push the boundaries of what is and what canbe, explorers of the same ilk and fervor as Lindbergh, Lewis and Clark, Columbus,and the Wright Brothers. But on that particular day I witnessed tragedy. We werereminded that exploration is truly a risky endeavor at best, an endeavor that sevenindividuals considered worthy of risking the ultimate sacrifice as they pursued theadvances in the human condition that always stem from such pursuits.

    And there on the shuttle landing strip at the Kennedy Space Center as I stood withthe Columbia families, I also witnessed extraordinary human courage. Their commit-ment to the cause of exploration served as inspiration in the agonizing days, weeks,and months that were to come.

    For NASA, that date initiated intense soul-searching and in-depth learning. Wesought answers for what went wrong. We asked ourselves what we could have doneto avoid such a tragedy and we asked what we could do to prevent another suchtragedy. We never questioned whether the pursuit of exploration and discoveryshould continue, as it seems to be an innate desire within the human heart, one thatsets humanity apart from other life forms in that we dont simply exist to survive.We did, however, question everything about how we approached the high-risk mis-sion of exploration.

    In the final analysis, what we found was somewhat surprising, although in retro-spect it should not have been. It was determined that the cause of such tragedy wastwofold. The physical cause of the accident was determined to be foam insulationthat separated from the external tank and struck the wings leading edge, creating afissure in the left, port side of the shuttle orbiter. But we also found the organizational

    OATA01 06/14/2005, 10:49 AM17

  • cause, which proved just as detrimental in the end. The organizational cause was themore difficult for us to grasp because it questioned the very essence of what theNASA family holds so dear: our can-do attitude and the pride we take in skillsto achieve those things once unimagined. The organizational cause lay in the veryculture of NASA, and culture wasnt a scientific topic NASA was accustomed toconsidering when approaching its mission objectives.

    We found that the culture we had created over time allowed us (1) to characterizea certain risk (foam shedding) as normal simply because we hadnt yet encounteredsuch a negative outcome from previous shedding; (2) to grow accustomed to a chainof command that wasnt nearly as clear as we thought was the case; and (3) to moreaptly accept the qualified judgments of those in positions of authority rather thanseriously considering the engineering judgments of those just outside those positions.In short, we were doing what most of us do at some point in time by trusting whatis common and supposedly understood rather than continually probing for deeperunderstanding. The same thing can happen within any industry or organization overtime, and we thus limit what can be by establishing as a boundary what currently is.That happened within NASA. But this tendency is present in most of us.

    The more frequently we see events, conditions, and limitations, the more we thinkof them as normal and simply accept them as a fact of life. Such is human nature.For most Americans, encountering the homeless on any city block in any metro-politan area is unremarkable. Few among us would even recall such an encounteran hour later even if an expansive mood had prompted a modest donation. Sadly,this condition has become a common occurrence in our lives and not particularlynotable. And while many of us may have become numb to this condition, it is stilla tragedy of great proportions that must be addressed.

    But consider the reaction of someone who had never encountered a homelessperson forced to live on the streets. Likely, this uninitiated person would come to theaid of the first helpless soul encountered, driven by the desire to do something. Suchemotion would be inspired by witnessing the same tragedy most urban dwellers seeeach and every day. But because it would be the first time, the event would promptextraordinary action. Indeed, such an encounter would likely force one to wonderhow a civilized society could possibly come to accept such a condition for anyoneamong us. It would be a remarkable event because it had never been witnessedbefore.

    The more we see abnormality, the more dulled our senses become. The frequencyof foam insulation strikes to the orbiter was sufficiently high to be dismissed asunremarkable and of limited consequence. Why are we surprised when aerospaceengineers react just like the rest of us?

    But the price for yielding to this human tendency can be horrible tragedy, just asit was on the morning of February 1, 2003. The challenge is to blunt the tendencyto react based on frequency of incident and to seek to explain and understand eachevent. That requires an extraordinary diligence, sensitivity, and awareness uncharac-teristic of most humans. It is the rare person who possesses such traits. But the stakesare too high to settle for anything less.

    xviii Preface

    OATA01 06/14/2005, 10:49 AM18

  • We were offered the rare opportunity to learn from our tragedies just as profoundlyas we do from our triumphs. That was certainly true of the Columbia tragedy. AtNASA, the self-reflection that resulted from that event led us to recalibrate itrevived that natural curiosity within us and served as a lens for gauging the import-ance of issues facing NASA on a daily basis, such that we continually sought to askthe right questions and to secure the right data before making the important deci-sions. In the end, NASA will be a stronger organization for having gone throughsuch intense self-examination and public scrutiny.

    Those looking at NASA from just outside its gates have the greatest opportunity ofall to learn from the hard lessons of others without experiencing the pain as deeplyfor themselves. The analyses contained within this book capture the collective workof 35 distinguished individuals representing 12 respected organizations of learning,each serving as an authority in their area of authorship, yet all bound by one com-mon belief, that there is more to be learned from the Columbia tragedy than whatis already being applied within NASA. Each chapter analyzes the tragedy from adifferent perspective, and each chapters ensuing commentary is worthy of carefulconsideration by many organizations today. To be sure, not all of the commentaryendorses the actions taken within NASA, and some comments surely surface issuesthat merit further thought. Similarly, there are conclusions and critiques herein that Ido not necessarily support or concur with. But there is great value in these divergentperspectives and assessments. Our Columbia colleagues and their families deserve noless than this rigorous debate. The value of this work for other organizations will beimportant. While using NASA as a case study, this work, and many of the trenchantobservations contained herein, will certainly serve to promote and ensure the successof any organization involved in very complex, high-risk endeavors. It is my beliefthat this study will serve as one of those lenses by which many organizations charttheir course for tomorrow.

    Preface xix

    OATA01 06/14/2005, 10:49 AM19

  • OATA01 06/14/2005, 10:49 AM20

  • Part I

    INTRODUCTION

    OATC01 06/14/2005, 10:50 AM1

  • 2 Farjoun and Starbuck

    OATC01 06/14/2005, 10:50 AM2

  • Introduction 3

    1

    INTRODUCTION: ORGANIZATIONALASPECTS OF THE COLUMBIA

    DISASTERMoshe Farjoun and William H. Starbuck

    On February 1, 2003, the space shuttle Columbia disintegrated in a disaster that killedits crew. When Columbia began its descent, only a handful of NASA engineers wereworried that the shuttle and its crew might be in danger. Minutes later, a routinescientific mission became a nonroutine disaster.

    Disasters destroy not only lives but also reputations, resources, legitimacy, andtrust (Weick, 2003). However, disasters also dramatize how things can go wrong,particularly in large, complex social systems, and so they afford opportunities forreflection, learning, and improvement. Within two hours of losing the signal fromthe returning spacecraft, NASAs Administrator established the Columbia AccidentInvestigation Board (CAIB) to uncover the conditions that had produced the disasterand to draw inferences that would help the US space program to emerge strongerthan before (CAIB, 2003). Seven months later, the CAIB released a detailed reportthat includes its recommendations.

    The CAIB identified the physical cause of the accident to be a breach in thethermal protection system on the leading edge of the left wing, caused by a piece ofthe insulating foam that struck the wing immediately after launch. However, theCAIB also said that the accident was a product of long-term organizational problems.Therefore, the CAIBs report provided not only an account of the technical causesof the Columbia accident, but an account of its organizational causes. Thus, the CAIBwondered: Why did NASA continue to launch spacecraft despite many years ofknown foam debris problems? Why did NASA managers conclude, despite the con-cerns of their engineers, that the foam debris strike was not a threat to the safety ofthe mission? Tragically, some of the problems surfaced by the CAIB had previouslybeen uncovered during the Challenger investigation in 1986. How could NASA haveforgotten the lessons of Challenger? What should NASA do to minimize the likeli-hood of such accidents in the future?

    Although the CAIBs comprehensive report raised important questions and offeredanswers to some of these questions, it also left many major questions unanswered.

    OATC01 06/14/2005, 10:50 AM3

  • 4 Farjoun and Starbuck

    For example, why did NASA consistently ignore the recommendations of severalreview committees that called for changes in safety organization and practices? Didmanagerial actions and reorganization efforts that took place after the Challengerdisaster contribute, both directly and indirectly, to the Columbia disaster? Why didNASAs leadership fail to secure more stable funding and to shield NASAs operationsfrom external pressures? This book reflects its authors collective belief that there ismore to be learned from the Columbia disaster. We dissect the human, organizational,and political processes that generated the disaster from more perspectives than theCAIB report, and we try to extract generalizations that could be useful for otherorganizations engaged in high-risk ventures such as nuclear power plants, hos-pitals, airlines, armies, and pharmaceutical companies. Some of our generalizationsprobably apply to almost all organizations.

    Indeed, although the CAIB said a lot about the human, organizational, and polit-ical causes of the Columbia disaster and the necessary remedies in those domains, itappears that it may not have said enough. At least, NASA appears to be discountingthe CAIBs concerns in these domains. In February 2005, two years after the disaster,the New York Times reported that NASA was intending to resume launches beforeit had made all the corrections that the CAIB had deemed essential, and NASAsmanagement seemed to be paying more attention to its technology than to its organ-ization. According to this report, NASA was rushing back to flight because of PresidentBushs goal of completing the International Space Station and beginning humanexploration of the Moon and Mars (Schwartz, 2005). In other words, NASA is againallowing its political environment, which has no technological expertise whatever, todetermine its technological goals and schedules. This pattern has repeated throughNASAs history, and it was a major factor in both the Challenger and Columbia disasters.

    This book enlists a diverse group of experts to review the Columbia disaster and toextract organizational lessons from it. Thanks to the documentation compiled by theCAIB, as well as other NASA studies, this endeavor involves a rich and multifacetedexploration of a real organization. Because disasters are (thankfully) very unusual,we need to use multiple observers, interpretations, and evaluation criteria to experi-ence history more richly (March et al., 1991). Some contributors to this book drawconclusions very different from the CAIBs.

    As the CAIB concluded, the accident did not have simple and isolated causes.There were many contributing factors, ranging from the environment, to NASAshistory, policy and technology, to organizational structures and processes and thebehaviors of individual employees and managers. The breadth and complexity ofthese factors call for a research inquiry that examines both specific factors andtheir combined effects. The unfortunate precedent of the Challenger disaster in 1986provides an opportunity to compare two well-documented accidents and considerhow NASA developed over time.

    This book is very unusual in the field of organization studies because it is acollaborative effort to dissect a decision-making situation from many perspectives.The nearest forerunners are probably Allison and Zelikows (1999) book on theCuban missile crisis and Moss and Sills et al.s (1981) book about the accident atThree Mile Island, which also used multiple lenses to interpret single chronologies ofevents. Overall, there are almost no examples of organizational research that bring

    OATC01 06/14/2005, 10:50 AM4

  • Introduction 5

    together such a diverse group of experts to discuss a specific event and organization,so this project is the first of its kind.

    Columbia exemplifies events that have been occurring with increasing frequency,and NASA exemplifies a kind of organization that has been growing more prevalentand more important in world affairs. Humanity must come to a better understandingof disasters like Columbia and must develop better ways of managing risky technologiesthat require large-scale organizations. Although many humans embrace new techno-logies eagerly, they are generally reluctant to accept the risks of real-life experimen-tation with new technologies. Some of these new technologies, like the space shuttle,involve degrees of complexity that exceed our abilities to manage them, and ourefforts to manage these technologies create organizations that, so far, have been toocomplex to control effectively. NASA and the space shuttle program have surpassedorganizational limits of some sort. The space shuttle missions are complex phenom-ena in which technical and organizational systems intertwine. On top of this com-plexity, NASA was operating under challenging conditions: budgetary constraints,severe time pressures, partially inconsistent efficiency and safety goals, personneldownsizing, and technological, political and financial uncertainty. However, someorganizations appear to be less prone to failure and others more so. What producesthese differences? Are well-meaning people bound to produce bad outcomes? Finally,can societies, organizations, and people learn from failures and reduce or removedangers? How can organizations, medium and large, limit their failures, and how canorganizations and people increase their resilience when operating at their limits?

    CHAPTER OVERVIEW

    The book has four main sections. Part II examines the context in which the Columbiadisaster occurred. It includes a historical overview, a comparison of the Challengerand the Columbia disasters, a focused examination of the shuttle programs recenthistory, and an examination of the disaster in the larger context of space transpor-tation. Part III examines three major influences on decision-making in the shuttleprogram: language, time, and attention. These influences were not limited to aparticular decision but played out in several decision episodes preceding the disaster.Part IV focuses on a controversial part of the disaster: the failure to seek additionalphotographic images of the areas of Columbia that had been hit by debris duringliftoff. Part V of the book moves beyond explanation of the Columbia disaster tosuggest ways in which NASA and other organizations can decrease the likelihood offailure and become more resilient.

    There is some redundancy because authors want their chapters to be independentof one another.

    Part II: The Context of the Disaster

    In chapter 2, Moshe Farjoun provides a historical analysis of the space shuttleprogram at NASA. He focuses on key events and developments that shed light on the

    OATC01 06/14/2005, 10:50 AM5

  • 6 Farjoun and Starbuck

    Columbia disaster and its aftermath. The historical analysis underscores how earlypolicy and technological decisions became relatively permanent features of the shuttlesworking environment. Farjoun argues that aspects of that working environment, suchas tight linkage between the shuttle program and the International Space Station, sched-ule pressures, the technological design of the shuttle, and its characterization as anoperational rather than developmental vehicle have long historical roots. He concludesthat history may repeat itself because the governing policies and resource decisionsremain relatively intact. Patterns repeat and lessons are not learned not only becauselearning processes are ineffective but also for motivational and political reasons.

    In chapter 3, Diane Vaughan also considers whether NASA has learned from mistakes.She compares NASAs two space shuttle disasters, the Challengers and the Columbias.Her analysis, as well as that of subsequent chapters, identifies many similarities betweenthe causes and contributing factors of these two disasters. She particularly discusses how,for years preceding both accidents, technical experts defined risk away by repeatedlynormalizing technical anomalies that deviated from expected performance. Based onher reviews of the changes NASA made after the two accidents, she argues that in orderto reduce the potential for gradual slides and repeating negative patterns, organiza-tions must go beyond the easy focus on individual failure to identify the social causesin organizational systems, a task requiring social science input and expertise.

    In chapter 4, Moshe Farjoun revisits the period from 1995 to 2003 that precededthe Columbia disaster. He identifies this period as one in which the shuttle programencountered a safety drift, incrementally sliding into increasing levels of risk. Heinfers that, in 1999, less than four years before the Columbia disaster, NASA missedtwo major learning opportunities to arrest or reverse safety drift: the STS-93 Col-umbia mishaps and the Jet Propulsion Laboratory (JPL) Mars robotic failures. Manyof the factors contributing to these failures also contributed to the Columbia disaster.Farjoun examines the impediments to learning and corrective actions during thissafety drift. He lists several potential reasons as to why NASA failed to exploit theselearning opportunities, including faulty knowledge-transfer mechanisms between pro-grams, incomplete learning processes, and problematic leadership transition.

    Chapter 5, by Karlene H. Roberts, Peter Madsen, and Vinit M. Desai, examines thefailure of the Columbia STS-107 mission in the larger context of space transporta-tion. They argue that the Columbia disaster was an instance of a broad phenomenonthat underlies organizational failures which they call the space between. Drawingon previous research on high-reliability organizations, they argue that organizationsencounter problems by neglecting coordination and failing to ensure independenceof activities. They use an interesting comparison between the shuttle program andAerospace Corporation, a private organization that provides launch verification andother services to the US Air Force, to show how projects can guarantee true inde-pendence in safety organization.

    Part III: Influences on Decision-Making

    In chapter 6, William Ocasio examines the interplay between language and culture inthe Columbia disaster. Using historical and archival analysis, Ocasio examines how

    OATC01 06/14/2005, 10:50 AM6

  • Introduction 7

    the vocabulary of safety contributed to the disaster. He finds that within the cultureof the space shuttle organization, the meaning of safety of flight was ambiguousand people viewed safety as a minimal constraint to satisfy rather than a goal toraise. Organizational culture and ambiguous linguistic categorizations made risk moreopaque, but the opacity of risk was not a root cause of the Columbia disaster. Ocasiouses the Columbia case to extract important lessons for managing organizations withrisky technologies, with special emphasis on the role of language at different levels oforganizations corporate strategy, ongoing operations, and accident response.

    In chapter 7, Sally Blount, Mary Waller, and Sophie Leroy focus on a key contrib-uting factor in the Columbia disaster the existence of time pressure and its particularmanifestation in overly ambitious and rigid deadlines. They argue that ongoing timepressure and associated time stress, as well as the time-urgent culture that ultimatelyemerged, sowed the seeds of disaster. The authors specifically examine the organiza-tional effects of the now notorious February 19, 2004 deadline: the cognitive focusbecame time, time stress rose, and information-processing and decision-making capab-ilities deteriorated. They conclude, Time, rather than safety or operational excellence,became the most valued decision attribute. Thus, when ambiguous information wasencountered, safety risks were systematically underestimated, while the costs of delaywere overestimated. And in the end, bad decisions were made by good people.

    Chapter 8, by Angela Buljan and Zur Shapira, examines how attention to produc-tion schedule as opposed to safety served as a determinant of risk-taking in NASAsdecision to launch Columbia. The authors argue that the decision to launch theColumbia without ascertaining the proper functioning of the heat insulation replicatesthe disastrous decision to launch the Challenger. They use a model of risk-takingbehavior based on how managers allocate attention between conflicting safety andtime targets. Using this model, Buljan and Shapira demonstrate how the pressures tomeet the target date for launch became the focus of attention at the expense of moreattention to safety, both before the Columbia flight and during the flight itself.

    Part IV: The Imaging Debate

    In chapter 9, Karl E. Weick traces the fate of an equivocal perception of a blurredpuff of smoke at the root of the left wing of the shuttle 82 seconds after takeoff. Heargues that units and people within NASA made sense of this equivocal perceptionin ways that were more and less mindful. Had mindfulness been distributed morewidely, supported more consistently, and executed more competently, the outcomemight well have been different. Karls analysis of the imaging events demonstratesthat decision-making is not so much a stand-alone one-off choice as it is aninterpretation shaped by abstractions and labels that are part of the ongoing negotia-tions about the meaning of a flow of events.

    In chapter 10, Scott A. Snook and Jeffrey C. Connor examine the imaging episodefrom a more structural perspective. They see striking similarities between threeseemingly different tragedies Childrens Hospital in Boston, friendly fire in northernIraq, and the Columbia imagery decision. All three cases exemplify best in their

    OATC01 06/14/2005, 10:50 AM7

  • 8 Farjoun and Starbuck

    class highly admired and complex organizations. Yet, all three instances involveda troubling pattern that they call structurally induced inaction: despite the multi-plicity of experts, nobody acts at a crucial moment. These instances of inaction aretragic. The authors identify conditions and mechanisms that seem to increase thelikelihood of this pattern of failure. The chapter concludes by discussing how deci-sion processes can counter structurally induced inaction.

    In chapter 11, Raghu Garud and Roger Dunbar focus on an aspect of ambiguousthreats that they call data indeterminacy. They view data as indeterminate if multipleperspectives within an organization generate ambiguities that obscure the signific-ance of events in real time. They argue that NASA and the Columbia disasterillustrate a tension between two modes of operation: normal and exploratory. Eachof these modes constitutes a different organizing mode for distributed knowledge,and combining the modes produces data indeterminacy. Garud and Dunbar use theimaging story to show how attempting to accommodate both organizing modessimultaneously makes the significance of available real-time data indeterminate, sothat ways to react become impossible to discern. They conclude that, in high-risksituations, the emergence of indeterminacy can have disastrous consequences, as wasthe case with STS-107.

    In Chapter 12, Amy C. Edmondson, Michael A. Roberto, Richard M.J. Bohmer,Erika M. Ferlins, and Laura R. Feldman introduce the notion of the recovery windowto examine how high-risk organizations deal with ambiguous threats. They definerecovery window as a period following a threat in which constructive collectiveaction is feasible. Their analysis characterizes the Columbias recovery window theperiod between the launch of the shuttle when shedding debris presented an ambigu-ous threat and the disastrous outcome 16 days later as systematically under-responsive. Based on their analysis, the authors propose that a preferred response toambiguous threats in high-risk systems would be an exploratory response character-ized by over-responsiveness and a learning orientation.

    Chapter 13, by Frances Milliken, Theresa K. Lant, and Ebony Bridwell-Mitchell,uses the imaging episode to examine barriers to effective learning about potentialproblems in organizations. Using an organizational learning lens, they discuss howan organizational context so beset with complexity and ambiguity can make effect-ive learning extremely difficult. They argue that under these conditions formal andinformal power relations often determine which interpretive frame wins. The authorsdiscuss how at least two different interpretation systems were at work in the imagingdecision. Their chapter specifies the mechanisms by which this interpretive conflictwas resolved and suggests ways in which organizations can use constructive conflictto improve learning and interpretation under trying conditions.

    Part V: Beyond Explanation

    Chapter 14, by Nancy Leveson, Joel Cutcher-Gershenfeld, John S. Carroll, BettyBarrett, Alexander Brown, Nicolas Dulac, Lydia Fraile, and Karen Marais, opens the

    OATC01 06/14/2005, 10:50 AM8

  • Introduction 9

    last part of the book by examining system approaches to safety. The authors arguethat traditional ways of reducing risks focus on components rather than interdepend-ent systems, and they offer a framework drawn from engineering systems andorganization theory to understand accidents and safety in a more comprehensiveway. They use the NASA shuttle disasters Challenger and Columbia as a windowonto complex systems and systems approaches to safety. In particular, they examinethe role of professional groups such as engineers and managers in the context ofinterdependent technical, social, and political systems.

    Chapter 15, by David D. Woods, examines patterns present in the Columbia accid-ent in order to consider how organizations in general can learn and change beforedramatic failures occur. David argues that the factors that produced the holes inNASAs organizational decision-making are generic vulnerabilities that have con-tributed to other failures and tragedies across other complex industrial settings.Under the umbrella of what he calls resilience engineering, Woods discusses ways inwhich organizations can better balance safety and efficiency goals and can establishindependent, involved, informed, and informative safety organizations.

    Chapter 16, by William Starbuck and Johnny Stephenson, provides a blueprint formaking NASA more effective. The chapter reviews key properties of NASA and itsenvironment and the organizational-change initiatives currently in progress withinNASA, and then attempts to make realistic assessments of NASAs potential forfuture achievement. In the authors opinion, some environmental constraints make itdifficult, if not impossible, for NASA to overcome some challenges it faces, but theredo appear to be areas that current change efforts do not address, and areas wheresome current efforts appear to need reinforcement.

    Chapter 17 was written by Henry McDonald, who served as a center directorat NASA and who headed the Shuttle Independent Assessment Team (SIAT) thatwas formed to study increases in shuttle failures around 1999. The SIAT reportanticipated many of the contributing factors of the Columbia disaster. Based onhis review of all the other chapters in this volume, Henry McDonald offers hisobservations on NASA and on the lessons it should draw from this volume. He offersa view of the events preceding the disaster, and he particularly discusses the extentto which NASA has implemented the SIAT report. He comments on how the differentchapters in this book reinforce or deviate from the CAIB report, and discussespotential lessons NASA could and should have drawn from organization and man-agement theory.

    ACKNOWLEDGMENTS

    This book project has benefited from the insights of Greg Klerkx, Robert Lamb, and StephenGarber. Several NASA personnel attended a conference on organization design in June 2004,and although the conference did not discuss the Columbia disaster as such, the NASA person-nel helped several of the books authors to better understand NASA. As well, the New YorkUniversity Department of Management and Organizations gave financial support for a meetingof the authors.

    OATC01 06/14/2005, 10:50 AM9

  • 10 Farjoun and Starbuck

    REFERENCES

    Allison, G.T., and Zelikow, P. 1999. Essence of Decision: Explaining the Cuban Missile Crisis,2nd edn. Longman, New York.

    CAIB (Columbia Accident Investigation Board). 2003. Report, 6 vols. Government PrintingOffice, Washington, DC. www.caib.us/news/report/default.html.

    March, J.G., Sproull, L.S., and Tamuz, M. 1991. Learning from samples of one or fewer.Organization Science 2(1), 113.

    Moss, T.H., and Sills, D.L. (eds.) 1981. The Three Mile Island Accident: Lessons and Implica-tions. New York Academy of Sciences, New York.

    Schwartz, J. 2005. Critics question NASA on safety of the shuttles. New York Times, February7.

    Weick, K.E. 2003. Positive organizing and organizational tragedy. In K.S. Cameron, J.E.Dutton, and R.E. Quinn (eds.), Positive Organizational Scholarship: Foundation of a NewDiscipline. Berrett-Koehler, San Francisco, ch. 5.

    OATC01 06/14/2005, 10:50 AM10

  • Introduction 11

    SYNOPSIS: NASA, THE CAIBREPORT, AND THE COLUMBIA

    DISASTER

    NASA AND THE HUMAN SPACE FLIGHT PROGRAM

    The National Aeronautics and Space Administration (NASA) formed on October 1,1958 in response to the launch of Sputnik by the Soviet Union. Almost immediatelyit began working on options for manned space flight. NASA launched the first spaceshuttle mission in April 1981. In addition to the human space flight program, NASAalso maintains an active (if small) aeronautics research program, a space-science pro-gram, and an Earth-observation program, and it conducts basic research in a varietyof fields (CAIB, 2003: vol. 1, 16).

    There are three major types of entities involved in the human space flightprogram: NASA field centers, NASA programs carried out at those centers, andindustrial and academic contractors. The centers provide the infrastructure andsupport services for the various programs. The programs, along with field centers andheadquarters, hire civil servants and contractors from the private sector to supportaspects of their enterprises.

    NASAs headquarters, located in Washington, DC, is responsible for leadership andmanagement across NASAs main enterprises and provides strategic management forthe space shuttle and International Space Station (ISS) programs. The Johnson SpaceCenter in Houston, Texas, manages both the space shuttle and the space station.The Kennedy Space Center, located on Merritt Island, Florida, adjacent to the CapeCanaveral Air Force Station, provides launch and landing facilities for the spaceshuttle. The Marshall Space Shuttle Flight Center, near Huntsville, Alabama, operatesmost of NASAs rocket propulsion efforts. Marshall also conducts microgravityresearch and develops payloads for the space shuttle. The two major human spaceflight efforts within NASA are the space shuttle program and ISS program, bothheadquartered at Johnson although they report to a deputy associate Administratorat NASA headquarters. The Space Shuttle Program Office at Johnson is responsiblefor all aspects of developing, supporting, and flying the space shuttle. To accomplishthese tasks, the program maintains large workforces at various NASA centers. TheSpace Shuttle Program Office also manages the Space Flight Operations Contract

    OATC01 06/14/2005, 10:50 AM11

  • 12 Farjoun and Starbuck

    with United Space Alliance a joint venture between Boeing and Lockheed Martinthat provides most of the contractor support at Johnson and Kennedy, as well as asmall amount at Marshall (CAIB, 2003: vol. 1, 16).

    THE COLUMBIA AND THE STS-107 MISSION

    NASA launched the space shuttle Columbia on its STS-107 mission on January 16,2003. On February 1, 2003, as it descended to Earth after completing a 16-dayscientific research mission, Columbia broke apart over northeastern Texas. All sevenastronauts aboard were killed. They were commander Rick Husband; pilot WilliamMcCool; mission specialists Michael P. Anderson, David M. Brown, Kalpana Chawla,and Laurel Clark; and payload specialist Ilan Ramon, an Israeli (Smith, 2003).

    The Space Transportation System (STS) the space shuttle consists of an airplane-like orbiter, two solid rocket boosters (SRBs) on either side, and a large cylindricalexternal tank that holds cryogenic fuel for the orbiters main engines. The SRBs detachfrom the orbiter 2.5 minutes after launch, fall into the ocean, and are recovered forreuse. The external tank is not reused. It is jettisoned as the orbiter reaches Earthorbit, and disintegrates as it falls into the Indian Ocean (Smith, 2003).

    Designated STS-107, this was the space shuttle programs 113th flight and Columbias28th. Columbia was the first space-rated orbiter, and it made the space shuttle pro-grams first four orbital test flights. Unlike orbiters Challenger, Discovery, Atlantis,and Endeavor, Columbias payload was insufficient to make it cost-effective for spacestation missions. Therefore, Columbia was not equipped with a space station dockingsystem. Consequently, Columbia generally flew science missions and serviced theHubble space telescope.

    THE CAIB INVESTIGATION

    Within hours of the Columbia break-up, NASA Administrator Sean OKeefe appointedan external group, the Columbia Accident Investigation Board (CAIB), to investigatethe accident. Chaired by Admiral (ret.) Harold Gehman, the CAIB released its reporton August 26, 2003, concluding that the tragedy was caused by technical andorganizational failures. The CAIB report included 29 recommendations, 15 of whichthe CAIB specified must be completed before the shuttle flights could resume. The248-page report is available at CAIBs website (http://www.caib.us]).

    The CAIBs independent investigation lasted nearly seven months. The CAIBs13 members had support from a staff of more than 120 and around 400 NASAengineers. Investigators examined more than 30,000 documents, conducted morethan 200 formal interviews, heard testimony from dozens of expert witnesses, andreviewed more than 3,000 inputs from the general public. In addition, more than25,000 searchers combed vast stretches of the western United States to retrieve thespacecrafts debris. In the process, Columbias tragedy was compounded when two

    OATC01 06/14/2005, 10:50 AM12

  • Introduction 13

    debris searchers with the US Forest Service perished in a helicopter accident (CAIB,2003: vol. 1, 9).

    THE CAIBS OBSERVATIONS, CONCLUSIONS, AND RECOMMENDATIONS

    The CAIB recognized early on that the accident was probably not an anomalous,random event, but rather likely rooted to some degree in NASAs history and theHuman Space Flight Programs culture. Accordingly, the CAIB broadened itsmandate at the outset to include a wide range of historical and organizationalissues, including political and budgetary considerations, compromises, and changingpriorities over the life of the Space Shuttle Program (CAIB, 2003: vol. 1, 9).

    The physical cause of the loss of Columbia and its crew was a breach in thethermal protection system on the leading edge of the left wing. A 1.7 pound piece ofinsulating foam separated from the external tank at 81.7 seconds after launch andstruck the wing, making a hole in a reinforced carbon carbon panel. During re-entrythis breach in the Thermal Protection System allowed superheated air to penetratethrough the insulation and progressively melt the aluminum structure of the leftwing, weakening the structure until aerodynamic forces caused loss of control,failure of the wing, and breakup of the Orbiter. This breakup occurred in a flightregime in which, given the current design of the Orbiter, there was no possibility forthe crew to survive (CAIB, 2003: vol. 1, 9). Figure A1 diagrams the physical causeof the accident.

    The flight itself was close to trouble-free (CAIB, 2003: vol. 1, 11). The foam strikeevent was not detected by the crew on board or seen by ground-support teams untilthe day after launch, when NASA conducted detailed reviews of all launch cameraphotography and videos. This foam strike had no apparent effect on the dailyconduct of the 16-day mission, which met all its objectives (CAIB, 2003: vol. 1, 11).

    Chapter 6 of the CAIB report, titled Decision Making at NASA, focuses on thedecisions that led to the STS-107 accident. Section 6.1 reveals that the shedding offoam from the external tank the physical cause of the Columbia accident had along history. It illustrates how foam debris losses that violated design requirementscame to be defined by NASA management as an acceptable aspect of shuttle mis-sions a maintenance turnaround problem rather than a safety of flight concern.Table A1, adapted from figure 6.17 of the CAIB report, provides the history of foamdebris losses up to the Columbia disaster.

    Section 6.2 of the CAIB report shows how, at a pivotal juncture just months beforethe Columbia accident, the management goal of completing Node 2 of the ISS byFebruary 19, 2004, encouraged shuttle managers to continue flying, even after asignificant bipod foam debris strike on STS-112. Section 6.3 discusses NASAs failureto obtain imagery from Department of Defense (DOD) satellites to assess the damagecaused by the foam debris. It notes the decisions made during STS-107 in responseto the bipod foam strike, and reveals how engineers concerns about risk and safetywere competing with and were defeated by managements belief that foam could

    OATC01 06/14/2005, 10:50 AM13

  • 14 Farjoun and Starbuck

    The space shuttleColumbia lifts off fromlaunch pad 39-A at theKennedy Space Center,

    Florida at 9.39amJanuary 16 to begin the

    STS-107 mission.

    A 1.7 pound piece ofinsulating foam

    separates from theExternal Tank at 81.7seconds after launch.

    The insulating foamstrikes the wing,

    making a hole in areinforced carbon-

    carbon panel.

    Solid Rocket Boosters.These detach from theorbiter 2.5 minutesafter the launch, fallinto the ocean, and arerecovered for reuse.

    External Fuel Tank. Alarge cylindrical tankthat holds cryogenicfuel for the orbiter'smain engines. This isnot reused butdisintegrates as it fallsback into the IndianOcean.

    Left bipod ramp. Thisfoam ramp insulatesone of the forwardconnections anchoringthe external fuel tankto the space shuttle. Itwas this foam thatbroke off after launch.

    Figure A1

    OATC01 06/14/2005, 10:50 AM14

  • Introduction 15

    Table A1 14 flights that had significant thermal protection system damage or major foamloss

    Mission Date Comments

    STS-1 April 12, 1981 Lots of debris damage. 300 tiles replaced.STS-7 June 18, 1983 First known left bipod ramp foam-shedding event.STS-27R December 2, 1988 Debris knocks off tile; structural damage and near burn

    through results.STS-32R January 9, 1990 Second known left bipod ramp foam event.STS-35 December 2, 1990 First time NASA calls foam debris a safety of flight issue,

    and a re-use or turnaround issue.STS-42 January 22, 1992 First mission after which the next mission (STS-45)

    launched without debris in-flight anomaly closure/resolution.

    STS-45 March 24, 1992 Damage to wing RCC Panel 10-right. Unexplained anomaly,most likely orbital debris.

    STS-50 June 25, 1992 Third known bipod ramp foam event. Hazard Report 37: anaccepted risk.

    STS-52 October 22, 1992 Undetected bipod ramp foam loss (fourth bipod event).STS-56 April 8, 1993 Acreage tile damage (large area). Called within experience

    base and considered in-family.STS-62 October 4, 1994 Undetected bipod ramp foam loss (fifth bipod event).STS-87 November 19, 1997 Damage to orbiter thermal protection system spurs NASA

    to begin nine flight tests to resolve foam-shedding. Foamfix ineffective. In-flight anomaly eventually closed afterSTS-101 classified as accepted risk.

    STS-112 October 7, 2002 Sixth known left bipod ramp foam loss. First time majordebris event not assigned an in-flight anomaly. Externaltank project was assigned an Action. Not closed out untilafter STS-113 and STS-107.

    STS-107 January 16, 2003 Columbia launch. Seventh known left bipod ramp foam lossevent.

    Source: Quoted from CAIB, 2003: vol. 1, fig. 6.17.

    not hurt the orbiter, as well as the desire to keep on schedule. Table A2, adaptedfrom pages 1667 of the CAIB report, summarizes the imagery requests and missedopportunities.

    In relating a rescue and repair scenario that might have enabled the crews safereturn, Section 6.4 grapples with yet another latent assumption held by shuttlemanagers during and after STS-107. They assumed that, even if the foam strike hadbeen discovered, nothing could have been done (CAIB, 2003: vol. 6, 121). Therewere two main options for returning the crew safely if NASA had understood thedamage early in the mission: repairing the damage in orbit, or sending anothershuttle to rescue the crew. The repair option, while logistically viable, relied on somany uncertainties that NASA rated this option as high-risk (CAIB, 2003: vol. 6,

    OATC01 06/14/2005, 10:50 AM15

  • 16 Farjoun and Starbuck

    Table A2 Imagery requests and missed opportunities

    Imagery requests1. Flight Day 2. Bob Page, chair, Intercenter Photo Working Group to Wayne Hale, shuttle

    program manager for launch integration at Kennedy Space Center (in person).2. Flight Day 6. Bob White, United Space Alliance manager, to Lambert Austin, head of the

    Space Shuttle Systems Integration at Johnson Space Center (by phone).3. Flight Day 6. Rodney Rocha, co-chair of Debris Assessment Team to Paul Shack, manager,

    Shuttle Engineering Office (by email).

    Missed opportunities1. Flight Day 4. Rodney Rocha inquires if crew has been asked to inspect for damage. No

    response.2. Flight Day 6. Mission Control fails to ask crew member David Brown to downlink video he

    took of external tank separation, which may have revealed missing bipod foam.3. Flight Day 6. NASA and National Imagery and Mapping Agency personnel discuss possible

    request for imagery. No action taken.4. Flight Day 7. Wayne Hale phones Department of Defense representative, who begins

    identifying imaging assets, only to be stopped per Linda Hams orders.5. Flight Day 7. Mike Card, a NASA headquarters manager from the Safety and Mission

    Assurance Office, discusses imagery request with Mark Erminger, Johnson Space CenterSafety and Mission Assurance. No action taken.

    6. Flight Day 7. Mike Card discusses imagery request with Bryan OConnor, associateAdministrator for safety and mission assurance. No action taken.

    7. Flight Day 8. Barbara Conte, after discussing imagery request with Rodney Rocha, callsLeRoy Cain, the STS-107 ascent/entry flight director. Cain checks with Phil Engelauf, andthen delivers a no answer.

    8. Flight Day 14. Michael Card, from NASAs Safety and Mission Assurance Office, discussesthe imaging request with William Readdy, associate Administrator for space flight. Readdydirects that imagery should only be gathered on a not-to-interfere basis. None wasforthcoming.

    Source: Quoted from CAIB, 2003: vol. 1, pp. 1667.

    173). NASA considered the rescue option challenging but feasible (CAIB, 2003:vol. 6, 174).

    The organizational causes of this accident are rooted in the space shuttle pro-grams history and culture, including the original compromises that were required togain approval for the shuttle from the White House and Congress, subsequent yearsof resource constraints, fluctuating priorities, schedule pressures, mischaracteriza-tion of the shuttle as operational rather than developmental, and lack of an agreednational vision for human space flight. Cultural traits and organizational practicesdetrimental to safety were allowed to develop. NASA relied on past success as asubstitute for sound engineering practices such as testing to understand why systemswere not performing in accordance with requirements. Organizational barriers pre-vented effective communication of critical safety information and stifled professional

    OATC01 06/14/2005, 10:50 AM16

  • Introduction 17

    differences of opinion. Management was insufficiently integrated across programelements. An informal chain of command evolved, together with decision-makingprocesses that operated outside the organizations rules (CAIB, 2003: vol. 1, 9).

    The CAIB judged that there is a broken safety culture at NASA (CAIB, 2003:vol. 1, 1849). Other factors included schedule pressure (CAIB, 2003: vol. 6, 1319)related to the construction of the ISS, budget constraints (CAIB, 2003: vol. 5, 1025),and workforce reductions (CAIB, 2003: vol. 5, 10610). The CAIB concluded that theshuttle program has operated in a challenging and often turbulent environment(CAIB, 2003: vol. 5, 118), and that it is to the credit of Space Shuttle managers andthe Shuttle workforce that the vehicle was able to achieve its program objectives foras long as it did (CAIB, 2003: vol. 5, 119).

    Former astronaut Sally Ride served both on the Rogers Commission that investig-ated the January 1986 Challenger accident and on the CAIB. During the Columbiainvestigation, she said she heard echoes of Challenger as it became clear that theaccident resulted from NASA failing to recognize that a technical failure that hadoccurred on previous shuttle flights could have safety of flight implications even thoughthe earlier missions had been completed successfully. In the case of Challenger, thetechnical failure was erosion of seals (O-rings) between segments of the solid rocketbooster. Some engineers warned NASA not to launch Challenger that day becauseunusually cold weather could have weakened the resiliency of the O-rings. They wereoverruled. In the case of Columbia, the technical failure was shedding of foam fromthe external tank. The CAIB concluded that both accidents were failures of fore-sight, and that their similarity demonstrated that the causes of the institutional failureresponsible for Challenger have not been fixed and if these persistent, systemic flawsare not resolved, the scene is set for another accident (CAIB, 2003: vol. 1, 195).

    The CAIB report concludes with recommendations, some of which are specificallyidentified as before return to flight. These recommendations are largely related tothe physical cause of the accident, and include preventing the loss of foam, improvedimaging of the space shuttle from liftoff through separation of the external tank, andin-orbit inspection and repair of the thermal protection system. Most of the remain-ing recommendations stem from the CAIBs findings on organizational causes. Whilethese are not before return to flight recommendations, they capture the CAIBsthinking on what changes are necessary to operate the shuttle and on future space-craft safely (CAIB, 2003: vol. 1, 9).

    The report discusses the attributes of an organization that could more safely andreliably operate the inherently risky space shuttle, but does not provide a detailedorganizational prescription. Among those attributes are: (1) a robust and independ-ent program technical authority that has complete control over specifications andrequirements; (2) an independent safety assurance organization with line authorityover all levels of safety oversight; and (3) an organizational culture that reflects thebest characteristics of a learning organization (CAIB, 2003: vol. 1, 9).

    These recommendations reflect both the CAIBs strong support for return to flightat the earliest date consistent with the overriding objective of safety, and the CAIBsconviction that operation of the space shuttle, and all human space flight, is adevelopmental activity with high inherent risks (CAIB, 2003: vol. 1, 9).

    OATC01 06/14/2005, 10:50 AM17

  • 18 Farjoun and Starbuck

    REFERENCES

    CAIB (Columbia Accident Investigation Board). 2003. Report, 6 vols. Government PrintingOffice, Washington, DC. www.caib.us/news/report/default.html.

    Smith, M.S. 2003. NASAs space shuttle Columbia: synopsis of the report of the ColumbiaAccident Investigation Board. Congressional Research Service, Library of Congress, OrderCode RS21606.

    OATC01 06/14/2005, 10:50 AM18

  • Part II

    THE CONTEXT OFTHE DISASTER

    OATC02 06/14/2005, 10:50 AM19

  • 20 Farjoun

    OATC02 06/14/2005, 10:50 AM20

  • History and Policy 21

    2

    HISTORY AND POLICY AT THESPACE SHUTTLE PROGRAM

    Moshe Farjoun

    If you would understand anything, observe its beginning and itsdevelopment.

    Aristotle

    The February 2004 deadline for the Core Complete phase of the International SpaceStation (ISS) contributed to the Columbia accident in many ways it pressured thealready stressful space shuttle program, affected the ways information was gatheredand interpreted, competed with engineers concerns for safety, and affected otherdecision-making priorities (CAIB, 2003: ch. 6; chapter 7 this volume). However, theColumbia STS-107 mission was also the first flight in two years that was not actuallyserving the ISS. In order to understand this apparent disconnect one needs to exam-ine the larger historical context.

    Despite the many important changes made at NASA after the Columbia disasterseveral of these risky conditions still persist. NASA uses the same complex and riskytechnology without adequate substitutes other than foreign spacecraft. The spaceshuttle program is still intimately tied to the ailing ISS and needs to serve itsoperational needs. And it does all this without significantly higher levels of re-sources, while still facing skill shortages, and while operating three out of the fourshuttles it had before the disaster. Consequently, a historical analysis can teach us notonly about the context and environment in which the Columbia accident occurredbut also how risky conditions develop and are perpetuated.

    The CAIB report, specifically Dr. John Logsdons contribution, provides excellenthistorical background, explaining the evolution of the space shuttle program. I buildon this account and incorporate information from other primary and secondaryhistorical sources detailed at the end of this chapter. My intent is not to provide adetailed organizational and technological history of NASA but to focus on keyevents and developments that shed light on the Columbia disaster and more recentdevelopments. I followed several recommended practices of historical analysis suchas obtaining contemporaneous sources when possible, and validating the data usingmultiple sources of evidence (e.g., Lawrence, 1984; Stratt and Sloan, 1989). Becauseof differences in sources, focus, and time frame, the key observations that I derive

    OATC02 06/14/2005, 10:50 AM21

  • 22 FarjounTable 2.1 Major events and developments in the space shuttle program and NASA

    Year Development

    1958 NASA established1961 President Kennedys commitment to lunar landing; the Apollo era begins1962 John Glenn is the first American to circle the Earth1967 January: Apollo 1 disaster1969 July: Apollo 11s successful mission to the moon1970 Apollo 13: a near disaster1972 The space shuttle era President Nixons decision about future spacecraft;

    later becomes the space shuttle (1981)196175 After NASA completes several major programs without losing any astronauts

    during a space flight, its accomplishments became synonymous with highreliability.

    1981 First shuttle flight (Columbia)1983 Foam loss events start1984 President Reagans announcement about building a space station within a

    decade1986 January: The Challenger disaster and follow-up investigation by the Rogers

    Committee1988 Return to flight (Discovery)1990 The Hubble telescope mirror incident

    The Augustine Committee established1992 Daniel Goldins tenure as NASA Administrator begins

    During the next decade, NASAs budget is reduced by 40%1993 A highly successful mission to repair the Hubble telescope

    Important vote in Congress on the future of the space station1995 The Kraft report gives legitimacy to the operational status of the shuttle and to

    the faster, better, cheaper (FBC) approach1996 The Space Flight Operations Contract (SFOC) is signed

    Goldin starts the