Computers in context: The philosophy and practice of systems design: Bo Dahlbom and Lars Mathiassen...

2
should be provided to their users on different levels of formality and different levels of detail’ to eliminate the problem of the Subject Matter Expert being unfamiliar with notations based on the predicate calculus.) But we should not too hastily cast aside this approach. It might only be the analyst and programmer who end up using the specifications. Nonetheless, there is a tremendous advantage in capturing highly complex relationships which could only be represented in diagrams that were too difficult to read or which would require that the analyst leap to alternative methods when the modelling requires more detail. Yourdon dataflow diagramming and DeMarco structure charting both require the analyst to leap to structured English and alternative diagrammatic expressions when the decomposition gets close to the system details. Chen diagrams leap to expressing these relationship details in cardinalities, mandatory/optional existence flags, attribute definitions, and other ancil- lary notations to capture the details. The Kilov and Ross recommendation eliminates this leap by maintaining a consistent and precise notation throughout the modelling process. Moreover, because the same, formal semantics are used throughout the modelling effort, it is far easier to re-use generic concepts or reduce complex struc- tures to a more fundamental library of elementary associations. Information Modeling provides that fundamental library. The authors first set the stage by explaining how to specify a ‘contract’ that expresses the bond between objects, and then define the elementary association from which all possible associ- ations among objects can be built. In a complete chapter, they then prove (with the aid of contributions from Joseph Morabito and Naidu Guttapelle) that higher level associations can be modelled from the elementary forms. But not resting there, they then follow up their work with a complete example of how to go about building an object-oriented information model, complete with a sample dialogue between an object modeller and a Subject Matter Expert. They teach us well how to focus on specifying the contract and to treat cardinalities as less important details, and then go on to show the reasoning behind the steps taken in their extended example. The book concludes with an overview of the standardization efforts that the International Standards Organization and the American Standards Association have made toward building an implementation- independent way of specifying objects. This effort is critical to the objective of creating peripatetic objects that can be distributed everywhere, regardless of plat- form. The authors want us to know that their work is not only informed by the standards efforts, but that they have also influenced those standards. Their associ- ation with Bellcore, the American telecom- munications entity that was spun off by AT&T, has certainly provided them with opportl lities to refine their approach. Book reviews Most unfortunately, the book does not live up to its promise of providing a way to specify mathematically objects and their relations. Almost as soon as the authors have won the reader over to a rigorous pursuit of formal methods, they themselves turn to the natural language and picto- graphic formulations they decry. They equivocate on the best method by saying that it doesn’t matter so long as it is precise. And in an appendix in which they provide an example of a complete, formal specification, they turn to using Object Z (a specification language that has many adherents internationally), while still remarking that the ‘particular choice of a notation is not the most important aspect. Any notation that supports formal and disciplined specifications is acceptable’. The central point of Information Modeling; An Object-Oriented Approach stands. While a mathematical approach to object definition might not succeed in analysis sessions with end-users, it seems essential when analysis moves to design. When object-oriented databases must be constructed so that an object may be in- stantiated on any vendor’s computing environment, having produced a math- ematically precise specification of those objects and object classes will be vital. Robert J Vitello Wadsworth Center for Laboratories and Research, US Computers in Context: The Philosophy and Practice of Systems Design Bo Dahlbom and Lars Mathiassen NCC BlackweLl (1993) 306 pp (inc. bib- liography and index) X37.50 hardback, f14.99 paperback ISBN 1 55786 405 5 ‘The goal is to take a few steps in the direction of a richer, clearer, more personal understanding of the practice of developing computer systems, in order to improve that practice. This aim could be more accurately stated as giving an appreciation of the philosophy of developing computer systems considering that the book only occasionally discusses the practice of developing com- puter systems. The stated audience is students of computer science or information systems at advanced undergraduate or postgraduate level and practitioners and professionals in the computing field. The authors make extensive references to theories from old and new philosophers, from Socrates to Sartre and Plato to Pirsig. Throughout the book discussions are re- iterated, each time from a slightly different point of view or to introduce a further complication. This makes parts of the book laborious but it must be noted that the authors explicitly use this approach and justify it by pointing to the Platonic technique of dialectics, the details of which are provided in Chapter 2. Each section begins with a page or so of introduction. This states the aims of each section and outlines the discussion to follow. These introductions provide a good road map through the diversity of material presented and help to clarify the purpose of some of the more oblique inclusions. The opening section, entitled ‘Systems’, introduces two views of the world that are frequently referred to through the rest of the book. These are the mechanistic and the romantic world views. The develop- ment of these views and the underlying philosophies from the Greeks and modern philosophers is discussed. The authors go on to say that the world of computers has traditionally been dominated by the mech- anistic view and that there is a lot that can be learned by applying some of the ideas of the romantics in the world of computers. ‘These two world views are clearly con- tradictory, but they can enrich one another only if they are allowed to encroach on each other’s territory. It is when we confront a mechanistic world view of organisations with a romantic view, or a romantic view of computing machines with mechanistic ideas that interesting things begin to happen.’ (p 44.) The final chapter in this section toils through the notions of hard systems thinking, soft systems thinking and what the authors call ‘dialectic systems thinking’. The ideas are presented with few tangible examples, apart from one example that raises more issues than it clarifies. It could be argued that this is intentional on the part of the authors. The second section, called ‘Develop- ment’, deals with three approaches to system implementation: construction (traditional development), evolution (a style of proto- Informarion and Software Technology 1995 Volume 37 Number 7 397

Transcript of Computers in context: The philosophy and practice of systems design: Bo Dahlbom and Lars Mathiassen...

should be provided to their users on different levels of formality and different levels of detail’ to eliminate the problem of the Subject Matter Expert being unfamiliar with notations based on the predicate calculus.) But we should not too hastily cast aside this approach. It might only be the analyst and programmer who end up using the specifications. Nonetheless, there is a tremendous advantage in capturing highly complex relationships which could only be represented in diagrams that were too difficult to read or which would require that the analyst leap to alternative methods when the modelling requires more detail. Yourdon dataflow diagramming and DeMarco structure charting both require the analyst to leap to structured English and alternative diagrammatic expressions when the decomposition gets close to the system details. Chen diagrams leap to expressing these relationship details in cardinalities, mandatory/optional existence flags, attribute definitions, and other ancil- lary notations to capture the details. The Kilov and Ross recommendation eliminates this leap by maintaining a consistent and precise notation throughout the modelling process. Moreover, because the same, formal semantics are used throughout the modelling effort, it is far easier to re-use generic concepts or reduce complex struc- tures to a more fundamental library of elementary associations.

Information Modeling provides that fundamental library. The authors first set the stage by explaining how to specify a

‘contract’ that expresses the bond between objects, and then define the elementary association from which all possible associ- ations among objects can be built. In a complete chapter, they then prove (with the aid of contributions from Joseph Morabito and Naidu Guttapelle) that higher level associations can be modelled from the elementary forms. But not resting there, they then follow up their work with a complete example of how to go about building an object-oriented information model, complete with a sample dialogue between an object modeller and a Subject Matter Expert. They teach us well how to focus on specifying the contract and to treat cardinalities as less important details, and then go on to show the reasoning behind the steps taken in their extended example.

The book concludes with an overview of the standardization efforts that the International Standards Organization and the American Standards Association have made toward building an implementation- independent way of specifying objects. This effort is critical to the objective of creating peripatetic objects that can be distributed everywhere, regardless of plat- form. The authors want us to know that their work is not only informed by the standards efforts, but that they have also influenced those standards. Their associ- ation with Bellcore, the American telecom- munications entity that was spun off by AT&T, has certainly provided them with opportl lities to refine their approach.

Book reviews

Most unfortunately, the book does not live up to its promise of providing a way to

specify mathematically objects and their relations. Almost as soon as the authors have won the reader over to a rigorous pursuit of formal methods, they themselves turn to the natural language and picto- graphic formulations they decry. They equivocate on the best method by saying that it doesn’t matter so long as it is precise. And in an appendix in which they provide an example of a complete, formal specification, they turn to using Object Z (a specification language that has many adherents internationally), while still remarking that the ‘particular choice of a notation is not the most important aspect. Any notation that supports formal and disciplined specifications is acceptable’.

The central point of Information Modeling; An Object-Oriented Approach

stands. While a mathematical approach to object definition might not succeed in analysis sessions with end-users, it seems essential when analysis moves to design. When object-oriented databases must be constructed so that an object may be in- stantiated on any vendor’s computing environment, having produced a math- ematically precise specification of those objects and object classes will be vital.

Robert J Vitello Wadsworth Center for

Laboratories and Research, US

Computers in Context: The Philosophy and Practice of Systems Design Bo Dahlbom and Lars Mathiassen NCC BlackweLl (1993) 306 pp (inc. bib-

liography and index) X37.50 hardback, f14.99 paperback ISBN 1 55786 405 5

‘The goal is to take a few steps in the direction of a richer, clearer, more personal understanding of the practice of developing computer systems, in order to improve that practice. ’ This aim could be more accurately stated as giving an appreciation of the philosophy of developing computer systems considering that the book only occasionally discusses the practice of developing com- puter systems.

The stated audience is students of computer science or information systems at advanced undergraduate or postgraduate level and practitioners and professionals in the computing field.

The authors make extensive references to theories from old and new philosophers, from Socrates to Sartre and Plato to Pirsig.

Throughout the book discussions are re- iterated, each time from a slightly different point of view or to introduce a further complication. This makes parts of the book laborious but it must be noted that the authors explicitly use this approach and justify it by pointing to the Platonic technique of dialectics, the details of which are provided in Chapter 2.

Each section begins with a page or so of introduction. This states the aims of each section and outlines the discussion to follow. These introductions provide a good road map through the diversity of material presented and help to clarify the purpose of some of the more oblique inclusions.

The opening section, entitled ‘Systems’, introduces two views of the world that are frequently referred to through the rest of the book. These are the mechanistic and the romantic world views. The develop- ment of these views and the underlying philosophies from the Greeks and modern philosophers is discussed. The authors go on to say that the world of computers has

traditionally been dominated by the mech- anistic view and that there is a lot that can be learned by applying some of the ideas of the romantics in the world of computers. ‘These two world views are clearly con- tradictory, but they can enrich one another only if they are allowed to encroach on each other’s territory. It is when we confront a mechanistic world view of organisations with a romantic view, or a romantic view of computing machines with mechanistic ideas that interesting things begin to happen.’ (p 44.) The final chapter in this section toils through the notions of hard systems thinking, soft systems thinking and what the authors call ‘dialectic systems thinking’. The ideas are presented with few tangible examples, apart from one example that raises more issues than it clarifies. It could be argued that this is intentional on the part of the authors.

The second section, called ‘Develop- ment’, deals with three approaches to system implementation: construction (traditional development), evolution (a style of proto-

Informarion and Software Technology 1995 Volume 37 Number 7 397

Book reviews

typing) and intervention (where computer systems development is the key agent for organizational change). The authors then proceed to compare the three approaches from a number of positions and they invite the reader to decide when it would be appropriate to use each method. The con- cepts in this section are presented clearly and solid examples complement the theory.

The third section discusses the notion of ‘Quality’ in systems development. Firstly the idea of quality of artefacts in terms of functional, aesthetic and symbolic quality is argued. Then quality in the context of an organization’s culture is discussed. A useful example of a quality software development process is used. Finally, quality is discussed in the context of the various stakeholders

of a given system. The authors propose that different stakeholders will have divergent views on what constitutes a quality system. Specifically, systems often give certain stakeholders more political power over others. For example, a pro- duction control system may enable managers to monitor closely the activity of operatives. Both groups are real users of a system but may have conflicting ideas on what makes a good system. The system developer that appreciates these issues is called a political agent. ‘The political agent cannot say “good” without asking, good for whom?’ (p 183.) The section concludes with a discussion of these three aspects of quality: artefacts, culture and power. The discussion raises the issue of social responsibility of technologists. ‘ [Computer professionals] have an expertise that makes them morally obligated to speak up against the develop- ment of low quality computer systems and the irresponsible use of computer

technology.’ (p 199.) The book concludes with a section

entitled ‘Practice’. Unfortunately several passages in this section are rather opaque and it takes a number of readings to penetrate the depths. Some of the links between the theories presented and the practice of systems development are tenuous and the general argument lacks the lucidity of earlier sections. The final chapter of the book, however, makes up for the difficulties of Chapters 10 and 11 by fulfilling the promise of: ‘illustrating the relevance of philosophical reflection to the very concrete problems of practice’. (p 201.) In particular the dynamics of system development and the fact that systems often introduce significant organizational change are reiterated. The text concludes as follows: ‘The very idea of systems development is to change organisations . . . the only way that this can be rationally done is by forcing our philosophy to confront our practice, and our practice to confront our philosophy. ’ (p 270.)

There are a number of appendices that greatly enhance the value of the book. The ‘Further Reading’ guide on a chapter-by- chapter basis provides an excellent reference into source material, as does the com- prehensive bibliography. There are notes for instructors that provide some pointers on how the text might be used on an academic course. The end of the book has five or six discussion questions per chapter that could be used in a seminar forum with students.

The typesetting is good with line spacing bigger than usual and this mostly improves readability. The text is infrequently broken by a cartoon and a few more graphics

would help to ease the reader through the longer passages of text. After a little use the cover of the paperback book tends to curl wide open when the book is closed.

The authors have tried to use informal English where possible and this has the overall effect of making the material easier to understand. Occasionally they use awk- ward phrases causing certain passages to be hard to comprehend and introducing unnecessary ambiguity. For example, the authors refer to the ‘dequalification’ of work when referring to automation and the removal of skill from work. In the final chapter the authors use the word ‘arsonist’ when referring to a person with an en- trepreneurial nature. Now and again inappropriate English idiom is used when many other phrases could be used to equal effect. Indeed, in the days of Salman Rushdie, light-hearted references to relig- ious texts and faiths could be described as ill-advised.

The authors advocate a very radical approach to design and development-a reflection on the meaning and purpose of computer systems to understand more fully the problems at hand. The book raises a whole range of questions and invites and challenges the reader to reflect upon the issues and form an opinion. It provides a new perspective worth consideration by both computer professionals and students, although its distinct lack of prescription may make the material difficult to grasp for some readers. The book is certainly worth buying for those looking for something more than the latest buzzwords.

Vincent Jordan Trinity College, Dublin

Systems, Software, and Quality Engineering: Applying Defect Behavior Theory to Programming Arthur E Ferdinand Van Nostrand Reinhold New York (1993)

416 pp f51. SO ISBN 0 442 01730 8

This impressive book on systems and software quality deserves to be read and used by software engineers and managers, but it is not for the faint of heart. As the author states, it is intended ‘to be useful to product and software developers, scientists and graduate students, and to the game executive who, through the overview sections, could understand the nature and basis of quality and how to influence it.’ [Emphasis added]. It succeeds in this goal largely because it can be read on two levels, either by the graduate student who examines and verifies each mathematical

step, or by the practitioner who considers its recommendations through a more cursory examination of the supporting equations. It will be especially useful for software engineers and managers who are looking for ways to increase quality and productivity through process improvement. It is a finely crafted book with many

helpful charts and graphs. The book itself, and each chapter after

the first, is structured as an overview followed by detailed supporting argument. The first chapter introduces the principal topics of the book, and motivates further reading. Even though this chapter contains no equations per se, it may nonetheless come as something of a shock to the reader whose mathematical abilities have atrophied. For example, the author suggests that, for small projects the ‘work effort’ required for system development is an amount

proportional to nP’(2P-‘) (n being the size and ~1 being the ‘complicatedness’, typically 1.5 5 p 5 2.5) whereas for larger projects, the work effort is propor- tional to np”(2p-‘). While it is not neces- sary to have mathematical training to understand statements like these, it can be difficult to appreciate their full significance without it. The reader who does not have an intuitive feeling for the characteristics of such functions will find it hard to understand this book’s arguments. The author’s own experience in executive management at IBM lends credibility to his claim that the ‘game executive’ will have sufficient background, though I must admit some scepticism.

The book presents parameterized, pre- dictive models of defect behaviour in software and systems. The first two chap- ters introduce, by analogy to statistical

398 Information and Software Technology 1995 Volume 37 Number 7