i;ii ASQasq.org/software/2010/11/software-quality/software-quality-winter... · the tutorial and...

20
VISION: MISSION: 1'1Il! lIav'� ihe t bility of in&· v i cltmis u ! d U , UUI ,izGions YO . ,te at 1r'0,or9/ .erican Society for Quality �i;�ASQ •••�'•••' a�.Software Division BY RO The burgeoning growth rate of Web- based systems has caused changes to the role of soare quality assurance (SQA). As release and schedule pressures were not already high, Internet speed has pushed them even higher. The trend to Internet systems has also placed new emphasis on application speed. Primarily this has been driven by the competition to increase end-user satisfaction. This is of course a traditional retail objective manifesting itself in the Inteet world. Given the need for speed and the increasing demands on the SQA resources, it is only natural that many SQA teams are adapting sophisticated tools and processes designed to automate as much of the SQA nction as possible. One obvious type of tool enjoying wide- sp'read success is automated load performance and scalability (LPS) tools, such as Mercury Interac tive's LoadRunner and Segue's Silk Performer. The load test tools are extremely valuable in ensuring that an Internet system can meet service level expectations before it is released., However, keeping the initial service level in production or through successive releases can be challenging. In preproduction the soon-to-be pro- duction environment can ftmction as the load and performance test lab. This, approach is ideal from many perspec- tives. First, it is economical. No separate test environment must be built. Second, L T y Be the actual production application/system is tested on the actual production plat- forms. Third, an entire promotion cycle is eliminated, thus eliminating the oppor- tunity for defects to be injected during the final installation, configuration, and start-up phase. However, once the system is in pro- duction, this approach tends to fall apart. Unless the application owner is willing to risk installing the new releases and then testing them in the production environ- ment, load testing (as well, other types of testing) becomes problematic. This paper explores some of the issues, as well as possible resolutions to this problem. The simplest and mos t thoroug h approach is to create a test environment that exactly duplicates the production environment. However, this can be very expensive, especially for Web-based busi- nesses. Many Web-based companies are true intellectual firms, outsourcing the Web site to a commercial hosting facili. Others are in a true venture capital start- up based mode, and every dollar is criticaL In these situations a duplicate production environment is usually cost prohibitive. After a Web application is in produc- tion two conditions can cause the need to perform additional LPS testing. The first is growth in the user base, either t h r o ugh acquis it ion of new users, increased frequency of use by fhe exist- (cont. on p. 3)

Transcript of i;ii ASQasq.org/software/2010/11/software-quality/software-quality-winter... · the tutorial and...

VISION:

MISSION:

1'1Il! lIav'� ihe t bility of in&· v i cltmis u ! d U , UUI ,izGl'ions YO

. ,te at 1r'0,or9/

.erican Society for Quality

�i;ii�ASQ ••••• �' ••• '.4 a�.� .. � �

Software Division

BY RO The burgeoning growth rate of Web­

based systems has caused changes to the role of software quality assurance (SQA). As if release and schedule pressures were not already high, Internet speed has pushed them even higher. The trend to Internet systems has also placed new emphasis on application speed. Primarily this has been driven by the competition to increase end-user satisfaction. This is of course a traditional retail objective manifesting itself in the Internet world.

Given the need for speed and the increasing demands on the SQA resources, it is only natural that many SQA teams are adapting sophisticated tools and processes designed to automate as much of the SQA function as possible. One obvious type of tool enjoying wide­sp'read success is automated load performance and scalability (LPS) tools, such as Mercury Interactive's LoadRunner and Segue's Silk Performer. The load test tools are extremely valuable in ensuring that an Internet system can meet service level expectations before it is released., However, keeping the initial service level in production or through successive releases can be challenging.

In preproduction the soon-to-be pro­duction environment can ftmction as the load and performance test lab. This, approach is ideal from many perspec­tives. First, it is economical. No separate test environment must be built. Second,

L T y

Be the actual production application/system is tested on the actual production plat­forms. Third, an entire promotion cycle is eliminated, thus eliminating the oppor­tunity for defects to be injected during the final installation, configuration, and start-up phase.

However, once the system is in pro­duction, this approach tends to fall apart. Unless the application owner is willing to risk installing the new releases and then testing them in the production environ­ment, load testing (as well, other types of testing) becomes problematic. This paper explores some of the issues, as well as possible resolutions to this problem.

The simplest and most thorough approach is to create a test environment that exactly duplicates the production environment. However, this can be very expensive, especially for Web-based busi­nesses. Many Web-based companies are true intellectual firms, outsourcing the Web site to a commercial hosting facility. Others are in a true venture capital start­up based mode, and every doll ar i s criticaL I n these situations a duplicate production environment is usually cost prohibitive.

After a Web application is in produc­tion two conditions can cause the need to perform additional LPS testing. The first is growth in the user base, either through acq ui s i t i o n of n e w users, increased frequency of use by fhe exist-

(cont. on p. 3)

BV!U ESTFALL

I hope you all had a great holiday sea­son1 This is my first Chair's Corner article as your new Software Division chair. On July 1 Sharon Miller (treasurer), Tim Surratt (secretary), Mike Kress (chair­elect), and I started our two-year term as the officers of the division. We are excit­ed about this opportunity and are looking forward to continuing the tradition of increasing the value of the division to you, its members.

Growing the value you receive from your Software Division membership was the focus that Jayesh Dalal, immediate past chair, set at the beginning of his term in 1998. I would like to thank him and other council members for their excellent and dedicated service to the division. I would also like to remind everyone of some of their accomplish­ments and point out opportunities for you to become more involved in the Software Division.

In 1998 the Software Division intro­duced the Software Quality Professional, the first journal that focused on you, the software quality p rofessional. Taz Daughtrey, journal editor, its editorial board, and its contributors have done an outstanding job in delivering a product that is both thought provoking and full of infomlation that is directly useful to us in our chosen profession. You can help with the continued success of Software Quality Professional:

• If you have not already done so, pur­chase a subscription to the journal or solicit corporate subscriptions from your employer organization.

• Submit an article to the journal. Has your company embarked on a process improvement journey?-Tell us about your lessons learned. Have you adopt­ed a useful new method, technique, tool, or metric?-Share your success story and help others benefit from your experiences. Have you read a good software book that provided you with useful information?-Write a book review. During Jayesh's term of office the

Software Division held two excellent conferences, the 8th and 9th International Conferences on Software Quality (lCSQ). We have already repeated this success with lOICSQ. Thanks go to Theresa Hunt, program chair� to the con-

2

ference chairs Terry Deupree (8ICSQ), John Pustaver (9ICSQ), and Rusty Perkins (lOICSQ); to their committees, and to all the tutorial and paper presenters and keynote speakers. David Zubrow, lllCSQ conference chair, and his committee are already hard at work planning next year's conference in Pittsburgh. Here's how you can contribute to the continued suc­cess of the International Conference on Software Quality:

• Submit a paper or tutorial for the con­ference. The call for papers is included in this newsletter.

• Contact Theresa Hunt or D a vi d Zubrow and volunteer t o b e part of the conference committee. We need people to review papers and tut01ials, help solicit exhibitors and sponsors, and provide other logistical help with the conference.

• Attend 1IICSQ in Pittsburgh, October 22-24, 2001 . Watts Humphrey has already been confirmed as one of our keynoters. Our conferences are won­derful places to network with other software quality professionals, learn about what is working in other organi­zations, or to sinlply get reinspired. Over the past two years, Tom Griffm,

publications chair, and his committee have continued to improve the technical c onten t and value of our Software Division's newsletter, Software Quality. Regular features include technical arti­cles, the "Measuring Up" column about software metrics, information about what is going on in the standards arena, and reports from the Software Division com­mittees. New with this newsletter is a set of practice certified software quality engi­neer (CSQE) review questions and their answers. You can contribute to the Software Division newsletter in these ways:

• Submit a technical alticle, "Measuring Up" column, one or more C S Q E review questions, o r a letter to the editor to Tom Griffin.

• Send in suggestions on what topics you would like to see covered in the newsletter.

• If you attend a conference or meeting that has a particularly good paper or speakers, talk to them a bout con­tributing to our newsletter and put them in contact with us. Un der the leadership o f Doug

Hamilton, certification chai r, and his committee, the Software Division contin­ues to strengthen the certified software

quality engineer program. The CSQE exam has now reached its ftfth birthday and the body of knowledge is being revis­ited. You can contribute to the success of the CSQE program:

• If you are not already a CSQE, consid­er taking the examination and becoming certified. The next applica­tion deadline is April 6, 2001, for the June 2, 2001 exam.

• If y ou are a CSQE, contact Doug Hamilton and become involved in the exam preparation process.

• If you are one of the individuals who receive a survey about the new CSQE body of knowledge, take the time to provide us with your input.

• Promote the CSQE certification and its value with your employer and your peers. The Software Division continues to

strengthen the role of its regional coun­cilors. Over the past two y ears, our regional councilors have been involved with successful local conferences and other events that help spread the soft­ware quality message at the local level. You can help represent the Software Division in your area:

• Contact your regional councilor and volunteer to be a deputy regional councilor for your local ASQ section.

• Become active with local professional organizations like your ASQ section, Software Process Improvement Network (SPIN), or other software special interest groups. Volunteer to give a presentation on software quali­ty to one of these organizations.

• Network with other software and quality professionals in your organiza­tion or in your community and tell them about the benefits of ASQ and the Software Division. I am excited about serving as your

chair. Your Software Division Council and I will do everything we can to con­tinue increasing the value of our division to you, its members. But we continue to need your input and help. Why don't you make it one of your new year's resolu­tions to become more active in the ASQ Software Division? I hope this article gives you some ideas on how you can both contribute to and benefit from your Software Division membership. Help us set the direction of the Software Division for the new millennium. CALL! PROVIDE YOUR INPUT! GET INVOLVED!

Winter 2001 /SOFlWARE QUALITY

/ '\" \,

;

'\ ! I i i �

I I 1

ing user base, or both. The second is the introduction of new features and functionality.

As the user base and/or frequency of use grows, eventually the original perfor­mance characteristics will deteriorate. In other words, new system bottlenecks will be found. They may be hardware based, application architecture based, or actually rooted in the operating systems. If enough user load is placed on any sys­tem, the bottlenecks will be found.

New functionality can also uncover performance issues. If nothing else, new fun c tionality s ho uld be e xpected to increase both the number of users and their frequency of use. But new function­ality often is used as an opportunity to introduce new technology (or upgrade existing technology). Performance and scalability is always at risk when new technology is introduced. There may be a learning curve associated with proper implementation of the new technology, or the new technology may just not yet be stable.

As if LPS testing of plalmed or antici­pated change is not enough we must also consider the unexpected test needs. Just as the SQA function has felt the pressure of Web spe ed, so has the repair and maintenance function. One exce llent techni que fo r economizing o n s u c h resources i s t o perform exhaustive pre­production LPS testing to predict the likdy bottlenecks. This allows the R&M team to get a head start in addressing these bottlenecks before they are uncov­ered in p ro d uction. However, any change made to the system as a result of emergency repairs has the same testing requirements as the planned introduction of new functionality. The issue still boils down to· how is the testing going to be performed in an efficient and effective manner.

There are two techniques, when used in conjunction, that will alleviate these financial concerns. First is to apply scal­ing and benchmarking to the production environment in conj unction with a scaled download test environment. The second is to use the appropriate produc­tion monitoring tools. To be effective these approaches require application of sophisticated SPC techniques, namely

SOFTWARE QUALITY /Winfer 2001

con trol charts, to the data generated fro m these techniques.

The first technique is to use a scaled down test environment. One typical situ­ation is the case of a 'Veb application being hosted in a commercial hosting center . The organization creating the application will either acquire and main­tain a scaled down version of the hosted production environment, or will actually lease such an environment from the host­ing center. Obviously there is additional c o s t associated with the duplicate environment. However, this second envi­ronment can serve two purposes. First, and often overlooked, there can be a sub­stantial handicap to the development team when developing a multi-server s c alable Web application on deskto p machines. Having access to a more robust development environment during development is a great idea. The second benefit is the ability to perfoml load test­ing as early as the architectural prototype phase of development.

To use the scaled down environment for load testing, it must be benchrnarked against the production environment. Obviously this implies that the load test­ing on the scaled down envirollment will be predictive, not definitive. A slightly lower reliability in the test results is off­set by the decreased costs of using a scaled d own test environment, as opposed to acquiring a production equiv­alent test environment. However, done correctly, this is a reasonable return on the investment in the scaled down test environments.

The second approach to load testing is to proactively monitor the production environment. In other words, the trends in real-time data, gathered on a continu­o u s basis, b ec o me a p redic tor of application p e rfo rman c e, a s well as future performance constraints.

Why are statistical techniques needed to make these two approaches feasible? Simply put, both techniques are proxy indicators of the actual production sys­t em . As such there is an element of uncertainty in the results . Generally speaking, a classic use of probability and statistics is when a predictive measure must be used to represent the actual sys­tem in question. The best method we have used yet in these situations is the application of control charts, specifically the X-bar R chart.

As any experienced QA professional knows, control charts have been used for years in manufacturing to serve just this

p urpose . But SQA professional s also know the software industry has been slow to adopt sophisticated SPC tech­tuques. 'Why this is so is out of the scope of this paper, but it could be as simple as lack of education. Typically, few comput­er science degree programs address SPC techniques.

From management's perspective there is great value in being able to differenti­ate between common cause and special cause pro blems. One reason for this value we have already alluded to, the potential for cost savings b y scaling down the test environment But the sec­ond, and often overlooked reason is the ability to base resource and schedule planning on actual data. Assigning the correct resource to a problem is always more efficient than just assigning any a vailable resource. Differentiating b etween c o mmon cause and sp ecial cause problems is a key determinant for resource allocation.

For those unfamiliar with control charts, the basic premise is that once a system is under statistical control, upper and lower control limits can be calculat­ed. Any normal variations within these limits are considered to be due to com­mon causes. In other words, due to random variation, or noise. Variation out­&ide of the control limits is considered due to special or one-time causes.

These concepts a re part icularly applicable to Internet traffic. A simple example of c ommon cause variation would be the normal ebb and flow of traffic to a Web site within a 24-hour peliod. (Our office certainly experiences performance issues with our I nternet connection during the lunch hour!) An example of special cause variation would be the difference in traffic after the Super Bowl commercial advertising a new site. With a little thought many examples of common and special cause variation can be found.

One fundamental concept with com­mon cause and special cause variation is that management owns the process (Le., has the responsibility for common cause variation) while the staff owns the spe­cial cause. While there are exceptions to b oth of these concepts, they really do hold true most of the time.

Every process or system has an inher­ent capability to perform. In the case of a Web-based retail s ystem, the p erfor­mance and concurrent user load capacity will be directly correlated to the time,

(cant. onp. 4)

3

money, and effort put into designing and building the system. If accurate assumptions are made regarding user demand, then for a given amount of investment, a given amount of performance will be achieved. The realized level of performance can be affected by several variables. They range from the robustness of hardware the system runs on to the talent level of the develop­ment staff. The resource allocation decisions are management decisions. Once those decisions are made, and the best system that can be built given those resource constraints is created, then the day-in, day-out performance of the system is set.

This is represented in the sample control seen in Figure 1 .

Figure 1

(f)

6.5

6.2

6.0

-g 5.81-----II .... _�--..,........j ... -�:__­o � 5.6 (f)

5.4

5.2

5.0 ���--���--���--���� UCL - • -

X-Bar­LCL - - -

In Figure 1 we see that after implementation of the systems, then the gathering of initial performance data and construction of the control chart, the average response time was 5.8 seconds. The upper and lower control limit (UCL and LCL) are 6.28 and 5.3 seconds, respectively. This means that as long as production response time falls within the UCL and the the system is stable and performing as designed. (In actuality there are more rules dealing with the trend characteristics of the data, but that is out of scope for this discussion.)

Any variation within the UCL and LCL is attributed to com­mon causes. In other words, attributed to the normal randomness of the real world. Most of the subsequent data

shown plotted here fell within the limits. If the performance characteristics of the system (the mean and variation) do not meet the Original specifications, the only solution is to begin major rework, or perhaps start over. This is why management is said to own the common cause variation.

However, one data point was above the UCL. This variation was likely caused by a special cause. That is, attributable to a real-world condition that was unanticipated by the Original spec­ifications. It could have been a peak load that was just unusually large. Or it could have been an internal issue. For example, per­haps a DLL was upgraded, and the upgrade failed. Regardless, it was a one-time situation that was unanticipated. When such a variation occurs, the maintenance team should investigate. In the case of the unusual peak, little can be done but to note it for future reference. If it happens again with some frequency per­haps a pattern will arise to suggest the cause. However, in the

4

case of a failed upgrade, the process can be modified to prevent such a failure in the future. This is why the operators are said to

own special cause variations. So how does tIlis aid the resource management process? SPC

theory suggests that management has created the upper and lower control limits through choices made in the creation of the process. Mean performance statistics, as well as the upper ami lower control limits, are a direct result of the time, effort, resources, and money placed into all aspects of the system development. Management controls the resource allocation process. If the resulting mean and control limits do not meet the user's expectations, then SPC theory suggests the answer is to create a new process. This is not a quick, or inexpensive fix. Nor is it in the domain of an R&M team. Taking the example of the daily peak usage variation, it is management that decided to either a.) Build a system that met user expectations of the time} or b.) Live with the fact that occasionally normal peaks would cause a temporary performance drop.

Unfortunately we often see resources deployed in exactly the opposite manner that SPC theory suggests. We believe the reason for this is a fundamental lack of understanding of process capability. If a software system is performing to its capability and the results are not up to user's expectations the first reaction is to try to fix it. However, if the control chart indicates that it is performing within the control limits, it is not broken. You can replace it with a new system, but a quick fix is not the answer.

Over time a system's performance will degrade. There are several reasons for this. One, of course, is increased user load. But it can also be a growing database, memory leaks, or other applications demanding resources in a share environment. This last point is particularly worrisome in large shared data centers as well as commercial hosting centers. In either case, it is the result of trying to get economy of scale in capital investments. In all of these situations a control chart is invaluable. Here is a typical scenario.

In the preproduction testing, an automated tool is used to simulate expected production conditions. An X-bar chart is con­structed. This control chart is used for two purposes. First, if the test environment is different from the production environment (Le., scaled down for cost reasons) the test control chart can be compared with the ultinlate production chart as a means to cali­brate the test environment. Second, the test chart can be used to determine i f proposed a lterations to the system actually improved the system capability or not. Tllis is accomplished by establishing a control chru1 in the test environment using the current production version of the system. Then add the pro­posed changes, and redo the chart. True process improvement will be visible. \Vhile the exact magnitude of improvement is unknown, the fact that positive improvement was made can be proved. (Alternatively, the avoidance of a negative impact to the system can also be proved.)

Once in production the application is monitored and a con­trol chart is constructed. Final production performance results are compared to initial user requirements as well as actual user satisfaction. If all is as expected, the charts are published and archived.

Over the system capability is continually updated, and monitored, with production data. The goal is that the system administrators will detect decreases in system capability before

the user perception of decreasing performance occurs. If a spe­cial cause problem has occurred, the R&M team must attack.

Winter 2001 /SOFTWARE QUAUlY

But if system capability is degrading, the control charts should aid in convincing the application owner that a system issue has occurred, and to approve the appropliate action.

Here is an example of a fix that was no ftx at all. A developer proposed switching the ODBC dliver as a possible solution to performance issues. In the development environment it was found to be about 30% faster. In a preproduction LPS test in the test environment, it was found that on average the modified application was 30% faster. However the variability was two times greater. The users, when asked which they preferred, chose to stay with the old driver. They preferred predictability. This probably was the correct answer for other reasons. It has

seemed that systems with wider range data are more frag­but tlns is anecdotal.

In conclusion we have learned that control charts do have applicability in software development, in at least the following ways.

1. Resource allocation.

2. Test lab calibration. 3. Predictor of production performance changes due to new

development or R&M without the need for a full-scale test lab. 4. Monitoring production system performance through con­

trol charts. The briefness of this paper has perhaps caused more ques­

tions to be raised than were answered, depending on the reader's knowledge of SPC as well as software development and testing. Our experience applying control charts has been over­whelmingly positive, but by no means exhaustive. If this paper inspires others to experiment with control charts then our goal has been met.

T w o good references for the construction of control charts are:

Statistical Quality Control Using Excel, Zimmerman & Icenogle, ASQ Press.

Juran's Quality Handbook, Juran & Godfrey, McGraw Hill.

Ron McCUntic is dil'ect01; QA/Testing Center of Excellence for PSI Net, the Internet Carrier. He is a CSQE and can be reached a t [email protected] .

BV

Your newsletter will continue to arrive by mail for the fore­seeable future. As I noted in the last issue of Software Quality,

technology does offer us the opportUlnty to do more with less and we will continue to move toward delivering the beneftts of electronic distribution to you. But, we aren't planning to leave anyone behind who is "addicted" to paper. Many interesting comments about our dilemma were expressed in e-mail to me.

Some of the suggestions that came in are given below: Jack Faulk suggested we "Eliminate the hard copy of the

newsletter and instead send a pdf file to each member. You should also provide a copy on the so members without up­to-date e-mail accounts can still access the newsletter. The \Veb copy should be password protected. "

Cathy Strabala said "I like the Web delivery with an e-mail notiftcation idea. This is ideal for the searchability aspect. I hate

the waste of paper and paying the USPS when we can get the info electronically for free. \VIlo in the S/W field doesn't have e­mail and/or the Internet? Also, dues to the division could be

SOFTWARE QUALITY/Winter 2001

reduced or even encouraging more folks to join." Tom Noteboom wrote: "Just a thought about A SQ's

Software Quality newsletter. I enjoy the use of my Palm Pilot as a means of storing reading material. I access vlww.avantgo.com to support my habit. There are likely others out there that would allow me to download daily updates to that reading list. Perhaps an arrangement might be made with such an organization to have 'members' be able to download routine articles. Stop the quarterly process and go real time with Web-based information and on-the-go tools. Just a thought."

I checked the site Tom Noteboom mentioned and found that we could create a channel for the newsletter without too much trouble.

Jerry Obenour made good points about flexible delivery options. "I much prefer the current hardcopy mailing. My time is very expensive and any other modality requires my time to download, read, and/or print. My cost would be much higher than whatever modest fee I am currently paying for the newslet­ter. I think the current deal is a real bargain. I edit a newsletter in

another life and I let my members choose-the Internet version (browser version, password protected) is 'free'� hardcopy mail­

ing is available at my cost. About half of my customers pay for

the hard copy. Let the marketplace decide." Rick Gonser writes "} am completely satisfied with the

Software Division newsletter, and have been since its inception. However, I could probably be persuaded to switch. Off the top of my head, I can see problems that would affect me personally, as far as an electronic copy is concerned. These are in addition to the ones you've carefully mentioned in your editorial.

"All of my URLs have me-size limitations. Therefore, any ftle(s) greater than 1MB nlight not make the tlip. In addition, if pdf for­mat is used, then I'd have a printer problem (and I'm sure I'm not alone). Our typical printers at work are HIP 5si's, wInch are OK, until you direct more than 33 pages of pdf their way. Our best nenvork printers are new HIP 8000s. However, they're not too happy with large pdf ftles either. Some of this could be Novell Net\V ARE related, even though we have a 1 GB backbone. Since I see no foreseeable change with our current setup in the near future, I'd be worried if you go to an electronic copy."

Based on Rick Gonser's thoughts I checked the Fall 2000 newsletter issue and found that the pdf file was only 300Kb. This was a relief since I really hadn't thought about ftle-size prob­

lems. If anyone would like me to e-mail this me to them, just send me an e-mail with "pdf test" in the "subject" field and I'll send it back to you. This will enable you to test to see if you can handle the file. I've also set up a download test at http://www­

biz.aum.edu/tomgriffin/pdftest which will allow you to test downloading the flie from the Internet.

Please keep your comments on this issue coming. Thank you to all who responded with helpful suggestions and thoughts.

By the time you read tllis, the next U.S. TAG meeting on ISO

Software Engineering standards will have taken place. At the time of this writing (early December), though, here is the status of standards matters relative to software that have changed since the last report:

(cont. on p. 6)

5

ANSI NATIONAL STANDARDS STRATEGY

ANSI, which is the member body for the United States in ISO, has developed a strategy for pursuing standards. This National Standards Strategy, though not software-specific, still seems to me to be important for us as a division to know about and discuss as to how it can be implemented in the software industry. To get a copy of the NSS document, you can go to http://web.ansi.org/public/ nss.html and look for a downloadable version of the document.

ISO 15504-2 This is the normative document for

the revised ISO 15504 document set. According to the disposition of com­ments report, the Committee Draft (CD) version of this document was disap­proved by an 8 to 7 vote (with 12 nations not voting at all, i.e., not even a formal abstention). I believe revisions will be made and another CD ballot will be gen­erated. A 2001 target for achieving IS status is still planned, however. The United States, by the way, though they had comments, voted to approve.

The ISO 15504-3 document (guidance on conducting an assessment) is targeted for either Working Draft (WD) or CD bal­loting in 2001 with a (late) 2001 or a 2002 publication as an approved International Standard (IS). Parts 4 and 5 are not yet at the WD level.

ISO 9000:2000 ANSI will have the formal IS versions

of this available December 15, 2000. The cost of 9000, 9001, and 9004 will be $62, $54, and $84, respectively. A "kit" con­sisting of these three plus the Transition Guide ($10 on its own) will be offered at $13 5. At this time, ASQ is saying it will have versions available (the "kit") to ASQ members for $ 96 by the end of December-$120 for nonmembers.

ISO 9000-3 It would appear that, though formal

turnover has not occurred, TC176 has voted to turn ISO 9000-3 over to the SC7 organization. In the meantime, "TickIT has issued a document that is an appar­ent replacement for 9000-3," according to reports from a joint SC7 Working Group meeting held in October. Thus, TicklT, it would seem, will go ahead with their registration/certification scheme without, for the time being at

6

least, and updated version of ISO 9000-3 . It does seem clear that TC176's Subcommittee 2 (SC2) document mUll­

bered N442 will be used as a basis for some of the revision work.

Since the Software Division is repre­sented as a voting member of the U.S. TAG once again, I will be able to keep the division updated on the progress of the revision work on 9000-3 and carry its opinions with regard to the standard to the TAG meetings.

ISO 14143-2 This is the standard on Functional Size

Measurement and it has reached Final Draft International Standard (FDIS) status. To be balloted at the FDIS level means that a "Yes/No" vote occurs without comments to agree (or disagree) that the document should be adopted as a formal ISO IS.

AN OPINION SEEKING DIRECTION To close, I'd like to solicit some feed­

back on a subject relative to software quality and standards. Much of the stan­dards material seems to focus o n deliverable content and/or life cycle and process related matters. However, it is my personal view that what industry and the marketplace really need is guidance relative to product quality standards. For this, there is the model of ISO 91 26. However, much of the eff0l1 to formally audit/certify software development is at the process, not product, level.

My own view is that process/life cycle standards are imp0l1ant to the providers/ developers of software products as process improvement tools. What should be important to customers is a way to know something about the quality of the products they will be buying. That may include some process information, but I think, in many ways, it involves more of what ISO 9126 tries to address. What kinds of data about what sorts of things could a supplier provide to a customer that would make the customer feel more secure in the purchase of the product?

I'd like to get feedback from Software Division members about this since, as noted in the last newsletter, there is an international effort to address software quality from a product characteristics perspective (the SQUARE effort). With all the ISO 9000:2000 talk and the concern over what will happen to ISO 9000-3, perhaps the marketplace should be devoting more energy, not to process/ system audits, but to some form of prod­uct certification?

Scott Duncan can be reacbed at soft­

[email protected] 01' sdul1can@asqnetorg

01' @acm.ol'g or @ieee.org 01' b_y phone at

706-565-9468.

Taz Daughtrey is New ASQ Fellow "For outstanding contributions to ASQ

in the formation and development of the Software Division; for the advancement of professional recognition as founding editor of the journal Software Quality ProfeSSional; and for continuing activity as a contlibutor to the wider application of software quality principles as mentor, teacher, and advocate."

Taz Daughtrey was a vice chair of the original Software Quality Technical Committee in 1988, was the chair-elect when the technical committee became the Software Division, and served the first full two-year term as division chair from 1990 to 1992.

Taz was appointed founding editor-in­chief of the journal in 1997 and recmited associate editors, the editorial board, review panelists, and contrib utors in order to begin publishing in December 1998. The following year, SQP also became the first ASQ journal with an online presence.

Taz continues to teach public and on­site software quality engineering courses for the SOciety. His latest project is the editing of a reprint collection of key arti­

cles in the most challenging CSQE body of knowledge areas. [See appended por­tion of his SQP editorial for the March issue.]

New CSQEs David L. Hough of Section 1113 ,

Raleigh. Israel Rose of Section 1502, Greater

Atlanta.

Member is Published Congratulations to Kim Cobler whose

r ecent article "Methods" (Software Qualit)!, No. 3, 1999-2000, Spring 2000) was reprinted in KCQAA, the newsletter of the Kansas City Quality Assurance Association.

Keep Us Informed Let us know when you or your fellow

Software Division members should be recognized for worthwhile contributions to the profession. Your efforts may encourage others or give someone else an idea that will advance the profession.

Winter 2001/S0FlWARE QUALITY

( i !

S!Jfh ANNUAL

OlJ/-\Llry

COI\lGRF\�

AND

EXPOSITION

STRENGTHENYOURCONWEITTINCPosnrrON

" "·'ri, .. i.H§i+ 1\1,.1\ Y 7 9,:i'001.1

Mr3,Y i q )001

(11/-\1110IIi-

COI\JVENllON CI�I\lrEH

CHAHLOIlE, NC

Each year in May, the Annual OUBlity Congress Ed cat non Cas _ Studies and Exposition [AGCl provides a forum for

business and quality professionals to learn,

exchange, and share extraordinary practices

with those who have experienced and

advanced individual and organizational

performance excellence. Plan now to network

with nearly 3,000 individuals and 200

exhibitors from more than 50 countries

worldwide. Make a commitment to excellence

. .. strengthen your competitive position.

To register or for more informotion, call ASQ ot 800-248-1946. For 0 copy of tile 55th AQC preliminary program, osk for item #80166.

American Society for Quality

ASQ <!>

An outstanding educational program is the foundation of the Annual Quality Congress. The

55th AQe will offer more than 100 hours of

dynamic learning opportuni ties. We've listened to our attendees and have designed instruction­al sessions for the advanced quality expert, the entry-level practitioner, and any business pro­fessional seekil1g world-class results.

Netvvorking We've been told that " networking," not just education, is a key reason thousands of profes­sionals attend the Annual Quality Congress year after year. The 55th AQC promises even more

opportuni ties for you to connect with profes­sionals from around the world-and your backyard-who have similar challenges and goals as you and your organization . We know you have an unwaver ing commitment to excel­lence ... join others who do too!

Learn from professionals like you who have made it happen ! The 55th AGe wi I feature dozens of

case studies and real-world quality applications that are proven mooels of business excellence. Discove r how to apply these success stories into substan tial improvements and bottom-line suc­cess for your organization.

Professional Development

It's safe to say that the Annual Quality Congress has never had more opportunities for profes­sional growth and career enhancement than will

be available in Charlotte! You can make an investment in your career by becoming ASQ cer­

tified at the 55th AQC. On-site certification exams will allow you to demonstrate your profi­ciency within a specifi d body of knowledge. At

the ASQ Career Fair, me l rae -to-fa� with hiring managers representing Jeading companies from around the country who are recruiting various levels of quality professionals.

Visit the Official Web site of the 55th Annual Quality Congress and Exposition

http://aqCg8sq.org Priority Code CEAFOPl

BVT

Several opportunities are available to serve the Software Division. Please find a place where you can make a contribution and help out.

Awards chair-This position on the Division Council is responsible for recognizing the contributions of our members in a variety of ways. Contact Linda Westfall, division chair.

Regional councilors for Regions 4 and 7. Contact Linda Westfall, division chair.

llICSQ volunteers: llICSQ will be held in Pittsburgh, October 22-24, 2001, and the planning is just getting under way. If you would like to participate on the program committee (con­tact Theresa Hunt) or assist with local arrangements and support (contact David Zubrow), now is the time to get involved.

Software Division Marketing Committee: This is a new committee responsible for promoting the Software Division and valious events. Contact Linda Westfall, division chair.

BY SUE CARROLL

Welcome to the new millennium. Remember back when 2001 was a movie and not the current year? Times change. We are on Internet time and things change fast. Try to keep solid processes as part of your Web activities. It is so easy to get caught up in the speed and forget all the lessons we have learned. Poor quality is still unacceptable to customers, especial­ly when they have several other choices already bookmarked.

I want to welcome Milt Boyd to the Web team. He is going to help us enhance the Web site. If you haven't visited our page, please go to http://www.asq-software.org/ . We are having a challenge with pages that don't work on Netscape but do work on Internet Explorer. We could use a volunteer to help us with this challenge. Please send e-mail to me at [email protected] .

Please point out the jobs page to any of your colleagues who are job hunting or searching for employees. We can post open software quality positions so they are available to those seeking employment. This is a free service and I hope it is useful to you.

I look forward to an exciting 2001. Please send any idea for the Web page to any other member of the Web team.

BY DOlUlGILAS !Sa HAM

It is time to start the process of updating the Certified Software Quality Engineer exam. On November 10 and 11, 2000, a Job Analysis Workshop was held in Milwaukee. I would like to thank the partiCipants: Esther Alessio, Eric Cicmanec, Theresa Deupree, Clayton Dryer, Lori Duning, Katharine Harris, Theresa Hunt, Aniz Sabuwala, Kamal El-Sheikh, and Robert Stoddard. This workshop updated the body of knowledge for the exam. The updated BoK will now be sent to more practitioners for review, input, and fmal revision. Watch this column in future newsletters for additional activities for the CSQE revisions.

8

PLAN NOW TO TAKE THE CSQE EXAM. THE SCHEDULE IS: Exam Date Registration Deadline May 6,2001 April 20, 2001 (exam is administered at

June 2,2001 December 1, 2001 June 1,2002 December 7, 2002

AQC in Charlotte) April 6, 2001 October 5, 2001 April 5, 2002 October 4, 2002

BV T ERESA HU T

Planning for the Software Division's 11th International Conference on Software Quality (11 ICSQ) October 22-24, 2001, in Pittsburgh, PA, is under way and now is the time to begin submitting papers or tutorial proposals for presentation at llICSQ. If selected by the Program Committee your paper will be published in the conference proceedings. Dave Zubrow is serving as the llICSQ conference chair, please visit our Web site at www.asq-software.org for more information.

As you know if you attended, lOICSQ in New Orleans was a huge success. Many thanks to 10ICSQ conference chair Rusty Perkins and our standing Program Committee members for the hard work they put into making this conference successful.

We are once again in cooperation with Software Quality Engineering Applications in Software Measurement/Software Management Conference February 12-16, 2001, in San Diego. Our division chair, Linda Westfall, will be presenting at the con­ference. Visit the conference Web site at www.sqe.com .

Keep the Program Committee in mind if you have volunteer time to share-we are always grateful to have new members join our team. We must share the workload or risk over-burden­ing our active contributors. See you in Pittsburgh!

RIEV

The ASQ Software Division recently entered into an agree­ment with a European consortium, the EuroSPI [European Software Process Improvement] Partnership, for mutual support and collaboration in professional conferences, publications, and Web sites. The past two issues of Software Quality ProfeSSional have featured papers based on presentations made at the most recent annual EuroSPI conference.

EuroSPI Partnership The EuroSPI partnership was formed in 1998 to continue with

the conference series and training actions previously organized in partnership with ISCN, ESI, and SP. EuroSPI is a consortium of Sintef (Norway), IVF (Sweden), Delta (Denmark), S1TF (Finland), DERA (UK), ASQF (Germany), and ISCN (Ireland and Austria). It bases on a defmed model for conference organization, standards for conference performance, and standard types of agreements. It is EuroSPI's policy to grow over the years into an organization with one representative per European country and with a confer­ence series securely managed at least until 2012.

You can f ind details about the current partnership at http://www.iscn.ie .

Winter 2001/S0FTWARE QUALITY

I I

I ) I

i

I I i

I

)

The Software Division of the A111erican Society for Quality presents

l l ICSQ International Conference on Software Quality

October 22-24, 200 1

Sheraton Hotel Station Square • Pittsburgh, PA, USA

Call for Participation, Papers, Tutorials, and Workshops

Tech nical Prog ra m Technical papers and panels should be practitioner oriented. They may be based on research of interest to practitioners or on experi­ences that contribute to the software quality body of knowledge . One complimentary adntission to the two-day technical program is given for each technical paper or panel session.

Tutoria llWorkshop Sessions are either half-day or full-day sessions that provide prac­tical knowledge to participants . Workshops that promote the active participation of learners through problem solving} case studies, or other interactive learn­ing methods will b e favore d . A c o mplimentary admission to the two-day technical program is given for each half-day tutorial in addition to other compensation that tnay be provided.

S uggested Confere n ce To pics:

• Software Quality Management

• Software Processes

• Software Project Management

• Software Metrics, Measurement, and Analytical Methods

• Software Inspection, Testing, Verification} and Validation Software Audits and Standards

Techn ical Papers Papers should be cOlllplete drafts of the fmal paper, not an abstract or presentation slides. Cotnpleted papers will be accepted until March 19. Notification letters will be sent April 20. Presentation fues will be due August 1 . Final versions of papers and presentations will be due on disk at the conference.

Software Configu ration Ma nag ement Proposal for a Pa nel Send one to two pages describing the topic o f the panel discussion, as well a s speaker biogra­phies and an author information form, by March 19. Hot topics: e-Commerce, Security, Mobile Networking.

Vol u nteers to serve as reviewers, session coordinators, and other positions are invited to contact Theresa Hunt, programs chair, at hunttl@eccic .com .

Coo ) S ponsors for conference events should contact David Zubrow) l lICSQ Conference chair at [email protected] .

Visit the l lICSQ Web site, http://www-biz.aum.edu/tonlgriffin/icsq/l licsq.htm .

SOFTWARE QUALITY/Winter 2001 9

Why do we care? Particularly in these days of diminish­

ing resources and outsourcing, an active program of measurement and analysis is often regarded as critical to the success of software development, maintenance, and acquisition efforts. Those of us who work in the field typically take it as a matter of faith that a robust measure­ment program can add substantia'! value . The whole reason we do it is to inform essential decisions, and to improve orga­nizational performance and subsequent product quality.

But what do we really know, and how do we know it?

\Ve do know a great deal about the technical issues of data gathering and applied statistics . Well-grounded guid­ance dates back to the development of the statistical sciences in the last century, with much older philosophical underpin­nings. But less is known about what it takes to implement and sustain a success­ful software measurement program. In fact a good deal of anecdotal evidence suggests that such efforts often fail .

Of course useful expert guidance does exist. But the experts too often disagree and a great deal more remains to be learned. Precious little empirical evalua­tion and defensible data are available in a field that prides itself on the importance of measurement.

About the study T h e results described here are

drawn from a broad-based survey of 228 practitioners and users of software measurement from both government and commercial enterprises. The sur­vey was a dministered via the World Wide \Veb from late November 1998 through February 1 999 . The response rate of about 60% is reasonably high by current standards in survey research, especially given the time of the year when it was done .

What do we mean by success? By success we mean more than

longevity and persistence over time . Measurement results must be used regu­larly to inform management and in technical decisionmaking . Moreover, demonstrable impact on business value and organizational performance (e.g . ,

1 0

defect density, cycle time, accuracy in forecasting budget and schedule, or oper­ational availability) is also necesssary to justify continued investment in the mea­snrement program.

\Ve created two composite indices of measurement program success: (1) use in decision making and management, and (2) organizational performance (Table 1) . Our focus here is on the first one, namely the use of measurement and analysis in informing management and in technical decision making.

As a matter of fact, there is a rather strong relationship between reported nse in organizational decision making and the subsequent improvement of organization­al performance (r2 = .46, p < .0001).

There is wide variation on both com­posite variables, and the respondents often characterize lack of success in their measurement programs. We can he rea­sonably confident that the respondents are answering candidly.

So what does it take to succeed? The larger study of which this paper is

a palt examines a wide range of possible explanatory variables. Here we limit our concern to three sets.

Aligrunent with business and tech­nical goals . The impo rtance of aligning the measurement program with the organization's goals and other information needs is a fundamental tenet of software measurement practi­tioners. \Ve asked our respondents a

series of questions about the involve­ment of various potential stakeholders in setting their organizations' agendas for software measurement (Table 2).

Not surprisingly given Goal-Question­Metric and related theory, a measure of involvement of the intended users is strongly related to successful use of measurement results in informing management and technical decision malting (r2 = .42, p < .0001). Organizational commitment and resource sufficiency. The impor­tance of management commitment and the existence of sufficient organi­zational resources are commonly emphasized as being crucial for soft­ware process improvement. Their role in the success of software measure­ment efforts appears to be no less important.

TABLE 1 QUESTIONNAIRE ITEMS AND RELIABILITY SCORES FOR MEASURES OF SUCCESS

Use in decision making and management (CronbachIs Alpha = 0.74) How widely are software measurements actua l ly used in making management and development decisions? • Monitoring and managing individual projects or similar work efforts • Use of historical data for project planning and estimation • Rolled up for larger organization and enterprisewide purposes • For use by individual engineers, programmers, and other praclitioners • Changes are made to technologies, business, or development processes as a result of

our software measurement efforts • Staffing and personnel changes are made because of measurement efforts in

our organization

Organizational performance (Cronbach/s Alpha = 0.94) In your judgment, how much has the use of software measurement improved your organization/s performance? • More accurate budget estimates or ability to reduce costs • More accurate schedule estimates or abi l ity to reduce cycle time • Belter adherence to customer or user requirements or improved customer satisfaction • Fewer software defectsl fau lts, or fa i lures • Belter funclional ity or user interface • Belter overall quality of products and services • Improved staff productivity or reduced rework • More informed judgments about the adoption or improvement of work processes and technologies • Better work processes • Better strategic decision making

Winter 2001 /S0FTWARE QUALITY

TABLE 2 ALIGNED WITH INrENDED USERS (CRON8ACH'S ALPHA = 0.70) How would you Characterize the involvement of various potential stakeholders in setting goals and dec iding on plans of action for measurement in your organization? • Senior enterprise and organization level managers • Project level managers • Individual engineers, programmers, or other practitioners • Business support units, e.g., finance, marketing

TA8LE 3 MANAGEMENT COMMITMENT (CRON8ACH'S ALPHA = O.S3) • Management regularly monitors the progress of software measurement activities • Management dearly demonstrates commitment to measurement

TABLE 4 USE OF ANALYTIC METHODS (CRONBACK'S ALPHA = 0.76) • Comparisons are regularly made between current project performance and previously

established performance baselines and goals • Sophisticated methods of analyses are used on a regular basis • Statistical analyses are done to understand the reasons for variations in performance • Experiments and/or pilot studies are done prior to widespread deployment of major additions

or changes to development processes and technologies • Evaluations are done during and aher full-scale deployments of maior new or changed

development processes and technologies

3.0

r E .� 2.0

� � 1.0

Fi ure 1 Summary multi Ie analysis of nriance

f 15 2.0 i l-- ----- ,.c-� c: .. ";;:.--;.:--:.. - · .. -----. . 1

� 1.0

0.0

.0 .5 1.0 1.5 2.0 2.5 3.0 3.5

Use inde6sioo making PrediCled

R2 :: .66. F = 142.25. P < .0001, n = 220

Term Aligned with intended users Management commitment Use of analytic methods

Std bela 0.27 0.33 0.37

Prob > I n < .0001 < .0001 < .0001

Figure. 2 Characteristic bivariate relationships

3.0

: . : : : .. : : ' .

Aligned v.,th intended USl!t1;

r = .42, P = .0001 . n :: 221

3.0

.. .. : � : : : � : : : �

0.0

2

Use or an<ll}"!ic methods

r :: .48. P = .0001. n = 222

� ... . .

~ . . . . . : : . : . . . : .

. .. " . . .. : : . � : ; :

.. .. '* .. • .. .

: : : : ,. � .. � . .. .

.. .. .. � " .. 4 .. of , ..

r = .47, P = .0001 . n = 221

SOFTWARE QUALITY/Winter 2001

As expected, a measure of management commitment is in fact rather strongly related to use of software measurement results (Table 3; r2 = p < .0001).

Technical characteristics of the measurement program. Finally, we examined a series of technical charac­teristics of the measurement program itself. The extent of use of a variety of data a nalytic methods is the most strongly related of four such mea­sures we examined (Table 4; r2 .48, p < .0001).

Putting it all together: An initial mul­tivariate analysis

Based on these bivariate results and preliminary multivariate analyses, we set­tled on a single, simple multiple analysis of variance (MANOVA) to summarize variation in reported use of s oftware measurement in our respondents' organi­zations . The model includes only the three predictor variables that we have just discussed. As seen in Figure 1 ) the main effects of these three vari ables account for almost two thirds of the observed variance in our criterion index of use of software measurement results.

There is some mulUcolinearity among the three predictors, but the variance explained is noticeably higher than is so for any of the single strongest bivariate relationships (Figure 2). We are unable to identify any signifi cant interact i o n effects. Moreover, a dding other main effects into a more complex model adds essentially no improvement in overall explanatory power (R2 = .68).

Conclusions and next steps All told then, one can explain a good

deal of the variation in the success of software measureme nt programs by three variables, namely:

• the extent of use of a variety of data analytic methods,

• management commitment, and

• involvement of the intended users in setting the measurement agenda. While these results may seem to be

fairly intuitive, we have added a better quantitative description than was avail­able previously .

Moreover we have failed to find evi­dence to support other commonly stated assertions about what it takes to establish a successful software measurement pro­gram . For example, a measure of the degree of cooperation and support from

(cont. on p. 12) 1 1

lED

the organization's technical people is only weakly related to our criterion index of measurement usel (r2 = . 1 0, P < .0001), and it does not contribute to our multivariate results. Similarly, we find little evidence that the existence of a well-respected measure­ment "guru" typically has much impact on the success of the measurement program2 (r2 = .07, P < .0001).

Of course, a great deal more remains to be done. One most certainly ought not to conclude that only three variables are all that matter. Nor can one take these results as the definitive answer to everything that we ever need to know. Much more is needed for a fuller explanation of what it takes to establish a suc­cessful software measurement program.

Acknowledgments This paper relies heavily on earlier analyses done with

Anandasivam Gopal, David White, and Tridas Mukhopadhyay. S e e esp e cially " D eterminants of Success in Software Measurement Programs: Initial Results ," in Proceedings of the 5th IEEE International Software Metrics Symposium (Metrics 1999) .

The Software Engineering Institute (SEI) is a federally funded research and development center sponsored by the U.S . Department of Defense and operated by Carnegie Mellon University.

IUTIIe effort required for people to submit data is often considered to be onerous or

burdensome (reverse-scored)"; "People consistently provide information as planned and when requested"; "TIle war software measuremen1 data are collected and used is often con­sidered to be inappropriate by the people who must provide the information (reverse·scored)"; "TIIere is resistance to doing measurement here (reverse-scored):

2"Measurement has been championed by a well·respected 'guru' (or gurus) who also

knows the organization and its business:

Dennis R. Goldsen is with the Software Engineering Institute at Carnegie Mellon University, Pittsburgb, PA, dg@sei,cmu.edu .

Region 4-stephen White Region 4 is willing to host the 1 2ICSQ in Canada in 2002-

when can we start?! The Ottawa Valley Section 407 is embarking on a NEW program focused specifically on the software life cycle from project initiation right through to maintenance and servicing as a service to the local industry and to its qualified ASQ members. The program involves peer support discussion sessions held in the Ottawa area. The first session is scheduled for April 200 1 .

The Region 4 \Veb page at http://www.cyberus.ca/-swrute/ is being updated.

Region 5-Joel Glazer A on e-day IT Excel lence Symp o s i u m is planned fo r

February 1 4 , 200 1 , at Johns Hopkins University 's Applied Physics laboratory. The goal of the symposium is to promote excellence in today's fast paced development environments within the greater Maryland, Virginia, and DC area to encom­pass the following disciplines:

• Quality • Management • Software engineering

• Process improvement

1 2

Sessions will include topics on: • CMMI

• Security

• New Technologies

See www.asq509.org for more information.

Region 6-Tom Gilchrist Region 6 needs reporters from the far-flung reaches of the

empire. If you live within the 10 states that make up the largest region, and you know about or participate in or are aware of quality activities, conferences, or meetings, please forward such information to me . I'm looking for some "cub" reporters out there who just don't have enough to do already!

Congratulations to the Pacific Northwest Quality Conference (pnsqc.org) that met October 16-18 in Portland, OR. Though I was not able to attend (I was at 10ICSQ) I understand it was very well attended. The Software Division co-sponsors this event which is perfect for those who can't get to the International Conference on Software Quality. Also, if you know of a leader in quality assurance in the Northwest, PNSQC is looking for nomi­nees for their "200 1 Software Excellence Award. " The award is presented each year to recognize and reward software practi­tioners from the area. If you have someone you think should be considered, go to pnsqc.org for nomination information. Plans for the PNSQC spring tutorials (held in Seattle and Portland) are being made now. If you have a tutorial you are interested in pre­senting, let them know. The spring tutorials are held in May.

The Seattle Area S oftware Quality Assurance G roup (SASQAG.org) held its first annual Software Testing Tools Fair (STIF-2000) in Bellevue, \VA, November 3-4. There was a total of 13 exhibitors including a Software Division booth_ A banner and a wealth of handout material on the division, the Software Quality Professional journal, and CSQE certification were sup­plied by the division. If you have a conference you would like similar support for, contact our leadership. looks like November 5 will be the date of STIF-2001 !

The E-Commerce IT Applications Conference is being co­sponsored by SASQAG in Seattle and will be held June 10-13 , 2001 . There will be three tracks including Testing E-Commerce Software, Building Quality Into E-Commerce Systems, and Test Tools. Right now, the organizers are looking for presenters. If you are intere s t e d , i nfo rmation fo r s ubmission is at www.sasqag.org/juneconf .

Of course, if you are in the Seattle area on the third Thursday of the month, make plans to attend the SASQAG public meeting at 6 p.m. Information and directions are at sasqag.org .

Region 1I-Dave Williamson I'd like to encourage all ASQ Software Division members and

other software quality professionals in Region 1 1 to get involved with their local chapter of the Software Engineering Institute's (SEI) Software Process Improvement Network (SPIN).

SPIN groups are voluntary organizations made up of people who want to improve software engineering practice. SPINs meet regularly to exchange information and experiences with soft­ware process inlprovement. A typical meeting has a networking session, a speaker, and an interactive discussion period. SPIN members are a l so invited to attend an annual Software Engineering Process Group Conference which is co-sponsored by SPIN and the SEI .

Currently, there are active SPIN chapters in Washington, DC, Hampton Roads, Charlotte, and Raleigh, and an emerging chap­ter in Memphis. Several Roanoke and Blacksburg companies are

Winter 200 1 /S0FTWARE QUALITY

\ " j }

)

investigating starting a SPIN chapter in Southwest Virginia (COll­tact Dave Williamson at [email protected] for more information). If there is not a chapter in your area, consider get­ting involved with another chapter and then maybe starting a

local one. See the \Veb site below for a list of chapters and for specific contact information. SPIN earns ASQ recertification points-recertification points can be awarded for attendance at

SPIN meetings as long as attendees provide written proof of attendance (journal of attendance). The points awarded would be 0.3 points per meeting.

For more information on SPINs see the SEI Web site at http://www.sei.cmu.edu/collaborating/spins/spins.html .

Region 25-Deependra Moitra Recruited two deputy regional councilors: I have recmited

two deputy regional councilors to support and strengthen activi­ties of ASQ's Software Division in the Intemational Region-Toma

Geber in Israel ([email protected]) and Tan Kian Hee in Malaysia ([email protected]). Both have assumed an active role and

are trying their best to promote the ASQ Software Division in their respective regions.

Promotion at the conferences: SEPG 2000 in India-an

international conference-was held in Bangalore in February

2000 and I took the opportunity to promote the ASQ Software Division and especially the Software Quality Professional jour­

nal during the conference. I distributed sample copies of SQP

with the intent to improve its readership as well as subscriptions

in India and the neighboring Asian countries. An effort was also made to encourage prospective authors to submit their papers

for possible publication in SQP. Toma Geber in Israel is also actively involved in promoting the Software Division as well as

SQP through the planning committee of a forthcoming interna­tional co nference on quality taking place in Jerusalem. An attempt was also made to link up and promote the division with

the First Asia-Pacific Conference on Quality Software (APAQS 2000) held in Hong Kong, October 30-31 , 2000 .

Sourcing papers for SQP: I have actively tried to recruit

papers for the SQP journal and as a result in the last five to six

months approximately 10 papers have been submitted to SQP

from India alone.

Lil.

\Vbether you are preparing for the Certified Software Quality

Engineer (CSQE) examination or just testing out your knowledge

of software quality engineering, why don't you sit back and let your brain do its thing. The answers can be found on p. 1 4 .

Note: The items i n this quiz are NOT from the past CSQE examinations NOR were they created as part of CSQE exam development process.

1. The ISO/IEC 12207 standard establishes:

A. minimum acceptable requirements for the preparation and con­

tents of software quality assurance plans.

B. a common framework for software life cycle processes.

C. guidelines to facilitate the application of ISO 9001 to the develop­ment, supply, and maintenance of software .

D. standards for software process assessments.

SOFTWARE QUALITY /Winfer 2001

2. Which of the following is NOT an example of a quality record?

A. Process definition

B. Change request

C. System test log

D. Audit report

3. Which of the following software life cycle models empha­sizes identifying and resolving risks?

A. Waterfall

B. V C. Spiral

D. Incremental

4. Which of the following is an example of adaptive mainte­nance?

A. Modifying a tax calculation software system to include changes to

the tax code.

B. Modifying an energy management software system to add alarm

screen and printed reports that handle ftre and security alarms. C. Modifying a telecommunications software system to include a

new call forwarding feature.

D. Modifying an inventory control software system to correct a

defect in the reorder algorithm.

5. \Vhich of the following is an ordinal scale metric?

A. Defect density

B. Defect root cause

C. Defect detection efficiency

D. Defect severity

6. Which of the following types of testing is pelformed to deter­mine if the software has any problems meeting throughput and response time requirements?

A. Usability testing

B. Performance testing

C. Stress testing

D. Volume testing

7. Which of the following is an example of a process audit?

A. The evaluation of a sample set of source code to determine if

they meet coding standards and naming conventions.

B. The examination of a company's quality poliCies, procedures, and

records to determine if they have the capability to produce a

product of the required quality.

C. The examination of software training materials to determine if they meet contractual requirements.

D. The observation of a sample set of code inspections to verify that

they are being conducted in accordance with documented proce­

dures.

8. Which of the following are typically kept under configuration

control?

l. Requirements specifications

II . Test logs

III. User's manuals

IV. Software metrics reports

A. I and III only

B. II and IV only

C. I, II, and III only

D. I, II, III, and IV

(cont. 011 p. 14)

1 3

Answers to the Software Quality Engineering Quiz 1. Answer B is correct. The ISO 12207 standard establishes a

common framework for software life cycle processes. IEEE Std 730 establishes minimum acceptable requirements for the preparation and contents of software quality assurance plans. ISO 9000-3 estab­lishes guidelines to facilitate the application of ISO 9001 to the development, supply, and maintenance of software. ISO/IEC 1 5504 establishes standards for software process assessments.

2. Answer A is correct. Quality records provide the evi­dence that the appropriate quality activities took place and that the execution of those activities met required st andards. Examples of quality records include documented test results, problem reports and change requests, annotated documents, review records, minutes, and audit reports. A process definition specifies part of the quality system requirements but provides no evidence that those requirements were implemented.

3. Answer C is correct. The spiral model divides each major life cycle phase (i .e . , concept, requirements, design and imple­mentation/testing) into fou r quadrants . The first quadrant determines the objectives alternatives and constraints. The sec­ond quadrant evaluates alternatives and identifies and resolves risks. The third quadrant develops and verifies the next level of the product. The fourth quadrant plans the next phase.

4. Answer A is correct. Adaptive maintenance does not add new functionality to the software but changes it to adapt to changes in the environment in which it is running . Answers B and C are perfective maintenance, which is the modification of the software to add new features or functionality. Answer D is corrective maintenance, which is the modification of the soft­ware to repair defects.

5. Answer D is correct. Ordinal scale metrics are compar­isons by order with no assumption of the magnitude of the difference. Defect severity is an ordinal scale metric. Defect den� . .. sity and defect detection efficiency are ratio scale metrics. Defect root cause is a nominal scale metric.

6. Answer B is correct. Performance testing is done to determine if the software has any problems meeting perfor­mance requirements like throughput or response time. Usability testing is done to determine if the software has any areas that

will be difficult or inconvenient for the users. Stress testing is done to determine if the software has any problems when sub­jected to p eak load conditio ns. Volume testing is done to determine if the software has any p ro blems handling the required volumes of data, transactions, requests, etc.

7. Answer D is correct. A process audit looks at the process­es used to create products or services and evaluates them to determine whether the processes exist and are adequate to meet the required quality objectives, the processes are being imple­mented correctly and with due diligence , and the processes really work. Answers A and C are examples of product audits, and answer B is an example of a quality system audit.

8. Answer A is correct. External delivered software prod­ucts, such as the user's manual, should always be placed under configuration controL Internal software work products, such as the requirements specification, are also typically placed under configuration control . Quality records, such as test logs and soft­ware metrics reports, are not typically placed under control.

1 4

The Software Division would like t o honor the following members for their loyalty and support of the Software Division.

The division had its l Oth birthday as a division on July 1, 2000. The following members have been members of the Software Division since that time. Congratulations to each of you for help­ing to make the Software Division what it is today. The lO-Year Software Division members are:

Faisal M. Abdullah 701 Orange Empire

Richard R. Allen 1 104 Richmond

Frank T. Anbari 505 Philadelphia

Rao S . Anumolu 303 Long Island

Gary A. Arthurs 704 Phoenix

Katsutoshi Ayano 2500 Chapter #1

Leroy A. Babbitt 1301 Kansas City

Barry Bailin 2500 Chapter #1

Larry A. Bannister 607 Portland

Joseph Basala 1204 Racine-Kenosha

George R. Bateman 1 201 Chicago

John H. Baumert 300 Metropolitan

Barry B. Beaman 1205 Rockford

Ronald V. Bedgood 707 Tucson Old Pueblo

Maria Angeles B .

Bernardez-Farrar 1416 Greater Fort W Olth

Richard E . Biehl 1 509 Orlando

Robert C . Birss 61 3 Santa Clara Valley

Michael A. Blackledge 1400 Albuquerque

Deborah 1. Blakeney 107 Rhode Island

Linda H. Borkowski 1113 Raleigh

Edward R. Bowling 711 Inland Empire Section

Don C. Boyle 903 Indianapolis

Carol M. Bradbury 304 North jersey

Elizabeth B. Brown 810 Akron-Canton

Laurel A. Bmssel 303 Long Island

john E. Caldecott 2500 Chapter #1

Jolm E. Calvetti 1001 Grand Rapids

Giuseppe Canepa 2500 Chapter #1

Sue Carroll 1 1 10 Charlotte

David Cas safer 605 Sacramento

Charles W. Champ 1 304 S1. Louis

David J. Christensen 1208 Fox Valley Section

Francois R. Coallier 401 Montreal

Don C. Cochrane 1 201 Chicago

J ayesh G. Dalal 307 Princeton

Taz Daughtrey 1 1 20 Lynchburg

Raymond Day 1530 Manasota

james R. Dildine 606 Seattle

Marvin B. Doran 407 Ottawa Valley

Winter 2001 /SOFlWARE QUALITY

. .''', )

!

t

Clayton C. Dryer 509 \Xlashington Takami Kihara 306 Greater Danbury

John E. Duckworth 5 1 1 Northern Virginia Teny L. Knox 701 Orange Empire

Jose A. Duran 1 500 Puerto Rico Nicholas F. Kokot 201 Buffalo

Frederick J. Eberhart Jr. 6 1 3 Santa Clara Valley Kenneth \XI. Kolence 6 1 3 Santa Clara Valley

\Villiam C. Eisentraut 1 2 1 7 Madison Imants Krauze 502 Baltimore

James C. Elliott 502 Baltimore Shirley A. K1:entz 1202 Milwaukee

Raymond \V. Engelman 502 Baltimore Michael P. Kress 606 Seattle

Richard 1. Eppig 100 Boston Vincent T. Lam 402 Toronto

M. Hosein Fallah 307 Princeton Rebecca A. Lamb 909 Dayton

Vincent S. Fesunoff 702 San Gabriel Valley Paul T. Lambert 1201 Chicago

Brenda M. Fisk 402 Toronto Sandra Landes 6 1 3 Santa Clara Valley

Mark A. Folkerts 606 Seattle Dean M. Lapp 802 Pittsburgh

William T. Folsom 305 New Haven Russell K. Lew 1 52 1 Baton Rouge

Roger G. Fordham 704 Phoenix Paula 1. Uebrecht 509 Washington

Kirby K. Fortenberry 1 405 Greater Houston Kurt R. Unberg 1 203 Minnesota

Herbert Foss 1201 Chicago George H. Loehwing 3 04 North Jersey

John H. Fowler 1 300 Denver John E. Lowe 909 Dayton

William M. Frank 706 San Fernando Valley Bonnie 1. Lowery 1 402 Dallas

John Friedhoff 900 Cinchmati \V. Mark Manduke 5 1 1 Northern Virginia

Brent J. Garback 1 000 Greater Detroit Russell K. Marcks 909 Dayton

David A. Gardner 1 005 Michiana Raymond S. Markowski 304 N01th Jersey

Susan J. Garza 505 Philadelphia Philip C. Marriott 617 Redwood Empire

Nancy F. George 502 Balthnore Kevin ]. Marston 2500 Chapter #1

Joel Glazer 502 Baltimore leRoy T. Mattson 1 203 Minnesota

Robert E. Glazier Sr. 303 Long Island Peter D. Mauch 1 20 1 Chicago

Bud GHck 1 2 1 2 Northeastern Illinois Cheryle F. May 1 1 1 3 Raleigh

Robin F. Goldsmith 1 00 Boston Brenda 1. McCalt 905 Northeastern Indiana

Roderick S. Goult 2 500 Chapter #1 Jack McKissick 206 Syracuse

Praveen K. Gupta 1 20 1 Chicago Ted Mercer 1 405 Greater Houston

Harlin 1. Hamilton 1 402 Dallas Denis C. Meredith 502 Baltimore

Dale A. Harmon 802 Pittsburgh Ichiro Miyauchi 2500 Chapter #1

Janet S. Harrison 1 4 1 6 Greater Fort \Vorth Jeanne E. Moldenhauer 1 2 1 2 Northeastern Illinois

Nancy C. Heinsz 1 304 S1. Louis Thomas P. Monkus 1 508 S1. Pete-Tampa

Dean 1. Hendrickson 1 3 13 Boulder Gilbelt Montes 140 1 Greater El Paso

Michael R. Herot 505 Philadelphia Hector Mujica 2500 Chapter #1

Charles P. Hollocker 1 402 Dallas Jean-Pierre Mulley 203 Allegheny Mountain

Michael J. Honea 1 4 1 2 West Texas S. Dev Nanda 909 Dayton

John \V. Horch 1 503 Huntsville Antonio Napolitano Jr. 2500 Chapter #1

Jeffrey W. Jackson 1 0 1 0 Ann Arbor Jolm T. Neville 606 Seattle

Michael J . Jaques 407 Ottawa Valley Thomas C. OICOlmor 107 Rhode Island

Erwin Jaumann 509 \Vashington Lee E. Olson 706 San Fernando Valley

Larry G. Jensen 50 1 North Central PA Joseph A. Ondrechen Jr. 703 San Diego

James R. Jones Jr. 506 Delaware Richard M. Opolski 2500 Chapter #1

Robert G. Kain 1 2 1 3 Illiana Karen 1. Owens 700 Los Angeles

Pasi Kantelinen 2500 Chaptel' #l Peter Papakostantinu 402 Toronto

Chris K. Kaufman 1 20 1 Chicago David R. Parker 2500 Chapter #1

David T. Kenney 607 Portland Daniel R. Pea 903 Indianapolis

Joyce E. Kepner 802 Pittsburgh Scott J. Pease 1 202 Milwaukee

(conI. on p. 16)

SOFTWARE QUALITY /Winter 2001 1 5

C O fJ � O . -�----....

Ruth Pennoyer 300

Russell L. Perkins 1422

Patricia E. Pierce 1 4 1 4

Jacldyn Powers 5 1 1

Shel Prince 604

Patte Pyle 1001

Edward A. Raether 618

Gene D. Redig 1 203

Jeffery A. Reed 1203

Gene M. Roske 706

Mark 1. Sadler 1 212

Andrew J. Sanchez 307

Johan M. Schaap 405

Klaus Schielke 2500

Thomas J. Scurlock Jr. 304

Nancy A. Selby 1300

Zvi E. Sella 622

Paul E. Sidney 401

Louis A. Silva 505

Michael W. Smith 1402

Steven 1. Soos 606

Gregory D. Springer 70 1

J oh11 M. Starr 706

Nick P. Stewart 1402

Clarence S. Stoltzfus 618

Chris Stylianides 2500

Richard R. Taylor 103

Terry Taylor 303

Helmut H. Thiemann 505

Lee Tiam Hock 2500

Donald D. Tice 702

Ivan 1. Trauernicht 1208

Rufus A. Turpin 407

Estil 1. Vandament 505

James C. Vetricek 1 508

Minhwei Wang 307

K. Fred \Vehmeyer 505

Gary W. Weyandt 704

Larry W. Whittington 1 502

Alan Wiegmann 1205

Jerry L. \Visdom 1414

David H. Wolen 613

Tracy A. Wood 701

James T. Zurn 1 3 1 3

1 6

. «J E D -��------------

Metropolitan

Bay Area

Austin Area

Northern Virginia

San Francisco

Grand Rapids

Golden Gate

Minnesota

Minnesota

San Fernando Valley

Northeastern Illinois

Princeton

Kitchener

Chapter #1

North Jersey

Denver

Southern San Joaquin Valley

Montreal

Philadelphia

Dallas

Seattle

Orange Empire

San Fernando Valley

Dallas

Golden Gate

Chapter #1

Hartford

Long Island

Philadelphia

Chapter #1

San Gabriel Valley

Fox Valley Section

Ottawa Valley

Philadelphia

St . Pete-Tampa

Princeton

Philadelphia

Phoenix

Greater Atlanta

Rockford

Austin Area

Santa Clara Valley

Orange Empire

Boulder

This is an excerpt fro m "Injecting Test Process Into

Software Engineering Cycle, JJ a paper presented by the author

at tbe 10tb International Conference 011 Software Quality. The tasks for the software coding and testing process include

the development of each software unit (or software configura­

tion item, if identified) and the test procedures and data for testing each software unit. For most software projects, the soft­

ware dev elo pment group consists of develop e rs a n d

development testers . Usually, test procedures and test cases are

developed by testers . While developers can concentrate on items like external consistency with requirements, design of

software items and appropriateness of coding methods and stan­

dard used, testers can focus on test cases that provide maximum

test coverage for software units.

Even though it is not spelled out in any software engineering

standard documentation, maintaining a high level of coopera­

tion and communication between developers and testers is

essential for the successful and timely completion of this stage.

For example, once the initial test coverage data for a software

unit is obtained, developers and testers should work together to

review the results, assess the further test coverage require- "\ ments, and write new tests for better test coverage. This cycle } can be repeated until a comfortable level of test coverage of

software modules is obtained. One good practice is to have

testers develop simple and user-friendly unit testing tools for developers so that developers can run test cases as needed

without heavy investment in testing methods and tools. For developers, unit testing tools become more useful for defect

reproduction and investigation. For testers they are invaluable

in defect fix verification.

Developer and Tester Relationships Good professional working relationships between developers

and testers can be achieved and maintained when they recog­

nize the value of each other's role and contribution. I learned

from my experience that when testers provide easy-to-use inter­

active testing tools (and test cases) to developers, developers

can mn some basic tests by themselves and catch many primi­tive defects in the earlier stage of the process. More important, it

provides strong and productive working relationships through­

out the software engineering cycle.

One Good Practice: Pushing Codes and Test Cases on the Same Day

The advent of the World Wide Web definitely changed and

helped a great deal in organizing software engineering documen­

tation efforts. Software requirement specification and its test

plan can be linked together on the same page and shared by

developers and testers instantly. Software coding and test case

development can be done in parallel, providing ample opportu­

nities to find problems during coding. In my organization, we

put this in practice and one of our development managers right­

ly claims that:

Winter 200 1 /SOF1W ARE QUALITY

)

"\Ve will push the fully tested source codes and test cases to source management track on the same day. In other words, unit testing gets completed as the coding is done ."

I believe this is one powerful statement that represents suc­cess in software testing as well as software engineering.

References

1 . ISO/IEC 1 2207, International Standard, " Information technol­ogy-Software life cycle process," First Edition, 1 995-08-01 .

2. Marc C. Paulk , et al . , "The Capability Maturity Model for Software: Guidelines for Improving Software Process" SEI

Series in Software Engineering, Addison-Wesley, 1995.

3 . J. Musa, "More Reliable, Faster, Cheaper Testing Through Softwa re Reliab ility Engine ering -Overview, " Testing Computer Software Conference, Washington, D.C. , June 1998.

4. Helne Burnstein, et al. , "Using the Testing Maturity Model .>

(TMM) to Assess and Imp rove Your Software Testing, " Quality Week '99 Conference, San]ose, CA, May 1999.

5. H. Park , "Test Automation for Multi-platform Client/Server Software, " STAR'99 East Conference, Orlando , FL, May 1999.

6. B. Langston, "How to Have a Perfect 'T' Party," STAR'98 East Conference, Orlando, FL, May 1 998.

7. P. Jorgensen, "Software Testing: A Craftsman's Approach ," CRC Press, 1 995.

BV TAZ DAU G REV

One guideline for the Software Quality Pl'Ofessional journal has been to offer material that fully spans the CSQE body of knowledge. When we have not received submissions in certain areas-such as configuration management, verification and vali­dation, or aUditing-we have sought out contributors, and we will continue to do so. However, not all subject areas are creat­ed equal. Not all are equally difficult to understand Of apply, and not all are equally well represented in the professional literature. For whatever reason, it is now clear that certain subjects need additional special emphaSis.

Let's be data-driven here. The results of a year's worth of CSQE exams show a distinct pattern. Of the eight subject areas, the same three had the lowest success rate for both those who did and those who did not pass the overall exam. Interestingly, those three areas also showed the widest discrepancy between success rates. For instance, 67% of the software quality management ques­tions were answered correctly by those who passed the exam (the lowest score in any section), but that also marked the largest separation from the success rate (48%) of those who did not pass. Cleady, these subjects were the areas that differentiated those who had mastered the material from those who had not.

What is the content of these body of knowledge areas? Software quality 111anagement addresses planning (including customer requirements, security, safety, and hazard analysis), tracking (including corrective action), and training. Software

processes covers development and maintenance methods , as well as process and technology change management. Software

SOFTWARE QUALITY/Winter 2001

rnetrics, measurement, and analytical tec!Jniques includes measurement theory, analytical techniques (such as statistical and graphical concepts), and measurement (of process, product, resources, quality attributes, defect detection effectiveness).

These are, objectively speaking, the subject-matter areas that need the most attention in developing the software quality pro­fession. SQP will emphasize these topics in future issues. As appropriate, we might have a special theme section containing several related presentations. In addition, the ASQ Software Division is suppOlting the development of special reprint collec­tions that gather key pap ers fro m SQP and other ASQ conferences and publications. The first collection will be target­ed at software qual ity manag e m e n t , p r oces s e s , and measurement . Look for more details in your memb e rship renewal notice.

The Government Division will sponsor two internationally recognized speakers and an international group from IBM to pre­sent at the AQC in Charlotte, May 7-9, 2001 .

These speakers have positively influenced their organiza­tions-and, in some cases, countries-by facilitating and leading major change initiatives. The initiatives have included utilizing technologies, public-private partnerships, and very aggressive entrepreneurial joint venture agreements between tlle public and private organizations to create innovative solutions. Their efforts have had a tremendous positive impact on the bottom line.

On Monday, May 7, 1 :30 p.m.-3:00 p.m., Janet Caldow and a group from IBM Corporation will share world-class examples of e-gove111ment applications from the United States and Canada to illustrate how IBM's clients have been able to revolutionize pub­lic service. The session number is M 1 03-E-Government and the Bottom-Line: IBM's Itlllovative Approach.

Also on Monday, May 7, 3 : 30 p . m . -5:00 p .m. , Art Daniels, Government of Ontario, Canada, an internationally recognized speaker and author, will present the secrets to making private­public partnerships witl-win propositions. Daniels will provide valuable insights by presenting the case of Teranet-a private­public s e c t o r p artnership . This partnership spawned a multimillion dollar global firm specializing in geographical infor­mation services, which draws on the strengths of public and private sec tors. The session number is M2 03-How to Build Successful Private/Public Partn ership s : A Case Study of a $ 100 Million Global Enterprise.

Finally, on Wednesday, May 9, 9: 1 5 a.m. - I O:45 a. m., Chan Meng Khoong, founder and managing director of SCS Foresight, a leading management conSUlting group in Singapore, will dis­cuss e-business models in government and industry. He has published three books and more than 100 articles in profession­al p e riod icals , and serves on the edito rial boards of five international management journals. He is a National Computer Board scholar with a distinguished service record. The session number is W I l l - How e-Busi ness Models Are (Finally) Reinventing Government, Industry, and Society.

1 7

1 8

Thinking about taking the CSQE

Exam?

The Westfal l Team can help you get

ready .

CSQE Refresher Course April 23 - 27 , 2001

Dal las , Texas or

October 1 5 - 1 9 , 2001

Pittsburgh , PA (** take the special offering of the CSQE

exam Sunday October 2 1 prior to 1 1 ICSQ)

You wi l l receive:

• In-depth review of al l areas of CSQE Body of Knowledge

• A l l presentation materials with annotated notes

• Sample quest ions to practice taking the exam

CSQE?

For more information call 2 1 4 - 544- 3694

or visit www . westfallteam . com

3000 Custer Road, Suite 270, PMB 383 P lano, TX 75075-4499

emai l : Iwestfal [email protected]

Winter 2001 /SOFTWARE QUALITY

(

.... \ }

(

DIVISION COUNCIL MEMBERS 2000-0 1

Linda Westfall-Chair Tom Griffm-Publications 972-867-1 172 334-244-3304

[email protected] [email protected]

Michael Kress-Chair-elect Selim russi-Strategic Planning Chair 425-717:7038 503-264-8510

[email protected] [email protected]

Jayesh Dalal-Past Chair Taz Daughtrey-Liaison, 732-949-7064

Journal Editor [email protected]

804-237-2723

Sue McGrath Carroll sqp_ [email protected]

Internet Liaison David Zubrow-i\ktrics, Measure-

919-677-8000 ext 7032 ment, and Analytical Methods

[email protected] 412-268-5243

Sharon Miller-Treasurer dz@seLcI11lLedu

972-985-501 7 Kim Cobler-Methods sharon. miller@ilex_com 801-363-7770

G. Timothy Surratt-Secretary [email protected]

630-713-5522

[email protected] Awards & Recognition Chair OPEN

Scott Duncan-Standards Chair 706-565-9468 Pam Case-Membership

[email protected] 301-261-3805

Doug Hamilton-Certification [email protected]

847-714-3306 Theresa Hunt-Programs

douglas. [email protected] 407-859-7414 ext 2306

[email protected] Claire L Lahr-Education & Training 703-391-9007 Patricia McQuaid-Wrorld Congress

lohrs)'[email protected] 805-756-1473

[email protected]

4 = Canada

* Includes Alaska and Hawaii ** Includes Mexico

SOFTWARE QUALITY/Winter 2001

REGIONAL COUNCILORS

Region 1 - John Pustaver Region 10 - David Walker 978-443-4254 800-83 1 -6314

[email protected] d\1i '-,II [email protected]

Region 2 - Jean Burns Region 11 - Da\'e Wilh:1mson 607 -779-7868 540-344-9205 x ) 135

[email protected] dwilliamson nleridiulll.com

Region 3 - Bill Folsom Region 12 - Bob Colby 203-385-4339 630-979-6783

wtfo!som@dcmde_dClua.mil rcolh} lu 'OLcom

Region 4 - Stephen White Region 13 - Mi hael Suelzer 613-727-1304 x1668 785-550-0006

[email protected] msuelzer@sunnower_com

Region 5 - Joel Glazer Region 14 - OPEl' 4 10-765-2346

joel�[email protected] Region 15 - Carol A. Dekkers 813-393-6fH8

Region 6 - Tom Gilchrist dekker CompuServe .com 425-234-4865

[email protected] Region 25 - Deependra Moitra +91-80-527- 1 771

Region 7 - OPEN [email protected]

Region 8 - Ralph Mohr 614-464-3360

RMohr@CBSINC_com

Region 9 - John Lowe 937-429-6458

[email protected]

1 9

CSQE Exam June 2, 2001

EDITOR DU. TOM F. GRIFFIN 1lI AUM, IS & OS

P.O. Box 244023

Application deadline is April 6, 2001 .

Montgomery, AL 36124-'1023 voice: 334-244-3304 (Business) fax: 334�244-3792 e-mail: software_neiVsletter@asqnel .org

EDITORIAL REVIEW BOARD LINDA WIESTFAll, Chair �HCHAEL KRESS, Chair-Elect TOM GRIFFIN, Publications Chair

JOHN HORCH, Associate Editor

SUBMIT ARTICLES FOR THE NEXT ISSUE OF

SOFTWARE QUAUTY BY APRIL 1 , 2001 .

ANN BRIIT, Associate Edi[Qr LARRY THOi\lAS, Associate Editor DA \IE ZUBROW, Associate Editor

EDITORIAL POlley Unless otherwise stated, bylined articles, editorial commentary, and product and service descriptions reflect the author's or firm's opinion. Inclusion in Software Qualify does not constitute endorsement by ASQ or the Software Division.

DR. TOM F. GRIFFIN I I I

PHONE: 334-244-3304 FAX: 334-244-3792

ADVERTISING FULL PAGE-$500 per issue If, PAGE-$2S0 1/, PAGE-S125

E-MAI L: SOFTWAR E _ NEWS LETTER@ASQN ET.ORG

r- - _ .. -- -- -- - - --- -- -- -- -- - - - -- .. - -- -- -- - - .. -- -- _ .. - _ .. - - --- --- -- - -- -_ .. - -- -- - _ .. -- ---- -_ .. -- - - ---- -_ .... - .. -- -_ .. -- -.. _ .. _ .. - -- -- ----- - --,

i Yes! Please enter my subscription to Software Quality Professional, a quarterly i ! publication focusing on the needs of professionals in software quality. 1 ! Member Number i

Name _______________________________ _

Company Name/Agency _________________________ _

Title _______________________________ _ Address ___________________ Apt.!Suite # _______ _

City ________________ StatelProvince ___________ _

___________ Coun tlY ____________ _ Zip+4!Postal Code Telephone ( ) ____________________ F� ( ) ______________________ _

E-mail ______________________________ _

Payment options: NI orders must be paid in U.S. currency. Please make checks payable to ASQ. Checks and money orders must be drawn on a U.S. financial institution. All p'tices are subject to change without notice.

, Payment encIosed: 0 Check 0 r-,'loney Order Aml. Paid ____ _

Please charge: 0 VISA 0 Mastercard 0 American Express Charge Card No. Exp. Date _______ _

Cardholder Name (please print) _______________________ _

Signature ______________________________ _

Cardholder Address _________________________ _ City ___________________ StatelProvince

_______ _

_______________ COUtltlY ______ � __ _ Zip+4!Postal Code Telephone ( ) _____________________________ F� ( ) ________________ __

Subscribe by: Phone 800-248-1946 or 414-272-8575 (outside North America)

, Fax 414-272-1734 ! Mail ASQ Customer Selvice Center, P.O. Box 3005, Milwaukee, WI 53201-3005

ASQ Membecs Nonmembers Institutional U.s. $40.00 $70.00 $ 1 20.00

International $60.00 $95.00 $ 1 50.00

Canacla $60.00 $95.00 $ 1 50.00

Sojlimre Quali(1' Proje&,'ioJlIII is publi�hed in Del�ll1ber, March,

l _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ �:::;:���:.}= _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ p�_�:��_C�_d�::��_�l_j

O> CJ) <D T""" 0 -:t CJ) L!) ro Q e t) 0

0 :; z 0- 0... 0-C (j) 'E 0 ::> ill Z 0...

� ai <D � ::J ro � �

..... -.c' (\J M -if) -

I- --, .... -(.!J I-f C"J. ....

-I -D -

tn ",," l.f":t * ...0 0 - \ * 10 u;: C) -- ) -0 1""".1 Cl t<) -- / I-- ..... l- I -:::;' 0 <1': T4 <t: Z (;) -* H "' .... (IJ ->}: IX: Cl M >,'< lJJ cc: if) -* O:l Cl -* 1::::':: 0::: 0 : !{t ;:',;I lIJ O tJ-, :i:- � * ;:? i-'f 0 :9: 4; U) O U.J � * ({� !e: Z r-<'J w -* lI.1 l!. .. O ,,,/ -* If.: C) t-� �< :::t -a" __ *' Ci :'1: to (':) <:!� ---*' I·�· t-I C0 3 -.,. W Z Y G'J _J * ::::) ... .,t ." .. en C':J � * C! -;I O <C t:L l::�