ASQ - Software Division

Copyright Notice This newsletter is published as nonprofit, educational work, Copyright (c) 2008 by American Society for Quality. Permission to make digital or hard copies of part or all of this work for personal or academic classroom use is granted without fee, provided that copies are not made or distributed for financial profit or commercial advantage and those copies bear this notation and full citation. Copyright for components of this work owned by others than ASQ must be honored. Abstracting in part with due credit citation is permitted. To copy this work otherwise, to republish, to post on servers, or redistribute to lists or list servers, requires prior specific permission.

 

Message from the Chair – W. L. ‘Bill’ Trest, ASQ CSQE

It is with humility, gratitude and excitement that I begin service as Chair of the ASQ Software Division. I wish to thank David Walker for past and continued service. During his term as chair, the software division grew significantly in both numbers of members as well as in member value. David’s leadership also included a marked increase in the division budget, which continues today from a series of inspired technical conferences and annual events, with constant focus upon member value.

As a Systems/ Software Quality practitioner with many years experience, I have seen many profoundly new changes and lessons in software products, processes and technology that I hope to share as the incoming chair.

By way of introduction, this I believe:

  1. A higher level of software development process performance provides a higher level of quality products.
  2. Software management and professionals constitute different cultures; neither is “right” or “better” and both are necessary to the organization.
  3. Organizations can improve both morale and teamwork by understanding that professionals wish to be judged in terms of achievement, rather than evaluated in the same manner as managers, administrative staff or factory workers.
  4. Key to achieving systems and software quality success is asserting professional knowledge, skills and attitude, as well as applying such attributes as one team.

Above all, I plan to be as approachable and accessible to ASQ SD members as necessary. I’m generally an easy person in to find in Texas, but if you want to contact me I can be reached at this e-mail: bill.l.trest@lmco.com.

Don’t become a stranger!

Regards: Bill Trest, Chair

Farewell Message from the Immediate Past Chair – David Walker

It is with great confidence that I pass the baton to Bill Trest to lead the Software Division into 2009-10. Bill has a wealth of experience in the software industry, has applied software quality principles through the evolvement of technology, has a collaborative leadership style, and has a very strong commitment to software quality and safety.

Over the last two years, I have focused on recruiting new volunteers, providing a focus on annual objectives, achieving operating discipline with respect to ASQ division management practices, and advancing membership value. I do hope that members are experiencing the effects of this work.

It has been thrilling to watch knowledge being transferred through webinars, conferences, courses, journals, newsletters, and discussion boards. It is my vision that the more members get involved and contribute, the more they harvest knowledge.

Please remember that you always have a direct communication link to the Software Division Management Committee through this email swd@asq.org. I will be assuming the roles of Nominations Chair, Audit Chair, and Regional Councilor Coordinator. So, I will still be very involved in the Software Division’s direction.

I wish everyone a great summer!

Back to top


Software Division Reaches Agreement with CAI for Free Webinars and Big Discounts on Conferences

ASQ Software Division has reached an agreement with Computer Aid Inc’s IT Metrics and Productivity Institute to bring two great benefits for members.

  1. The software best practices webinars. The registration is free, and they are scheduled just about each week. Computer Aid Webinars http://www.itmpi.org/webinars/
  2. The 2008 Software Best Practices Conferences at a discount rate of $395 (regular $695).

The following discount code is needed for the on-line registration to the Software Best Practices Conferences: “ASQyy8”

Back to top


Plan to Attend the 4th World Congress for Software Quality (WCSQ)

Are you passionate about building better software? Do you want to be part of a major new effort, partnered with leading experts in software quality, to define how the practice of software quality should evolve to meet ever-increasing demands? If so, plan to attend the 4th World Congress for Software Quality (WCSQ) at the Hyatt Regency in Bethesda, Maryland from 15-18 September 2008. Not only is this exciting event a partnership between the Software Division of the American Society for Quality, the Software Group of the European Organization for Quality, and the Union of Japanese Scientists and Engineers, it also provides the opportunity to attend VERIFY, a leading conference on testing topics. See www.asq.org/conferences/wcsq for details.

Back to top


Evolving Software Quality Through 2015
Article by Nicole Radziwill, ASQ Software Division Chair- Elect, 2008- 2010

(Nicole Radziwill is the Assistant Director (VP) for End to End Operations at the National Radio Astronomy Observatory in Charlottesville, VA, where she oversees software development at the executive level. She has an MBA, is a PhD candidate in Technology Management with a Quality Systems specialization at Indiana State University, and is co-chair, innovation research group of the Network Roundtable at the University of Virginia.)

In an excellent article in the June 2008 CrossTalk, The Journal of Defense Software Engineering; Dr. Watts Humphrey discusses the current state of software quality. He points out how common defects are in even the most complex, well-tested system, and through a very thoughtful analysis, concludes that “the current testing-based quality strategy has reached a dead end”. Because we “have complex life-critical [software] systems [now]… and will have much larger ones in the relatively near future… we must do something, but what?” This “something” refers to improving the discipline of software quality, which is a task in which the ASQ Software Division is keenly interested as it formulates its strategy for the upcoming years.

It is possible that we can uncover some solutions for ourselves by examining the progress of the quality discipline in manufacturing since its inception. According to Conti, Kondo & Watson (2003), there have been four “learning cycles” of quality improvement observed in U.S. industry over the past two centuries. The first cycle involved transitioning from the craftsman model to the mode of mass production, which occurred during the early part of the 20th century. It was fueled by the availability of reusable parts. The second cycle, from the 1940’s through the 1970’s, occurred when the emphasis shifted from inspection to prevention, and designing defect-free processes and products became just as important as inspecting problems out. The defining characteristic of the third cycle was recognizing that a definition of “defect” had to factor in a customer’s perception of what constituted quality. This is embodied by Juran’s definition of quality as “fitness for use” that was popularized in the 1980’s. The fourth cycle, which emerged during the 1990’s, focused on involving the entire organization in the process and mechanics of quality improvement.

The shift to modular, reusable code that could be obtained from accessible repositories could be considered the software industry’s equivalent of the first learning cycle. However, have we, as software professionals, collectively shifted our focus from inspection to prevention? Have we acknowledged the many definitions of defect that might impact how we structure, manage and monitor the software development life cycle? Is software quality yet an organizational imperative? Furthermore, how do we balance the structures and systems required to deliver high-quality software while preserving a culture where innovation can flourish and where our efforts can directly contribute to the competitiveness of our companies?

These unique challenges require that we, as software professionals, expand our personal perspectives on quality and seek out the benefits and insights that our non-software peers in ASQ might provide for us. At the same time, we must raise the profile of software quality throughout ASQ, since software-intensive systems are poised to remain an essential part of our lives and societies, to ensure that an organization-wide participation in software quality can ensue. Finally, we must expand our own awareness and expertise regarding how to improve quality in ways other than inspection. We invite any interested Software Division members to work with us to devise and execute actionable steps to advance software quality as a discipline over the next few years. Contact Nicole at nicole.radziwill@gmail.com to participate.

References
Conti, T., Watson, G. H. & Kondo, Y. (2003). Quality into the 21st century: perspectives on quality and competitiveness for sustained performance. Milwaukee, WI: Quality Press.
Humphrey, W. (2008). The software quality challenge. CrossTalk: The Journal of
Defense Software Engineering. June 2008.

Back to top


Aerospace Corner
Article by W. L. "Bill" Trest ASQ CSQE; Chair, ASQ Software Division

The software quality ‘ROMP’…

Software quality seems a simple concept from a software provider/ developer’s point of view -- identify requirements, assure the requirements are achieved, and deliver the required products on time to the right customers. In other words, from a provider/ developer’s point of view, software quality is conformance to specifications. Never mind that the quality of the software specification itself may not be as complete and correct as necessary for the software quality that the customer really expected or reasonably believed was needed. On the other hand, if the customer’s point of view of quality is reasonably correct, then quality is the customer’s perceived fitness for use; in other words, whatever satisfies the customer, right? The problem here is that quality rapidly becomes subjective, and even the most well-meaning software folks may also risk losing management control of software quality, cost, and schedule altogether. The opportunity for risk management is also perishable! For as often as well-meaning, reasonably skilled software folks may lose control of development, they may just as well run into an opportunity for quality, cost, and schedule improvements.

Risk opportunity management planning (ROMP) is the act or practice of dealing with risks and opportunities (R/O). It includes planning for R/O, assessing, identifying and analyzing areas of R/O, developing R/O handling options, monitoring to determine how R/Os have changed, and (probably most important) documenting the overall ROMP efforts. The majority of ROMP efforts can be accomplished without special tools since the day-to-day activity of ROMP involves team member meetings. Having a risk manager on an organization chart looks great, but successful ROMP is not a one-person show. Everyone must be active to identify and rank R/Os for their area of responsibility as well as to develop and execute ROMP activity. The best insight into effective R/Os comes from people most directly involved.

Finally, anything worth doing is worth periodic check-ups. There are two things that will make a ROMP really terrible: (1) Management failure to use the ROMP process for Program Management, and (2) Team member failure to participate effectively. The fundamental questions for an independent Systems/Software Quality program to evaluate are as follows: Is the Risk/Opportunity Management Process used to manage the program, or simply done while the program is being “managed”?; and, Do any team members -- for whatever reasons -- “withhold” issues and concerns from ROMP and attempt to resolve matters on their own? In either case, the ROMP failure may be attributed to an inadequate human condition and/or inadequate process deployment.

(W. L. "Bill" Trest is the 2008- 2010 Chair, ASQ Software Division, as well as an ASQ Certified Software Quality Engineer (ASQ CSQE) employed by Lockheed Martin Aeronautics Company in Fort Worth, Texas, with 25+ year experience in software. Bill is also a Senior Member of the Greater Fort Worth Section 1416 of the American Society for Quality).

Back to top


What Exactly Is Automated Software Testing?
Article by Kenneth White, ASQ CQIA

What do the following three things have in common — Bigfoot, the Loch Ness Monster, and Automated Software Testing? Answer: Everyone has heard of them, but no one actually has any proof they exist. At least there are, reportedly, pictures of the first two and opinions about the last.

Nearly every company that writes software has, at one time or another, flirted with the idea of automation to test their software. However, many of these projects end in failure because the company jumps into Automated Software Testing head first not knowing what to expect, or worse, expecting the wrong result.

As described during the 2007 International Conference on Software Quality, Automated Software Testing (AST) is using one software program to ‘drive’ another, either by mimicking human behavior through a User Interface (UI) or by interacting directly with the source code under test via an Application Programming Interface (API). This article focuses upon UI-oriented automation testing tools.

Typical UI-oriented AST tools have two basic pieces: 1) A way to recognize and interact with the software item under test, and 2) A language in which to write automated scripts. The methods of interaction are as varied as the available tools, but typically the AST will take control of the computer system and move the mouse or provide input much like the user or subsystem would. It finds the ‘correct’ place to click, type, or provide (stimulate) by recognizing the object types (buttons, commands, text boxes, etc.), as well as interacting with them as necessary. How well the objects are found and how ‘appropriate’ the interactions are will vary from tool to tool.

Regardless, behind the scenes there is normally a set of instructions written in the tool’s language. This set of instructions is most commonly referred to as a “script”, even though “script” is not technically accurate in some cases. Scripts can be very much like application source code, simply a set of actions in a table, or some hybrid of both.

It is not the purpose of this article or ASQ Software Division to say which AST tools are best, or better than another, but only a few universally ‘excellent’ tools such as Test Data Generators, File Comparison Tools, and Simulation Tools, appear available. Similar to the UI portion, the usability and flexibility of the automation language will vary with the tool.

(Kenneth White is a Test Automation Architect with Advanced Solutions International, Austin. Texas and a Member of ASQ Section 1414 and ASQ CQIA)

Back to top


Standards Chair Report
Article by Theresa Hunt, CSQE, CSTE, Chair, ASQ Software Division Standards

This article briefs the current activities of the US Technical Advisory Group (TAG) to JTC1/SC7, summarizing the most recent meeting of the TAG, meeting #58 which was held March 25-27, 2008 in Bethesda, Maryland and addresses other news from the Software Division Standards Committee.

For those who may be unfamiliar with how the JTC1SC7 TAG is organized, or just need a refresher, Sub-Committee Seven (SC7 - Software and Systems Engineering) is part of the Joint Technical Committee One (JTC1 - Information Technology) of the International Organization for Standardization and International Electro-technical Commission (ISO/IEC). The U.S. secretariat for JTC1 is the American National Standards Institute (ANSI). ANSI is the Unites States’ member body of ISO. JTC1/SC7 is currently comprised of the following active Working Groups (WGs):

WG2 – Systems and Software Documentation
WG4 – Tools and CASE Environments
WG6 – Measurement and Metrics
WG7 – Life Cycle Management
WG10 – Process Assessment
WG19 – ODP and Modeling Languages
WG20 – Software Engineering Body of Knowledge
WG21 – Software Asset Management
WG22 – Software and Systems Engineering Consolidated Vocabulary
WG23 – Systems Quality Management
WG24 – Software Lifecycles for Very Small Enterprises
WG25 – Information Technology Services Management
WG26 – Testing
WG42 – Architecture
WG1A – IT Governance
JWG with ISO/TC159/SC4 – Common Industry Format for Usability

The following summarizes the work of the WG Task Groups (TGs) who participated in meeting # 58:

Task Group #2 – Systems & Software Documentation
New Work Item Proposal (NWIP) – Software and Systems Engineering – Requirements for acquirers and suppliers of user documentation (proposed ISO/IEC 26512): TG2 recommended and the TAG voted to accept the NWIP and to support the addition of this new work item into the joint technical committee.

Committee Draft (CD) 26513 Testing and Assessment of User Documentation: TG2 recommended and the TAG voted to approve with comments, mainly to reflect consistency with ISO/IEC 12207:2008 and ISO/IEC 15288:2008.

Other WG2 actions:

  • Began Working Draft (WG) revisions to ISO/IEC 15289, content of systems and software life cycle process information products (documentation) – the objectives, process, and schedule were reviewed with TG7. Revisions will reflect consistency with ISO/IEC 12207:2008 and ISO/IEC 15288:2008.
  • Project 26514, Systems and software engineering – Requirements for designers and developers of user documentation was re-sent for Final Draft International Standard (FDIS) ballot on 1/20/2008

Task Group # 6 – Measurement and Metrics
Final Committee Draft (FCD) 25012.2: Software Engineering: Software product Quality Requirements and Evaluation (SQuaRE) - Data Quality Model: TG6 recommended ballot
disapproval unless submitted changes are made. Mainly, the document attempts to categorize characteristics into artificial categories (Intrinsic/Extended) in a confusing way that does not add value.

CD 25040.2: Software engineering - Software product Quality Requirements and Evaluation (SQuaRE) – Evaluation reference model and guide: TG6 recommended ballot approval with changes, mainly a concern that the scope creeps into 25030

Combined WD circulation and CD registration, WD25045, Software Engineering – Software product Quality Requirements and Evaluation (SQuaRE) – Evaluation Module for Recoverability: TG6 recommended ballot approval with comments; mainly the Technical Report (TR) contains a conformance clause.

Combined WD circulation and Preliminary Draft Technical Report (PDTR) registration, WD25060, Software product Quality Requirements and Evaluation (SQuaRE) — Common Industry Format (CIF) for Usability — General Framework for Usability-related Information: TG6 recommended ballot approval with comments.

NWIP for the revision of IS 19761: Software Engineering - COSMIC: A functional size measurement method, Specification of Data Value Domain: TG6 approved with comments and recommended that the document be moved to FCD status and remain within the purview of WG6.

Task Group #7 - Life Cycle Management
ISO/IEC 15288, System Live Cycle processes – ed. 2 and ISO/IEC 12207 Software Life Cycle processes – ed. 2 were both published in March 2008. Both works were approved with a 90% ballot approval.

ISO/IEC 24748 TR, Guide for Life Cycle Management is currently going through the PDTR balloting process. TG7 recommended approval with no comments. Revisions will be coordinated with IEEE CS in the same manner as is done with 15288 and 12207.

ISO/IEC 19760 TR, A guide for the application of 15288 and ISO/IEC 15271 TR, A guide for the application of 12207. TG7 discussed the annotated outlines and reached an agreement on content outline and population of sections, primarily only short summary information from 24748 should be included with the focus being on unique information in these guides with pointers back to 24748 for common information and definitions.

ISO/IEC 15026, Systems and Software Assurance. This standard is being broken up into 4 parts. TG7 recommended approval of the proposal for subdivision of the standard, noting that parts 2 & 4 need to be worked in parallel and approved in roughly the same time frame to ensure adequate integration between the parts. The recommendation was approved by the TAG

ISO/IEC 16326, Life cycle processes - Project Management. The latest publication of this standard is a result of its harmonization with IEEE Std 1058. The FCD ballot closed December. 2007 and comments were submitted with a recommendation to approve with comments. The TG7 lead will request an advance copy of proposed comments disposition from the WG7 convener.

ISO/IEC/IEEE 26702, Application and Management of the System Engineering Process. There is a proposal for revision of 26702 to align with 15288. A draft New Work Item Proposal (NWIP) will be submitted at the 2009 plenary meeting.

NWIP – Proposed fast track of INCOSE Systems Engineering Handbook V3.1. Analysis shows significant consistency and compatibility issues with 15288:2008. WG7 recommends follow-up with INCOSE to discuss the identified issues and to decline submission of the proposed NWIP and contribution of the document until issues are resolved. The TAG approved the recommendation

Task Group #10 – Process Assessment
Theresa Hunt will be the new TG10 Lead with John Phippen assuming responsibilities for the international meetings.

NWIP N3915 – Benchmarking Framework and NWIP N3910 – IT Process Assessment Capability target profiles were approved by the TAG.

ISO/IEC 15504-8 8 Information technology – Process Assessment – Part 8: An exemplar process assessment model for IT service management. This new project was passed however TG10 and the US TAG maintains its opposition to the ISO development of a process assessment model, maintaining that it should be left to the marketplace.

Following is the current status of the 15504 series:

  • ISO/IEC 15504-1:2004 Information technology -- Process assessment -- Part 1: Concepts and vocabulary - Published Standard, Edition 1, Stage 90.60 (close of review)
  • ISO/IEC 15504-2:2003 Information technology -- Process assessment -- Part 2: Performing an assessment - Published Standard, Edition 1, Stage 60.60 (IS Published)
  • ISO/IEC 15504-2:2003/Cor 1:2004 - Published Standard, Edition 1, Stage 60.60 (IS Published)
  • ISO/IEC 15504-3:2004 Information technology -- Process assessment -- Part 3: Guidance on performing an assessment - Published Standard, Edition 1, Stage 90.60 (close of review)
  • ISO/IEC 15504-4:2004 Information technology -- Process assessment -- Part 4: Guidance on use for process improvement and process capability determination - Published Standard, Edition 1, Stage 60.60 (IS Published)
  • ISO/IEC 15504-5:2006 Information technology -- Process Assessment -- Part 5: An exemplar Process Assessment Model - Published Standard, Edition 1, Stage 60.60 (IS Published)
  • ISO/IEC PRF TR 15504-6 Information technology – Process Assessment – Part 6: An exemplar System Life Cycle Process Assessment Model - Under Development - Edition 1, stage 50.60 (close of voting. Proof returned by secretariat)
  • ISO/IEC DTR 15504-7 Information technology – Process Assessment – Part 7: Assessment of Organizational Maturity - Under Development - Edition 1, stage 40.99 (Full report circulated: DIS approved for registration as FDIS)
  • ISO/IEC NP TR 15504-8 Information technology – Process Assessment – Part 8: An exemplar process assessment model for IT service management - Under Development - Edition 1, stage 10.99 (New project approved). The US TAG maintains its opposition to the ISO development of this process assessment model, maintaining that it should be left to the marketplace.

Task Group #21 – Asset Management

In brief:
ISO/IEC 19770-1:2006, Information technology – Software asset management – Part 1: Processes. This standard is up for review this year.

ISO/IEC 19770-2 is circulating as a working draft and will move to the final draft stage in May, 2008. IBSMA publications 19770 – 2 draft

  • Copyright has been transferred to ISO
  • ISO accepted release of draft that was published July, 2007

ISO/IEC 19770-3 has no activity.

TG21 is forming a study group to look at the Assessment Model under the 15504 framework.

Task Group #22 – Software & Systems Consolidated Vocabulary
ISO/IEC FCD 24765, Systems and software engineering vocabulary. The vote to approve FCD 24765 is in process with balloting due on July 17, 2008. The online vocabulary status is currently 3358 terms, 4169 definitions, 173 terms with definitions, and 103 sources. For this project, IEEE contributed IEEE Std 610.12, SC 7 contributed its vocabulary aggregations, and other sources, e.g. PMI, made contributions. The IEEE Computer Society (CS) has developed and is hosting a database application providing public web access: http://www.computer.org/sevocab. SC 7/WG 22 will maintain the database and will facilitate the consolidation of alternative definitions. Occasional snapshots of the database will be published as ISO/IEC and IEEE 24765.

Task Group 25 – IT Service Management
ISO/IEC 20000-2:2005, Information technology – Service management – Part 2: Code of practice. The US considers the call for comment at this time to be unconventional and requests deferral until completion of comment resolution and readiness for publication of part 1 (Specification).

ISO/IEC CD 20000-3, Information technology -- Service management -- Part 3: Guidance for the scoping and applicability of ISO/IEC 20000-1.TG25 provided comments and recommended ballot disapproval, noting that fundamental problems with parts 1 and 2 should be addressed before additional parts are produced, diverting resources. The recommendation was approved by the TAG.

Proposed ISO/IEC 20000-4, Information technology – Process Reference Model (PRM). The US maintains its opposition to the ISO development of these artifacts (PRMs), maintaining that it should be left to the marketplace.

NWIP for ISO/IEC IT Governance. TG25 recommended approval of this NWIP.

Task Group 26 – Testing
The NP ballot for this new series of standards was approved in May 2007 and IEEE-CS will provide the editor. IEEE has contributed both IEEE 829 and IEEE 1008 and BSI has contributed both BS 7925-1 and BS 7925-2 as base documents for this project. There will be four parts to the new standard which will cover a larger scope than the current standards. Parts 2 and 3 will be developed in parallel and parts 1 and 4 will be developed in parallel.

Task Group 42 – Architecture
ISO/IEC 42010:2007 Systems and software engineering -- Recommended practice for architectural description of software-intensive systems – Working Draft 2. TG42 reviewed and dispositioned comments on WD2 of 42010. Release of a Committee Draft is planned for October 2008. TG42 and TG 7 met to discuss the integration of 42010 and 15288 and to identify/align 15288 processes that produce inputs for 42010 descriptions.

Ad Hoc Task Group
FCD ISO/IEC 18018, Information Technology – Guidance to Configuration Management Tool Capabilities. The TAG voted to approve the FCD ballot with comments, mainly a request to update Annex A to reflect the revised content of 12207:2008 and 15288:2008.

IEEE Liaison
Two of the largest collections of Software and Systems Engineering standards belong to SC7 and IEEE. Some IEEE standards are adopted by SC7 and some SC7 standards are adopted by IEEE. Sometimes, IEEE and SC7 merge their respective standards or perform a coordinated development of a new standard or a revision. IEEE is both a member of the US TAG to SC 7 and a Category A liaison to SC7.

New projects underway in IEEE:

  • Adopt ISO/IEC 15289 to replace IEEE 12207.1
  • Adopt ISO/IEC 20000-1 and 20000-2 to provide a shared basis for work on IT Services an Management
  • Rejected adoption of ISO/IEC 25051, Software engineering -- Software product Quality Requirements and Evaluation (SQuaRE) -- Requirements for quality of Commercial Off-The-Shelf (COTS) software product and instructions for testing Comments will be sent to WG6 for their consideration in revising the standard. The balloting group also decided to withdraw the competing IEEE standard.

Upcoming U.S. TAG Meetings
TAG Meeting #59: September 16 – 18, 2008, IDA (Institute of Defense Analysis) Alexandria, VA

TAG Meeting #60: March, 2009, MSI Systems Integrators Portland, OR

International: Interim meeting (dates to be determined) – China, SC7 Plenary meeting, (dates to be determined) India

Other News on Standards
I am pleased to announce that the software division now has representation on the Association for the Advancement of Medical Instrumentation (AAMI) Medical Device Software Standards Committee. David Walker will be the point of contact for the standards committee. (See next article) This broadens the standards committee involvement in industry-specific standards work, allowing us to represent the interests of even more of our membership.

Those interested in any of the topics mentioned above, or other standards-related issues, can send email to Theresa Hunt, the Software Division Standards Chair, at theresahunt@cfl.rr.com or Theresa.Hunt@GDIT.com

Back to top


ASQ SD Membership on Medical Device Software Standards
(Article by David Walker, Immediate Past Chair, ASQ Software Division)

In June, 2008, the Software Division established a liaison with the AAMI (Association for the Advancement of Medical Instrumentation) Medical Device Software Standards Committee. I have been granted membership with voting rights and will be providing a voice for ASQ on this committee.

This is a very important win/win relationship as ASQ has the knowledge and skills in applying safety and quality principles to product development and AAMI has the knowledge and skills in the medical device domain. Software is of special importance as it increasingly controls electronics and mechanical systems in medical devices. International standards are in-work that will drive the future of how software intensive medical devices are developed. In 2005, I served on this committee in the domestic review of the IEC 62304 Medical Device Software Lifecycle. I am excited to play a key role in this and will be providing reports in future newsletters to keep you all informed of new developments.

First up is a new IEC/TR 80002, Medical device software - Guidance on the application of ISO 14971 to medical device software, which has recently been issued to AAMI/SW for ballot as a CD. This international standard reinforces some of the principles and philosophies from the AAMI TIR 32 Software Risk Management but aligns better with IEC 62304. I will provide a summary of this new important standard in our next newsletter.
Don’t be afraid to log into our discussion boards, too. We have had some good medical device software discussions there: www.asq.org/software/discuss/index.html

Back to top


New IEEE-Std-829, Software Quality Assurance Plan
Article by Eva Freund, IEEE CSDP, ASQ CSQE
Edited by Claire Lohr [Lohr Systems], Chair, 829 Working Group

On March 27, 2008 the IEEE Standards Association approved for publication a brand new version of an existing IEEE standard. It is estimated that in July 2008 the community of system and software testers will have available the new IEEE Standard for System and Software Test Documentation. The prior standard was a standard that solely described the format and content of numerous items of test documentation. The new standard removes some items of test documentation and modifies the format and content of the remaining items. In addition the new 829 standard provides the following changes:

New directions

  • Introduces the concept that the test effort has tasks to accomplish during the entire development life cycle, not merely during the test activity.
  • Moves from a document focus to a process focus. This is in keeping with the IEEE Standards Association direction.

New test related documentation

  • Adds a Master Test Plan. This standard governs the management of a large and/or complex test effort.
  • Adds a Master Test Report. May summarize the results of the tasks identified in the Master Test Plan. May be used to consolidate results for multiple Level Test Reports.
  • Adds a Level Interim Test Status Report. This is used during the test execution activity.
  • Moves away from stand-alone documents. This standard recognizes that some projects may desire to have some stand-alone and some combined documents and allows for any combination of plan, design, test cases, and test procedures within test levels.
  • Adds a process for choosing appropriate documentation and contents.
  • Moves away from requiring identical documentation. This standard provides for documentation based on the integrity level of the project. Identifies minimum recommended tasks for the identified integrity level.

New processes

  • Introduces the concept of integrity levels. Provides a mechanism by which projects can identify their integrity level. The higher the integrity level the more test tasks that are recommended.
  • Introduces the concept of test management. Describes tasks that are exclusive to those who manage a test effort.

The following key concepts are emphasized in this new 829 Standard:

  • Integrity Levels. Defines four integrity levels to describe the importance of the software or system aspects to the user. The process of identifying the integrity level is the criticality analysis. Each project or organization identifies the aspect of the system or software that is most important.
  • Recommended minimum testing tasks for each integrity level. Defines the recommended minimum testing tasks required for each of the four integrity levels. Includes a table of optional testing tasks for tailoring the test effort to meet project needs and application specific characteristics. A low integrity level project such as an internal bug-tracking program requires fewer test tasks than would a high integrity level project such as one developing software/firmware for medical devices.
  • Intensity and rigor applied to testing tasks. Introduces the notion that the integrity and rigor applied to testing tasks vary according to the integrity level. Higher integrity levels require the application of greater intensity and rigor. A high integrity level project such as one developing medical devices may execute a myriad of tests for each unit as well as for integration and system/acceptance tests. These tests will likely go to the depth of each test level looking for every conceivable deficiency. While a low integrity level project may only do acceptance test against the primary functionalities rather than system testing against the requirements.
  • Detailed criteria for testing tasks. Defines specific criteria for each testing task including minimum recommended criteria for correctness, consistency, completeness, accuracy, readability, and testability.
  • Systems viewpoint. Includes recommended minimum testing tasks to respond to system needs. Recognizes that software does not exist in isolation and that much of current software development may actually be for software intensive systems or for embedded firmware. Thus the entire system needs to be taken into account when identifying the system integrity level and the resultant test tasks.
  • Selection of test documentation. Both the types of test documentation and the content topics within each documentation type need to be selected based on the testing tasks associated with the identified integrity level. The prior standard required every project to use the same test documents and to include the same information. The current standard provides for tailoring based on the integrity level. Thus a high integrity level project (e.g., medical devices) will require the full range of test documentation and contents as described in the 829-2008 standard. Conversely a low integrity level project may require only a minimum quantity of test plan information and a full range of test case and test procedure information.
  • Compliance with International and IEEE Standards. The standard is mapped to specific content requirements of IEEE/EIA 12207.0-1997 and IEEE/EIA 12207.0-1998. It is similarly mapped to IEEE/EIA 12207.1-1997 and IEEE/EIA 12207.1-1998. In addition, it is in conformance with IEEE Std. 1012-2004 and is applicable for use with ISO 15288.

Submitted by: Eva Freund [The IV&V Group, Inc.], Vice-chair 829 Working Group Edited by Claire Lohr [Lohr Systems], Chair, 829 Working Group

Back to top


FROM THE REGIONS
Regions

The following links will provide you with a snapshot of the latest activities in the regions.

Region 4 (Canada): Chris Fitzgibbon

This column provides members with information on relevant association meetings, conferences and other events in Canada. Be sure to visit the Software Division website for updated information: http://www.asq.org/software/. If you have information that you would like to share with fellow ASQ Software Division members, or you would like to get involved with the Division, contact me at chris@orioncanada.com or (613) 563-9000.

National
The Software Division is a proud partner of the World Congress for Software Quality (WCSQ) that is being held in Bethesda, Maryland (Washington, D.C. area) on September 15 to 18, 2008. This is a major international gathering of software quality professionals that is definitely worth attending. More information is available at: http://www.asq.org/conferences/wcsq/index.html.

Other upcoming conferences include:
Software Quality & Testing within the Agile Development Process by the Association of Software Testing (CAST) in Toronto on July 14-16, 2008 (http://www.cast2008.org)
Agile 2008 Conference in Toronto on August 4 - 8, 2008 (http://agile2008.org)
QUEST Toronto 2008 (formally the Toronto Quality Conference) at the Toronto Hilton September 22-26, 2008 (http://www.qaiquest.org/toronto/)

Western Canada
I recently met with an organizer of the Software Quality Assurance Vancouver User Group (VanQ). The last meeting of the season was held at the end of May. The topic was "Increasing Test Effort Estimation Effectiveness", presented by Trevor Atkins of Silverpath Technologies. April's presenter, Jay Marino of Eclipsys Corporation, discussed advanced tools for application performance testing. Vanq meetings are typically held at the Burnaby campus of the British Columbia Institute of Technology (BCIT). Watch their website for the fall schedule of events: www.vanq.org.

The Calgary-based IEEE/ASQ Discussion Group for Software Quality resumes meetings in September 2008. Check the website (www.software-quality.ab.ca) for registration, location and upcoming topics.

Eastern Canada
The topic was “Six Short Talks about Software Testing” at the May event of the Toronto Association of Systems and Software Quality (TASSQ). The general meeting was held in June. TASSQ typically has dinner meetings at the Sheraton Centre Toronto Hotel on the last Tuesday of each month. Additional information on upcoming events is available on their website: www.tassq.org.

The Toronto SPIN closed its year with a presentation on "Use Cases and Estimation" on June 19, 2008. The fall line-up will be posted on their website (www.torontospin.com) later this summer.

The Montreal Software Process Improvement Network (SPIN) is currently planning the fall line-up. The website is a good source for French language information on software quality: http://www.spin-montreal.org/.

The Ottawa Software Quality Association (OSQA) and Ottawa SPIN event for May was titled “Can I Have My Requirements and Test Them Too?”. Meetings will resume in the fall. Monthly meetings are typically held at the Travelodge Hotel & Convention Centre in Ottawa. Additional information is available on the Ottawa SPIN website: www.spin.org.

The Information Systems Audit and Control Association’s (ISACA) Ottawa Valley Chapter had its annual general meeting on June 19, 2008. The Ottawa Valley Chapter celebrates its 30th year in 2008/2009.

Region 8: Greg Zimmerman

Hello from the new guy! I’m the new Regional Councilor for Region 8. If you have any information on local software related events or would like to get the word out to Region 8 members on upcoming conferences and volunteer organizing opportunities, please drop me a note.

The Software Engineering Institute lives in our region, yet there are very few SPIN groups in the area. If you are interested in participating in or helping to start up a SPIN, particularly in the Columbus or Cleveland metro areas, I’d like to hear from you.

The Central Ohio Quality Assurance Association (COQAA) is offering refresher courses for software certifications from the Quality Assurance Institute (QAI). QAI is a US-based organization specializing in software quality. Visit www.coqaa.org for more info.

Greg Zimmerman – Region 8, ASQ Software Division gregz@appliedqualitysolutions.com

Region 10: Louise Tamres

The Great Lakes Software Excellence Conference will be held Nov. 4 -5 in Grand Rapids, Michigan. Potential speakers can submit their proposals by July 31. Conference information is available at glsec.org.

The Great Lakes SPIN is on summer hiatus. Find out about their upcoming programs and events at gl-spin.org.

The Southeastern Michigan Software Quality Assurance Association (SEMISQAA) is sponsoring QAI training sessions. QAI is a national organization specializing in software quality. Information available at semisqaa.org.

Ann Arbor Software Quality Professionals (AASQP) has suspended its monthly meetings due to low participation. The mailing list remains active as a source of communication and information. Access is through the Yahoo group tech.groups.yahoo.com/group/aa-sqp.

ASQ chapters in southeastern Michigan provide programs primarily related to manufacturing. Any programs focusing on software and software quality will definitely be highlighted when available.

If you have any information for Region 10 members or need referrals for software quality matters, please give me a shout at: l.tamres@computer.org

Back to top


Questions or comments?
Contact:
Software Division Web Site
Software Division Leadership

How can we improve Software Quality Live? Did this issue provide helpful information? Let us know!
Kathy Trest, Newsletter Editor

  • Print this page
  • Save this page

Average Rating

Rating

Out of 0 Ratings
Rate this item

View comments
Add comments
Comments FAQ

ASQ News

 

Follow the
Software Division

Twitter  LinkedIn