You are on page 1of 15

Software Quality

Assurance
McCalls factor model tree
Define and give example of
each
Usability
Integrity
Efficiency
Correctness
Reliability
Maintainability
Testability
Flexibility
Reusability
Portability
Interoperability
Operability
Training
Communicativeness Usability
Input/Output volume
Input/Output gate Integrity
Access Control
Access Audit Efficiency
Storage efficiency
Execution Efficiency
Correctness
Traceability
Completeness
Reliability
Accuracy
Error Tolerance
Consistency
Maintainability
Simplicity
Conciseness Testability
Instrumentation
Expandability Flexibility
Generality
Self-Descriptiveness Reusability
Modularity
Machine Independence Portability
Software System Independence
Communications Commonality
Interoperability
Data Commonality
Verifiability refers to design and programming features that
enable efficient verification of the design and programming. (Ex. modularity,
to simplicity, and to adherence to
documentation and programming guidelines)

Expandability - refer to future efforts that will be needed to


serve larger populations, improve service, or add new applications in order
to improve usability.

Safety - meant to eliminate conditions hazardous to operators of equipment


as a result of errors in process control software.

Survivability - refers to the continuity of service (Ex. minimum time


allowed between failures of the system, and the maximum time permitted for
recovery of service)
Who is interested in defining
Software Quality Requiremetns?
The client is not the only party interested in thoroughly defining the
requirements that assure the quality of the software product.

The developer is often interested in adding requirements that


represent his own interests, such as reusability, verifiability and
portability requirements. These may not, however, be of interest to
the client.

Thus, one can expect that a project will be carried out according to
two requirements documents:
The clients requirements document
The developers additional requirements document.
Formal design reviews (DRs)
- requires formal professional approval of their quality as
stipulated in the development contract and demanded by the
procedures applied by the software developer. It should be
emphasized that the developer can continue to the next phase
of the development process only on receipt of formal approval
of these documents.
Peer reviews (inspections and walkthroughs)
- directed at reviewing short documents, chapters or parts of a
report, a coded printout of a software module, and the like.
Expert opinions
- support quality assessment efforts by introducing additional
external capabilities into the organizations in-house
development process.
Software Testing
based on a prepared list of test cases that represent a variety of expected scenarios
objective of the software tests
detection of software faults and other failures to fill the requirements
formal approval of a module or integration setup so that either the next
programming phase can be begun or the completed software system can be
delivered and installed.
Software maintenance services

Corrective maintenance Users support services and correction


of software code and documentation failures.

Adaptive maintenance Adaptation of current software to new


circumstances and customers without changing the basic software
product. These adaptations are usually required when the hardware
system or its components undergo modification (additions or
changes).

Functionality improvement maintenance The functional and


performance related improvement of existing software, carried out
with respect to limited issues.
Model for SQA defect removal
effectiveness and cost

The models quantitative results:

a. The SQA plans total effectiveness in


removing project defects
b. The total costs of removal of project defects
Defects originating
and defect removal costs
Software development Average % of defects Average relative
phase originating in phase defect removal cost
Requirement specification 15% 1
Design 35% 2.5
Unit coding 30% 6.5
Integration coding 10% 16
Documentation 10% 40
System testing ----- 40
Operation ----- 110
Defects removal effectiveness
for quality assurance plans
Quality assurance activity Defects removal Defects removal
effectiveness for effectiveness for
standard SQA plan comprehensive SQA plan
Specification requirement 50% 60%
review
Design inspection ----- 70%
Design review 50% 60%
Code inspection ----- 70%
Unit test 50% 40%
Integration tests 50% 60%
Documentation review 50% 60%
System test 50% 60%
Opertion phase detection 100% 100%
Defects removal effectiveness
for quality assurance plans
Defect removal phase Defect Average relative defect removal cost
removal {cost unit}
effectiveness
Defect origination phase
Req Des Uni Int Doc

Requirement specification (Req) 50% 1 --- --- --- ---


Design (Des) 50% 2.5 1 --- --- ---
Unit coding (Uni) 50% 6.5 2.6 1 --- ---
Integration (Int) 50% 16 6.4 2.5 1
System documentation (Doc) 50% 16 6.4 2.5 1
System testing / Acceptance 50% 40 16 6.2 2.5 1
testing (Sys)
Opertion by customer (after 100% 110 44 17 6.9 2.5
release)
The standard quality assurance plan
The process of removing 100 defects

POD = Phase Originated Defects


PD = Passed Defects (from former phase or former
quality assurance activity)
%FE = % of Filtering Effectiveness (also termed %
screening effectiveness)
RD = Removed Defects
CDR = Cost of Defect Removal
TRC = Total Removal Cost. TRC = RD x CDR.
Defect correction effectiveness
Slide 7.12a relates to updated section 7.4
and cost - standard
ID
specification
PD
Ddoc Dint Duni
plan model of the process of
Ddes Dreq Total
15
7.5
15
7.5 %FE=50

correction 100 defects


(POD=15) 7.5
RD 7.5 7.5
RDRC 1 TRC > 7.5 cu
Req,

Ddoc Dint Duni Ddes Dreq Total Ddoc Dint Duni Ddes Dreq Total

Documentation
ID 35 7.5 42.5 ID 10 5 7.5 4.4 1.0 27.9
PD 17.5 3.8 21.3 %FE=50 PD 5 2.5 3.8 2.2 0.5 14.0 %FE=50
(POD=35

(POD=10)
21.3 14.0
RD 17.5 3.7 21.2 RD 5 2.5 3.7 2.2 0.5 13.9
Design

reviews
RDRC 1 2.5 TRC > 26.8 cu RDRC 1 1 2.5 6.4 16 TRC > 38.9 cu
)

Ddoc Dint Duni Ddes Dreq Total Ddoc Dint Duni Ddes Dreq Total
ID 30 17.5 3.8 51.3 ID 5 2.5 3.8 2.2 0.5 14.0
Unit tests
(POD=30

PD 15 8.8 1.9 25.7 %FE=50 PD 2.5 1.2 1.9 1.1 0.3 7.0 %FE=50

(POD=0)
RD 15 8.7 1.9 25.6 25.7 RD 2.5 1.3 1.9 1.1 0.2 7.0 7.0

System
RDRC 1 2.6 6.5 TRC > 50 cu RDRC 2.5 2.5 6.2 1.6 40 TRC > 50.9 cu

customer (POD=0) tests


)

Ddoc Dint Duni Ddes Dreq Total Ddoc Dint Duni Ddes Dreq Total
ID 10 15 8.8 1.9 35.7 ID 2.5 1.2 1.9 1.1 0.3 7.0

Operation by
Integration

PD 5 7.5 4.4 1.0 17.9 %FE=50 PD 0 0 0 0 0 0 %FE=50


(POD=10)

RD 5 7.5 4.4 0.9 17.8 17.9 RD 2.5 1.2 1.9 1.1 0.3 7.0
RDRC 1 2.5 6.4 1.6 TRC > 66.3 cu RDRC 6.9 6.9 17 44 110 TRC > 139.2 cu
tests

You might also like