You are on page 1of 27

UNIT IV

MANAGED AND OPTIMIZED PROCESS

Data Gathering and Analysis The principles of data gathering: The data are gathered in accordance with specific objectives and plans; The choice of data to be gathered is based on a model or hypothesis about the process being examined; The data gathering process must consider the impact of data gathering on the entire organization; The data gathering plan must have management support.

The objectives of data gathering are: Understanding valuation !ontrol "rediction

#UT$ # !%& 'U(: )eorge *iller$ a famous psychologist once said: +,n truth$ a good case can be made that if your -nowledge is meager and unsatisfactory$ the last thing in the world you should do is ma-e measurements. The chance is negligible that you will measure the right things accidentally.+ ./0T : This is probably in response to the famous 1uotation from (ord 2elvin: +3hen you can measure what you are spea-ing about$ and express it in numbers$ you -now something about it; but when you cannot measure it$ when you cannot express it in numbers$ your -nowledge is of a meager and unsatisfactory -ind; it may be the beginning of -nowledge$ but you have scarcely advanced to the stage of science.+4

*ar- Twain - +!ollecting data is li-e collecting garbage. 5ou need to -now what you6re going to do with it before you collect it.+ 7paraphrase8 )iven enough random numbers and enough time$ one can draw inferences from data and events most li-ely independent$ such as the stoc- mar-et goes up in years when the /ational (eague team wins the 3orld 9eries.

T: ,*"%!T 0' ;%T% )%T: &,/) 0/ %/ 0&)%/,<%T,0/ *ust consider the effects of measurements on the people and the effects of people on the measurements. 3hen people -now they are being measured$ their performance will change 7the :awthorn ffect8; they will give top priority to improving the measure. ,n the software business$ employees must -now the numbers will not be used against them; otherwise$ they will ma-e the numbers loo- good regardless of reality. !ollecting data is tedious and must generally be done by software professionals who are already very busy. Unless they and their immediate managers are convinced that the data are important$ they either won6t do it or will not be very careful when they do. ,f professionals are shown how the data will help them$ they will be interested in the results and exercise greater care.

'or reasons of objectivity and cost$ it is important to automate as much as possible. 9pecial data gathering instrumentation is expensive$ so it is wise to ma-e data collection a part of the existing software engineering processes whenever possible. ,f portions of the 9!* function are automated$ they can provide -ey data with relatively little extra effort. *anagement support id6s critical. ;ata gathering must be viewed as an investment. 0nce useful results are evident$ project managers willing eager to support the data collection.

% study at the /%9% )oddard 9pace 'light !enter6s 9oftware ngineering (aboratory shows that data collection can ta-e =>? of a development budget. The main reason is the wor- in manual and done by inexperienced people. 3ith improved tools$ these costs can go down. ven so$ it is crucial that data be defined precisely to insure the right information is obtained and validated and to insure it accurately represents the process.

T5",!%( "&0#( *9 %/; !%U9 9 ,/ ;%T% )%T: &,/)

Problem ;ata are not correct ;ata not timely ;ata not measured or indexed properly Too much data needed

Typi al Ca!se &aw data entered incorrectly. ;ata were generated carelessly. ;ata not generated rapidly enough. &aw data not gathered consistently with purpose of analysis. Too many coefficients in model.

/eeded data do not exist /o one retained data. ;ata don6t exist.

;%T% )%T: &,/) "(%/ The plan should be developed by the 9 ") with the help of those who will gather the data. ,t should cover the following topics: 3hat data is needed by whom and for what purpose@ *a-e it clear it will not be used for personnel evaluation. 3hat are the data specifications@ %ll definitions must be clear. 3ho will support data gathering@ :ow will the data be gathered@ %ppropriate forms and data entry facilities must be available. :ow will the data be validated@ 9oftware process data are error-prone$ so it should be validated as 1uic-ly as possible. :ow will the data be managed@ /eed a reliable staff with suitable facilities

#asili reported that software process data may be as much as >A? erroneous$ even when recorded by the programmers at the time they first found the errors. # !%& 'U(B

90'T3%& * %9U& 9 ;ata !haracteristics: The measures should be robust - They should be repeatable$ precise$ and relatively insensitive to minor changes in tools$ methods$ and product characteristics. 0therwise$ variations could be caused by measurement anomalies instead of by software processes. The measures should suggest a norm - ;efect measures$ for example$ should have a lower value when best$ with zero being desirable. The measures should suggest an improvement strategy !omplexity measures$ for example$ should imply a reduction of complexity. They should be a natural result of the process The measures should be simple

They should be both predictable and traceable - *ea-

sures are of most value when they are projected ahead of time and then compared with the actual experience. "roject personnel can then see better how to change their behavior to improve the result.

9oftware measures may be classified as: 0bjectiveC9ubjective - ;istinguishes ones that count things and those involving human judgement; %bsoluteC&elative - %bsolute measures are invariant to the addition of new terms 7such as the size of a program8. &elative measures change 7such as the mean of test scores8. 0bjective measures tend to be absolute$ while subjective measures tend to be relative. xplicitC;erived - xplicit measures are often ta-en directly$ while derived measures are computed from explicit measures or from other derived measures. "rogrammer months expended on a project is an explicit measure$ while productivity per month in (0! is a derived measure.

;ynamicC9tatic - ;ynamic measures have a time dimension$ as with errors found per month. 9tatic measures remain invariant$ as with total defects found during development. "redictiveC xplanatory - "redictive measures can be obtained in advance$ while explanatory measures are produced after the fact.

The following is from an article by 9. "fleeger "essons "earned in #!ilding a Corporate Metri s Pro$ gram 9hari "fleeger IEEE Software$ *ay =DDE =. #egin with the process - ;evelopers must understand the need for the metrics. 0therwise$ they may not provide accurate data or use the results of the analysis. F. 2eep the metrics close to the developers - ;o not form a separate metrics group.

E. 9tart with people who need help; let then do the advertising for you. G. %utomate as much as possible. >. 2eep things simple and easy to understand H. !apture whatever you can without burdening developers I. ,f the developers don6t want to$ don6t ma-e them. J. Using some metrics is better than using no metrics. D. Use different metrics when needed =A. !riticize the process and the product$ not the people

90'T3%& 9,< * %9U& 9 The assumption is that effort re1uired is directly related to program size. Unfortunately$ there is no universal measure of program size because program size is not a simple subject. ;o we count new$ deleted$ reused$ etc. lines@ 3hat about higher-level languages versus assembly language@ 3hat about comment statements@

(0! possibilities: 9ome alternative ways of counting (0! are: xecutable lines xecutable lines plus data definitions xecutable lines$ data definitions$ and comments xecutable lines$ data definitions$ comments$ and K!( "hysical lines on a screen (ogical delimiters$ such as semicolons 0nly new lines /ew and changed lines /ew$ changed$ and reused lines %ll delivered lines plus temporary scaffold code %ll delivered lines$ temporary scaffolding$ and support code

&&0& ;%T% rror - :uman mista-es. !ould be typographical$ syntactic$ semantic$ etc. ;efect - ,mproper program conditions as the result of an error. /ot all errors produce program defects$ and not all defects are caused by programmers 7bad pac-aging or handling$ for example8. #ug 7fault8 - "rogram defect encountered in operation$ either under test or in use. #ugs result from defects$ but all defects do not cause bugs 7some are never found8. 'ailures - % malfunction of a user6s installation. !ould result from a bug$ incorrect installation$ a hardware failure$ etc. "roblems - User-encountered difficulties. may result from failures$ misuse$ or misunderstanding. "roblems are human events; failures are system events.

!(%99 9 0' ; ' !T * %9U& ;efects can be classified along dimensions of: 9everity 9ymptoms 3here found 3hen found 7unit test$ system test$ etc.8 :ow found 7inspection$ testing$ etc.8 3here caused 7what part of software8 3hen caused 7design$ etc.8 :ow caused 7logical$ data definition$ etc.8 3here fixed 3hen fixed :ow fixed

9tatistical brea-downs of errors are given below 7Tables =>.H=>.J8. %/%(59,9 0' ,/9" !T,0/ &%T 9 "reparation time is the sum of preparation times for all individuals involved in an inspection. ,nspection time is the time spent by the entire team in the inspection process. 'or a five-person inspection group$ one inspection hour e1uals five programmer inspection hours$ while one preparation hour e1uals one programmer preparation hour. ,n the following data$ it appears the author used an average of the individual preparation times rather than a sum. 7see 'igure =>.F8

rrors found per 2(0! decline with increasing inspection rate. %n upper limit of EAA to GAA (0! of 'ortran might be reasonable.

Managing So%t&are '!ality *otivation is the -ey to good wor-. management must challenge developers. ,f senior management tolerates poor wor-$ sloppiness will pervade the organization. *eetings must start on time$ reports must be accurate$ etc. /o ris-$ no heroesCheroines. 'our basic 1uality principles: 3ithout aggressive 1uality goals$ nothing will change; ,f these goals are not numerical$ then 1uality will remain tal-; 3ithout 1uality plans$ only you are committed to 1uality; Luality plans are just paper unless you trac- and review them.

De%e t Pre(ention *ust prevent defects because: To meet the escalating needs of society$ progressively larger and more complex software will be needed. 9ince these programs will be used in increasingly sensitive applications$ software defects will li-ely become progressively more damaging to society. 'or the foreseeable future$ programs will continue to be designed by error-prone humans. This means that$ with present methods$ the number and severity of bugs encountered by systems users will increase.

The principles of defect prevention: "rogrammers must evaluate their own errors. 'eedbac- is an essential part of defect prevention. There is no single cure-all that will solve all problems. "rocess improvement must be an integral part of the process. "rocess improvement tales time to learn. The steps of software defect prevention: ;efect reporting !ause analysis %ction plan development "erformance trac-ing 9taring over - do it all again for the most prevalent remaining defects.

rror cause categories: Technological - ;efinability of the problem$ feasibility of solving it$ availability of tools and procedures. 0rganizational - ;ivision of wor-load$ available information$ communication$ resources. :istoric - :istory of project$ of the program$ special situations$ and external influences. )roup dynamic - 3illingness to cooperate$ distribution of roles inside the project group. ,ndividual - xperience$ talent$ and constitution of the individual programmer.

You might also like