You are on page 1of 27

On a Sub process trail

SEPG Conference March 2010

COPYRIGHT NOTICE Copyright 2008 Polaris Software Lab Limited All rights reserved. These materials are confidential and proprietary to Polaris and no part of these materials should be reproduced, published in any form by any means, electronic or mechanical including photocopy or any information storage or retrieval system nor should the materials be disclosed to third parties without the express written authorization of Polaris Software Lab Limited.

About Polaris Setting the context


Corporate Profile
20 years of experience in providing Banking, Financial Services & Insurance (BFSI) technology solutions Amongst the largest players specializing in BFSI with over 2,000 projects and 25,000 person years of experience Global Delivery Capabilities
8 Business Solution Centers, 5 campuses 4 global near shore development centers

Focused on Long-Term Customer Relationships


Repeat business of over 85% Over 10 of the Top 15 global banks and 6 of the top 10 global insurance companies are our clients CAGR of over 70%

1986

1993
Polaris Software Lab

1999
Listing on major stock exchanges

2003
Merger with Orbitech, A Citi company

2004-09
Transformation from a pureplay services organization to an IP-focused solutions company

Genesis of Polaris

Incorporated

Poised for accelerated growth


2

Quality@Polaris - The Milestones


Certified at CMMI DEV v1.2 Level Certified at CMMI - -DEV v1.2 Level 55 ISO 9001 Re-certification ISO 9001 Re-certification ISO 27001 Certified ISO 27001 Certified Re-certified in new version of ISO 9001:2000 Re-certified in new version of ISO 9001:2000 BS7799 certified BS7799 certified Assessed at CMMi Level Assessed at CMMi Level 55 Jointly Assessed at CMM L3 with Jointly Assessed at CMM L3 with Citibank & Certified ISO 9001:1994 Citibank & Certified ISO 9001:1994

Early adopter CMM Early adopter CMM Level Level 33

TQM TQM

1991

1995

1998 1999

2001

2004

2007

2009

Maturity Profile. From Measuring & Tracking Trending

Predicting
3

Exploring Phase

DNA of High Maturity Organizations


Sub Process along with attributes and measures fundamental to High Maturity Organizations

Identification of right sub process


Selection of sub process choice between Phase (too broad) and Activity ( too granular)

it y ? Activ

ase ? Ph
Striking a balance
6

Emerging Phase

Organizational Measurement Framework


Customer
Polaris Management Committee
Business Objectives Operational Measurement Framework HR Performance Management System
Balanced Score Card Balanced Score Card Framework Framework Corporate Corporate functions goals functions goals Business Sol Business Sol Center goals Center goals Geography Geography Goals Goals Org Process Org Process Improvement goals Improvement goals

Measurement Measurement Framework Framework Engagement Engagement goals goals

Engagement Directors Engagement Directors goals goals

Account Directors Account Directors goals goals

Objectives Objectives

Project Team members goals

Projects Goals

Business Objectives

Measurement Definition Document (GQM) Policy Note

Organization Quality and Process Objectives

Customer and Other relevant stakeholder objective

Contract Requirement Documents Proposal Initiation Meeting with Client NFR


Programs All project Teams Quality IT HR

Renegotiate Project Goals Establish Project Goals Quality Plan Metrics Forms
No

Use DAR,CAR, Innovation & Resolve


Yes

No

Are they realistic?


Yes

Derive Interim Goals (Phase wise)

Quality Plan Metrics forms

Distribute/align individual project goals to participating projects

Projects Quality Goals Overall & Interim

Ref: Understanding CMMI High Maturity Practices (SEI training material)

Sub process repository A snapshot


Business Driver
Customer Satisfaction Profitable Growth Customer Satisfaction Profitable Growth Customer Satisfaction Profitable Growth Profitable Growth Customer Satisfaction Profitable Growth Customer Satisfaction Customer Satisfaction Profitable Growth

Process
Review Review

Sub Process

Attribute

Measure

Stratification
type of unit (simple, medium, complex) size in pages

Analysis Technique
C Chart (Stratified per Unit) or U Chart (Varying Size) XmR Chart C Chart (Stratified per Unit) or U Chart (Varying Size) XmR Chart C Chart (Stratified per Unit) or U Chart (Varying Size) XmR Chart XmR Chart C Chart (Stratified per Unit) or U Chart (Varying Size)

Error density (EDdr) - errors per Design/Solution Doc unit or per page Review Effectiveness (plotted per Design/Solution Doc unit) Review Execution Design review Review Efficiency

Time for review per Design/Solution Doc unit (plotted per Design/Solution type of unit (simple, medium, Doc unit complex)

Review Review

type of unit (simple, medium, Review Effectiveness Error density (EDur) - errors per UTP unit or per page (plotted per UTP unit) complex) size in pages Review Execution - UTP type of unit (simple, medium, Review Efficiency Time for review per UTP unit (plotted per UTP unit) complex) review type of unit (simple, medium, complex) size in pages type of unit (simple, medium, complex) by unit type type of unit (simple, medium, complex) sloc ranges type of unit (simple, medium, complex) sloc ranges

Review Review

Review Effectiveness Error density (EDcr) - errors per unit or per sloc (plotted per unit) Review Execution - Code review Review Efficiency Time for review per unit (plotted per unit) Coding Performance Productivity - Actual Effort by unit category (plotted by units) Test Effectiveness Error density (EDiut) defects per unit or per sloc (plotted per unit)

Construction Coding Construction

Construction Test Execution (IUT) UAT Problem Resolution Readiness for SLA Commitment

Test Efficiency Problem Resolution Responsiveness

Time for testing per unit type (plotted per unit)

XmR Chart XmR Chart

Problem turnaround time-duration in hours by problem Problem correction effort- effort in hours per problem severity type, Module wise

UAT Ongoing Support

Turnaround Time (TAT) - turnaround time per day (for critical & high) and per week (for low and medium)-duration in hours by problem SLA Responsiveness Ratio of Issues not meeting SLA to Total Issues on a daily basis

Impact Analysis

IA Efficiency

Impact Analysis time

severity type, Module wise Problem type Type of CR (Simple, Medium, Complex)

XmR Chart, P Chart

XmR Chart

10

The Connect
Project Quality goals (Overall and Interim) Identify sub processes Renegotiate objectives project

Identify critical sub processes List of other goals to be managed Check if they are suitable for Statistical management

Identify frequency Identify measures Identify analysis techniques Analyze sub process interactions Check relationship with existing models and adjust Pareto Pie Stacked Bar
No Yes Yes

No No

Meet Goals
Yes

Look for more sub processes

Baseline repository

Model Repository

Identify risk of either not having historical data or unstable process

Can CAR,DAR, Innovation Plan action and decide on help?

final list of sub processes to managed statistically

Ref: Understanding CMMI High Maturity Practices (SEI training material)

11

Seeds of High Maturity sown


Senior SEPG members attended training on understanding CMMi High Maturity Practices Entire Quality Team trained on statistical techniques ( based on inputs from CMMi High Maturity Practices Training) Research activities initiated on statistical models and techniques Key differences between Retroactive Vs Proactive, Aggregation Vs Decomposition were understood Independent team of specialists on statistics set up. They were provided orientation on software engineering processes (chosen as better option over training software engineers on statistics)
12

Maturing Phase

13

Organization wide off site workshops on sub processes, statistical analysis and usage of prediction models All levels of management involved Project Leads, Project Managers, Project Directors, Business Unit Heads Brainstorming sessions to identify pain areas / critical sub processes tied to business / customer objectives for each project. Sub process repository enriched Redefinition of organization behavior and paradigm shift in the culture of statistical thinking Revelation - Ability to look beyond the obvious project goals of Effort Variance, Schedule Variance, Defect Density
14

Wide Spectrum of Sub processes


Typical Sub processes - Code Review
CONTROL CHART : C CHART
Enter Chart Name - Code Review Effectiveness (Medium Complexity Units)

(C chart)

IUT (XmR chart)


XmR: "X" Individual and "mR" Moving Range Charts
Chart Uses Control Limits computed from Process Data( Stage2 )1 to 12 Points Enter Chart Name - IUT Test Efficiency (for Simple Units)

15

Typical sub processes


Code Review (U chart )
Control Chart: U-Chart
# of data points = Mean / Lambda () = 17 0.03

Defect Density - Medium complexity code units(size in KLOC)


0 0.03
Major Tim e Scale Specification lim it

# of data points Out of Control = Mean/Lambda remains unchanged

View CLICK

U-Chart does not have constant Control Limits


0.06

D fe ts D n ity e c es

0.05 0.04 0.03 0.02 0.01 12M ay 14M ay 16M ay 18M ay 20M ay 22M ay 24M ay 26M ay 28M ay 0 M on W ed S un T ue T hu T ue T hu

S at

Days

Issue Resolution (P chart)


Control Chart: P-Chart
# of data points = Mean = 21 0.0981 # of data points Out of Control = Mean after removing OCP

Ratio of SLA Not Met for Severe complexity production issues


1 0.0938
Major Tim e Scale Specification lim it

F ri

severe CLICK

P - Chart does not have Constant Control Limits


0.3 0.25 0.2 0.15 0.1 0.05 0 04Jan 06Jan 08Jan 10Jan 12Jan 14Jan 16Jan 18Jan 20Jan 22Jan 24Jan M on W ed T ue T hu T ue S un S un T hu S at F ri F ri

Days

16

Out of Box - Sub processes


Emergency Outages Monitoring (XmR chart )

Query Performance Optimization (Confidence Interval)


Variable of Interest Query Response Time in Seconds Calculation Type Information Sample Standard Deviation Sample Mean Sample Size Confidence Level Calculations Standard Error of the Mean Degrees of Freedom t value Interval Half Width Result Lower Limit Upper Limit Auto Common s S n 1- Auto 0.55031 0.358431 51 0.99 Manual

s / n df n-1

0.077059 50 2.677793 0.206348

0.152084 0.564779

17

Out of Box - Sub processes


System Downtime Tracking (XmR chart )
XmR: "X" Individual and "mR" Moving Range Charts
Chart Uses Control Limits computed from Process Data Stage2 System Downtime - current data XMR

End of Day (EOD) monitoring (XmR Chart)


Chart Uses Control Limits computed from Process Data( Stage2 )1 to 12 Points EOD (in Minutes)

18

Support Infrastructure Tools


Easy to use control chart templates Minitab and Crystal Ball software Dashboards to monitor weekly progress

Champions
Quantum Leap Ambassadors SPCs at each business unit

Training
Regular Quantum Leap sessions Quantum Leap Competency assessment (along with HR)
Training across all levels : 100% Senior Management trained 100% Quality and SEPG trained 900 Project Directors &Project Managers trained across the organization

People
Dedicated team of statisticians attached to every business unit

19

Excelling Phase

20

Improvement in Resource Utilization Case Study 1

Customer goal for resource utilization >85 % and Polaris goal > 90% Resource Utilization dropped to 81.10% due to delayed clarifications from customer Action plan Team started monitoring the Clarification TAT as a critical sub process using control charts Outliers were identified and taken up with customer Updates provided to customer on weekly basis on aging of issues Over a period of time, improvement seen in TAT with shift in process mean from 2.23 days to 0.85 Offshore resource utilization improved to 97%
21

Emergency Production Outage Analysis - Case Study 2

Project Scenario
Monitoring Frequency Prediction Model Outage Prediction Model

Quantitative Goal Process Ongoing Production Support

Sub process Emergency Outages Monitoring

Attribute Outages Reported

Measure Number of Outages per month.

Stratification Emergency Outages

Interim Goal # of outages = 3 to 4 in every month. Reducing the downtime duration per outages from 2 to 1.5 hours in next 6 months

Reducing Outages from current 0-8 range

Monthly

Impact

Reducing the downtime per outages from 2 hours to 1.5 hour.

Ongoing Production Support

System Downtime Tracking

Outages Reported

Downtime per outage

Emergency Outages

NA Monthly

Application Downtime/Outages leads to financial transactions not getting effected thereby causing loss in business to end user. Hence the project wants to reduce outages. Historical data of Jan 2007 till Jan 2008(13 months) shows 109 hours of downtime due to 56 outages # of Outages below 4 per month and duration of outage below 1.5 hrs are project specific goals given by the customer

Outage Reduction

Actions
Monitoring slots wise ( to have individual responsible for a slot) to ensure ownership of monitoring the system during a slot. A checklist is filled after each slot is completed. The same is filled with details of monitoring done during the slot and mailed to client to give the health check status. This checklist is enhanced as and when new functionality goes into production and needs to be monitored for stability and performance.

22

Need for a strong Code Review Process Case Study 3 Poke Yoke path Control Chart: U-Chart
# of data points = 17 0.03 # of data points Out of Control = Mean/Lambda remains unchanged 0.03 0
Major Time Scale Specification limit

View CLICK

Coding

D e f e c ts D e n s i t y

Self Check by developer using view points

Mean / Lambda () =

U-Chart does not have constant Control Limits


0.06 0.05 0.04 0.03 0.02 0.01 0 12M ay Tue 14M ay Thu 16M ay Sat 18M ay M on 20M ay W ed 22M ay F ri 24M ay Sun 26M ay Tue 28M ay Thu

Days

Unit Testing

IUT

Peer Code Review using viewpoints

READY for Customer ?

Assurance of code quality through view points. U-Chart implemented to monitor defect density at micro-level. Reduction of code review defects by 27%. Faster delivery to customer through elimination of IUT
23

Breaking New Ground


Ripple Effects of Sub processes Monitoring EOD monitoring Analysis of run time pattern of the Cards EOD programs to check if the overall run time will meet the desired objective by predicting the minimum run time and the available time for critical programs to complete Program efficiency - Number of Threads being executed in a single Java program Application performance monitoring - Optimum commit size for achieving the application batch cycle time of less than 15 minutes

24

The Complete Life Cycle

Org Business Objectives Customer Objectives Control limits from historical Spec Limits from Prediction baseline repository model

Project Goals Critical Sub process under SPC


Control Chart: U-Chart
# of data points = Mean / Lambda () = 17 0.03 # of data points Out of Control = Mean/Lambda remains unchanged 0.03 0
Major Time Scale Specification limit

View CLICK

U-Chart does not have constant Control Limits


0.06

D e fe c ts D e n s ity

0.05 0.04 0.03 0.02 0.01 0 12May Tu e 14May Th u 16May Sat 18May Mon 20May Wed 22May F ri 24May Sun 26May Tu e 28May Th u

Special Cause
XMR Chart
Comparison Chart of Control Limits
P dc n s e ro u tio Is u s 1 80 60 40 20 0 1 7 13 19 25 +3S L=47.60 +2S L=38.37 +1S L=29.14 _ X=19.91 -1S L= 10.69 -2S L= 1.46 LB =0 31 Day 37 43 49 55 61 41

Days

Historical
30.00

Recompute 1-25

25.00

25.00

20.00 Yae - l s vu 17.76


1

Stability Effectiveness of actions (PMR)

41

15.00

15.00
M v gR n e o in a g

60 45 30 15 0 1 7 13 19 25 31 37 O bs er v ation 43 49 55 61 +3S L=34.01 +2S L=26.14 +1S L=18.28 __ M R= 10.41 -1S L= 2.54 -2S -3S L= 0

10.00

8.84 5.00

5.00

0.00

0.00

0.00

Common Cause resulting in reduction in mean and variance

Process Performance shift Tangible benefits 25

References
Understanding CMMI High Maturity Practices (SEI training material) Measuring the Software Process : statistical process control for software process improvement by William Florac & Anitha Carleton Understanding Variation : The Key to Managing Chaos by Donald Wheeler CMMI : Guidelines for Process Integration and Product Improvement by Mary Beth Chrissis, Mike Konrad and Sandy Shrum

26

Thank You
rama.sivaraman@polaris.co.in sudha.gopalakrishnan@polaris.co.in Website: www.polaris.co.in
COPYRIGHT NOTICE Copyright 2008 Polaris Software Lab Limited All rights reserved. These materials are confidential and proprietary to Polaris and no part of these materials should be reproduced, published in any form by any means, electronic or mechanical including photocopy or any information storage or retrieval system nor should the materials be disclosed to third parties without the express written authorization of Polaris Software Lab Limited.

You might also like