Defect Containment

Purpose

The more defects are detected (and fixed) before they come to production i.e. the software come to end-users - the easier (and cheaper) a maintenance of such software - and the higher a customer satisfaction about the quality of the software. 

Defect Containment shows a percentage of defects found before a software is delivered to end-users - in comparison to a total number of submitted defects for the same month. So, this metric helps assess an overall efficiency of the Quality Assurance process on a project.

How metric helps

As said above, Defect Containment helps to evaluate a testing process efficiency in the project. If a value of Defect Containment remains non-green for several months in a row, it is a good indication to review the testing strategy for the purpose of its improvement.

Metric:

  • shows amount of bugs logged by internal team vs all defects logged for the same month

  • shows defect leakage to PROD, i.e. how many issues found by end users

  • shows quality of testing

  • shows how many issues are found during user acceptance process, or alfa-/beta- testing

  • shows how effective your chosen way of working is at preventing defects from escaping

  • shows how effective changes that were done (if any) in QA process

Some ways to prevent production defects:

  • Have a dedicated Testing capacity in a project team (manual, automated, or mix, or both);

  • Apply test-driven development principles;

  • Introduce continuous integration approach;

  • Do not neglect documentation (specifications, test cases);

  • Monitor the ongoing Quality Debt and keep it under control.

How metric works

Chart overview

Chart shows a percentage of defects found internally (Axis Y) by month (Axis X). Additional info is provided on hover over a column:

  • Defect Containment % in a considered month.

  • Defects NOT DETECTED by the internal team - defect found after delivery by end users, i.e. missed by a project quality assurance team. In other word this is a Defect Leakage for that month.

  • Defects DETECTED by the internal team - defect found before delivery by a project quality assurance team.

  • Total - number of submitted defects in a considered month.

Legend shows the last calculated Defect Containment value and a difference in a metric value with the previous month.                 

By click on a column a pop up appears. It contains information  from the defect tracking system about defects not detected by the internal team:

  • Defect ID

  • Type

  • Priority

  • Summary

Top problems metric identifies 

  1. There is no Test Plan and/or Strategy

  2. Team misbalance (skills ratio)

  3. Low Quality Of Testing

  4. No Quality Gates for phases

  • Huge regression suite which cannot be executed in full due to limited amount of time or resources

  • Current method/framework of delivering software does not prevent defects from escaping, i.e. there are no unit and integration tests done by development team or Tech Debt / EngX processes are not included into delivery 

  • There is no Definition of Done for features

  • Limitations in testing, i.e. some complex flows cannot be tested from user perspective, or some test data cannot be generated

  • Sprint or / and Product goal or / and business context is not well known by the team

  • Requirements management process is not mature enough, so that improper A/C have been captured, implemented and tested

  • Product backlog shared between several teams

  • There is no issues review or bug scrubbing meeting to revisit priority of issues backlog with purpose of confirmation that it has not been changed, and it still does not have major impact on functionality and business 

Calculation 

There are 2 variations of the metric calculation:

Defect Containment

Defect Containment 3 mo rolling average

Defect Containment

Defect Containment 3 mo rolling average

DF r.avg. = ((Nbugs- Next.bugs)/ Nbugs )* 100%

where

Next.bugs - number of defects submitted by end-users. Criteria for an external defect is defined in Project Settings>Data Sources>Task Tracking System> 'Quality Management'>Criteria for escaped defects.

Nbugs - number of all submitted defects in a considered month.

RAG thresholds: Red - metric value ≤ 80 %; Amber - 80 % < metric value ≤ 95 %; Green > 95 %.

DF r.avg. = ((Nbugs- Next.bugs)/ Nbugs )* 100%

where

Next.bugs - rolling average for the number of submitted external bugs for the last 3 calendar months. Criteria for an external defect is defined in Project Settings>Data Sources>Task Tracking System> 'Quality Management'>Criteria for escaped defects.

Nbugs - average number of all submitted defects in the last 3 calendar months.

How to calculate rolling avg: https://www.portent.com/blog/analytics/rolling-averages-math-moron.htm

Note: for the months 1 & 2 absolute number of external bugs is used instead.

RAG thresholds: Red - metric value ≤ 80 %; Amber - 80 % < metric value ≤ 95 %; Green > 95 %.

NOTE: Defect Containment is sensitive to a rule in project configuration which determines how to distinguish production-originated defects:

Defect Containment can change its values for the previous months in the following cases:

  1. Defects is rejected / closed as Invalid.
    Defect Containment is calculated by created defects in a month. If afterwards a defect is closed with as invalid status it affects the calculation of Defect Containment for the month it is created. Criteria for an invalid defect is defined in Project Settings>Data Sources>Task Tracking System> 'Quality Management'>Criteria for invalid defects.

  2. Firstly marked as internal defect is finally marked 'external'
    Defect found by a project quality assurance team is finally claimed as found by an end-user. For example, the end-user describes a problem that team detected already. So this defect is marked as 'external' and not submitted as another.

  3. Defect is deleted from a defect tracking system
    In some systems for defect tracking, JIRA for example, it is not possible to find information about a defect if it was deleted.

  4. Criteria for escaped defects is changed.
    If criteria for escaped defects is changed Defect Containment is recalculated to reflect this change.

PerfQL

WITH timeline AS ( SELECT generate_series( date_trunc('month', now()) - interval '5 month', date_trunc('month', now()), '1 month' ) as month ), tickets AS ( SELECT * FROM ticket t WHERE NOT is_invalid_defect(t) ), all_bugs AS ( SELECT COUNT(DISTINCT key) as all_bugs, date_trunc('month', created) as month FROM tickets t WHERE is_defect(t) GROUP BY month ), escaped_bugs AS ( SELECT COUNT(DISTINCT key) as escaped, date_trunc('month', created) as month FROM tickets t WHERE is_escaped_defect(t) GROUP BY month ) SELECT to_char(t.month, 'YYYY Mon') as month, coalesce( round( (100*(ab.all_bugs::numeric - coalesce(eb.escaped,0)::numeric) / nullif(ab.all_bugs::numeric,0)),0 ), 100 ) as "Defects not timely detected by internal team" FROM timeline t LEFT JOIN all_bugs ab ON t.month = ab.month LEFT JOIN escaped_bugs eb ON t.month = eb.month;

--------DRILL DOWN---------

select case when url is not null then '[' || key || '](' || url || ')' else key end as "Issue Id", type as "Type", priority as "Priority", summary as "Summary" from ticket t where is_escaped_defect(t) and not is_invalid_defect(t) and to_char(date_trunc('month', created), 'YYYY Mon') = clicked_x_value;

Data Source

Data for the metric can be collected from a task tracking system (Jira, TFS/VSTS, Rally, etc.)