DORA Metrics

After years of research, Google’s DevOps Research and Assessment (DORA) team identified four key metrics for evaluating a team’s performance:

Please follow the links above to find out how to create DORA metrics in PERF. 

After that please come back - below you'll find an example of converting values on DORA metrics into scores (based on benchmarks) & creating a composite score metric, which shows averages of four key metrics scores. This will give you a general idea of the status & trend of DORA metrics for your teams.

The pre-requisite for it is to have all 4 aggregated data sets for all 4 DORA metrics already available.

We will convert the value of each DORA metric for each team into scores based on RAG thresholds and then take the average: Composite score = (Deployment Frequency score + Lead Time for Changes score + Mean Time to Recovery + Change Failure Rate)/4.

RAG → Score conversion:

  • RED (Low) = 0

  • AMBER (Medium) = 1

  • GREEN (High) = 2

RAGs are taken from 2022 Google's State of DevOps Report: https://services.google.com/fh/files/misc/2022_state_of_devops_report.pdf

Thresholds on a composite metric could be not so strict - for example:

Here is the aggregated metric's code:

with dora_scores as --Calculating DORA scores ( select a.timeline, a.unit_name, case when deployment_frequency<1 then 0 when deployment_frequency between 1 and 4 then 1 else 2 end as dp_score, case when lead_time_for_changes<7 then 2 when lead_time_for_changes between 7 and 30 then 1 when lead_time_for_changes is null then null else 0 end as ltfc_score, case when mean_time_to_recovery<1 then 2 when mean_time_to_recovery between 1 and 7 then 1 when mean_time_to_recovery is null then null else 0 end as mttr_score, case when change_failure_rate<15 then 2 when change_failure_rate between 15 and 30 then 1 when change_failure_rate is null then null else 0 end as cfr_score from DORA_Deployment_Frequency a join DORA_Lead_Time_for_Changes b on a.timeline=b.timeline and a.unit_name=b.unit_name join DORA_Mean_Time_to_Recovery c on a.timeline=c.timeline and a.unit_name=c.unit_name join DORA_Change_Failure_Rate d on a.timeline=d.timeline and a.unit_name=d.unit_name ) select timeline, round(avg(score),2) as "Composite DORA Score" from ( select timeline, avg(dp_score) as score from dora_scores group by timeline union all select timeline, avg(ltfc_score) as score from dora_scores group by timeline union all select timeline, avg(mttr_score) as score from dora_scores group by timeline union all select timeline, avg(cfr_score) as score from dora_scores group by timeline ) as temp group by timeline order by timeline asc

Here is the drill-down code with High/Medium/Low levels for each metric & team:

select unit_name as "Team", case when dp_score<1 then 'Low' when dp_score>=1 and dp_score<2 then 'Medium' when dp_score>=2 then 'High' else null end as "Deployment Frequency", case when ltfc_score<1 then 'Low' when ltfc_score>=1 and ltfc_score<2 then 'Medium' when ltfc_score>=2 then 'High' else null end as "Lead Time foe Changes", case when mttr_score<1 then 'Low' when mttr_score>=1 and mttr_score<2 then 'Medium' when mttr_score>=2 then 'High' else null end as "Mean Time to Recover", case when cfr_score<1 then 'Low' when cfr_score>=1 and cfr_score<2 then 'Medium' when cfr_score>=2 then 'High' else null end as "Change Failure Rate" from dora_scores where timeline=clicked_x_value order by unit_name asc

Aggregated custom metrics can be added on dashboards for upper units in hierarchy as well as on the team's dashboard itself - in this case metric & the pop-up that appears on the click will contain this particular unit's data only: