Skip to content

0003: Adopting DORA Metrics for Measuring Team Health

STATUS

Request for Comments

CONTEXT

As our organization continues to grow and adopt DevOps practices, there is a need for a standardized set of metrics to measure the health of our software delivery teams. We have adopted Team Health Checks for a qualitative measure of team health, but a more quantitative view is also needed. Traditional metrics such as lines of code, bug count, and velocity are inadequate and counterproductive in capturing the true effectiveness of teams. We need a more holistic approach that takes into account not just output, but also the quality, speed, and stability of our software delivery process.

Considered Options

  • DORA Metrics: Begin collecting DevOps Research and Assessment metrics.

  • Developing Custom Metrics: We explored the possibility of developing our own custom metrics tailored to our specific needs. While this approach could provide a more tailored solution, it would require significant effort and may lack the industry-wide acceptance and validation of the DORA metrics.

DECISION

We have decided to adopt the DORA (DevOps Research and Assessment) metrics as a standard for measuring the performance and health of our software delivery teams. The DORA metrics are based on extensive research and data analysis, and have been widely adopted by organizations across various industries.

The four DORA metrics are:

  1. Deployment Frequency: The frequency at which an organization successfully releases new software to production. Higher deployment frequencies are better.

  2. Lead Time for Changes: The time it takes for a new code commit to be deployed to production. Lower lead times are better.

  3. Mean Time to Restore (MTTR): The time it takes an organization to recover from a service disruption or incident. Lower MTTR values are better.

  4. Change Failure Rate: The percentage of deployments that result in degraded service or require remediation (e.g., rollbacks, fixes). Lower change failure rates are better.

We will use Datadog's DORA metrics analysis tool to both capture and visualize these metrics.

CONSEQUENCES

Benefits

  • Standardization: Adopting a widely accepted and research-backed set of metrics will provide a common language and framework for measuring team performance across the organization.
  • Holistic View: The DORA metrics cover various aspects of software delivery, including speed, quality, and stability, providing a more comprehensive view of team health.
  • Data-Driven Decisions: By collecting and analyzing DORA metric data, we can make data-driven decisions about process improvements, tooling investments, and team staffing.
  • Continuous Improvement: Tracking and monitoring DORA metrics over time will allow teams to identify areas for improvement and measure the effectiveness of their efforts.

Risks

  • Data Collection Challenges: Collecting accurate and consistent data for DORA metrics may require changes to our existing tooling and processes, which can be time-consuming and resource-intensive.
  • Cultural Shift: Adopting new metrics may require a cultural shift within teams, as they adapt to new performance measurement criteria.
  • Interpretation Challenges: Interpreting DORA metric data and deriving actionable insights may require specialized expertise and training.

NOTES

References

Original Author

Approval date

Approved by

Appendix