In the book “Mythical Man-Month,” Fred Brooks demonstrates that the most efficient developers, the so-called “10x developers,” code ten times more efficiently than their average counterparts.
However, that’s not the norm.
What is the norm is that software and operations teams want to continually improve.
But measuring that can be difficult.
So, as engineering leaders, we’re always looking for ways to improve our teams’ performance.
Here are three powerful frameworks to measure developer effectiveness.
Each offers unique insights into developer productivity and team performance. Which do you use in your organization? Let us know in the comments.
DORA (DevOps Research and Assessment)
Origin: DORA metrics originated from the DevOps Research and Assessment program, which was started by Dr. Nicole Forsgren, Jez Humble , and Gene Kim. Google acquired DORA in 2018 and has continued to publish annual State of #DevOps reports.
Example Metrics:
↳Deployment Frequency: 3 times per day
↳Lead Time for Changes: 2 days
↳Mean Time to Recover: 30 minutes
↳Change Failure Rate: 5%
SPACE (SPACE is the acronym for satisfaction, performance, activity, communication, and efficiency)
Origin: The SPACE framework was introduced in 2021 by researchers from GitHub and Microsoft, including Dr. Nicole Forsgren, Margaret-Anne Storey, and others. It was published in the Communications of the ACM journal.
Example Metrics:
↳Satisfaction: 85% of developers report high job satisfaction
↳Performance: 95% of sprint goals met
↳Activity: Average of 5 pull requests submitted per developer per week
↳Communication & Collaboration: 3 hours spent in cross-functional meetings per week
↳Efficiency & Flow: 4 hours of uninterrupted coding time per day
DevEx (Developer Experience)
Origin: While the concept of Developer Experience has been around for a while, it was popularized and formalized by Spotify’s engineering team. They introduced their #DevEx framework in a series of blog posts and conference talks starting in 2020.
Example Metrics:
↳Feedback Loop: Average time from code commit to test results is 10 minutes
↳Cognitive Load: Developers report spending 70% of time on new features vs. 30% on maintenance
↳Flow State: Developers achieve 3 hours of deep work per day
↳Ease of Tool Use: 90% of developers rate internal tools as “easy to use”
↳Onboarding Time: New developers are productive within 2 weeks
At #Pulumi, we are confident that we can help you with your deployment metrics. Find out why at https://hubs.ly/Q02JBvtW0