metrics, developer experience, distributed workforce, developers, data platform Grafana Labs DORA Splunk New Relic dashboards GitLab platform

While platform engineering is relatively new, it is not entirely novel. According to the ’State of DevOps Report: The Evolution of Platform Engineering’ by Puppet in 2024, 43% of organizations report having had a platform team for three to five years already. This indicates that companies active in this field have had time to learn about developer experience. This raises some interesting questions: How do we measure developer experience? How do we know that platform brings value to the company? What are these metrics? Which ones do I need for my tasks at hand and how should I approach the question of metrics in platform engineering?

Understanding Core Metrics

My professional experiences primarily revolve around my role at Jooble, where I serve as the platform lead, and at Paydock, where I am a product owner of the foundation squad working on developer experience. Both these roles helped me see the value of platforms in examples. In a competitive world, where technology and user needs evolve rapidly, staying relevant in the market requires quick adaptation. The ability to swiftly develop and release high-quality software is crucial. How can you adapt quickly if you release it once every three months? As we know, we can’t change what we can’t measure, underscoring the necessity of metrics in platform engineering.

DORA Metrics

DORA metrics is a good starting point, if you don’t know where to start. DORA stands for DevOps research and assessment and provides a standard widely used set of metrics for evaluating process performance and maturity. For example, the change lead time measures how long it takes for a change to go from committed to deployed. The deployment frequency determines how frequently changes are pushed to production. The change failure rate measures how often a deployment introduces failure that requires immediate intervention. The failed deployment recovery time shows how long it takes to recover from a failed deployment.

From my experience, these metrics are valuable for gaining insights and as mentioned are widely used in the industry. The metrics of stability (change failure rate and failed deployment recovery time) focus on quality, while the metrics of throughput (change lead time and deployment frequency) focus on feedback and ease of detecting problems. Platform teams primarily aim to enable fast-flow software delivery. These metrics are excellent for tracking progress toward this goal, as they measure both the speed and quality of the delivery process.

Is Happiness a Metric?

DORA metrics provide us with quantitative data, which is essential for understanding how the company is performing. However, it is equally important to engage with engineers and communicate with them regularly. For example, you can run quarterly surveys to check developer happiness, track changes and see if the features you are building increase developer satisfaction and happiness. This can reveal blind spots and ensure that your product truly has valuable features that engineers can relate to.

It is crucial to remember that the main purpose of having a platform team is to make the lives of engineers better. How can you achieve this if they are not using the features you built? This can also be measured: When you release a new functionality for your platform, make sure to measure if any engineer is using it and how usage changes over time. Then, when you run a survey to check developer happiness, you can correlate the data. If you see high usage of the newly released features and survey results show increased happiness, it can be a sign that you are heading in the right direction.

Additional Metrics

While the metrics described earlier serve as a solid foundation, it is essential to recognize that each product is unique. You may find it beneficial to introduce additional metrics or tailor the suggested ones to align with your specific use case. From my experience, a critical question that facilitates understanding of what metrics to measure is: What is the company’s primary goal? Is the objective to expedite releases, enhance quality, ensure stability or achieve another strategic aim? It is advisable not to answer this question in isolation. Engage your team and stakeholders in a collaborative brainstorming session. This approach not only yields superior outcomes but also ensures the metrics chosen are highly relevant.

Tailoring Metrics for Success

In conclusion, while DORA metrics form a solid foundation, each company must tailor these metrics to their specific needs. To ensure successful platform operations, establish and routinely review comprehensive metrics that align with organizational goals and stakeholder needs. Track both performance and improvement metrics and integrate operational parameters with team enablement and tool adoption indicators. By understanding and implementing these key metrics, platform engineering teams can significantly boost their efficiency and effectiveness, leading to streamlined operations and ultimately, more satisfied and loyal customers.

Tailoring Metrics for Success

By continuously refining and adapting metrics to fit your company’s unique objectives and engaging with your engineering team, you can foster a collaborative environment that emphasizes both productivity and satisfaction. Remember, the ultimate goal of platform engineering is to enhance the developer experience so that engineers can spend more time delivering valuable features to users. Balancing quantitative DORA metrics with qualitative feedback ensures you are addressing both performance and happiness. This dual approach will drive meaningful improvements, help achieve long-term success for your platform, enable you to stay competitive in the market and create a product users love.

Techstrong TV

Click full-screen to enable volume control
Watch latest episodes and shows

SHARE THIS STORY