Metrics is a topic in QA/Test that gets a lot of people worked into a tizzy. And rightfully so. It is an area full of data manipulation, incorrect assessments and magic numbers. So lets add some more fuel to the flames (an apt cliche since I’m sitting on a train that has been halted because the fire marshal has closed the tracks a couple km ahead).

Sandy Kemsley spent the first half of this week at a Gartner BPM conference and was live blogging it. In her post about Michael Smith she brings up his metric hierarchy.

He organizes metrics into three levels: accounting metrics at the highest level, which are often regulated and audited; performance metrics, which are non-regulated but are key performance indicators for that industry; and analytical metrics, which are specific to the company but explain the performance metrics. It’s important to differentiate between performance metrics and analytical metrics, and not jump straight down to the fine-grained detail of the latter without considering the industry KPIs

Since everyone has different metrics they monitor, I’ll let you do your own mapping, but I would guess that things like code coverage would be an ‘accounting metric’, number of authentications per second would be a performance one and number of database hits per transaction probably falls into the analytical bucket.