Emergent Chaos has a useful post that provides an example in education where metrics lead to inappropriate conclusions. He ends with advice:
There’s two important takeaways. First, if you’re reading "scorecards"
from somewhere, make sure you understand the nitty gritty details.
Second, if you’re designing metrics, consider what perverse incentives
and results you may be getting.
His example is a classic "rate of change" versus "coverage" challenge that highlights the law of diminishing marginal returns. I frequently joke with vendors who suggest that their "year over year revenue increased by 400%" that going from $1 to $4 is not so exciting; having a baseline to work with helps. (This is also somewhat related to the dearth of features by rev 3 or 4 of any product – the stuff that finally gets integrated has already been accounted for in previous rev marketing .
But there is a broader issue here – selecting the correct metric to do the job. Since gaming metrics systems are a standard objection to doing metrics, what exactly are the security metrics that people don’t like? Here are two of mine:
- From privacyrights.org: TOTAL
number of records containing sensitive personal information involved
in security breaches. And given the history, this is about as clear a case of gaming that you’ll find in security, IMO. And don’t forget this. - Jeff Jones’ (et.al.) "Days of Risk" – primarily because he is using it as a measure of vendor responsiveness to security issues while the general public sees it as a measure of the vulnerable state of the systems. This latter belief makes the metric, which only counts days between disclosure and patch, horrible.
On the enterprise side, I have heard folks say at one time or another that the percent of security spending over total IT spending is a bad measure, as is number of firewall drops. I like both of these if they are put into the right context.
So what is your least favorite security metric?