Breach Costs “per record”

It makes a lot of sense to normalize data coming in from many different sources so that it the individual values can be compared more easily to each other. This is done at a ratio level of measurement (the other levels are nominal, ordinal, and interval, in that order with ratio as the highest level.

However, using this scale assumes there is a reason to — that the number of things being counted (in this case costs) correlates well with the normalizer (denominator).

It is common when estimating costs of data breaches to quote costs "per record". Most recently, Ponemon Institute released a study that asserted a cost of $202 per record for data breaches. But here's the problem — the bulk of the costs of breaches are not variable costs (or at least it isn't clear to me that they are). These costs appear to be fixed costs.

In the Ponemon example, the variable cost – notification -  constitutes $15 of that $202, or about 7.5%. This doesn't strike me as significant enough to keep using per record ratios as a measurement.

Why does this matter? Because, to make this information useful, people may want to plan around it. And counting up the total number of records you have, then multiplying by $202 to figure out potential losses (from there, they would need to discount based on probability of loss, but that's another post).

It seems unlikely to me that when you have a million records versus ten million records that you will only lose 1/10th as much in, say, lost business.

All the studies use per record and I think this is misleading. Better to use numbers per enterprise, as a percent of revenue, or perhaps even percent of all costs (I'm brainstorming here).