Inspired by a rant against information overload, I looked up Sturgeon's Law which might be paraphrased as "90% of everything is crap". That in turn got me thinking about the Pareto principle (a.k.a. the 80/20 rule: 80% of the effects relate to 20% of the causes). The numbers in both statements are arbitrary and indicative, not literal. The 80% or 90% values are meant to convey "a large proportion" and bear no special significance beyond that. Adding the phrase "of the order of" would not materially affect either statement.
I'm also reminded that (according to Stephen Wright) "42.7% of statistics are made up on the spot", while Benjamin Disraeli's "lies, damned lies, and statistics" reminds us that numbers can be used to mislead as much as to inform.
So how does this relate to PRAGMATIC security metrics?
It is especially pertinent to the Accuracy and Meaningfulness criteria.
Most metrics can be made more Accurate by taking a greater number of measurements and/or being more careful and precise in the measurement process. The number of readings is statistically relevant when we are sampling from a population: the more samples we take, the more accurately we can estimate the total population. Measurement precision depends on factors such as the quality of the measuring instruments and the care we take to determine and record each value. Taking repeated measurements on the same sample is another way to increase the Accuracy. However, that extra Accuracy comes at a Cost in terms of the time, effort and resources consumed.
Greater Accuracy may increase the validity and precision of the metric, but is that valuable or necessary?
Regarding Meaningfulness, the fact that we have special terms for them implies that, despite their imprecision, rules of thumb are valuable. Rough approximations give us a useful starting point, a default frame of reference and a set of assumptions that are in the right ballpark and close enough for government work.
A long long time ago, I dimly recall being taught arithmetic at school, back in those cold dark days before electronic calculators and computers came to dominate our lives, when digital calculations meant using all ten fingers. We learnt our 'times tables' by rote. We were shown how to do addition, subtraction and division with pencil and paper (yes, remember those?!). We looked up logarithms and sines in printed tables. When calculators came along, difficult calculations became even easier and quicker, but so too did simple errors, hence we were taught to estimate the correct answer before calculating it, using that to spot gross errors. It's a skill I still use to this day, because being 'roughly right' often trumps being 'precisely wrong'. To put that another way, there are risks associated with unnecessary precision. At best, being highly accurate is often - though not always - a waste of time and effort. Paradoxically, a conscious decision to use the rounding function or reduce the number of significant figures displayed in a column of numbers can increase the utility and value of a spreadsheet by reducing unnecessary distractions. Implicitly knowing roughly how much change to expect when buying a newspaper with a $20 note has literally saved me money.
The broader point concerns information in general, not just numbers or security metrics of course. A brief executive summary of an article gives us just enough of a clue to decide whether to invest our valuable time in reading the entire thing. A precis or extract is meant to portray the flavor of the piece, not condense its entirety.
So, to sum up this ramble, don't dismiss imprecise, inaccurate, rough measures out of hand. As the name suggests, "indicators" are there to indicate things, not to define them to the Nth degree. A metric that delivers a rough-and-ready indication of the organization's security status, for instance, that is cheap and easy enough to be updated every month or so, is probably more use to management than the annual IT audit that sucks in resources like a black hole, and reports things that are history by the time they appear in the oh so nicely bound audit report.
Regarding Meaningfulness, the fact that we have special terms for them implies that, despite their imprecision, rules of thumb are valuable. Rough approximations give us a useful starting point, a default frame of reference and a set of assumptions that are in the right ballpark and close enough for government work.
A long long time ago, I dimly recall being taught arithmetic at school, back in those cold dark days before electronic calculators and computers came to dominate our lives, when digital calculations meant using all ten fingers. We learnt our 'times tables' by rote. We were shown how to do addition, subtraction and division with pencil and paper (yes, remember those?!). We looked up logarithms and sines in printed tables. When calculators came along, difficult calculations became even easier and quicker, but so too did simple errors, hence we were taught to estimate the correct answer before calculating it, using that to spot gross errors. It's a skill I still use to this day, because being 'roughly right' often trumps being 'precisely wrong'. To put that another way, there are risks associated with unnecessary precision. At best, being highly accurate is often - though not always - a waste of time and effort. Paradoxically, a conscious decision to use the rounding function or reduce the number of significant figures displayed in a column of numbers can increase the utility and value of a spreadsheet by reducing unnecessary distractions. Implicitly knowing roughly how much change to expect when buying a newspaper with a $20 note has literally saved me money.
The broader point concerns information in general, not just numbers or security metrics of course. A brief executive summary of an article gives us just enough of a clue to decide whether to invest our valuable time in reading the entire thing. A precis or extract is meant to portray the flavor of the piece, not condense its entirety.
So, to sum up this ramble, don't dismiss imprecise, inaccurate, rough measures out of hand. As the name suggests, "indicators" are there to indicate things, not to define them to the Nth degree. A metric that delivers a rough-and-ready indication of the organization's security status, for instance, that is cheap and easy enough to be updated every month or so, is probably more use to management than the annual IT audit that sucks in resources like a black hole, and reports things that are history by the time they appear in the oh so nicely bound audit report.
No comments:
Post a Comment
Have your say!