15 February 2016

We don't know, we just don't know UPDATED


Crime-related metrics are troublesome for several reasons.  

Firstly, crime tends to be hidden, out of sight, mostly in the shadows. An unknown number of crimes are never discovered, hence recognized/identified incidents may not be representative of the entire population. Criminals might brag about their exploits to their posse but they are hardly likely to participate willingly in surveys.

Secondly, criminals can't be trusted so even if they did complete the forms, we probably shouldn't swallow their responses. Mind you, if the surveys weren't designed scientifically with extreme care over the precise questions, proper selection of the samples, rigorous statistical analysis, honest reporting etc., then all bets are off. 

Thirdly, the police, governments/authorities, the news media, assorted commercial organizations, professions, industry bodies and pressure groups all have vested interests too, meaning that we probably shouldn't believe their surveys and assessments either, at least not uncritically*. Guess what, if an organization's income or power depends to some extent on the size of The Problem, they may, conceivably, allegedly, be tempted to slightly over-emphasize things, perhaps exaggerating, oh just a little and down-playing or ignoring inconvenient metrics and findings that don't quite align with their world view or objectives. [This one applies to me too as an infosec pro, but recognizing my inherent bias is not the same as counteracting it.]

Fourthly, the metrics vary, for example in how they define or categorize crimes, what countries or areas they cover, and the measurement methods employed. Are US homicide numbers directly comparable with murders in, say, the UK? Are they even comparable, period-on-period, within any constituency? Would deliberately killing someone by running them over 'count' as a car crime, murder, accident, crime of passion, and/or what?

Fifthly, the effects of crime are also hard to account for, especially if you appreciate that they extend beyond the immediate victims. Society as a whole suffers in all sorts of ways because of crime. These effects and the associated costs are widely distributed. 

Sixthly, and lastly for now, crime is inherently scary, hence crime metrics are scary or eye-catching anyway. We risk losing our sense of perspective when considering 'facts' such as the skyrocketing rates of gun crime, home invasions, child abductions or whatever in relation to all the normal humdrum risks of everyday life, let alone all those scares about smoking, obesity, stress, heart disease and cancer. The emotional impact of crime metrics and the way they are portrayed in various media introduces yet more bias. [By the way, the same consideration applies to security metrics: perhaps we should explore that tangent another day.]

So, with all that and more in mind, what are we to make of cybercrime? How many cybercrimes are there? How many remain unidentified? To what extent can we trust our information sources? How do we even define, let alone measure, cybercrime? What is The Problem, and how big is it? And does it really matter anyway if the answer is bound to be scary?

Well yes it does matter because all sorts of things are predicated on cybercrime statistics - strategies, policies (public, corporate and personal), risk assessments, investment and spending plans, budgets and so forth. 

The right answer might be: we don't know. Good luck with all those predicates if that's your final answer! Phone a friend? 50/50?

* Update Feb 20th: according to Cybercrime costs more than you think, "Cybercrime costs the global economy about $450 billion each year", a factoid used (for reasons that are not entirely obvious) to support a call for organizations to plan for incidents. Their sources are not clearly referenced but the paper appears to draw on a glossy report by Allianz, an insurance company with an obvious self-interest in pumping-up the threat level. The Allianz report in turn cited studies by the Ponemon Institute and by McAfee with the Center for Strategic and International Studies, three further organizations with axes to grind in this space. To their credit, the 2014 McAfee/CSIS study openly acknowledged the poor quality of the available data - for instance stating: "... we found two divergent estimates for the European Union, one saying losses in the EU totaled only $16 billion, far less than the aggregate for those EU countries where we could find data, and another putting losses for the EU at close to a trillion dollars, more than we could find for the entire world ..." They also noted particular difficulties in estimating the costs of theft of intellectual property, while simultaneously claiming that IP theft is the most significant component of loss. Naturally, such carefully-worded caveats buried deep in the guts of the McAfee/CSIS study didn't quite make it through to the Allianz glossy or the sales leaflets that cite it. It's a neat example of how, once you unpick things, you discover that incomplete and unreliable information, coupled with rumours, intuition, guesswork, marketing hyperbole and weasel words, have morphed via factoids, soundbytes and headline horrors into 'fact'. Hardly a sound basis for strategic decision-making, or indeed for purchasing commercial goods and services. 

No comments:

Post a Comment

Have your say!