Showing posts with label SMotQ. Show all posts
Showing posts with label SMotQ. Show all posts

12 February 2014

PRAGMATIC Security Metric of the Quarter #7

PRAGMATIC Information Security Metric of the Seventh Quarter


According to the overall PRAGMATIC scores assigned by ACME's managers, the latest metric discussed was the top choice in the three months just past, but it was a close-run thing:

Example metric P R A G M A T I C Score
Information security incident management maturity 90 95 70 80 90 85 90 85 90 86%
Information security ascendancy 97 87 15 94 86 90 99 97 99 85%
Quality of system security 83 88 83 73 90 68 80 82 10 73%
Integrity of the information asset inventory 82 66 83 78 80 43 50 66 70 69%
Proportion of systems security-certified 72 79 73 89 68 32 22 89 88 68%
Number of different controls 71 75 72 75 88 30 50 65 43 63%
Controls consistency 78 83 67 60 71 33 27 31 27 53%
Value of information assets owned by each Information Asset Owner 48 64 78 57 79 38 50 22 26 51%
Number of information security events and incidents 70 60 0 50 72 35 35 70 50 49%
% of business units using proven identification & authentication 69 73 72 32 36 4 56 2 50 44%
Distance between employee and visitor parking 1 0 6 93 2 93 66 45 66 41%
Employee turn vs account churn 30 30 11 36 44 36 62 57 20 36%
Non-financial impacts of information security incidents 60 65 0 20 60 6 30 20 17 31%



"Maturity of the organization's information security incident management activities" seems to us to be an excellent proxy or indicator for the organization's overall approach to information security. The maturity scoring process we have described makes this a valuable metric, not just in terms of the final maturity rating but also the additional information that emerges when comparing current practices against accepted good practices.

Just as interesting are the metrics languishing at the bottom of the league table. For example, "Non-financial impacts of incidents" may appear, at first glance, to hold considerable promise as a security metric but the PRAGMATIC score clearly indicates ACME management's severe misgivings once they explored the metric in more detail.

Instead of simply selecting metrics on the basis of their the overall PRAGMATIC scores, management could instead select high-rating metrics for any one of the individual PRAGMATIC criteria, or any combination thereof - for example, 'information security ascendancy' is rated the most predictive and cost-effective security metric of this little lot.

In researching and developing the PRAGMATIC method for the book, we explored the possibility of weighting the PRAGMATIC ratings in order to place more or less emphasis on the criteria. There may be situations where that is a sensible approach but, in the end, we decided that the overall PRAGMATIC score was the most valuable and straightforward metametric.

03 November 2013

PRAGMATIC Security Metric of the Quarter #6


The league table for another 3-month's information security metrics shows a very close race for the top slot:


Metric P R A G M A T I C Score

81 69 89 92 80 99 98 90 98 88%

95 97 70 78 91 89 90 85 90 87%

75 75 90 73 84 76 80 77 93 80%

65 76 91 73 83 77 70 61 78 75%

80 85 40 66 72 75 80 80 80 73%

88 86 88 65 78 60 26 90 70 72%

72 80 10 80 80 80 61 80 79 69%

86 80 51 40 65 39 55 95 60 63%

80 70 72 30 75 50 50 65 65 62%

58 55 82 73 86 47 64 66 17 61%

75 70 66 61 80 50 35 36 50 58%

85 85 67 40 77 40 48 16 40 55%
Psychometrics 40 24 0 79 15 55 10 42 5 30%

[Click any metric to visit the original blog piece that explained the rationale for ACME's scoring.]

Hopefully by now you are starting to make out themes or patterns in the metrics that score highly on the PRAGMATIC scale.

Having so far discussed and scored more than half of the example metrics from the book, plus a bunch more metrics from other sources, there's a fair chance we have covered some of the security metrics that your organization currently uses. How did they do? Do the PRAGMATIC scores and the discussion broadly reflect your experience with those metrics?  

We would be amazed if your metrics rate exactly the same as ACME's but if any of your scores are markedly higher or lower, that itself is interesting (and we'd love to hear why - feel free to comment on the blog or email us directly). The most likely explanation is that you are interpreting and using the metric in a way that suits your organization's particular information security management needs, whereas ACME's situation is different. Alternatively, it could be that you are applying the PRAGMATIC criteria differently to ACME (and us!). To be honest, it doesn't matter much either way: arguably the most important benefit of PRAGMATIC is that is prompts a structured analysis, and hopefully a rational and fruitful discussion of the pros and cons of various security metrics. 

12 July 2013

PRAGMATIC Security Metric of the Quarter #5

Example Information Security Metric of the Fifth Quarter

The PRAGMATIC scores for another 3-month's worth of information security metrics examples are as follows:

Example metric P R A G M A T I C Score
Information access control maturity 90 95 70 80 90 80 90 85 90 86%
Security policy management maturity 90 95 70 80 88 85 90 82 88 85%
Number of important operations with documented & tested security procedures 95 96 91 85 95 84 62 90 60 84%
Information security budget variance 70 90 85 77 80 77 80 90 95 83%
% of information assets not [correctly] classified 75 75 97 85 90 80 80 80 80 82%
Policy coverage of frameworks such as ISO/IEC 27002 70 75 90 69 85 76 72 65 85 76%
% of policy statements unambiguously linked to control objectives 92 91 64 60 85 65 45 75 75 72%
Rate of change of emergency change requests 64 71 69 73 78 70 70 69 83 72%
Total liability value of untreated/residual risks 88 98 59 33 96 33 77 38 10 59%
Entropy of encrypted content 78 66 23 78 3 93 74 79 34 59%
Embarrassment factor 26 38 20 50 63 72 40 87 87 54%
% of security policies that satisfy documentation standards 66 47 79 45 74 38 44 50 35 53%
Patching policy compliance 66 52 55 77 19 36 11 8 5 37%

Top of the heap are two maturity metrics scoring 85% and 86%, with a further 3 metrics also scoring in the 80's.

While it is tempting to recommend these and other high-scoring metrics to you, dear reader, please bear in mind that they were scored in the context of a fictional manufacturing company, Acme Enterprises Inc.  The scores reflect the perceptions, prejudices, opinions and needs of Acme's managers, given their current situation.  Things are undoubtedly different for you.  We don't know what's really important to you, your managers and colleagues, about information security.  We have no idea which aspects are of particular concern, right now, nor what might be coming up over the next year or three.  Hence we encourage you to think critically about the way we describe the metrics, and preferably re-score them.   

Furthermore, PRAGMATIC scores alone are not necessarily a sound basis on which to select or reject metrics.  It's not that simple, unfortunately, despite what you may think given the way we bang on and on about PRAGMATIC!  The scores are intended to guide the development of an information security measurement system, a well-thought-out suite of metrics plus the associated processes for measuring and using them.  Considering and scoring each security metric in isolation does not build the big picture view necessary to measure information security as a coherent and integral part of the organization's management practices.

The book describes PRAGMATIC scoring as the heart of a comprehensive method, an overall approach to information security metrics.  The method starts by figuring out your metrics audiences and their measurement requirements, building a picture of what you are hoping to achieve.   Knowing why certain security metrics might or might not be appropriate for your organization is arguably even more important than knowing which specific metrics to choose ... but, that said, the act of gathering, contemplating, assessing and scoring possible metrics turns out to be a productive way both to determine and to fulfil the needs.  It's a deliberately pragmatic approach, a structured method that achieves a worthwhile outcome more effectively and efficiently than any other approach, as far as we know anyway.  Perhaps you know different?

11 April 2013

PRAGMATIC Security Metric of the Year, 2013

Having just discussed our fifty-second Security Metric of the Week here on the blog, it's time now to announce our top-rated example security metrics from the past year.  

<Cue drum roll>

The PRAGMATIC Security Metric of the Year, 2013, is ... "Security metametrics"

<Fanfare, riotous applause>

Here are the PRAGMATIC ratings for the winner and seven runners-up, all eight example metrics having scored greater than 80%:

Example metric P R A G M A T I C Score
Security metametrics 96 91 99 92 88 94 89 79 95 91%
Access alert message rate 87 88 94 93 93 94 97 89 79 90%
Business continuity maturity 90 95 70 80 90 85 90 87 90 86%
Asset management maturity 90 95 70 80 90 85 90 85 90 86%
Infosec compliance maturity 90 95 70 80 90 85 90 85 90 86%
Physical security maturity 90 95 70 80 90 85 90 85 90 86%
HR security maturity 90 95 70 80 90 85 90 85 90 86%
Security traceability 85 89 88 90 91 87 65 84 85 85%

Before you rush off to implement these eight metrics back at the ranch, please note that the PRAGMATIC scores were calculated in the context of an imaginary organization, ACME Enterprises Inc.  They reflect ACME's situation, and ACME management's perspectives, understanding, prejudices and measurement objectives.  They are merely worked examples, demonstrating how to apply the PRAGMATIC method in practice.  You may well already have better security metrics in place, and we know there are many other excellent security metrics - not least because there are other high-scoring examples in the book!  In short ...

Y M M V 
Your Metrics May Vary

You have no doubt noticed that five of the top eight are "maturity metrics", and if we include "security metametrics", fully six of the top eight are our own invention ... which probably reveals a bias in the way we scored and ranked the metrics.  These six are our babies and, naturally, we love them to bits, warts and all.  We are blind to their imperfections.  On the other hand, using the PRAGMATIC approach, we have elaborated in some detail on why we believe they are such strong candidates for ACME's information security measurement system.  We've shown our workings, and actively encourage you to review and reconsider these and other candidate metrics in your own contexts.  

It might be nice if we could develop and agree on a comprehensive suite of universally-applicable information security metrics, particularly as we now have a more rational approach than "Trust us, these are great security metrics!"   However, that may be just a pipe-dream since we are all so different.  Is it realistic to presume that the half-dozen information security metrics that have been chosen by, say, a small charity would also feature among the two dozen selected by a large bank, or the four dozen imposed on a government department by some regulatory authority?  We suspect not, but  having said that we would be delighted to reach a consensus on a handful of PRAGMATIC security metrics that have proven themselves invaluable to almost everyone.

OK, that completes the first year of our cook's tour of information security metrics.  In the months ahead, we plan to continue discussing and scoring other example metrics from the book, along with various others that pop into our consciousness from time-to-time.   If you'd like us to consider and score your favorite information security metric, then why not join the security metametrics discussion forum and tell us all about it?  Does yours score above 80%?  What makes it shine?