19 March 2016

How effective are our security policies?

On the ISO27k Forum today, someone asked us (in not so many words) how to determine or prove that the organization's information security policies are effective. Good question!

As a consultant working with lots organizations over many years, I've noticed that the quality of their information security policies is generally indicative of the maturity and quality of their approach to information security as a whole. In metrics terms, it is a security indicator.

At one extreme, an organization with rotten policies is very unlikely to be much good at other aspects of information security - but what exactly do I mean by 'rotten policies'? I was thinking of policies that are badly-written, stuffed with acronyms, gobbledegook and often pompous or overbearing pseudo-legal language, with gaping holes regarding current information risks and security controls, internal inconsistencies, out-of-date etc. ... but there's even more to it than their inherent quality since policies per se aren't self-contained controls: they need to be used which in practice involves a bunch of other activities.

At the other extreme, what would constitute excellent security policies? Again, it's not just a matter of how glossy they are. Here are some the key criteria that I would say are indicative of effective policies:
  • The policies truly reflect management’s intent: management understands, supports and endorses/mandates them, and (for bonus points!) managers overtly comply with and use them personally (they walk-the-talk);
  • They also reflect current information risks and security requirements, compliance obligations, current and emerging issues etc. (e.g. cloud, BYOD, IoT and ransomware for four very topical issues);
  • They cover all relevant aspects/topics without significant gaps or overlaps (especially no stark conflicts);
  • They are widely available and read … implying also that they are well-written, professional in appearance, readable and user-friendly;
  • People refer to them frequently (including cross-references from other policies, procedures etc., ideally not just in the information risk and security realm);
  • They are an integral part of security management, operational procedures etc.;
  • They are used in and supported by a wide spectrum of information security-related training and awareness activities;
  • Policy compliance is appropriately enforced and reinforced, and is generally strong;
  • They are proactively maintained as a suite, adapting responsively as things inevitably change;
  • Users (managers, staff, specialists, auditors and other stakeholders) value and appreciate them, speak highly of them etc.
As I'm about to conduct an ISO27k gap analysis for a client, I'll shortly be turning those criteria into a maturity metric of the type shown in appendix H of PRAGMATIC Security Metrics.  The approach involves documenting a range of scoring norms for a number of relevant criteria, developing a table to use as a combined checklist and measurement tool. Taking just the first bullet point above, for instance, I would turn that into 4 scoring norms roughly as follows:
  • 100% point: "The policies truly reflect management’s intent: management full understands, supports and endorses/mandates them, managers overtly comply with and use them personally, and insist on full compliance";
  • 67% point: "Managers formally mandate the policies but there are precious few signs of their genuine support for them: they occasionally bend or flaunt the rules and are sometimes reluctant to enforce them";
  • 33% point: "Managers pay lip-service to the policies, sometimes perceiving them to be irrelevant and inapplicable to them personally and occasionally also their business units/departments, with compliance being essentially optional";
  • 0% point: "Managers openly disrespect and ignore the policies. They tolerate and perhaps actively encourage noncompliance with comments along the lines of 'We have a business to run!'"
During the  gap analysis, I'll systematically gather and review relevant evidence, assessing the client against the predefined norms row-by-row to come up with scores based partly on my subjective assessment, partly on the objective facts before me. The row and aggregate scores will be part of my closing presentation and report to management, along with recommendations where the scores are patently inadequate (meaning well below 50%) or where there are obvious cost-effective opportunities for security improvements (low-hanging fruit). What's more, I'll probably leave the client with the scoring table, enabling them to repeat the exercise at some future point e.g. shortly before their certification audit is due and perhaps annually thereafter, demonstrating hopefully their steady progress towards maturity.

Regards,
Gary

No comments:

Post a Comment

Have your say!