I've just been perusing another vendor-sponsored survey report - specifically the 2016 Cybersecurity Confidence Report from Barkly, a security software company.
As is typical of marketing collateral, the 12 page report is strong on graphics but short on hard data. In particular, there is no equivalent of the 'materials and methods' section of a scientific paper, hence we don't know how the survey was conducted. They claim to have surveyed 350 IT pro's, for instance, but don't say how they were selected. Were they customers and sales prospects, I wonder? Visitors to the Barkly stand at a trade show perhaps? Random respondents keen to pick up a freebie of some sort for answering a few inane questions? An online poll maybe?
The survey questions are equally vague. Under the heading "What did we ask them", the report lists:
- Biggest concerns [presumably in relation to cybersecurity, whatever that means];
- Confidence in current solutions, metrics, and employees [which appears to mean confidence in current cybersecurity products, in the return on investment for those products, and in (other?) employees. 'Confidence' is a highly subjective measure. Confidence in comparison to what? What is the scale?];
- Number of breaches suffered in 2015 [was breach defined? A third of respondents declined to answer this, and it's unclear why they were even asked this]
- Time spent on security [presumably sheer guesswork here]
- Top priorities [in relation to cybersecurity, I guess]
- Biggest downsides to security solutions [aside from the name! The report notes 4 options here: slows down the system, too expensive, too many updates, or requires too much headcount to manage. There are many more possibilities, but we don't know whether respondents were given free rein, offered a "something else" option, or required to select from or rank (at least?) the 4 options provided by Barkly - conceivably selected on the basis of being strengths for their products, judging by their strapline at the end: "At Barkly, we believe security shouldn’t be difficult to use or understand. That’s why we’re building strong endpoint protection that’s fast, affordable, and easy to use"].
Regarding confidence, the report states:
"The majority of the respondents we surveyed struggle to determine the direct effect solutions have on their organization’s security posture, and how that effect translates into measurable return on investment (ROI). The fact that a third of respondents did not have the ability to tell whether their company had been breached in the past year suggests the lack of visibility isn’t confined to ROI. Many companies still don’t have proper insight into what’s happening in their organization from a security perspective. Therefore, they can’t be sure whether the solutions they’re paying for are working or not."
While I'm unsure how they reached that conclusion from the survey, it is an interesting perspective and, of course, a significant challenge for any company trying to sell 'security solutions'. I suspect they might have got better answers from execs and managers than from lower-level IT pro's, since the former typically need to justify budgets, investments and other expenditure, while the latter have little say in the matter. The report doesn't say so, however.
Elsewhere the report does attempt to contrast responses from IT pro's (two-thirds of respondents, about 230 people) against responses from IT executives and managers (the remaining one-third, about 120) using the awkwardly-arranged graphic above. The associated text states:
"When our survey results came in, we quickly noticed a striking difference in attitudes among IT professionals in non-management positions and their counterparts in executive roles. These two groups responded differently to nearly every question we asked, from time spent on security to the most problematic effect of a data breach. Stepping back and looking at the survey as a whole, one particular theme emerged: When it comes to security, executives are much more confident than their IT teams."
Really? Execs are "much more confident"? There is maybe a little difference between the two sets of bars, but would you call it 'much' or 'striking'? Is it statistically significant, and to what confidence level? Again we're left guessing.
Conclusion
What do you make of the report? Personally, I'm too cynical to take much from it. It leaves far too much unsaid, and what it does say is questionable. Nevertheless, I would not be surprised to see the information being quoted or used out of context - and so the misinformation game continues.
On a more positive note, the survey has provided us with another case study and further examples of what-not-to-do.