In Michael Brunton-Spall's weekly cyber security newsletter, which can be found here, he highlighted a recent Twitter thread by Perry Metzger that garnered a great deal of attention. The thread can be found here or there's a nice formatted display of the entire text here: .

I recommend you read the thread, but the quotes from Metzger that "Computer security is warfare. No, really, it's war." and "Security _is_ adversarial. In warfare, you don't survive if you're second rate, you die" sum up his central point, that cyber security is a technical discipline, focused on defending against dedicated and capable foes, where the main determinant of victory, if not the sole determinant, is if you are more skilful than your opponent. There's also a strong theme against the value of certifications, but without offering any kind of viable alternative or better methodology of rating ability.

In reply to Metzger's thoughts Brunton-Spall used him blog to highlight how many security issues are caused by "accident, misconfiguration or error", emphasising that "we aren’t at war with advanced 'cyber-adversaries' whatever that means, but that we are at war with ourselves, with the software that we build", and looking to safety engineering for solutions.

I think they're both wrong.

And right.

Metzger is right in that some of the situations within cyber security require an adversarial mindset, and makes an excellent point that this aspect to cyber security is often missed. However this is lost in all of their comments about technical excellence being the dominant reason for success, if not the sole reason; similarly Metzger imagines a world where the enemy are only "High quality paid professionals", there's no concept of levels of risk here, no nuance regarding the different types and ranges of ability required for the many different positions within the industry or within criminal organisations. Similarly no appreciation for all those roles where security is just part of the overall job description, system administrators for example.

And as I've argued in my recent set of presentations, the current approach to incident response based on containment and recover misses many of the opportunities present when you're up against an active and sentient adversary, and the effect you can have on them - so the adversarial point appeals to me, but it's lost in everything else.

My "military" experience has all been gained through Xbox Live, and I don't get to wargame as often as I like - but I've been fortunate enough to have access to experienced war-fighters and those working in Operational Research, and been through simulations of conflict at various levels of abstraction. The ability to win conflicts is not based on a linear scale comparing your technically adeptness against your opponent, this is an idea of warfare based on DragonBall Z, not any kind of real world conflict. The relevance of communication, of logistics, of morale, and of an overall strategy based on forethought and feedback, are crucial to highlight here if you're going to bring in any kind of military thinking into the industry at all. In my own presentations I argue we are over-simplifying the requirements of the situation by emphasising a golf-like approach to individual ability, which suits a game like golf, but certainly doesn't suit our complex area of operation, where the ability to work within and against teams is much more relevant.

However similarly Brunton-Spall makes a great point, so many cyber security breaches are a result of failure on the part of the defenders, due to incorrect processes or those processes not being followed correctly, in such a way that the nature and skill level of the attackers is almost irrelevant. As Brunton-Spall states, he'll be focusing this year on "how we can build systems that are resistant to misconfiguration or accident". While it's not the complete solution, I agree in that there is so much more that the industry can do here - for example looking at the Checklist Manifesto style approach to executing administrative tasks, inspired by improvements in the world of medicine.

I think examining these analogies and references to other areas are useful, both in what they illuminate and when the analogy breaks because it has been stretched too far. Some of what cyber security is conflict, some of the methods cyber security needs are well-established within safety engineering, but simply advocating one approach or the other is too simplistic for our inordinately complex field, and while the appeal to a safety engineering mindset is far more useful than the "OORAH!" of a particularly distorted view of armed conflict, I think some combination of the two is the way forward.

As to the best analogy for computer security as a whole, I think Wendy Nather summarises the situation well, as she often does, in that crime is a "fine analogy", Crime is a vague concept, with a wide scale of events covered, from petty fraud to mass murder - we should appreciate that cyber security is similarly broad, and learn useful lessons from others where we can, but aggressively toss them aside when they do not.