Marlena is an extremely strong security analyst who combines deep technical knowledge with an exceptional ability to communicate her findings and recommendations.
——Peter Lieberwirth, President of Identity Associates
The other day another company got hacked, and the other day my sweetheart made fun of me for wanting to buy not just any old cordless phone but one that used spread spectrum technology to minimize eavesdropping.
What do these things have to do with each other? More than one might think. As a security professional, it strikes me that the problem today in security isn't one of technology but of attitude. This is a not new idea. The difficulty in the industry however has been how to get people to change how they think. How do you evoke a frame of mind that would encourage "good" evaluation of security issues?
I'm going to suggest a concept — embodied in an easy-to-remember phrase — that can help. But first, back to the discussion about the cordless phone.
"No one is going to listen in," my honey said vigorously. "The range on this thing is only eighty feet."
I told him that I thought the range was a lot longer than that. Besides, eighty feet goes a long way on the densely packed street where we live.
This all didn't matter.
"I can't believe you are considering paying nearly twice as much money, when the risk is so low," he said. "I really think your judgment is off on this one."
"O-Rings," I said. "Remember NASA's O-Rings?"
He waved his hand as if shooing away a mosquito. "They knew that was high probability."
***
The thing about most calamitous security break-ins and disasters like the Challenger is that they are always "high probability" in hindsight. I mean, a disaster wouldn't have happened if it were low probability? Or would it?
Evaluating consequences is a key part of securing your enterprise. My security catch-phrase is aimed at getting people to remember to do just this. Though it comes from operations research, it very much applies to both corporate life and personal life. It is just the two words, "Minimize Regret".
Minimizing regret entails simply thinking about how you'd feel (and what loss your company would suffer!) if a "low-risk" event actually happened. And, comparing how you would feel in the case that you took no action and a disaster did happen, to how you'd feel if you spent time, money, and effort protecting yourself and it didn't.
In my case of the cordless phone, I often work at home, speaking to remote colleagues about company-confidential matters. Even though the probability of an eavesdropper might be low, an "incident" would be bad for my company -- and disastrous for my career. After all, I'm a security professional and am supposed to know better. I am certain that I'd be incredibly regretful if an incident occurred because I failed to spend the extra money to get the more secure phone.
As for the O-Rings, we all know what happened.
* * *
The only problem with "minimizing regret" is oddly a consequence of success: if the actions you take to forestall a low-probability event work well, the event won't happen, and you may be faced with pressure and even ridicule for having "wasted" resources.
Stand firm and just show your colleagues the next news article about an Internet break-in. You won't have long to wait.
In a Financial Times article in 2017, a Microsoft executive "emphasized that upgrading to modern software is the only way for customers to stay safe."
This is a paraphrase apparently. If the Microsoft executive actually said this, well, I beg to differ.
My take:
a) Sure, if the "upgrade" is a patch, it's a very good idea, and will likely make you safer. But if the upgrade is new software, that's another matter. New software (e.g. a new OS) is much more likely to have security holes than software that has been beat on for years (and been patched). Think twice before "upgrading."
b) The "new thing" is always "bleeding edge" not "leading edge" when it comes to security. (There can be exceptions but they involve laborious secure design and development processes that the Microsoft's, Apple's, etc of the world don't do (and that smaller vendors definitely don't do)).
c) Even more importantly, there are many factors involved in being safe, such as your electronic access controls, your physical access controls, and how you train employees.
c) Even if there are vulnerabilities in software, they can often be mitigated (i.e. made not to matter) by appropriate deployment controls (e.g. closing ports).
e) "Security" involves looking at the actual deployment scenario: what else is on the machine hosting the software in question; what services the software connects to; who has access to the machine locally and also remotely.
f) Looking at security reductionistically as opposed to holistically (e.g. "we're at the lastest patch, therefore we're safe") is a good way to wind up with some really bad days for your business.