ESSAYS


A Wake Up Call on Risks and Consequences

The other day another company got hacked, and the other day my spouse made fun of me for wanting to buy not just any old cordless phone but one that used spread spectrum technology to minimize eavesdropping.

What do these things have to do with each other? More than one might think. As a security professional, it strikes me that the problem today in security isn't one of technology but of attitude. This is a not new idea. The difficulty in the industry however has been how to get people to change how they think. How do you evoke a frame of mind that would encourage "good" evaluation of security issues?

I'm going to suggest a concept — embodied in an easy-to-remember phrase — that can help. But first, back to the discussion about the cordless phone.

"No one is going to listen in," my honey said vigorously. "The range on this thing is only eighty feet."

I told him that I thought the range was a lot longer than that. Besides, eighty feet goes a long way on the densely packed street where we live.

This all didn't matter.

"I can't believe you are considering paying nearly twice as much money, when the risk is so low," he said. "I really think your judgment is off on this one."

"O-Rings," I said. "Remember NASA's O-Rings?"

He waved his hand as if shooing away a mosquito. "They knew that was high probability."

***

The thing about most calamitous security break-ins and disasters like the Challenger is that they are always "high probability" in hindsight. I mean, a disaster wouldn't have happened if it were low probability? Or would it?

Evaluating consequences is a key part of securing your enterprise. My security catch-phrase is aimed at getting people to remember to do just this. Though it comes from operations research, it very much applies to both corporate life and personal life. It is just the two words, "Minimize Regret".

Minimizing regret entails simply thinking about how you'd feel (and what loss your company would suffer!) if a "low-risk" event actually happened. And, comparing how you would feel in the case that you took no action and a disaster did happen, to how you'd feel if you spent time, money, and effort protecting yourself and it didn't.

In my case of the cordless phone, I often work at home, speaking to remote colleagues about company-confidential matters. Even though the probability of an eavesdropper might be low, an "incident" would be bad for my company -- and disastrous for my career. After all, I'm a security professional and am supposed to know better. I am certain that I'd be incredibly regretful if an incident occurred because I failed to spend the extra money to get the more secure phone.

As for the O-Rings, we all know what happened.

* * *

The only problem with "minimizing regret" is oddly a consequence of success: if the actions you take to forestall a low-probability event work well, the event won't happen, and you may be faced with pressure and even ridicule for having "wasted" resources.

Stand firm and just show your colleagues the next news article about an Internet break-in. You won't have long to wait.

Thoughts On Software Upgrades

In a Financial Times article in 2017, a Microsoft executive “emphasized that upgrading to modern software is the only way for customers to stay safe."

This is a paraphrase apparently. If the Microsoft executive actually said this, well, it's a lie.

My take:

1a) Patching security holes in your software is a way to be safer, not "safe."

1b) There are a lot of other factors in "security" such as your electronic access controls, your physical access controls, and how you train employees (with respect to phishing for example). The executive saying that "updating to modern software" is "the only way to stay safe" is "highly incorrect." Even if there are vulnerabilities in software, they can often be mitigated (i.e. made not to matter) by appropriate deployment controls (e.g. closing ports). "Security" involves looking at the actual deployment scenario: what else is on the machine hosting the software in question; what services is the software connecting to; who has access to the machines locally, and remote; etc. Looking reductionistically as opposed to holistically is a good way to get "pwned."

2a) New software (e.g. a new OS), is "untried" and is more likely to have security holes than software that has been beat on for years (and been patched).

2b) The "new thing" is always "bleeding edge" not "leading edge" when it comes to security. (There could be exceptions but they involve laborious secure design and development processes that the Microsoft's, Apple's, etc of the world don't do (and that smaller vendors definitely don't do)).

2c) I personally try to never install the first new version of an OS or an application. It's likely to be broken in one way or another. I'm not too keen on the second version either.