There’s been some constructive feedback "out there" on my points about SSL. Practically speaking, it doesn’t matter whether SSL is protecting us or not since auditors and regulators typically expect it. Therefore, it doesn’t really matter what we (I) think, it’s here to stay anyway.
I do think it is worth the threat modeling exercise, however, because it is still not clear to me that SSL provides added protection. The question at hand is, if SSL were removed would it increase the number of attacks/exploits against a website? (In the completely roundabout way we security folks sometimes work, one commenter "revealed" in his risk assessment that if you don’t encrypt the traffic it is a problem because the traffic is then in the clear).
It is not clear (but not encrypted either ) to me that attacks would increase, for two reasons: 1) attackers are likely as successful as much as they want to be – their attack appetite is satisfied already; and correspondingly 2) it is a target rich environment.
It does seem likely that at least some attackers would consider sniffing traffic if simply to make it harder to detect them (this might be an even bigger reason than any I’ve seen). I think in many cases, however, it creates more of a problem. It may be easy, for example, to set up shop at a Starbucks or Holiday Inn Express, but you just don’t know what you are going to get. It’s one thing to be able to randomly sniff traffic and a whole other thing to actually sniff useful, semi-targeted stuff. This assumes that successfully exploiting and maintaining a presence on a device in the cloud is much more difficult than the garden variety sniff/phish/sql injection. (Phishing just provides the opportunity with so much more scalability and automation that its randomness is okay).
My point about the use of SSL by the bad guys is not a very significant one – though without "legitimate" SSL traffic it might be easier to block the bad stuff. In the end, if you can block it all, then it doesn’t matter. Obviously, there are other ways around the good guy sniffing problem and consumers aren’t even faced with it.
One of the biggest problems with SSL is that it makes us complacent. Consider this REST vs. SOAP post from Gunnar Peterson along with the corresponding discussion (among a handful of folks). With XML and Web Services, message based architectures allow for more intelligence and more persistent protection. Overall, there are better solutions out there that incorporate this type of protection.
At best, SSL is a weak control yet we are stuck with it. In some respects this is fascinating because of the way other security mechanisms are evaluated – any evasion or bypass and they are deemed unworthy. Relatively speaking, SSL is stable as a control (I believe) and yet the ultimate implication is that it is effectively "bypassed" by all of the other attack methods (go have fun with that one).
“The question at hand is, if SSL were removed would it increase the number of attacks/exploits against a website? ”
i suspect it would decrease the number of attacks against a website…
why do people attack websites? what are attackers trying to get? if they’re trying to get confidential information that people enter into websites then when it suddenly becomes possible to snatch that information right out of the network during transit there won’t be much need to attack the website anymore…
I can’t believe I’m agreeing with Pete Lindstrom, but here I am.
SSL, as it is implemented on the web, is next to useless.
Don’t throw the baby out with the bathwater, though. SSL, as implemented on corporate wireless networks via EAP-TLS, is good security, authenticating both endpoints to each other as well as users.
And I don’t know how “stuck” we are for those connections where we can influence both endpoints. SSL is easy and ubiquitous on the web, which is part of why it sucks there.
For point-to-point encryption where you can influence both endpoints, I think anything is on the table, even from a compliance perspective.
Whats Your Risk Style, Part II
So if we dont have the quality of data to use an objectivist approach to probability, that leaves two alternatives:
Donn Parkers no risk approach – where we dont acknowledge probability, frequency, or risk at all, or
A …