Monday, July 6, 2009

Podcast: Crypto-Gram 15 Feb 2007: You can be secure even though you don't feel secure, and you can feel secure even though you're not really secure.

from the Feb 15, 2007 Crypto-Gram Newsletter
by Bruce Schneier


* In Praise of Security Theater

1-in-375,000 chance of baby abduction
VS
1-in-415 of infant mortality

Why to prevent infant abduction, all babies had RFID tags attached to their ankles by a bracelet.

Security is both a reality and a feeling. The reality of security is mathematical, based on the probability of different risks and the effectiveness of different countermeasures.

But security is also a feeling, based on individual psychological reactions to both the risks and the countermeasures. And the two things are different: You can be secure even though you don't feel secure, and you can feel secure even though you're not really secure.

The RFID bracelets are what I've come to call security theater: security primarily designed to make you *feel* more secure.

Like real security, security theater has a cost. It can cost money, time, concentration, freedoms, and so on. It can come at the cost of reducing the things we can do. Most of the time security theater is a bad trade-off, because the costs far outweigh the benefits. But there are instances when a little bit of security theater makes sense.

Too much security theater and our feeling of security becomes greater than the reality, which is also bad. But to write off security theater completely is to ignore the feeling of security.

* Real-ID: Costs and Benefits

Real ID is another lousy security trade-off. It'll cost the United States at least $11 billion, and we won't get much security in return.

* Debating Full Disclosure

Full disclosure -- the practice of making the details of security vulnerabilities public -- is a damned good idea. Public scrutiny is the only reliable way to improve security, while secrecy only makes us less secure.

Unfortunately, secrecy *sounds* like a good idea. Keeping software vulnerabilities secret, the argument goes, keeps them out of the hands of the hackers. The problem, according to this position, is less the vulnerability itself and more the information about the vulnerability.

But that assumes that hackers can't discover vulnerabilities on their own, and that software companies will spend time and money fixing secret vulnerabilities. Both of those assumptions are false.

To understand why the second assumption isn't true, you need to understand the underlying economics. To a software company, vulnerabilities are largely an externality. That is, they affect you -- the user -- much more than they affect it. A smart vendor treats vulnerabilities less as a software problem, and more as a PR problem. So if we, the user community, want software vendors to patch vulnerabilities, we need to make the PR problem more acute.

So a bunch of software companies, and some security researchers, banded together and invented "responsible disclosure." The basic idea was that the threat of publishing the vulnerability is almost as good as actually publishing it. A responsible researcher would quietly give the software vendor a head start on patching its software, before releasing the vulnerability to the public.

This was a good idea -- and these days it's normal procedure -- but one that was possible only because full disclosure was the norm. And it remains a good idea only as long as full disclosure is the threat.

Secrecy prevents people from accurately assessing their own risk. Secrecy precludes public debate about security, and inhibits security education that leads to improvements. Secrecy doesn't improve security; it stifles it.

* Sending Photos to 911 Operators

Since 1968, the 911 system has evolved smartly with the times. Calls are now automatically recorded. Callers are now automatically located by phone number or cell phone location. This plan is the next logical evolution

* DRM in Windows Vista

Windows Vista includes an array of "features" that you don't want. These features will make your computer less reliable and less secure. They'll make your computer less stable and run slower. They will cause technical support problems. They may even require you to upgrade some of your peripheral hardware and existing software. And these features won't do anything useful. In fact, they're working against you. They're digital rights management (DRM) features built into Vista at the behest of the entertainment industry.

And you don't get to refuse them.

* A New Secure Hash Standard

The U.S. National Institute of Standards and Technology is having a competition for a new cryptographic hash function.

time 37:37
PS: this is my cheat sheet of Bruce Schneier's Podcast:
http://www.schneier.com/crypto-gram-0702.html

Labels: ,

0 Comments:

Post a Comment

<< Home