Friday, June 26, 2009

Podcast: Crypto-Gram 15 June 2006: iPod Sneakiness

from the Jun 15, 2006 Crypto-Gram Newsletter
by Bruce Schneier

* The Value of Privacy

If you aren't doing anything wrong, what do you have to hide?

Some clever answers:
- If I'm not doing anything wrong, then you have no cause to watch me.
- Because the government gets to define what's wrong, and they keep changing the definition.
- Because you might do something wrong with my information.

Privacy is an inherent human right, and a requirement for maintaining the human condition with dignity and respect.

* Hacking Computers Over USB

If an attacker can convince you to plug his USB device into your computer, he can take it over.

iPod Sneakiness: you can innocently ask someone at an Internet cafe if you can plug your iPod into his computer to power it up -- and then steal his passwords and critical files.





length: 21:51m
PS: this is my cheat sheet of Bruce Schneier's Podcast:
http://www.schneier.com/crypto-gram-0606.html

Labels: ,

Podcast: Crypto-Gram 15 May 2006

from the May 15, 2006 Crypto-Gram Newsletter
by Bruce Schneier

* Who Owns Your Computer?

When technology serves its owners, it is liberating. When it is designed to serve others, over the owner's objection, it is oppressive.

You own your computer, of course. You bought it. You paid for it. But how much control do you really have over what happens on your machine? Technically you might have bought the hardware and software, but you have less control over what it's doing behind the scenes.

It used to be that only malicious hackers were trying to own your computers.
Estimates are that somewhere between hundreds of thousands and millions of computers are members of remotely controlled "bot" networks. Owned.

Now, things are not so simple. There are media companies that want to control what you can do with the music and videos they sell you. There are companies that use software as a conduit to collect marketing information, deliver advertising or do whatever it is their real owners require. And there are software companies that are trying to make money by pleasing not only their customers, but other companies they ally themselves with. All these companies want to own your computer.

* Identity-Theft Disclosure Laws

California was the first state to pass a law requiring companies that keep personal data to disclose when that data is lost or stolen.

Identity theft is the fastest-growing area of crime. It's badly named -- your identity is the one thing that cannot be stolen -- and is better thought of as fraud by impersonation. A criminal collects enough personal information about you to be able to impersonate you to banks, credit card companies, brokerage houses, etc. Posing as you, he steals your money, or takes a destructive joyride on your good credit.

* Microsoft's BitLocker

BitLocker Drive Encryption is a new security feature in Windows Vista, designed to work with the Trusted Platform Module (TPM). It encrypts the C drive with a computer-generated key. In its basic mode, an attacker can still access the data on the drive by guessing the user's password, but would not be able to get at the drive by booting the disk up using another operating system, or removing the drive and attaching it to another computer.

There is a recovery key: optional but strongly encouraged. It is automatically generated by BitLocker, and it can be sent to some administrator or printed out and stored in some secure location. There are ways for an administrator to set group policy settings mandating this key.

Encryption particulars: The default data encryption algorithm is AES-128-CBC with an additional diffuser. The diffuser is designed to protect against ciphertext-manipulation attacks, and is independently keyed from AES-CBC so that it cannot damage the security you get from AES-CBC. Administrators can select the disk encryption algorithm through group policy. Choices are 128-bit AES-CBC plus the diffuser, 256-bit AES-CBC plus the diffuser, 128-bit AES-CBC, and 256-bit AES-CBC. (My advice: stick with the default.) The key management system uses 256-bit keys wherever possible. The only place where a 128-bit key limit is hard-coded is the recovery key, which is 48 digits (including checksums). It's shorter because it has to be typed in manually; typing in 96 digits will piss off a lot of people -- even if it is only for data recovery.


length: 29:33m
PS: this is my cheat sheet of Bruce Schneier's Podcast:
http://www.schneier.com/crypto-gram-0605.html

Labels: ,

UPDATE 10+1 things to do before I die

0. basE jump from Parekupa meru
1. Wing suit
2. Wing suit
3. Wing suit
5. Free fly
6. Swimming/diving with Rhincodon typus, Mola mola, Manta birostris, and under the school of Sphyrnidae
7. Inca trail to machu picchu
8. See GWAR & Jane's Addiction live concert
9. 18000 feet freefall
10. peeing in Everest (dont eat yellow snow!!!)

Labels:

Podcast: Crypto-Gram 15 Apr 2006: Security through Begging

from the Apr 15, 2006 Crypto-Gram Newsletter
by Bruce Schneier

* Airport Passenger Screening

It seems like every time someone tests airport security, airport security fails. In tests between November 2001 and February 2002, screeners missed 70 percent of knives, 30 percent of guns, and 60 percent of (fake) bombs. And recently, testers were able to smuggle bomb-making parts through airport security in 21 of 21 attempts. It makes you wonder why we're all putting our laptops in a separate bin and taking off our shoes.

Airport screeners have a difficult job, primarily because the human brain isn't naturally adapted to the task. We're wired for visual pattern matching, and are great at picking out something we know to look for, but we're much less adept at detecting random exceptions in uniform data.

* VOIP Encryption

There are basically 4 ways to eavesdrop on a telephone call:
1) listen in on another phone extension.
2) attach some eavesdropping equipment to the wire with a pair of alligator clips.
3) eavesdrop at the telephone switch.
4) tap the main trunk lines, eavesdrop on the microwave or satellite phone links, etc.

That's basically the entire threat model for traditional phone calls.

Phone calls from your computer are fundamentally different from phone calls from your telephone. Internet telephony's threat model is much closer to the threat model for IP-networked computers than the threat model for telephony.

This is why encryption for VOIP is so important. VOIP calls are vulnerable to a variety of threats that traditional telephone calls are not.

Encryption for IP telephony is important, but it's not a panacea. Basically, it takes care of threats No. 2 through No. 4, but not threat No. 1. Unfortunately, that's the biggest threat: eavesdropping at the end points.

* Security through Begging

Surprising news came out that Japanese nuclear secrets leaked out: caused by a contractor was allowed to connect his personal virus-infested computer to the network at a nuclear power plant. The contractor had a file sharing app on his laptop as well, and suddenly nuclear secrets were available to plenty of kids just trying to download the latest hit single. It's only taken about nine months for the government to come up with its suggestion on how to prevent future leaks of this nature: begging all Japanese citizens not to use file sharing systems

* KittenAuth
CAPTCHAs: those distorted pictures of letters and numbers you sometimes see on web forms. GOAL: is to authenticate that there's a person sitting in front of the computer.

The idea is that it's hard for computers to identify the characters, but easy for people to do.

KittenAuth works with images. The system shows you nine pictures of cute little animals, and the person authenticates himself by clicking on the three kittens. A computer clicking at random has only a 1 in 84 chance of guessing correctly.

* New Kind of Door Lock

We know a lot about the vulnerabilities of conventional locks, but we know very little about the security of this system. But don't confuse this lack of knowledge with increased security.


length: 24:21m
PS: this is my cheat sheet of Bruce Schneier's Podcast:
http://www.schneier.com/crypto-gram-0604.html

Labels: ,

Podcast: Crypto-Gram 15 Mar 2006: Wholesale surveillance: it's not "follow that car," it's "follow every car."

from the Mar 15, 2006 Crypto-Gram Newsletter
by Bruce Schneier

* The Future of Privacy

Wholesale surveillance is a whole new world. It's not "follow that car," it's "follow every car." The National Security Agency can eavesdrop on every phone call, looking for patterns of communication or keywords that might indicate a conversation between terrorists.

More and more, we leave a trail of electronic footprints as we go through our daily lives.

Information about us has value. It has value to the police, but it also has value to corporations.

In the dot-com bust, the customer database was often the only salable asset a company had. Companies like Experian and Acxiom are in the business of buying and reselling this sort of data, and their customers are both corporate and government.

* Face Recognition Comes to Bars

The data will be owned by the bars that collect it. They can choose to erase it, or they can choose to sell it to data aggregators like Acxiom.

It's rarely the initial application that's the problem. It's the follow-on applications. It's the function creep. Before you know it, everyone will know that they are identified the moment they walk into a commercial building. We will all lose privacy, and liberty, and freedom as a result.

* Security, Economics, and Lost Conference Badges

Conference badges are an interesting security token. They can be very valuable

A few years ago, the RSA Conference charged people $100 for a replacement badge, which is far cheaper than a second membership. So the fraud remained.

This year, the RSA Conference solved the problem through economics: "If you lose your badge and/or badge holder, you will be required to purchase a new one for a fee of $1,895.00."

Instead of trying to solve this particular badge fraud problem through security, they simply moved the problem from the conference to the attendee. The badges still have that $1,895 value, but now if it's stolen and used by someone else, it's the attendee who's out the money. As far as the RSA Conference is concerned, the security risk is an externality.

* Data Mining for Terrorists

The basic idea was as audacious as it was repellent: suck up as much data as possible about everyone, sift through it with massive computers, and investigate patterns that might indicate terrorist plots. Americans across the political spectrum denounced the program, and in September 2003, Congress eliminated its funding and closed its offices.

But TIA didn't die. According to "The National Journal," it just changed its name and moved inside the Defense Department.

* Airport Security Failure

At LaGuardia, a man successfully walked through the metal detector, but screeners wanted to check his shoes. But he didn't wait, and disappeared into the crowd.

The entire Delta Airlines terminal had to be evacuated, and between 2,500 and 3,000 people had to be rescreened.

Aside from the obvious security failure -- how did this person manage to disappear into the crowd, it's painfully obvious that the overall security system did not fail well. Well-designed security systems fail gracefully, without affecting the entire airport terminal.

* Police Department Privilege Escalation

In the computer security world, privilege escalation means using some legitimately granted authority to secure extra authority that was not intended. This is a real-world counterpart. Even though transit police departments are meant to police their vehicles only, the title -- and the ostensible authority that comes along with it -- is useful elsewhere. Someone with criminal intent could easily use this authority to evade scrutiny or commit fraud.

The real problem is that we're too deferential to police power. We don't know the limits of police authority, whether it be an airport policeman or someone with a business card from the "San Gabriel Valley Transit Authority Police Department."

* Credit Card Companies and Agenda

A guy tears up a credit card application, tapes it back together, fills it out with someone else's address and a different phone number, and send it in. He still gets a credit card.

* Proof that Employees Don't Care About Security

Employees care about security; they just don't understand it. Computer and network security is complicated and confusing, and unless you're technologically inclined, you're just not going to have an intuitive feel for what's appropriate and what's a security risk. Even worse, technology changes quickly, and any security intuition an employee has is likely to be out of date within a short time.


length: 28:49m
PS: this is my cheat sheet of Bruce Schneier's Podcast:
http://www.schneier.com/crypto-gram-0603.html

Labels: ,

Thursday, June 25, 2009

Podcast: Crypto-Gram 15 February 2006: Valentine day, if you have spouse & lover, it's a serious problem.

from the Fwb 15, 2006 Crypto-Gram Newsletter
by Bruce Schneier

* Risks of Losing Portable Devices

It's now amazingly easy to lose an enormous amount of information.

There are two solutions that make sense:
1) to protect the data. Hard-disk encryption programs like PGP Disk allow you to encrypt individual files, folders or entire disk partitions.
2) to remotely delete the data if the device is lost.

* Multi-Use ID Cards

Truth is, neither a national ID nor a biometric system will ever replace the decks of plastic and paper that crowd our wallets. Because:
1) the uniqueness of the cards provides important security to the issuers.
2) reliability.

But security and reliability are only secondary concerns. If it made smart business sense for companies to piggyback on existing cards, they would find a way around the security concerns. The reason they don't boils down to one word: branding.

* Valentine's Day Security

Valentine's Day is the biggest single 24-hour period for florists, a huge event for greeting-card companies and a boon for candy makers. But it's also a major crisis day for anyone who is having an affair. After all, Valentine's Day is the one holiday when everyone is expected to do something romantic for their spouse or lover - and if someone has both, it's a serious problem.

* Identity Theft in the UK

Serious tax credit fraud in the UK: there is a tax-credit system that allows taxpayers to get a refund for some of their taxes if they meet certain criteria.

Unfortunately, the only details necessary when applying were the applicant's National Insurance number (the UK version of the Social Security number) and mother's maiden name. The refund was then paid directly into any bank account specified on the application form. Anyone who knows anything about security can guess what happened. Estimates are that fifteen millions pounds has been stolen by criminal syndicates.


length: 26:39m
PS: this is my cheat sheet of Bruce Schneier's Podcast:
http://www.schneier.com/crypto-gram-0602.html

Labels: ,

Podcast: Crypto-Gram 15 January 2006: The security of pseudo-anonymity inherently depends on how trusted that "trusted third party" is.

from the Jan 15, 2006 Crypto-Gram Newsletter
by Bruce Schneier

* Anonymity and Accountability

Anonymous systems are inherently easier to abuse and harder to secure

The problem isn't anonymity; it's accountability. If someone isn't accountable, then knowing his name doesn't help.

History is filled with bandits and pirates who amass reputations without anyone knowing their real names.

eBay's feedback system doesn't work because there's a traceable identity behind that anonymous nickname. EBay's feedback system works because each anonymous nickname comes with a record of previous transactions attached, and if someone cheats someone else then everybody knows it.

Historically, accountability has been tied to identity, but there's no reason why it has to be so.

pseudo-anonymity: you hand your identity to a trusted third party that promises to respect your anonymity to a limited degree.

The security of pseudo-anonymity inherently depends on how trusted that "trusted third party" is.

* Cell Phone Companies and Security

There seems be some evidence that Telco decides whether or not to shut off a suspicious phone after a fraud has been detected based on the customer's ability to pay.

Telco should not be able to charge its customers for telephone calls they did not make. If customer's phone is cloned; there is no possible way he/she could notify Telco of this before she saw calls he/she did not make on his/her bill.

Customer is also completely powerless to affect the anti-cloning. To make customer liable for the fraud is to ensure that the problem never gets fixed.

* Dutch Botnet

Dutch police arrested three people who created a large botnet and used it to extort money from U.S. companies. Authorities said that the botnet consisted of about 100,000 computers. The actual number was 1.5 million computers.

* Internet Explorer Sucks

The researchers tracked three browsers (MSIE, Firefox, Opera) in 2004 and counted which days they were "known unsafe"

Their definition of "known unsafe" = a remotely exploitable security vulnerability had been publicly announced and no patch was yet available.

MSIE was 98% unsafe. There were only 7 days in 2004 without an unpatched publicly disclosed security hole.

Firefox was 15% unsafe. There were 56 days with an unpatched publicly disclosed security hole. 30 of those days were a Mac hole that only affected Mac users. Windows Firefox was 7% unsafe.

Opera was 17% unsafe: 65 days. That number is accidentally a little better than it should be, as two of the unpatched periods happened to overlap.

This underestimates the risk, because it doesn't count vulnerabilities known to the bad guys but not publicly disclosed.


length: 31:29m
PS: this is my cheat sheet of Bruce Schneier's Podcast:
http://www.schneier.com/crypto-gram-0601.html

Labels: ,

Podcast: Crypto-Gram 15 December 2005: Better to combat terrorism through intelligence.

from the Dec 15, 2005 Crypto-Gram Newsletter
by Bruce Schneier

* Airplane Security

CAPPS will create two different access paths into the airport: high-security and low-security. The intent is to let only good guys take the low-security path and to force bad guys to take the high-security path, but it rarely works out that way. You have to assume that the bad guys will find a way to exploit the low-security path.

Better to combat terrorism through intelligence!

* Australian Minister's Sensible Comments on Airline Security Spark Outcry

Immigration Minister Amanda Vanstone:
"a lot of what we do is to make people feel better as opposed to actually achieve an outcome"

* Sky Marshal Shooting in Miami

1) any time you have an officer making split-second life and death decisions, you're going to have mistakes.
2) I'm not convinced the sky marshals' threat model matches reality.

* Sony's DRM Rootkit: The Real Story

On Oct. 31, Mark Russinovich broke the story in his blog: Sony BMG Music Entertainment distributed a copy-protection scheme with music CDs that secretly installed a rootkit on computers. This software tool is run without your knowledge or consent - if it's loaded on your computer with a CD, a hacker can gain and maintain access to your system and you wouldn't know it.

Sony offered a "fix" that didn't remove the rootkit, just the cloaking.

* CME in Practice

CME is "Common Malware Enumeration," and it's an initiative by US-CERT to give all worms, viruses, and such uniform names.
The problem is that different security vendors use different names for the same thing.

* OpenDocument Format and the Commonwealth of Massachusetts

OpenDocument format (ODF) is an alternative to the Microsoft document, spreadsheet, and etc. file formats.

Microsoft, with its proprietary Office document format, is spreading rumors that ODF is somehow less secure.

This, from the company that allows Office documents to embed arbitrary Visual Basic programs?

But at least ODF has a clean and open XML format, which allows layered security and the ability to remove scripts as needed. This is much more difficult in the binary Microsoft formats that effectively hide embedded programs.

* Surveillance and Oversight

September 2005, Rotterdam. The police had already identified some of the 250 suspects in a soccer riot from the previous April, but most were unidentified but captured on video. In an effort to help, they sent text messages to 17,000 phones known to be in the vicinity of the riots, asking that anyone with information contact the police. The result was more evidence, and more arrests.

* Truckers Watching the Highways

Features I like in security systems: it's dynamic, it's distributed, it relies on trained people paying attention, and it's not focused on a specific threat.

* Twofish Cryptanalysis Rumors

Twofish isn't even remotely broken.

* Totally Secure Classical Communications?

Securing a communications link, like a phone or computer line, with a pair of resistors. By adding electronic noise, or using the natural thermal noise of the resistors.


length: 48:23m
PS: this is my cheat sheet of Bruce Schneier's Podcast:
http://www.schneier.com/crypto-gram-0512.html

Labels: ,

Wednesday, June 24, 2009

Podcast: Crypto-Gram 15 Nov 2005: users, not software manufacturers, pay the price, nothing improves.

from the Nov 15, 2005 Crypto-Gram Newsletter
by Bruce Schneier

* The Security of RFID Passports

RFID chips are passive, and broadcast information to any reader that queries the chip: the new passports would reveal your identity without your consent or even your knowledge. Thieves could collect the personal data of people as they walk down a street, criminals could scan passports looking for Westerners to kidnap or rob and terrorists could rig bombs to explode only when four Americans are nearby. The police could use the chips to conduct surveillance on an individual; stores could use the technology to identify customers without their knowledge.

The RFID industry envisions these chips embedded everywhere.

RFID chips, can still be uniquely identified by their radio behavior. Specifically, these chips have a unique identification number used for collision avoidance. This is something buried deep within the chip, and has nothing to do with the data or application on the chip.

* Liabilities and Software Vulnerabilities

It's the software manufacturers that should be held liable, not the individual programmers. Getting this one right will result in more-secure software for everyone; getting it wrong will simply result in a lot of messy lawsuits.

In a capitalist society, businesses are profit-making ventures, and they make decisions based on both short- and long-term profitability. They try to balance the costs of more-secure software - extra developers, fewer features, longer time to market - against the costs of insecure software: expense to patch, occasional bad press, potential loss of sales.

The end result is that insecure software is common. But because users, not software manufacturers, pay the price, nothing improves. Making software manufacturers liable fixes this externality.

If end users can sue software manufacturers for product defects, then the cost of those defects to the software manufacturers rises. Manufacturers are now paying the true economic cost for poor software, and not just a piece of it. So when they're balancing the cost of making their software secure versus the cost of leaving their software insecure, there are more costs on the latter side. This will provide an incentive for them to make their software more secure.

* Preventing Identity Theft: The Living and the Dead

According to Metacharge: the fastest growing form of identity theft is not phishing; it is taking the identities of dead people and using them to get credit.

* Banks and Two-Factor Authentication

Two-factor authentication won't stop phishing, because the attackers will simply modify their techniques to get around it.

Nordea bank: paper-based single-use pwd sec system...

* Sony Secretly Installs Rootkit on Computers

Sony lies about their rootkit: it removes the cloaking technology component - This component is not malicious and does not compromise security.

It does not remove the rootkit

* The Zotob Worm

Internet epidemics are much like severe weather: they happen randomly, they affect some segments of the population more than others, and your previous preparation determines how effective your defense is.

Zotob was the first major worm outbreak since MyDoom in January 2004. It happened quickly - less than five days after Microsoft published a critical security bulletin

It wasn't much of a big deal, but it got a lot of play in the press because it hit several major news outlets, most notably CNN.


length: 24:19m
PS: this is my cheat sheet of Bruce Schneier's Podcast:
http://www.schneier.com/crypto-gram-0511.html

Labels: ,

Podcast: Crypto-Gram 15 Oct 2005 Security works best when the entity that is in the best position to mitigate the risk is responsible for that risk.

Podcast: Crypto-Gram 15 Oct 2005 Security works best when the entity that is in the best position to mitigate the risk is responsible for that risk.

from the October 15, 2005 Crypto-Gram Newsletter
by Bruce Schneier

* Phishing

Financial companies have until now avoided taking on phishers in a serious way, because it's cheaper and simpler to pay the costs of fraud.

Financial institutions make it too easy for a criminal to commit fraudulent transactions, and too difficult for the victims to clear their names.

Security works best when the entity that is in the best position to mitigate the risk is responsible for that risk. Making financial institutions responsible for losses due to phishing and identity theft is the only way to deal with the problem. And not just the direct financial losses -- they need to make it less painful to resolve identity theft issues, enabling people to truly clear their names and credit histories.

* DUI Cases Thrown Out Due to Closed-Source Breathalyzer

People have a right to examine the evidence against them, and to contest the validity of that evidence.

* Jamming Aircraft Navigation Near Nuclear Power Plants

This certainly could help if terrorists want to fly an airplane into a nuclear power plant, but it feels like a movie-plot threat.

* Secure Flight Working Group Report

TSA unable to answer issue in regards to:
- Minimizing false positives and dealing with them when they occur.
- Misuse of information in the system.
- Inappropriate or illegal access by persons with and without permissions.
-Preventing use of the system and information processed through it for purposes other than airline passenger screening.

* The Doghouse: CryptIt

Most file encryptors use methods that rely on the theory of computational security, that is difficulty of key factorisation prevents decryption of the file. But this method may not work forever.

CryptIt is designed to use conventional XOR encryption on keys that are the same size as the file to be encrypted

* Tax Breaks for Good Security

Congress is talking -- it's just talking, but at least it's talking -- about giving tax breaks to companies with good cybersecurity.


* Judge Roberts, Privacy, and the Future

Advances in genetic mapping continue, and someday it will be easy, cheap, and detailed -- and will be able to be performed without the subject's knowledge. What privacy protections do people have for their genetic map, given that they leave copies of their genome in every dead skin cell they leave behind? What protections do people have against government actions based on this data? Against private actions?

time: 28:05
PS: this is my cheat sheet of Bruce Schneier's Podcast:
http://www.schneier.com/crypto-gram-0510.html

Labels: ,

Podcast: Crypto-Gram 15 Sept 2005 Athletes have to evade any tests that exist today, but they have to at least think about how they could evade any t

Crypto-Gram 15 Sept 2005
from the September 15, 2005 Crypto-Gram Newsletter
by Bruce Schneier


* Movie-Plot Threats

Security is most effective when it doesn't make arbitrary assumptions about the next terrorist act. We need to spend more money on intelligence and investigation: identifying the terrorists themselves, cutting off their funding, and stopping them regardless of what their plans are. We need to spend more money on emergency response: lessening the impact of a terrorist attack

The problem is that we all got caught up in "movie-plot threats" specific attack scenarios that capture the imagination and then the dollars.

* Katrina and Security

Large-scale terrorist attacks and natural disasters differ in cause, but they're very similar in aftermath.

Money spent on intelligence-gathering makes us safer, regardless of what the next disaster is. Against terrorism, that includes the NSA and the CIA. Against natural disasters, that includes the National Weather Service and the National Earthquake Information Center.

* The Keys to the Sydney Subway

Global secrets are poor security. 2 problems:
1. cannot apply any granularity
2. fail badly; if the secret gets out, then the bad guys have a pretty powerful secret.

* New Cryptanalytic Results Against SHA-1

The time complexity of the new attack is 2^63, Previous result was 2^69; brute force is 2^80.

* Zotob

Microsoft plug-and-play vulnerability

* Airline Security, Trade-offs, and Agenda

All security decisions are trade-offs, and smart security trade-offs are ones where the security you get is worth what you have to give up.
There are differences between perceived risk and actual risk, differences between perceived security and actual security, and differences between perceived cost and actual cost.

* Cameras in the New York City Subways

New York City is spending $212 million on surveillance technology: 1,000 video cameras and 3,000 motion sensors for the city's subways, bridges, and tunnels.

* Lance Armstrong Accused of Doping

Ability of a security mechanism to go back in time is interesting, and similar to police exhuming dead bodies for new forensic analysis, or a new cryptographic technique permitting decades-old encrypted messages to be read.

It also has some serious ramifications for athletes considering using banned substances. Not only do they have to evade any tests that exist today, but they have to at least think about how they could evade any tests that might be invented in the future.

* Peggy Noonan and Movie-Plot Terrorist Threats

This game of "let's imagine" really does stir up emotions, but it's not the way to plan national security policy.

* Trusted Computing Best Practices

The basic idea is that you build a computer from the ground up securely, with a core hardware "root of trust" called a Trusted Platform Module (TPM). Applications can run securely on the computer, can communicate with other applications and their owners securely, and can be sure that no untrusted applications have access to their data or code.



time: 25:54
PS: this is my cheat sheet of Bruce Schneier's Podcast:
http://www.schneier.com/crypto-gram-0509.html

Labels: ,

Podcast: Crypto-Gram 15 Aug 2005: People are the stronger point of security process.

Crypto-Gram 15 Aug 2005
from the August 15, 2005 Crypto-Gram Newsletter
by Bruce Schneier


* Profiling

Good security has people in charge, people are resilient, people can improvise, people are creative, people can develop on the spot solutions, ppl can detect attacker who cheat & can attempt to maintain security despite the fact tha attacker is cheating, ppl can detect passive failure, ppl are the stronger point of security process.

When a security system succeeds in the face of a new or coordinated or devastating attack, it's usually due to the efforts of people.

To profile is to generalize. It's taking characteristics of a population and applying them to an individual. People naturally have an intuition about other people based on different characteristics. Sometimes that intuition is right and sometimes it's wrong

1. Whenever you design a security system with two ways through - an easy way and a hard way - you invite the attacker to take the easy way.

2. If we are going to increase security against terrorism, the young Arab males living in our country are precisely the people we want on our side.

3. Despite what many people think, terrorism is not confined to young Arab males.


* Cisco and ISS Harass Security Researcher

Full disclosure is good for society. But because it helps the bad guys as well as the good guys, many of us have championed "responsible disclosure" guidelines that give vendors a head start in fixing vulnerabilities before they're announced.

Can see class-action suit against Cisco.

* E-Mail Interception Decision Reversed

Entertainment industry used to greatly expand copyright law in cyberspace. They argued that every time a copyrighted work is moved from computer to computer, or CD-ROM to RAM, or server to client, or disk drive to video card, a "copy" is being made...

* Stealing Imaginary Things
Every form of theft and fraud in the real world will eventually be duplicated in cyberspace.

* Turning Cell Phones off in Tunnels

this is to avoid cell phone to trigger bomb, but bomb can be triggered even with kitchen alarm. Communication availability is far more important.

* Searching Bags in Subways

Counterterrorism is most effective when it doesn't make arbitrary assumptions about the terrorists

* Plagiarism and Academia: Personal Experience

Schneier is surprised...if they were going to do this, wouldn't it have been smarter to pick a more obscure author?

* RFID Passport Security Revisited

The new design:

1. The data on the chip is encrypted, and the key is printed on the passport. The officer swipes the passport through an optical reader to get the key, and then the RFID reader uses the key to communicate with the RFID chip = passport-holder can control who has access to the information on the chip

2. A thin radio shield in the cover, protecting the chip when the passport is closed

* Risks of Losing Portable Devices

password protection and encryption

* How to Not Fix the ID Problem

more paperwork in order to get an ID

* Secure Flight

SA did not fully disclose to the public its use of personal information as required by the Privacy Act. TSA use of personal information drawn from commercial sources to test aspects of the Secure Flight program.

it's better to change the Privacy Act statement before violating the old one. Changing it after the fact just looks bad.

* Shoot-to-Kill

The most common type of bomb carried by a person has been the hand grenade.

When a shoot-to-kill policy is known to be in effect, that suicide bombers will use the same kind of dead-man's trigger on their bombs: a detonator that is activated when a button is released, rather than when it is pushed. This is a difficult one. Whatever policy you choose, the terrorists will adapt to make that policy the wrong one.


* Visa and Amex Drop CardSystems

The biggest problem with CardSystems' actions wasn't that it had bad computer security practices, but that it had bad business practices. It

time: 49:45
PS: this is my cheat sheet of Bruce Schneier's Podcast:
http://www.schneier.com/crypto-gram-0508.html

Labels: ,