Interview with Joshua Holden (Page 2)
N: What are the current areas of research and interest in the field of cryptography?
JH: It's a huge field and people are doing all sorts of things. I subscribe to to a couple of email lists on cryptography research, so let me try to give you a sampling of things that have come through in the last couple of days: automatic verification of security protocols, randomized "fingerprinting" for content protection, discrete Hilbert transforms for data hiding, preventing coordinated network attacks via distributed alerts, generic complexity of one-way functions, information hiding in text, images, and audio, faster group operations in elliptic curve cryptography, group theory and pairing-based cryptosystems, ID based encryption and signatures, RFID authentication protocols, voting schemes, attacks on the elliptic curve digital signature algorithm, "onion routing" anonymity networks.... It's all over the map. Another summary might be taken from the session titles of the last "CRYPTO" conference, which is one of the premier conferences in cryptology: Cryptanalysis, Secure Searching, Theory, Lattices, Random Oracles, Hash Functions, Quantum Cryptography, Encryption, Protocol Analysis, Public-Key Encryption, Multi-Party Computation. Still a lot of stuff.
N: There exist plenty of cryptographic technologies, both proprietary and open source. When a need arises for a cryptographic solution, what criteria do you use to make your choice?
JH: This is not my area of expertise, but I do have some general advice. First, you have to know what you are going to use it for. This seems obvious, but we all know that sometimes people are tempted to by something that looks cool and then figure out what you are going to use it for later.... There's no perfect technology; they all have strengths and weaknesses. Second, go with the tried-and-true. Since we don't have a good way of proving the security of any really practical cryptographic technology, the best test of something new is just to put it out there and see if anyone can break it. Go with systems and companies that have been around and gotten good reviews. Beware of any company that won't tell you how their system works --- just because their internal testers couldn't break it doesn't mean that it won't be broken immediately when the larger community gets a hold of it. "Security through obscurity" is generally not a good idea.
N: What is your take on DRM?
JH: DRM is in many ways a legal issue and I'm really not qualified to say much about that. From a technical point of view, no DRM system is perfect but that's not really the point. Users, like attackers, will take the path of least resistance. If music (for instance) with DRM is easy to get, easy to use, and basically lets you do what you want, most people will use it --- thats a win-win for music companies and buyers. Apple's DRM did pretty well with that --- the only thing that was difficult for me to with iTunes DRM music was get it on my (non iPod) music player. I could listen to it on multiple computers, burn it to more than one CD, and so on. And it wasn't too expensive. So I used it. And I do think that artists should get paid for what they produce (I play in a rock band myself!) and I'm okay with the music companies getting a cut for what they contribute. (How much that cut should be is a different matter, but not really about DRM!) One thing I am concerned about is aspects of DRM that interfere with fair use of copyrighted material. If I don't have the ability to make a backup copy, or distribute excerpts of written material for educational use, and so on, then DRM is overstepping its proper function, and that's a problem for me. Also, I'm opposed to attempts to criminalize the software that breaks DRM, as opposed to use the illegal use of that software. Someone (I'm afraid I've forgotten who) compared that to criminalizing the sale and manufacture of bolt cutters because they can be used by burglars. It's not a complete parallel, and like I said I'm not a lawyer. But my take is that criminalizing the software is an unnecessary overreaction.
N: What are your views on social engineering which has a potential to jeopardize even the strongest cryptographic technologies? How do you emphasize on this part, as an educationist?
JH: Again, this is really a question of understanding the whole system and knowing where the weakest link will be. If you designing the system, you need to recognize that the weakest link may be your users and try to educate them about good software practices. If you are the user, you need to take some responsibility for educating yourself. Again, we as designers and educators have to remember that users will take the path of least resistance. Some things are easy --- don't give your password to someone who contacts you, as opposed to you contacting them. On the other hand, if your system has a secure way to do something and also an insecure way which is easier (or more familiar), users will do it the insecure way, even if they know better. I've done it myself on occasion. And if the security gets too cumbersome, users won't use the system at all. So you have to balance security with ease-of-use, and also eliminate loopholes when it's practical. Incidentally, from the users' point of view, software security is not that different from a lot of things in life, for instance home security, food safety, and sex. I like to tell students to "practice safe software". Don't interface with people you don't trust. When you share files with someone, you share files with everyone they've ever shared files with. It's really a question of being aware of what the risks are and following common-sense methods to minimize them.
Go back to the first page of the interview: