Motivation: There has been very little investigation into ways in which mathematically secure crypto solutions have been misused by users who are deploying them. Frequently the answer is simply to further educate the user. However, the author of the paper feels that common misuses can be avoided from the development end if the people who write crypto software avoid common pitfalls. This paper is a humorous look at the various mistakes the author has come across in his years as a crypto consultant, and the possible solutions that could prevent those mistakes from ever happening. Summary and dicussion points: Private Keys: In short, private keys seldom are. Many companies in an effort to save time and money simply use the same key in multiple computers or applications. In addition, developers often keep copies of private keys on their local machines in order to test before deployment. Many crypto implementations also allow exportation of plaintext private keys, which are then emailed around to those who need them unencrypted. The solution to this problem from a developement stanpoint is simply to make it difficult or impossible to export a private key, especially in plaintext. The distinction between public and private keys should always be clear. Nothing can be done about the cost of private keys from the development end (that is clearly more an economic issue), but generation of new keys can be made easier so that people are less likely to mind getting new ones. Key Management: If a crypto solution requires extensive key management strategies, users often take the easy way out and simply reuse or share keys, reducing the actual security of the system. The solution is to accept that a slightly less secure strategy which requires little or no key management may in practice be much more secure, since it will at least be less misused. Timestamps: Secure communications often require timestamps as a part of the exchange. However, clock drift between machines can be extensive and lead to problems in accurate exchange of information. The solution is to use nonces or relative timestamps, since these experience no drift from machine to machine. Using the wrong solution: An example of this that the author uses is a case where someone used RSA for large scale secure bulk data transmission. If the data being transfered is a large amount which is time sensative, then using public key cryptography will slow the process down. So for streaming applications, a better way is to use RSA or a similar system to exchange private keys and then use a fast symmetric system to quickly encrypt, tranfer, and decrypt the necessary information. The solution to this problem of misuse of the crypto solutions in general is to restrict the ways in which users can use the various functions, since in many cases they lack the understanding of the drawbacks and advantages of various systems. Leaving anything "as an exercise": An example of this is leaving the seed for a random number generator up to the user. In general, users will not understand why this is important and will take the easy way out and use something like the string "123456789abc", which certainly lacks ANY randomness. The solution for this is just not to leave any hard problems to the user, since they will likely have an even harder time solving them correctly. Make errors noticable: If an essential part of your algortihm fails, make the error notice impossible to miss. Do not assume that any part is infallible, since quite likely someone will find a way to make it fail. The solution is just to make error notices so obvious that even without explicit check, it will be obvious (e.g. have the final encryption be "000000000" so no data will be given away). Restrict the power of users: Don't leave too much flexibility of use to users in general. (This point is similar to "Using the wrong solution".) Usually the average user lacks the understanding necessary to make the correct decisions. Instead, keep calls as general as possible - merely say "transmit encyrpted data" as a call rather than letting them choose the secure encryption method explecitly. This point is contraversial since many developers prefer to give as much functionality as possible to a product, but in this case leaving this functionality out could improve the security of the final product. Pros: - jokes and stories: They make the paper easier to read and allow users to get a feel for real problems. - highlights a problem that has not been examined enough - focuses on designers, since most of existing work is focusing on education users and obviously that has been insufficient - brings HCI to security, since sometimes tradeoff of security vs. useability is worth it if users will not be using the product correctly and will damage overal actual security anyway Cons: - jokes: Not everyone liked them. - The paper is somewhat haphazard in its dealings with topics. - contributes little or nothing new as far as research to CS - assumes that the general public is stupid - does not present a comprehensive solution, just various points to keep in mind - suggests limiting functionality, when some believe answer is still better education of users, and that the users should be responsible as adults for their own mistakes, not limited like children - it is impossible to really make an "idiot proof" system, so are little things like this paper suggests really going to make enough of a difference?