Teaching Usable Privacy and Security: A guide for instructors

Lorrie Cranor, Jason Hong, and Michael Reiter

Last updated 6 January 2007



This guide provides suggested curricula, readings, assignments, and other materials for courses on usable privacy and security (UPS). While this guide geared towards a full-semester interdisciplinary graduate course, we also offer suggestions for using these materials in undergraduate courses and tutorials, and as modules in other courses that will devote only a small amount of time to usable privacy and security.

Motivation for usable privacy and security courses

There is growing recognition that technology alone will not provide all of the solutions to security and privacy problems. Human factors play a significant role in these areas, and it is important for security, privacy, and trustworthy computing experts to have an understanding of how people will interact with the systems they develop. Likewise, human-computer interaction experts who work in the secure systems area need some understanding of computer security and privacy. In 2003 the Computing Research Association challenged the information security and assurance research community to "give end-users security controls they can understand and privacy they can control for the dynamic, pervasive computing environments of the future." Meeting this challenge will require an interdisciplinary approach involving researchers educated in usability and human factors as well as trustworthy computing.

Types of courses

Usable privacy and security is an interdisciplinary topic, and thus lends itself to a variety of different types of courses. The curriculum will vary depending on the amount of time available and the background of the participants. It is important to decide up front whether the course will have pre-requisites, and if not, how much background will be taught.

Background of participants

A usable privacy and security course for students who have a background in computer security but not in human-computer interaction should include substantial instruction on human-computer interaction methods. A mini course in human-computer interaction methods may need to be integrated into this course, especially if students are going to be conducting user studies as part of course assignments. A review of computer security topics most relevant to the class should probably be included, but instructors can assume students are coming to the course with a general understanding of computer security concepts. Since most computer security courses have only limited instruction on privacy, if there is going to be a privacy component to the course, some instruction on privacy will also be needed. This kind of course should include some case studies and research papers on the design and evaluation of usable privacy and security, but there probably will not be a lot of time for discussing research papers.

A usable privacy and security course for students who have a background in human-computer interaction but not in computer security should introduce students to a variety of computer security concepts. The course should discuss threat analysis and introduce students to concepts such as least privilege. It should introduce students to commonly-used security tools and components of those tools, for example symmetric and public-key cryptography, password authentication, and firewalls. This course should also introduce students to privacy issues. Only a review of human-computer interaction methods will be needed. This kind of course should include some case studies and research papers on the design and evaluation of usable privacy and security, but there probably will not be a lot of time for discussing research papers.

A usable privacy and security course offered to students with a variety of backgrounds should provide an overview of computer security, privacy, and human-computer interaction methods. Teaching such a course can be challenging because material that will be brand new for some students will be well-known by others. On the other hand, such a course presents an opportunity for having students teach each other, especially if most of the students are graduate students. When we taught this course at CMU in 2006 we spent only a small amount of time teaching basic concepts of security, privacy, and human-computer interaction, and instead delved right into case studies, research papers, and projects. We divided the students into interdisciplinary teams and relied on the students to teach each other.

A usable privacy and security course that requires pre-requisite courses in both computer security and human-computer interaction can begin with only a brief review of these areas and then concentrate on the intersection of these areas. Such a course would likely focus mostly on case studies and research papers, and would lend itself to a seminar style course.

A usable privacy and security course for undergraduate students, regardless of background, will likely include more lectures and associated homework assignments on basic concepts than a course for graduate students. A graduate course, especially one for PhD students, can be taught in more of a seminar style with readings from research papers and student presentations. A course for masters students might be taught like a PhD course if the masters students are in a research-oriented program, or more like an undergraduate course if the masters students expect to focus on practical skills they will be able to apply immediately in their careers.

Length of course

There is ample material on usable privacy and security to warrant a full semester course. A course of this length provides sufficient time for students to complete a major project. If a shorter course is desired, most of the material from a full-semester course could be covered in a half-semester or "mini" course without including a project. A project course could be offered as a second mini course for interested students.

Lectures on usable privacy and security could be included as modules in other courses. A computer security course might include one or a few lectures on human-computer interaction, while a human-computer interaction course might include one or a few lectures on computer security and privacy. In both cases, the use of well-chosen case studies can help students understand the relevance of the material to the rest of the course. Besides teaching key concepts in such a module, a goal should be to give students an appreciation of what is involved in usable security design and evaluation as well as the benefits of interdisciplinary collaboration.

Tutorials and short professional development courses on usable privacy and security can be approached in a similar manner as UPS modules in other courses. The length of the tutorial and background of the audience will influence the types of concepts and skills that are taught.


We would like to thank Microsoft Research for funding the development of this instructor's guide. We would also like to thank the students who took our course and provided us with lecture notes, activity ideas, and feedback. Finally, we would like to thank Simson Garfinkel and Charles Frank for reviewing a draft of this guide and providing valuable feedback.

Example Courses

Carnegie Mellon University: 5-899/17-500 Usable Privacy and Security

Instructors: Lorrie Cranor, Jason Hong, and Michael Reiter
Course web site | Syllabus

We developed a full semester course on usable privacy and security for the Carnegie Mellon School of Computer Science beginning in the Spring of 2006. This course was designed to introduce students to a variety of usability and user interface problems related to privacy and security and to give them experience in designing and conducting studies aimed at helping to evaluate usability issues in security and privacy systems. Topics covered included: human-computer interaction methods for design and evaluation, secure interaction design, trust and semantic attacks, design for privacy, making privacy visible, web browser privacy and security, authentication and alternatives to text passwords, PKIs and usability, usable secure communications systems, and usable tools for security administration. Students analyzed usability issues related to a number of deployed security and privacy systems. In addition, students conducted group projects involving usability analyses and pilot user studies.

The course was designed to be suitable both for students interested in trustworthy computing who would like to learn more about usability, as well as for students interested in usability who would like to learn more about trustworthy computing. The course was open to all Carnegie Mellon graduate students with sufficient technical background. Juniors and seniors were given the opportunity to enroll with permission of one of the instructors. This course satisfied elective requirements for Carnegie Mellon computer science students at all levels, as well as for some students in other Carnegie Mellon colleges. We specifically targeted the course for students in the following degree programs: Masters in Human-Computer Interaction; Master of Science in Information Security Policy and Management; Master of Science in Information Security Technology and Management; Master of Software Engineering; PhD in Computation, Organizations and Society; PhD in Human-Computer Interaction; and PhD in Computer Science.

We ended up with 15 students enrolled in the course when we offered it in Spring 2006. These included one undergraduate (a senior in electrical and computer engineering), a non-degree student, 7 masters students, and 6 PhD students. All of the targeted degree programs were represented except software engineering.

The course was developed and taught by an interdisciplinary team of three Carnegie Mellon faculty members. All three faculty members have experience and expertise in the usable privacy and security area. In addition, each conducts research with a focus on a different area: Lorrie Cranor focuses on privacy, Mike Reiter focuses on security, and Jason Hong focuses on human-computer interaction. Having three faculty members from three different disciplines involved was a great way to teach an interdisciplinary course, however, a course like this could be taught by a single faculty member. In that case, it would be especially helpful to include some guest lecturers.

Our approach in teaching the course was to first provide students with a background in the methodologies used in the relevant fields, and then to expose them to current research in this area as well as examples of secure system design with varying levels of usability. Introductory lectures were given by each of the three faculty members at the beginning of the semester and at the beginning of each of the three major sections of the course (usability, privacy, and security). Students were assigned to prepare lectures for most of the remaining lectures, including a set of discussion questions and a class "activity." The student lectures were consistently well done and served as learning experiences for both the students who gave them and the rest of the class. We were especially pleased with some of the activities the students came up with. We have included some of these activities in the Course Materials section below.

Students were given a substantial reading assignment each week that included chapters from the course text and research papers. Students were required to submit a short summary (3-8 sentences) and a "highlight" for each chapter or article in the reading assignment. The highlight could be something the student found particularly interesting or noteworthy, a question for class discussion, a point of disagreement, etc. We required students to bring their summaries and highlights with them to class in order to receive credit for doing them (in this way these assignments doubled as a way of taking attendance and penalizing students for missing class). The summaries and highlights were graded for completeness. Students were permitted to drop their lowest two grades.

During the fourth week of the semester students were asked to propose group project topics and pitch them to their classmates during one of the class meetings (we also offered some suggested topics). Students were then instructed to assemble themselves into project groups of 3 or 4 students with diverse backgrounds (we provided a set of rules to ensure that students in each degree program would be spread across the groups). The students worked in their groups for the rest of the semester on their projects. They gave progress report presentations during the 11th week of the semester and final presentations during the final exam period. Students were asked to pick a project that would involve a pilot user study to evaluate the design of an existing or proposed privacy- or security-related system or gain insight into users' attitudes or mental models related to some aspect of security or privacy. Although students were permitted to conduct their pilot study on their classmates, all four groups decided to go through the IRB approval process and recruit outside volunteers for their studies so that they could produce publishable research results. All four groups ended up presenting their results as posters at the 2006 Symposium On Usable Privacy and Security (SOUPS). The students told us that they got a lot out of doing the project. The interdisciplinary teams worked very well. We saw the HCI students teaching the security students how to conduct user studies, and we saw the security students teaching the HCI students about security topics. If we had not had such a good mix of students in the class the faculty would have had to spend more time with the project groups or devoted more lecture time to the details of how to conduct user studies and on security topics related to the projects.

Because this was a new course, we conducted three evaluations of the course throughout the semester, in addition to our university's standard faculty course evaluations. The feedback was extremely favorable on each of the four evaluations. Our approach to providing only a small number of introductory lectures seemed to be successful. Students reported that the introductory lectures in the areas outside their primary area of expertise were very informative, while the lectures in their area of expertise were less informative, but still interesting. The students really liked the guest lecture we had from a PhD student who talked about a published paper he had written in the usable privacy and security area. Besides talking about his results, he gave the students a look behind the scenes at how he conducted his study. Most students said they liked the seminar style approach to the course. However, some students indicated that they would have preferred to have more faculty lectures and fewer student lectures. In future versions of this course we will likely include additional guest lectures and perhaps some additional faculty lectures as well.

Harvard University: CS279r Topics in User Interfaces: Security and Privacy Usability

Instructors: Rachna Dhamija and Stuart Shieber
Course web site | Syllabus

This course covered content very similar to the CMU course. It included many of the same reading assignments as the CMU course and a similar term project. However, the instructors took a somewhat different approach to covering the course material, which is well-suited for a course that will have students who mostly do not have backgrounds in HCI. This course included take-home and in-class assignments to guide students through the interface design and usability study that is part of their semester-long group project. Students were assigned to perform a task analysis, develop design sketches, create and evaluate a low-fidelity prototype, create and evaluate a high-fidelity prototype, conduct cognitive walk throughs or heuristic evaluations, conduct pilot usability tests, and finally conduct formal usability tests. Each week one step in this process was partially completed in class and students provided suggestions to the other groups.

Similar to the CMU course, this course included weekly reading assignments, for which students were expected to write short summaries and commentaries. For this class the commentaries were submitted to an online discussion board so that students could read each others' comments and add further comments.

Some additional topics covered in this course that were not explicitly covered in the CMU course include: human interactive proofs, mobile and ubiquitous computing, digital rights management.

This course had about a dozen students in it when it was taught in Spring 2006.

Harvard University Extension School: CSCI E-170: Security, Privacy and Usability

Instructor: Simson Garfinkel
Course web site | Syllabus

This course placed more emphasis on teaching security and privacy concepts and less emphasis on HCI methods than either the CMU or Harvard courses. It was similar to a traditional security course with 50% devoted to security, 25% devoted to privacy, and 25% devoted to usability. It included more traditional computer security readings and assignment, in addition to readings from the Security and Usability book. Students were required to conduct a literature review for a mid-term project and a pilot study for a final project. Projects were conducted in small groups.

This course was offered both on campus and remotely (via real-time video feed) in Fall 2005. About 30 students took it on campus and 30 students took it remotely. The couse was also offered in Fall 2004 at the Harvard Extension School and in Summer 2004 at Northeastern University.

Course Materials

We have broken down our course materials into 10 segments, similar to the approach we took when teaching our course at CMU. We spent between one and four lectures on each of these segments (for a total of 24 lectures). These segments can be covered in almost any order, although it would make sense to include the overview and introductory segments near the beginning of a course.

For each segment we provide a brief overview of the topics covered, suggested readings, discussion questions, activities, and pointers to slides and lecture notes. The discussion questions and activities might be used in class or as part of a homework assignment. In our class most of the activities were done in class with the students divided into small groups. The students were split into different groups for each class meeting. Some of the activities we have included here would be better done as homework assignments, ideally with a follow-up class discussion. Most of the slide sets and lecture notes have been provided by our students. We provide them unedited, as examples of ways to present material relevant to each segment. Many of the discussion questions and activities were also suggested by our students.

Our course used the book Security and Usability: Designing Secure Systems that People Can Use, edited by Lorrie Cranor and Simson Garfinkel. We have recommended readings from this book in our list of suggested readings. The Symposium On Usable Privacy and Security and the HCISec Bibliography are also good sources of readings. For our course we typically assigned 3 or 4 readings each week and also recommended a number of optional readings.

Usable Privacy and Security: Overview and Motivation

This segment is designed to provide an overview of the course and interest students in the topic. We presented this in two lectures: one on the first day of class and one after we had presented our three introductory lectures. We have also given stand-alone 1-hour lectures on this topic at conferences and in professional development courses. In this segment students should be exposed to a range of problems in the usable privacy and security area. They should see and discuss some tangible examples of both unusable and usable design related to security or privacy.

Key concepts:


Suggested readings

Discussion questions

  1. What are the advantages and disadvantages of each approach to usable security (invisible security, visible security, and user training)? For each approach, come up with an example of a successful use and an example of an unsuccessful use.
  2. Once users are educated and trained about how to act in a secure manner, why do they continue to perform in a manner that is not secure?


The activities for this segment are designed to get students thinking about usable privacy security problems and how to solve them. The two example activities presented here are related to authentication, but activities might be developed around any of the topics that will be discussed in this course.

Mobile device password scenario

Your target users are young professionals who are constantly on the go. These users have a variety of devices with them at all times, such as laptop, cell phone, PDA, and possibly others. Each of these devices requires a password for use. The users will use their devices in a variety of public settings and will need easy access to their data at a moment's notice. However, the consequences of someone else gaining access to their data are severe. The dynamic environment poses a threat to the entry of passwords into these devices, but hardware limitations preclude the use of biometrics. Discuss how to address security needs with passwords in this context.

Questions to Keep in Mind: Should each device have it's own password or should there be one password? Should users change their passwords on a regular basis? Should passwords be randomly assigned or chosen by users? What precautions should be taken for password entry in public spaces?

Hospital workstation password scenario

Your target users work in a hospital. Confidentiality of patient data cannot be compromised. Different employees have different levels of clearance within the one system that controls all of the patient records. There are a limited number of public workstations that are highly trafficked throughout the day. Current practice at the hospital is that one worker logs in and often many people with different levels of clearance work under that same account, even though they are not authorized to do so. Often, the workstation remains logged in between users, thus an unauthorized user could gain access to patient records. In addition, passwords change on a monthly basis so it is more convenient for the workers to just use the account that has already been logged in than try to recall their always changing password. Management insists that the passwords much change frequently to reduce the risk of a hacker viewing the confidential data. Discuss how to address security needs with passwords or other forms of authentication in this context.

Questions to Keep in Mind: Should users change their passwords on a regular basis? Should passwords be randomly assigned or chosen by users? What precautions should be taken for password entry in public spaces? Are there any alternatives to password authentication that might work in this setting? What can be done to change the workers' attitudes about security?

Introduction to HCI Methods and User Studies

This part of the course provides an overview of human-computer interaction, including how to gather information about existing work practices through field studies, how to get quick feedback from end-users through low-fidelity paper prototypes, how to evaluate existing systems through user studies, and understanding mental models.

Key concepts:


Suggested readings

Discussion questions

  1. Everyone has a horror story; what's the worst system you've ever had to interact with? How did it make you feel?
  2. Harder question: what are examples of good user interfaces, and why? It's easy to find and criticize bad examples, much harder to draw out the factors that contribute to good user interfaces.
  3. What experiences have you had with respect to developing usable systems?
  4. What are technical and non-technical factors leading to hard to use systems?
  5. What is Security Software? Is this a category that makes sense? What software is not security software? Is Microsoft Word security software? It can digitally sign a document. it can scan for keywords and compromise your privacy. Is Firefox security software?

Why Johnny Can't Encrypt

  1. Do you agree with Whitten and Tygar's statements about the importance of usability when designing secure systems?
  2. Whitten and Tygar list five principles that make designing usable security very difficult. Which principles do you think are the hardest for designers to deal with?
  3. The question Whitten and Tygar ask when defining the goals of their test on PGP 5.0 might seem leading or loaded. Do you feel it was appropriate?
  4. Whitten and Tygar do both a Cognitive Walkthrough and a user test; what do you think are the advantages of this approach? What are the strengths and weaknesses of both techniques?
  5. What are the biggest problems found in the Cognitive Walkthrough? Can you suggest some solutions?
  6. When considering the user test, what do you like about the way it was performed and what would you have done differently?
  7. Do the results of the user test surprise you? What are some of the major problems in PGP 5.0 that contributed to the poor usability?
  8. Whitten and Tygar stress the differences between normal system usability and designing for secure system usable. Do you agree with their conclusions?

Johnny 2

  1. Garfinkel and Miller state that a key reason for the usability failure of PGP 5.0 was underlying third-party certification model. Do you agree?
  2. What do you think about Key Continuity Management vs. third-party key certification? Discuss both the security and usability issues.
  3. Compare the setup of the user test in the original Johnny to the new test in Johnny 2.
  4. How do the results in Johnny 2 compare to those of the original Johnny test?
  5. Do you have any suggestions for improvement of the CoPilot interface?
  6. Discuss the changes from the original Johnny to Johnny 2. Where do you think we have improved and where are we still lacking with regard to usability and security?


Human subjects training

Students should complete your university's human subjects training. For example, CMU requires people doing human subjects research to take the National Cancer Institutes online training course.

Observations of people using technology

Observe people in a public place using a computerized system. For example, you might observe people using a public transit ticket machine, a parking garage pay station, a grocery store self-checkout machine, or an airport self-check-in kiosk. Stay long enough to observe both experienced and inexperienced users using the system. What kinds of problems did people have using the system? What aspects of the system appeared to be easy to learn? What aspects of the system appeared to be difficult to learn? What aspects of the system seemed to frustrate experienced users? How might the design of the system be improved?

Quick user study design

In a small group, examine a piece of technology and design a simple user study to test the usability of a particular aspect of that technology. The user study should involve having a user perform a task that will take no more than a few minutes.

Trade one member of your group with another group. The person from the other group will be your test participant. Conduct a pilot study with your test participant.

This is an activity that can be conducted during class in as little as 30 minutes to give students a taste of what it is like to design and conduct a user study -- divide the students into groups and have them spend 10 minutes on the first part and 10 minutes on the second part. Use the last 10 minutes for a discussion. In our class a cell phone, a portable CD player, and a calculator were tested.

Good and bad design examples

Look for examples of good and bad user interface design related to privacy and security. Offer suggestions for improving the bad designs. Keep a design scrap book throughout the semester or bring examples in for "show and tell" each week. (See http://www.baddesigns.com/ for bad design examples that are not necessarily related to privacy or security.)

Introduction to privacy

This segment provides introductory material on privacy. While most people know privacy when they see it, few have given much thought to exactly what privacy is.

Key concepts:


Suggested readings

Discussion questions

  1. What does privacy mean to you?
  2. How could web site privacy policies be made more useful?
  3. What are the privacy risks associated with release of search engine log files? How could these files be sanitized or obfuscated to minimize or eliminate the privacy risks?


Technology and privacy risks

Pick a technology that causes privacy concerns.

Privacy policies

Pick a particular industry or type of web site and use Privacy Finder to find three different companies from that industry, at least one of which has a P3P policy.

Examine the privacy policies for sites you found. For each policy, provide the URL and list major strengths and weaknesses related to both substance and presentation.

Use the privacy report feature in Privacy Finder. Compare the substance and presentation of the Privacy Finder privacy report with the corresponding privacy policy posted by the web site.

(Pop) cultural references to privacy

Find a reference to privacy in art, literature, advertising, or pop-culture (tv, movie, cartoon, etc.). Explain the reference and what it says about privacy. If possible, provide a URL or a copy of the privacy reference that you can share with the class.

Introduction to security

This segment provides a brief introduction to concepts of computer security that are likely to be useful to the course participants. Since computer security is a very broad topic, this introduction is not intended to be comprehensive. Rather, the introduction focuses on topics for which background knowledge is likely to be useful when conducting user studies of security mechanisms.

The key concepts discussed in this introduction are:


Suggested readings

Discussion questions

  1. Is spam a computer security issue? How about employees surfing web sites unrelated to the business of their companies? What are other examples of activities for which it is a matter of perspective whether "security mechanisms" are needed to defeat them?
  2. Do you agree with the provided definitions of "trust", "trustworthiness" and "usability"? Is trust something to strive for or something to avoid?
  3. Which of the design strategies for improving the usability of security mechanisms is most effective for improving security, i.e., making it "disappear" or presenting security decisions using better metaphors? Are there other alternatives? Which will win in the marketplace?
  4. Who bears the cost of a security mishap?
  5. Are backups a security issue?
  6. Security is almost always cast in terms of privacy. Does this distort usability?


  1. Think of your computer activities over a typical day. Try to enumerate all of the computer components (both hardware and software) that you trust during these activities. For what do you trust each one?
  2. Spend some time trying to configure a computer security mechanism, e.g., configure a firewall; security-enable a wireless access point and configure computers to have access to it; or set up an encrypting file system. What do you think of the configuration process? How might you improve it?

Secure interaction design

The main goal of secure interaction design is to design a computer system to protect the interests of its legitimate users. Some examples might include protection from viruses, spyware, phishing, and accidental leaks of personal information.

Key concepts:


Suggested readings

Discussion questions

  1. What are some factors contributing to poor secure interaction design?
  2. How do you know if you have created a successful design?
  3. Are there common metaphors that can be shared across multiple domains for secure interaction design? For example, many web sites use lock images to denote security.
  4. What kinds of feedback are useful for letting people know things are okay?
  5. One significant issue with secure interaction design is spoofing, where scammers create fake user interfaces that look just like the regular ones, to trick people into falling for them. What are ways to prevent or avoid spoofing?


Firefox security and privacy extensions

Download and review a number of Firefox extensions related to security and privacy (you can find them at https://addons.mozilla.org/extensions/). For each extension ask: Who is this extension designed for? What are their goals? Does the extension meet those goals? Does the interface accurately portray what the tool really does? Could a member of the intended audience use this extension?

Analyze the secure interaction design of a UI

Do a task analysis, cognitive walkthrough, and/or user study on a user interface that deals with security or privacy. Examples might include downloading software, opening up an attachment, logging into a web site, getting a certificate, being notified of a virus, etc. What are the strengths of the design? Weaknesses? What would you change to make it better?

Trust and semantic attacks

This section looks at a class of security attacks that target end-users rather than computer systems themselves. Semantic attacks try to trick people into giving up information. The most common example of this is phishing attacks, in which scammers impersonate legitimate businesses to gather passwords and financial information.

Key concepts:


Suggested readings

Discussion questions

  1. How do you define trust?
  2. Should software always respect a user's decision, even if the user appears to be doing something dangerous (for example, going to a web site that appears to be a phishing site)? Should software security interventions allow users to override them?
  3. How can you figure out whether a web site is trustworthy? How can you figure out whether it is safe to download and install a piece of software? What simple advice can you give to average users on this topic?
  4. Not all trust decisions are made by end-users. For example, system administrators might make some settings that apply to everyone in an administrative domain. This is useful because it is economically feasible (assuming there are lots of users) and reduces the burden on individuals. If you consider the overall environment in which people work, you might include software like the operating system and anti-virus checkers, internal organizations such as sysadmins, as well as external organizations like ISPs, web sites, and anti-phishing groups, each of which, to some extent, factor into how trust decisions are made. Are there other ways of offloading trust decisions by end-users? Are there other kinds of software or organizations that can either make or provide advice on trust decisions on behalf of end-users?
  5. Social engineering is a common way for hackers to gain access to sensitive passwords. What techniques are used to accomplish this? Why do they work?
  6. Three general approaches to usable privacy and security are to (1) make things invisible, (2) educate and train users, and (3) provide better user interface metaphors and awareness of what's going on. Which of these three approaches currently works best for trust decisions? Which is the weakest? Most importantly, what is the right combination of these that will be most effective?


Risks and approaches to mitigating them

Analyze the following situations. What risks are you exposed to? How could they be mitigated?

You receive a high-quality printed flyer in the mail advertising a new dating web site. You want to give it a try. It costs $20 per month. The first month is free, but you are required to provide checking account information so that payments are automatically deducted after the first month. You also need to fill out an extensive background and personality profile, including contact information.
While browsing a new web site you see an ad for an online electronics store. You have wanted to get a large HDTV for many months now. They are running a sale on their store brand. You aren't familiar with that brand, but it is a little cheaper than any of the name brands you have been keeping an eye on. And shipping is free for you first purchase. It has all the features you wanted. They accept all major credit cards.
You have been feeling pretty sick for several days now. One of your friends insists you have mono and should see the doctor. Your other friend says that is ridiculous and that it is probably just a cold. Since you just started a new job and your health insurance hasn't kicked in, it costs $80 to visit the doctor. You use Google to find an online healthcare website. You check the symptoms for mono on the site and you find your symptoms are consistent. (How would the situation change if your symptoms were consistent except for one?)
You see a newspaper ad for a new online bank. You haven't heard of it before, but it has really good interest rates and it says it is FDIC insured.
You receive an email advertising a free antivirus program. You were running a different one, but you let the subscription expire so you need a new one. You haven't heard of this other one before, but it is free.

Anti-phishing toolbars

Download and try an anti-phishing toolbar.

Anti-phishing on Web Sites

Show what different web sites are doing to fight off phishing attacks. Examples might include message centers, two-factor authentication, secrets between the end-user and web site (such as Bank of America's SiteKey), cards that have secret numbers that you can scratch off, honeypots, etc. Analyze anti-phishing solutions for failure modes and usability. Who is the user and who is the customer? What is the purpose of the anti-phishing system?

Design for privacy and visualizing privacy

This segment looks at how to design privacy tools and how to design every day software with privacy in mind. It addresses the challenges involved in communicating with users about privacy as well as the challenges associated with privacy configuration interfaces. We spent four lectures on this topic, in addition to an introduction to privacy lecture earlier in the course. If your course has not included an introduction to privacy segment, some of that material might be combined with the material in this segment. If your course has included an introduction to privacy segment, some of that material might be reviewed here or revisited in more depth. Topics to cover include:


Suggested readings

Discussion questions

  1. What are the differences between designing a privacy tool and designing for privacy in everyday software?
  2. What symbols are associated with privacy? What symbols might you use to indicate whether or not a user's privacy is being protected?
  3. How can people be made more aware of potential long-term privacy consequences of their actions online?
  4. What questions would you ask someone to ascertain the general level of privacy they expect? Do you think their answers could be used to predict their browser cookie settings? whether or not they use a grocery store card? whether they use anonymity tools?


Privacy Pictionary

Distribute words and phrases related privacy to members of the class (or let them come up with their own) and ask them to draw a visual representation of them on the black board. The rest of the class tries to guess what the drawing represents.


Think about a piece of software with which you are familiar that has the ability to capture or disclose personal information (for example, a web browser, instant messaging client, video chat client, email client, etc.). Evaluate this software against the five pitfalls discussed in Five Pitfalls in the Design for Privacy. Propose ways of avoiding any of the pitfalls into which this software falls.

Alternatively, for each of the five pitfalls, come up with an example of a piece of software that falls into that pitfall. Propose a way to avoid this pitfall.

Privacy tools

Download a privacy-related software application or use one you have already purchased, or use an online privacy service.

This could be done with specific types of privacy tools such as P3P user agents, cookie managers, anonymity tools, etc. Students could be asked to review more than one tool and compare them.

Privacy configuration

Examine the configuration interface for an online communication client (for example, instant messaging software) that allows users to control who has access to their online presence information. Come up with several examples of access control rules that typical users might wish to implement. Can the interface be used to implement these rules? Do you think most users will be able to figure out how to use this interface to implement these rules? Do you think most users will take the time to configure this interface? How could the configuration interface be improved? How could you perform a user study to test this interface?

Phone home

Many software products contain "phone home" features, for example, for performing software updates or monitoring usage patterns. In some cases software phones homes quite frequently, for example, to update phishing black lists or check for fresh image files. Users may be concerned that the software company is using these features to track or profile them. Thus it is important that the software is up front about the fact that it is phoning home. Furthermore, some users may wish to disable such features or be prompted every time before they phone home (due to privacy or other concerns), whereas other users are happy to have them operate automatically. Discuss the various approaches you have seen different software manufacturers take to addressing this problem. What do you like/dislike about them? How should phone home features be designed so that they facilitate informed consent? Describe an example user interface design and general principles that might be applied to specific cases. What sort of user studies should be performed to test this user interface design?

Web browser privacy and security

Web browsers are arguably the most used class of software today. However, web browsers do many things that users may not necessarily understand. Some examples include sending cookies to web sites, making secure connections, and asking people whether to accept a certificate. This section looks at user interface metaphors for web browsers.

Key concepts:


Suggested readings

Discussion questions

  1. What should we teach users about the SSL lock icon in web browsers?
  2. There's a tension between exposing more of what a web browser is doing and keeping it simple. What kinds of things that are not exposed now should be, and vice versa? Also, are there ways of achieving both awareness and simplicity?
  3. Social approaches to end-user security and privacy management suggest sharing information on how people configure their systems. What are the pros and cons of such approaches?
  4. Related to trust and semantic attacks, web browsers offer few hints about whether a web site can be trusted or not. What kinds of user interfaces and systems support can be offered to help people make better decisions in this regard?
  5. One problem many people have is distinguishing between the browser chrome and the content in a web page. This lack of awareness can be used in phishing attacks. What are ways of addressing this problem?



Find all of the security-critical settings in a web browser. Are they all in the same place? Should they be? What is a security-critical setting?

Cookie manager

Review several cookie managers (browser extensions or built into web browsers) and design the user interface for a new cookie manager. What successful features from other cookie managers have you borrowed? How has your design improved on other cookie managers? Who are the intended users of your cookie manager?

Better User Interfaces

Take some aspect of the web browser and design a better user interface for it. This might include a better metaphor for secure connections, more awareness and control of cookies, a better way of managing unknown certificates, or a better way of identifying trusted / untrusted sites. Consider other constraints too, including existing knowledge of users, where people are likely to look (assuming a visual interface), screen real-estate, feasibility of implementation, and feasibility of adoption.


Abstractly, authentication in a computer system refers to determining the principal who could have originated some communication. User authentication is a common example of this, where the principal is some human user. But more generally any component of a computer system can originate some communication, and authentication refers to confirming a claim of which component originated it.

This segment introduces various types of authentication in computer systems. Authenticating messages in a distributed system is often done via digital signatures, and determining the public key to authenticate messages from some named person is typically done with the help of a so-called "public key infrastructure" or PKI. This segment introduces public key infrastructures. It also introduces various methods for user authentication, including via text passwords, biometrics, and graphical passwords.

Key concepts included:


Suggested readings

Discussion questions

  1. What makes a good password?
  2. How do you make a password easy to remember but hard to guess?
  3. What are the security benefits of requiring users to frequently change their passwords? What problems are caused by this requirement?
  4. How can passwords that are used infrequently be remembered?
  5. Text passwords or passphrases are sometimes used as encryption keys. Can biometrics be used this way? How about graphical passwords? In both cases, why or why not?
  6. Ever since the invention of public key cryptography, public key infrastructures has been "tomorrow's" up-and-coming technology. What makes PKI's so difficult to deploy and use? For those that have been deployed (e.g., in support of SSL/TLS), why did those succeed?
  7. Suppose you wanted to send encrypted email with your classmates. Would you need a PKI to do so? Why or why not?
  8. Is there anything wrong with using "password" as a password?
  9. Why do so many websites require usernames and passwords -- is it to protect the users, or is it to enhance advertising revenue?
  10. What do you think about the "bugmenot" service?
  11. Is it possible to have authorization without identification?
  12. What would an anonymous authentication system look like?
  13. Can cookies be an identification system? An authorization system? An authentication system?


Password quiz

Start with a score of zero. For each of the statements below, if the statement applies to you, add or subtract the number of points indicated.

  1. You have any password more than 8 characters: +3
  2. You have any password less than 6 characters: -4
  3. You have never have written a password down: +6
  4. You have written a password on a sticky and placed it next to a computer: -6
  5. Written down a password and put it somewhere else: -5
  6. Any regularly used password contains a word that could be found in a dictionary or a name: -3
  7. You have any password that contains special characters: +3
  8. The password you use the most does not have any numbers: -4
  9. You have a password that contains capital letters: +4
  10. You have told anyone the password that you use the most: -4
  11. You use the same password for multiple accounts: -4
  12. You use a variant of the same password for multiple accounts: -2
  13. You have forgotten a password and it has not been reissued: -2
  14. You link multiple passwords with a common element (decreases memorability): -5
  15. You have had a password for over a year without changing it: -4
  16. You have ever used the word 'password' for your password: -4
  17. You use your username (forwards or backwards) as your password: -5
  18. Your password contains 3 or more unique letters: +3
  19. You have ever been required to change your password and you change it to the identical password or one you have had in the past: -5
  20. You have a password that repeats the same word (e.g. fredfred): -4
  21. You have a password that has a word in a foreign language: -3
  22. You spell a word backwards in one of your passwords: -3
  23. You have ever had a password that is a keyboard sequence like '1234' or 'qwer': -5
  24. In your password you replace char based on graphic or phonetic similarities 1 -> l or o is 0: -3
  25. Your password contains pet names, license plate numbers, telephone numbers, identification numbers, the brand of your automobile, the name of the street you live on, etc.: -4
  26. Your password ends in an exclamation point: -3

This quiz was developed based on common password advice. Do you disagree with any of this advice? What are the violations that are most common among your classmates? What are some things you could do to create better passwords that are still memorable? Are there other bad password habits that are not addressed by this quiz?

Get certified

Suppose your instructor uses the ThunderBird mail client, including (only) the certification authorities shipped with that client. Send to your instructor an email that is signed in a way that his/her client can authenticate you as the sender.

Testing biometrics

Many laptops today come with built-in fingerprint readers. Conduct a user study to determine how secure these readers are. Can you ever fool them?

Tools for security administration

This segment introduces the role and duties of a typical security administrator, and how the tools available to him/her are often inadequate. The segment also describes how visualization tools can aid in system diagnosis, but also how administrators who come to depend on them can be misled by an intelligent attacker who exploits weaknesses in the visualization interface.

Key ideas include:


Suggested readings

Discussion questions

  1. Aside from better visualization tools, what can you think of to make the security administrator's job easier?
  2. What asymmetries between the attacker's and defender's roles benefit the attacker? Which ones benefit the defenders? Can you think of ways to compensate for those factors that favor the attackers?


Better visualization

Design a visualization interface for describing some security-relevant observation, which you believe is immune to attacks by an adversary who knows that interface is being used? What is the key principle you employed to achieve this?

The base-rate fallacy

Consider a computer installation that produces about 1,000,000 audit records per day. For this small installation, let's presume that only about one or two of these records are indicative of some form of attack. Now suppose that this installation installs an intrusion detection system that is 99% accurate, i.e., on records indicating an attack it raises an alarm 99% of the time, and on records devoid of attacks it raises an alarm only 1% of the time. Are the administrators likely to be happy with their new intrusion detection system? Explain.