This page describes possible class projects. These will be research projects that you will work on in small groups. Please indicate your preferences for which project you will work on by Thursday, February 5th.
User study of password managers
As users are forced to work with more and more complex passwords, they often rely on password managers to help them remember all of those passwords. However, password managers can be difficult to use. In this project, researchers will review previous work on password managers. Based on this, researchers will then develop a set of criteria for evaluating password managers: both their usability and their security. With these criteria, the researchers will conduct an analysis of the leading available password managers. This work can lead to recommendations for improving the existing products, and ideas for features for future products.
Views of privacy on social media
Designers and companies have various conceptions of privacy-related terms and concepts (e.g., "privacy", "information privacy", "location information", "health information") that inform the creation of website privacy settings. However, general users' conceptions of these terms may differ from the designers/companies' understandings. For example, Prof. Cranor's Privacy Illustrated project demonstrates how different people have a wide range of understanding of the term "privacy."
When user understanding doesn't match companies' intentions there can be a number of negative outcomes. For example, users can inappropriately use the available settings, choose not to use the available settings, or lose trust in available options when they result in unexpected outcomes. One area where this dynamic is apparent is on social networking sites like Facebook, Twitter, Instagram, LinkedIn, etc. These sites offer a range of privacy settings and access control tools. However, it is not clear that users of these sites understand the concepts expressed in the settings in a consistent manner or in the way the designers intend. For example, Facebook users often express distrust in how privacy settings will behave.
In this project, students should pick two or more social networking sites and perform a survey of common concepts expressed in the access control and privacy settings (e.g., "privacy" "sharing information" "using information for ads" etc.). They will then do a study (either survey- or interview- based) to examine how users of the sites understand a selection of the terms, with a focus on whether understanding varies by site, understanding is consistent across users, and understanding seems to match the designers' intentions. The students can then suggest improvements (e.g., clearer settings, better education, etc.) that could make the site more usable.
Notice and choice in the Internet of Things
The Internet of Things (IoT) is starting to become reality with more and more smart home and wearable
devices arriving on the market. These devices include wearable health monitors (e.g., Fitbit, Up by Jawbone), smart-home devices (e.g., Belkin Wemo, SmartThings), in-car sensors and cameras, security systems, and other sensor-based devices. Many of these devices collect detailed information about users, yet have only limited abilities to inform users about data practices, i.e., what data is collected for which purposes or which entities it is shared with.
This week's FTC report came down strongly in favor "notice and choice" as the privacy regime that the FTC staff believes should govern the IoT. However, a number of aspects of IoT devices make it difficult to provide "notice and choice" in the manner typical for other domains, including the lack of a screen, constant data collection, the collection of data from individuals other than the device owner, etc.
The goal of this project is to prototype different notice and choice methods for IoT devices and evaluate their effectiveness in an experimental study. Approaches may be taken from the existing literature on privacy in ubiquitous computing and the Internet of Things, or include entirely novel methods designed by the project team.
Usability evaluation of an open-source password meter
Based on the CMU password group's research results over the past few years, we are currently developing an open-source password meter that we plan to release in the next year and distribute widely. As opposed to traditional password-strength meters that take into account only the length of the password and the character classes that have been used, our open-source meter will support much more innovative client-side and server-side password evaluation features. When we release this meter, we hope to have a data-backed evaluation of the effectiveness of each feature so that we can provide system administrators sensible default settings and insight into the impact of each configuration choice they make.
Your group will conduct such a usability evaluation that we will release later this year along with our open-source password meter. The Ur et al. paper linked below is a good starting point for the type of usability evaluation you might want to conduct, but you may want to augment this usability analysis with lab-based methods. We are currently developing the meter in parallel with this usability evaluation, so your group will be the first to test the usability of a number of our ideas.
Testing the usability of two-factor authentication
Two-factor authentication is a way to help users protect their online accounts. Instead of logging on using only a password, users enter a password and are sent a code on their mobile devices. They need to enter that code to authenticate. This protects their accounts because an attacker would need both their password and their mobile device. However, this also introduces usability problems, such as needing to enter another item and needing to have the mobile device. This project will examine the usability implications of two-factor authentication. Does it make users feel more secure? Does it make users feel annoyed? Why are users using or not using this technology?
Testing the Usability of the Apple Password Assistant
Apple has introduced a Password Assistant. This is found on a Mac running their latest operating system, Mavericks, as follows: System Preferences --> Users and Groups --> Change Password ... --> Click on the Key button. This lets the user easily create a random password. The password can be configured for length and across several other measures. While past work has shown that system-assigned passwords can be difficult for users, it is possible that Apple's latest offering allows enough configuration choices to be easier to use. In this project, researchers would have users use this feature, or a similar mock-up, and evaluate both what users thought of the system, and the security and construction of resulting passwords.
Can Johnny Encrypt, 15 years later?
15 years ago, we learned that Johnny couldn't encrypt. Is that still true today? Researchers in this project would study a leading free encryption technologies: GNU Privacy Guard, an e-mail encryption software suite. GNU Privacy Guard is a free and open-source implementation of the OpenPGP standard for e-mail encryption, but usability problems may be limiting its wide adoption. The aim of GNU Privacy Guard usability studies will be to identify the usability challenges with the software, make suggestions for improvements, and ideally implement improvements to the interface.
Choose Your Own Authentication
Choose Your Own Authentication is a novel authentication architecture that enables users to choose a scheme that best suits their preferences, abilities, and usage context. While we have a working prototype of CYOA, there remain several open questions with respect to users' behavior when given the choice of multiple authentication schemes, such as: Do users actually choose schemes that better suit their preferences and abilities (e.g., selecting graphical passwords if they have a better visual memory)? Do users choose more secure password schemes (and/or passwords) for accounts that are more valuable (such as a bank account versus a discussion forum)?
Understanding decision making about online tracking
As users browse the web, dozens of analytics and advertising companies track their browsing through means like third-party cookies and other, more subtle, techniques. From this tracking, these companies derive inferences about users to better target advertising, yet many users find this practice creepy, if useful.
A number of privacy tools, such as Ghostery, DNTMe, PrivacyBadger, and others, provide visibility about this tracking and empower users to block HTTP requests to these third-party companies. One might expect a privacy-interested user to choose to block everything, but not all users do so. We are interested in when users choose not to block a particular company, and why (e.g., blocking the company prevented the page from working, I like this company, I prefer to receive targeted advertising, etc.).
In this project, you will instrument an open-source privacy tool (e.g., PrivacyBadger) to track study participants' decision making (with their consent, of course!) and periodically survey them as to why they made particular decisions.
Finger Prints and Identity Verification
Some crypto software systems use public key fingerprints to allow users to perform manual key authentication. Typically, there is a long string of letters and numbers that users have to compare to make sure the fingerprint they have been sent matches one they received through another channel. There are a variety of approaches to doing this in a way that users are likely to be able to confirm the fingerprints accurately. Sometimes they are done as hex codes, sometimes alphanumeric, sometimes image matches, and sometimes word matches. There has been some discussion of creative ways to do this, but not much actual research. In this project, students will conduct a user study to test the usability and accuracy of several variations on fingerprint verification.