CUPS is affiliated with Carnegie Mellon CyLab. Our research is funded by grants from the National Science Foundation, the Army Research Lab, Microsoft, and Google. Wombat Security Technologies, Inc. is commercializing some of the technologies we developed.
CUPSstudents come from several CMU PhD programs including the programs in Computation, Organizations and Society, Engineering and Public Policy, Human Computer Interaction, Computer Science, Electrical and Computer Engineering, and Public Policy and Management. Perspective students should apply directly to these programs and also express their interest in the CUPS doctoral training program.
News and Events
Our privacy engineering masters program has graduated its first class. Alumni have accepted jobs at Google, Oracle, Adobe, and Ebay. Applications for fall 2015 are due in January.
We are participating in the ARL Collaborative Research Alliance on cybersecurity
Videos from our June 27, 2014 Workshop on the Future of Privacy Notice and Choice are now available
Check out the July 2014 edition of our lab newsletter, The Saucer.
The Symposium on Usable Privacy and Security (SOUPS 2015) is July 22-24, 2015 in Ottawa, Canada
Director: Lorrie Cranor
Current members: Alessandro Acquisti, Yuvraj Agrawal, Hazim Almuhimedi, Rebecca Hunt Balebako, Lujo Bauer, Sekhar Bhagvatula, Justin Cranshaw, Cristian Bravo-Lillo, Nicolas Christin, Julie Downs, Adam Durity, Alain Forget, David Gordon, Jim Graves, Hanan Hibshi, Eiji Hayashi, Candice Hoke, Mandy Holbrook, Jason Hong, Phillip Huh, Peter Klemperer, Saranga Komanduri, Darya Kurilova, Pedro Leon, Shing-hon Lau, Bin Liu, Abby Marsh, Billy Melicher, Alessandro Oltramari, Emmanuel Owusu, Norman Sadeh, Ashwini Rao, Marios Savvides, Florian Schaub, Sean Segreti, Rich Shay, Stephen Siena, Manya Sleeper, Josh Tan, Yuan Tian, Tiffany Todd, Blase Ur, Timothy Vidas, Tatiana Vlahovic, Jason Wiese, Shomir Wilson .
Alumni and former lab members: Idris Adjerid, Fahd Arshad, Joanna Bresee, Luc Cesca, Paul Hankes Drielsma, Serge Egelman, Ian Fette, Naoko Hayashida, Cynthia Kuo, Eduardo A. Cuervo Laffaye, Matthew Geiger, Iulia Ion, Patrick Kelley, Braden Kowitz, Ponnurangam Kumaraguru, Janne Lindqvist, Chris Long, Ryan Mahon, Michelle Mazurek, Aleecia McDonald, Marty McGuire, Jonathan Mugan, Elaine Newton, Greg Norcie, Sven Dietrich, Robert Reeder, Bryan Pendleton, Sasha Romanosky, Steve Sheng, Fred Stutzman, Eran Toch, Janice Tsai, Kami Vaniea, Kai Wang, Yang Wang
Current Projects and Selected Publications
Privacy decision making
L. Cranor, A. Durity, A. Marsh, and B. Ur. Parents' and Teens' Perspectives on Privacy in a Technology-Filled World. SOUPS 2014.
Rebecca Balebako, Abigail Marsh, Jialiu Lin, Jason Hong, Lorrie Faith Cranor. The Privacy and Security Behaviors of Smartphone App Developers. Workshop on Usable Security (USEC 2014). San Diego, CA, February 23, 2014.
Rebecca Balebako, Rich Shay, and Lorrie Faith Cranor. Is Your Inseam a Biometric? A Case Study on the Role of Usability Studies in Developing Public Policy. Workshop on Usable Security (USEC 2014). San Diego, CA, February 23, 2014.
Lujo Bauer, Lorrie Faith Cranor, Saranga Komanduri, Michelle L. Mazurek, Michael K. Reiter, Manya Sleeper, Blase Ur. The Post Anachronism: The Temporal Dimension of Facebook Privacy. Workshop on Privacy in the Electronic Society. Berlin, Germany. November 2013.
R. Balebako, R. Shay, and L.F. Cranor. Is Your Inseam a Biometric? Evaluating the Understandability of Mobile Privacy Notice Categories. CMU CyLab Technical Report CMU-CyLab-13-011.
P.G. Leon, B. Ur, Y. Wang, M. Sleeper, R. Balebako, R. Shay, L. Bauer, M. Christodorescu, L.F. Cranor. What Matters to Users? Factors that Affect Users' Willingness to Share Information with Online Advertisers. In Proceedings of the Eight Symposium On Usable Privacy and Security (SOUPS ’13), Newcastle, United Kingdom, 2013.
R. Balebako, J. Jung, W. Lu, L.F. Cranor, and C. Nguyen. "Little Brothers Watching You:" Raising Awareness of Data Leaks on Smartphones In Proceedings of the Eight Symposium On Usable Privacy and Security (SOUPS ’13), Newcastle, United Kingdom, 2013.
L.F. Cranor, K. Idouchi, P.G. Leon, M. Sleeper, B. Ur. Are They Actually Any Different? Comparing Thousands of Financial Institutions’ Privacy Practices. WEIS 2013.
User controllable security and privacy
Managing security and privacy policies is known to be a difficult problem. It is important that new user interfaces be developed to effectively and efficiently support lay users in understanding and managing security and privacy policies - their own as well as those implemented by systems and individuals with whom they interact. Solutions in this area have traditionally taken a relatively narrow view of the problem by limiting the expressiveness of policy languages or the number of options available in templates, restricting some decisions to specific roles within the enterprise, etc. As systems grow more pervasive and more complex, and as demands for increasing flexibility and delegation continue to grow, it is imperative to take a more fundamental view that weaves together issues of security, privacy and usability to systematically evaluate key tradeoffs between expressiveness, tolerance for errors, burden on users and overall user acceptance; and develop novel mechanisms and technologies that help mitigate these tradeoffs, maximizing accuracy and trustworthiness while minimizing the time and effort required by end users. The objective of this project is to develop new interfaces that combine user-centered design principles with dialog, explanation and learning technologies to assist users in specifying and refining policies. One new policy authoring interface we have developed is a visualization technique for displaying policies in a two-dimensional "expandable grid". (See also the User controllable security and privacy project page, the Expandable grids project, Grey project, and Locaccino.)
K. Vaniea, L. Bauer, L.F. Cranor, and M.K. Reiter. Studying access control usability in the lab: Lessons learned from four studies. In LASER 2012–Learning from Authoritative Security Experiment Results, July 2012.
K. Vaniea, L. Bauer, L.F. Cranor, and M.K. Reiter. Out of sight, out of mind: Effects of displaying access-control information near the item it controls. In Proceedings of the Tenth Annual Conference on Privacy, Security and Trust, July 2012.
Usable Cyber Trust Indicators
When systems rely on a "human in the loop" to carry out a security-critical function, cyber trust indicators are often employed to communicate when and how to perform that function. Cyber trust indicators typically serve as warnings or status indicators that communicate information, remind users of information previously communicated, and influence user behavior. They include a variety of security- and privacy-related symbols in the operating system status bar or browser chrome, pop-up alerts, security control panels, or symbols embedded in web content. However, a growing body of literature has found the effectiveness of many of these indicators to be rather disappointing. It is becoming increasingly apparent that humans are a major cause of computer security failures and that security warnings and other cyber trust indicators are doing little to prevent humans from making security errors. In some cases, it may be possible to redesign systems to minimize the need for humans to perform security-critical functions, thus reducing or eliminating the need for security warnings. However, in many cases it may be too expensive or difficult to automate security-critical tasks, and systems may need to rely on human judgment. In these cases, it is important to situate security indicators both spatially and temporally to maximize their effectiveness, and to design them to communicate clearly to users. The goal of this research is to systematically study the effectiveness of cyber trust indicators and develop approaches to making these indicators most effective and usable. We are currently focusing on security warning dialogs. See also our work on privacy indicators on our privacy decision making research page.
C. Bravo-Lillo, L. Cranor, S. Komanduri, S. Schechter, M. Sleeper. Harder to Ignore? Revisiting Pop-Up Fatigue and Approaches to Prevent It. SOUPS 2014.
C. Bravo-Lillo. Improving Computer Security Dialogs: An Exploration of Attention and Habituation PhD Thesis, Engineering & Public Policy Department, Carnegie Mellon University, Pittsburgh, PA, May 2014.
C. Bravo-Lillo, L.F. Cranor, J. Downs, S. Komanduri, R.W. Reeder, S. Schechter, and M. Sleeper. Your Attention Please: Designing security-decision UIs to make genuine risks harder to ignore. In Proceedings of the Eight Symposium On Usable Privacy and Security (SOUPS ’13), Newcastle, United Kingdom, 2013.
L. Bauer, C. Bravo-Lillo, L. Cranor, and E. Fragkaki. Warning Design Guidelines. CMU-CyLab-13-002. February 5, 2013.
C. Bravo-Lillo, L. Cranor, J. Downs, S. Komanduri, S. Schechter, and M. Sleeper, Operating system framed in case of mistaken identity: Measuring the success of web-based spoofing attacks on OS password-entry dialogs, in Proceedings of the 19th ACM Conference on Computer and Communications Security, ACM, 18 October 2012.
C. Bravo-Lillo, L.F. Cranor, J.S. Downs, S. Komanuri. Bridging the Gap in Computer Security Warnings: A Mental Model Approach. IEEE Security & Privacy, 2011: 18-26.
Usable security for digital home storage
We are exploring architecture, mechanisms, and interfaces for making access control usable by laypeople faced with increasing reliance on data created, stored, and accessed via home and personal consumer electronics. Digital content is becoming common in the home, as new content is created in digital form and people digitize existing content (e.g., photographs and personal records). Interesting and fun new devices make creating digital content easier and interacting with it much more flexible than ever before. The transition to digital homes is exciting, but brings many challenges. Perhaps the biggest challenge is dealing with access control. Users want to be able to access their content easily from any of their devices, including shared devices (e.g., the family DVR), and yet they also want to be able to restrict access to certain data among household members and visitors. They also want to be able to share data (e.g., photographs) selectively with friends and family outside their home. Unfortunately, studies repeatedly show that computer users have trouble specifying access-control policies. Worse, we are now injecting the need to do so into an environment with users who are much less technically experienced and notoriously impatient with complex interfaces. Without a holistic, usable approach to access control management, adoption of new technology in the home will be slowed and there will be no effective data security once the transition inevitably occurs. This project builds on the Perspective data management system developed by CMU's Parallel Data Lab.
P. Klemperer, Y. Liang, M. Mazurek, M. Sleeper, B. Ur, L. Bauer, L.F. Cranor, N. Gupta, and M. Reiter. Tag, You Can See It! Using Tags for Access Control in Photo Sharing. CHI 2012.
M. Mazurek, J.P. Arsenault, J. Bresee, N. Gupta, I. Ion, C. Johns, D. Lee, Y. Liang, J. Olsen, B. Salmon, R. Shay, K. Vaniea, L. Bauer, L.F. Cranor, G.R. Ganger, and M.K. Reiter. Access Control for Home Data Sharing: Attitudes, Needs and Practices. CHI 2010.
To combat both the inherent and user-induced weaknesses of text-based passwords, administrators and organizations typically institute a series of rules – a password policy – to which users must adhere when choosing a password. There is consensus in the literature that a properly-written password policy can provide an organization with increased security. There is, however, less accord in describing just what such a well-written policy would be, or even how to determine whether a given policy is effective. Although it is easy to calculate the theoretical password space that corresponds to a particular password policy, it is difficult to determine the practical password space. Users may, for example, react to a policy rule requiring them to include numbers in passwords by overwhelmingly picking the same number, or by always using the number in the same location in their passwords. There is little published empirical research that studies the strategies used by actual users under various password policies. In addition, some password policies, while resulting in stronger passwords, may make those passwords difficult to remember or type. This may cause users to engage in a variety of behaviors that might compromise the security of passwords, such as writing them down, reusing passwords across different accounts, or sharing passwords with friends. Other undesirable side effects of particular password policies may include frequently forgotten passwords. In fact, the harm caused by users following an onerously restrictive password policy may be greater than the harm prevented by that policy. In this project, we seek to advance understanding of the factors that make creating and following appropriate password policies difficult, collect empirical data on password entropy and memorability under various password policies, and propose password policy guidelines to simultaneously maximize security and usability of passwords. We also explore the security and usability of some new types of passwords.
Richard Shay, Saranga Komanduri, Adam L. Durity, Philip (Seyoung) Huh, Michelle L. Mazurek, Sean M. Segreti, Blase Ur, Lujo Bauer, Nicolas Christin, and Lorrie Faith Cranor. Can long passwords be secure and usable? In CHI 2014: Conference on Human Factors in Computing Systems, April 2014. ACM. [Video teaser]
M.L. Mazurek, S. Komanduri, T. Vidas, L. Bauer, N. Christin, L.F. Cranor, P.G. Kelley, R. Shay, and B. Ur. Measuring Password Guessability for an Entire University. ACM CCS 2013.
J. Blocki, S. Komanduri, A. Procaccia, and O. Sheffet. 2013. Optimizing password composition policies. In Proceedings of the fourteenth ACM conference on Electronic commerce (EC '13). ACM, New York, NY, USA, 105-122.
P.G. Kelley, S. Komanduri, M.L. Mazurek, R. Shay, T. Vidas, L. Bauer, N. Christin and L.F. Cranor. The impact of length and mathematical operators on the usability and security of system-assigned one-time PINs. USEC 2013.
B. Ur, P.G. Kelley, S. Komanduri, J. Lee, M. Maass, M.L. Mazurek, T. Passaro, R. Shay, T. Vidas, L. Bauer, N. Christin, L.F. Cranor, S. Egelman, and J. Lopez. Helping Users Create Better Passwords. ;login: Vol 37, No. 6, December 2012.
B. Ur, P.G. Kelley, S. Komanduri, J. Lee, M. Maass, M. Mazurek, T. Passaro, R. Shay, T. Vidas, L. Bauer, N. Christin, and L.F. Cranor. How does your password measure up? The effect of strength meters on password creation. USENIX Security 2012.
R. Shay, P.G. Kelley, S. Komanduri, M. Mazurek, B. Ur, T. Vidas, L. Bauer, N. Christin, L.F. Cranor. Correct horse battery staple: Exploring the usability of system-assigned passphrases. SOUPS 2012.
Patrick Gage Kelley, Saranga Komanduri, Michelle L. Mazurek, Rich Shay, Tim Vidas, Lujo Bauer, Nicolas Christin, Lorrie Faith Cranor, Julio Lopez. Guess again (and again and again): Measuring password strength by simulating password-cracking algorithms. 2012 IEEE Symposium on Security and Privacy (Oakland) [CyLab Technical Report cmu-cylab-11-008, August 21, 2011.]
Looking for some of our work that you can't find under "current projects"? Check here for our past projects.
Supporting trust decisions
When Internet users are asked to make "trust" decisions they often make the wrong decision. Implicit trust decisions include decisions about whether or not to open an email attachment or provide information in response to an email that claims to have been sent by a trusted entity. Explicit trust decisions are decisions made in response to specific trust- or security-related prompts such as pop-up boxes that ask the user whether to trust an expired certificate, execute downloaded software, or allow macros to execute. Attackers are able to take advantage of most users' poor trust decision-making skills through a class of attacks known as "semantic attacks." It is not always possible for systems to make accurate trust decisions on a user's behalf, especially when those decisions require knowledge of contextual information. The goal of this research is not to make trust decisions for users, but rather to develop approaches to support users when they make trust decisions. Our research began with a mental models study aimed at understanding and modeling how people make trust decisions in the online context and ultimately resulted in the development of anti-phishing training tools and filtering software. The tools developed by this project our being commercialized by Wombat Security. For our publications, see the Supporting trust decisions project page.
Usable anonymity tools
A variety of tools have been developed to provide anonymity for various types of online interactions. Most of the work in this area has focused on improving the anonymity properties of these tools, and little has been done to improve their usability. We have been working on developing more usable interfaces for Tor.
FoxTor download and FAQ
Other Selected Publications
J. Wiese, A.J. Brush, T. Scott Saponas. Phoneprioception: enabling mobile phones to infer where they are kept. CHI 2013.
T. Vidas, E. Owusu, S. Wang, C. Zeng, and L. Cranor. QRishing: The Susceptibility of Smartphone Users to QR Code Phishing Attacks, USEC 2013 [originally published as CyLab Technical Report CMU-CyLab-12-022, November 2012].
M. Sleeper, D. Sharma, and L. Cranor. I Know Where You Live: Analyzing Privacy Protection in Public Databases. cmu-cylab-11-015, October 2011. [Extended version of paper presented at WPES 2011]
H. Hibshi, T. Vidas, and L. Cranor. Usability of Forensics Tools: A User Study. IT Security Incident Management and IT Forensics (IMF), 10-12, May 2011.
Janne Lindqvist, Justin Cranshaw, Jason Wiese, Jason Hong, and John Zimmerman. I'm the Mayor of My House: Examining Why People Use foursquare - a Social-Driven Location Sharing Application. In CHI 2011: Conference on Human Factors in Computing Systems, May 2011.
Timothy Vidas, Nicolas Christin, Lorrie Cranor. Curbing Android Permission Creep. Web 2.0 Security & Privacy 2011. Oakland, CA, May 26, 2011.
J. Downs, M. Holbrook, S. Sheng, and L. Cranor. Are Your Participants Gaming the System? Screening Mechanical Turk Workers. CHI 2010.
Sarah Spiekermann and Lorrie Faith Cranor. Engineering Privacy. IEEE Transactions on Software Engineering. Vo. 35, No. 1, January/February, 2009, pp. 67-82.
Ahren Studer, Christina Johns, Jaanus Kase, Kyle O'Meara, Lorrie Cranor. A Survey to Guide Group Key Protocol Development. Annual Computer Security Applications Conference (ACSAC) 2008, December 8-12, 2008, Anaheim, CA.
A. McDonald and L. Cranor. How Technology Drives Vehicular Privacy. I/S: A Journal of Law and Policy for the Information Society Volume 2, Issue 3 (2006).
X. Sheng and L. Cranor. An Evaluation of the Effectiveness of US Financial Privacy Legislation Through the Analysis of Privacy Policies. I/S: A Journal of Law and Policy for the Information Society, Volume 2, Number 3, Fall 2006, pp. 943-979.
L. Cranor. 'I Didn't Buy it for Myself': Privacy and Ecommerce Personalization. Proceedings of the 2nd ACM Workshop on Privacy in the Electronic Society, October 30, 2003, Washington, DC.
L. Cranor, J. Hong, and M. Reiter. Teaching Usable Privacy and Security: A guide for instructors. 2007.
S. Egelman and L. Cranor. The Real ID Act: Fixing Identity Documents with Duct Tape. I/S: A Journal of Law and Policy for the Information Society, Volume 2, Number 1, Winter 2006, pp. 149-183.
M. Geiger and L. Cranor, Counter-Forensic Privacy Tools: A Forensic Evaluation. ISRI Technical Report. CMU-ISRI-05-119, 2005.
Romanosky, S., Acquisti, A., Hong, J., Cranor, L. F., and Friedman, B. 2006. Privacy patterns for online interactions. In Proceedings of the 2006 Conference on Pattern Languages of Programs (Portland, Oregon, October 21 - 23, 2006). PLoP '06. ACM, New York, NY, 1-9.
Join our cups-friends mailing list for announcements about our papers and events and discussions about usable privacy and security
Security and Usability: Designing Secure Systems that People Can Use, edited by Lorrie Cranor and Simson Garfinkel, is now available
L. Cranor, J. Hong, and M. Reiter. Teaching Usable Privacy and Security: A guide for instructors. 2007.
The HCISec Bibliography contains a good list of CUPS-related publications.
Usable Security Blog from UC Berkeley
Usability, Psychology, and Security workshop
Vizsec - a research and development community interested in applying information visualization techniques to the problems of computer security
Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation or any of our other funders.