Usable Cyber Trust IndicatorsWhen systems rely on a "human in the loop" to carry out a security-critical function, cyber trust indicators are often employed to communicate when and how to perform that function. Cyber trust indicators typically serve as warnings or status indicators that communicate information, remind users of information previously communicated, and influence user behavior. They include a variety of security- and privacy-related symbols in the operating system status bar or browser chrome, pop-up alerts, security control panels, or symbols embedded in web content. However, a growing body of literature has found the effectiveness of many of these indicators to be rather disappointing. It is becoming increasingly apparent that humans are a major cause of computer security failures and that security warnings and other cyber trust indicators are doing little to prevent humans from making security errors. In some cases, it may be possible to redesign systems to minimize the need for humans to perform security-critical functions, thus reducing or eliminating the need for security warnings. However, in many cases it may be too expensive or difficult to automate security-critical tasks, and systems may need to rely on human judgment. In these cases, it is important to situate security indicators both spatially and temporally to maximize their effectiveness, and to design them to communicate clearly to users. The goal of this research is to systematically study the effectiveness of cyber trust indicators and develop approaches to making these indicators most effective and usable. We are currently focusing on security warning dialogs. See also our work on privacy indicators on our privacy decision making research page. C. Bravo-Lillo, L. Cranor, S. Komanduri, S. Schechter, M. Sleeper. Harder to Ignore? Revisiting Pop-Up Fatigue and Approaches to Prevent It. SOUPS 2014. C. Bravo-Lillo. Improving Computer Security Dialogs: An Exploration of Attention and Habituation PhD Thesis, Engineering & Public Policy Department, Carnegie Mellon University, Pittsburgh, PA, May 2014. C. Bravo-Lillo, L.F. Cranor, J. Downs, S. Komanduri, R.W. Reeder, S. Schechter, and M. Sleeper. Your Attention Please: Designing security-decision UIs to make genuine risks harder to ignore. In Proceedings of the Eight Symposium On Usable Privacy and Security (SOUPS ’13), Newcastle, United Kingdom, 2013. L. Bauer, C. Bravo-Lillo, L. Cranor, and E. Fragkaki. Warning Design Guidelines. CMU-CyLab-13-002. February 5, 2013. C. Bravo-Lillo, L. Cranor, J. Downs, S. Komanduri, S. Schechter, and M. Sleeper, Operating system framed in case of mistaken identity: Measuring the success of web-based spoofing attacks on OS password-entry dialogs, in Proceedings of the 19th ACM Conference on Computer and Communications Security, ACM, 18 October 2012. C. Bravo-Lillo, L.F. Cranor, J.S. Downs, S. Komanuri. Bridging the Gap in Computer Security Warnings: A Mental Model Approach. IEEE Security & Privacy, 2011: 18~26. C. Bravo-Lillo, L.F. Cranor, J.S. Downs, S. Komanduri, M. Sleeper. Improving Computer Security Dialogs. In Proceedings of 13th IFIP TC13 Conference on Human-Computer Interaction (INTERACT'2011), 2011, pp.18-35. J. Sunshine, S. Egelman, H. Almuhimedi, N. Atri, and L. Cranor. Crying Wolf: An Empirical Study of SSL Warning Effectiveness. USENIX Security 2009. L. Cranor. A Framework for Reasoning About the Human in the Loop. Usability, Psychology and Security 2008. Serge Egelman. Trust Me: Design Patterns for Constructing Trustworthy Trust Indicators. PhD Thesis, Computation, Organizations and Society, Carnegie Mellon University, Pittsburgh, PA, CMU-ISR-O9-110, April, 2009. S. Egelman, L. Cranor, and J. Hong. You've Been Warned: An Empirical Study of the Effectiveness of Web Browser Phishing Warnings. CHI 2008. L. Cranor. What do they "indicate?": evaluating security and privacy indicators. interactions, May/June 2006, p. 45-57. B. Kowitz and L. Cranor. Peripheral Privacy Notifications for Wireless Networks. In Proceedings of the 2005 Workshop on Privacy in the Electronic Society, 7 November 2005, Alexandria, VA. | |