Last updated 6 January 2007
This guide provides suggested curricula, readings, assignments, and other materials for courses on usable privacy and security (UPS). While this guide geared towards a full-semester interdisciplinary graduate course, we also offer suggestions for using these materials in undergraduate courses and tutorials, and as modules in other courses that will devote only a small amount of time to usable privacy and security.
There is growing recognition that technology alone will not provide all of the solutions to security and privacy problems. Human factors play a significant role in these areas, and it is important for security, privacy, and trustworthy computing experts to have an understanding of how people will interact with the systems they develop. Likewise, human-computer interaction experts who work in the secure systems area need some understanding of computer security and privacy. In 2003 the Computing Research Association challenged the information security and assurance research community to "give end-users security controls they can understand and privacy they can control for the dynamic, pervasive computing environments of the future." Meeting this challenge will require an interdisciplinary approach involving researchers educated in usability and human factors as well as trustworthy computing.
Usable privacy and security is an interdisciplinary topic, and thus lends itself to a variety of different types of courses. The curriculum will vary depending on the amount of time available and the background of the participants. It is important to decide up front whether the course will have pre-requisites, and if not, how much background will be taught.
A usable privacy and security course for students who have a background in computer security but not in human-computer interaction should include substantial instruction on human-computer interaction methods. A mini course in human-computer interaction methods may need to be integrated into this course, especially if students are going to be conducting user studies as part of course assignments. A review of computer security topics most relevant to the class should probably be included, but instructors can assume students are coming to the course with a general understanding of computer security concepts. Since most computer security courses have only limited instruction on privacy, if there is going to be a privacy component to the course, some instruction on privacy will also be needed. This kind of course should include some case studies and research papers on the design and evaluation of usable privacy and security, but there probably will not be a lot of time for discussing research papers.
A usable privacy and security course for students who have a background in human-computer interaction but not in computer security should introduce students to a variety of computer security concepts. The course should discuss threat analysis and introduce students to concepts such as least privilege. It should introduce students to commonly-used security tools and components of those tools, for example symmetric and public-key cryptography, password authentication, and firewalls. This course should also introduce students to privacy issues. Only a review of human-computer interaction methods will be needed. This kind of course should include some case studies and research papers on the design and evaluation of usable privacy and security, but there probably will not be a lot of time for discussing research papers.
A usable privacy and security course offered to students with a variety of backgrounds should provide an overview of computer security, privacy, and human-computer interaction methods. Teaching such a course can be challenging because material that will be brand new for some students will be well-known by others. On the other hand, such a course presents an opportunity for having students teach each other, especially if most of the students are graduate students. When we taught this course at CMU in 2006 we spent only a small amount of time teaching basic concepts of security, privacy, and human-computer interaction, and instead delved right into case studies, research papers, and projects. We divided the students into interdisciplinary teams and relied on the students to teach each other.
A usable privacy and security course that requires pre-requisite courses in both computer security and human-computer interaction can begin with only a brief review of these areas and then concentrate on the intersection of these areas. Such a course would likely focus mostly on case studies and research papers, and would lend itself to a seminar style course.
A usable privacy and security course for undergraduate students, regardless of background, will likely include more lectures and associated homework assignments on basic concepts than a course for graduate students. A graduate course, especially one for PhD students, can be taught in more of a seminar style with readings from research papers and student presentations. A course for masters students might be taught like a PhD course if the masters students are in a research-oriented program, or more like an undergraduate course if the masters students expect to focus on practical skills they will be able to apply immediately in their careers.
There is ample material on usable privacy and security to warrant a full semester course. A course of this length provides sufficient time for students to complete a major project. If a shorter course is desired, most of the material from a full-semester course could be covered in a half-semester or "mini" course without including a project. A project course could be offered as a second mini course for interested students.
Lectures on usable privacy and security could be included as modules in other courses. A computer security course might include one or a few lectures on human-computer interaction, while a human-computer interaction course might include one or a few lectures on computer security and privacy. In both cases, the use of well-chosen case studies can help students understand the relevance of the material to the rest of the course. Besides teaching key concepts in such a module, a goal should be to give students an appreciation of what is involved in usable security design and evaluation as well as the benefits of interdisciplinary collaboration.
Tutorials and short professional development courses on usable privacy and security can be approached in a similar manner as UPS modules in other courses. The length of the tutorial and background of the audience will influence the types of concepts and skills that are taught.
We would like to thank Microsoft Research for funding the development of this instructor's guide. We would also like to thank the students who took our course and provided us with lecture notes, activity ideas, and feedback. Finally, we would like to thank Simson Garfinkel and Charles Frank for reviewing a draft of this guide and providing valuable feedback.
Instructors: Lorrie Cranor, Jason Hong, and Michael Reiter
Course web site | Syllabus
We developed a full semester course on usable privacy and security for the Carnegie Mellon School of Computer Science beginning in the Spring of 2006. This course was designed to introduce students to a variety of usability and user interface problems related to privacy and security and to give them experience in designing and conducting studies aimed at helping to evaluate usability issues in security and privacy systems. Topics covered included: human-computer interaction methods for design and evaluation, secure interaction design, trust and semantic attacks, design for privacy, making privacy visible, web browser privacy and security, authentication and alternatives to text passwords, PKIs and usability, usable secure communications systems, and usable tools for security administration. Students analyzed usability issues related to a number of deployed security and privacy systems. In addition, students conducted group projects involving usability analyses and pilot user studies.
The course was designed to be suitable both for students interested in trustworthy computing who would like to learn more about usability, as well as for students interested in usability who would like to learn more about trustworthy computing. The course was open to all Carnegie Mellon graduate students with sufficient technical background. Juniors and seniors were given the opportunity to enroll with permission of one of the instructors. This course satisfied elective requirements for Carnegie Mellon computer science students at all levels, as well as for some students in other Carnegie Mellon colleges. We specifically targeted the course for students in the following degree programs: Masters in Human-Computer Interaction; Master of Science in Information Security Policy and Management; Master of Science in Information Security Technology and Management; Master of Software Engineering; PhD in Computation, Organizations and Society; PhD in Human-Computer Interaction; and PhD in Computer Science.
We ended up with 15 students enrolled in the course when we offered it in Spring 2006. These included one undergraduate (a senior in electrical and computer engineering), a non-degree student, 7 masters students, and 6 PhD students. All of the targeted degree programs were represented except software engineering.
The course was developed and taught by an interdisciplinary team of three Carnegie Mellon faculty members. All three faculty members have experience and expertise in the usable privacy and security area. In addition, each conducts research with a focus on a different area: Lorrie Cranor focuses on privacy, Mike Reiter focuses on security, and Jason Hong focuses on human-computer interaction. Having three faculty members from three different disciplines involved was a great way to teach an interdisciplinary course, however, a course like this could be taught by a single faculty member. In that case, it would be especially helpful to include some guest lecturers.
Our approach in teaching the course was to first provide students with a background in the methodologies used in the relevant fields, and then to expose them to current research in this area as well as examples of secure system design with varying levels of usability. Introductory lectures were given by each of the three faculty members at the beginning of the semester and at the beginning of each of the three major sections of the course (usability, privacy, and security). Students were assigned to prepare lectures for most of the remaining lectures, including a set of discussion questions and a class "activity." The student lectures were consistently well done and served as learning experiences for both the students who gave them and the rest of the class. We were especially pleased with some of the activities the students came up with. We have included some of these activities in the Course Materials section below.
Students were given a substantial reading assignment each week that included chapters from the course text and research papers. Students were required to submit a short summary (3-8 sentences) and a "highlight" for each chapter or article in the reading assignment. The highlight could be something the student found particularly interesting or noteworthy, a question for class discussion, a point of disagreement, etc. We required students to bring their summaries and highlights with them to class in order to receive credit for doing them (in this way these assignments doubled as a way of taking attendance and penalizing students for missing class). The summaries and highlights were graded for completeness. Students were permitted to drop their lowest two grades.
During the fourth week of the semester students were asked to propose group project topics and pitch them to their classmates during one of the class meetings (we also offered some suggested topics). Students were then instructed to assemble themselves into project groups of 3 or 4 students with diverse backgrounds (we provided a set of rules to ensure that students in each degree program would be spread across the groups). The students worked in their groups for the rest of the semester on their projects. They gave progress report presentations during the 11th week of the semester and final presentations during the final exam period. Students were asked to pick a project that would involve a pilot user study to evaluate the design of an existing or proposed privacy- or security-related system or gain insight into users' attitudes or mental models related to some aspect of security or privacy. Although students were permitted to conduct their pilot study on their classmates, all four groups decided to go through the IRB approval process and recruit outside volunteers for their studies so that they could produce publishable research results. All four groups ended up presenting their results as posters at the 2006 Symposium On Usable Privacy and Security (SOUPS). The students told us that they got a lot out of doing the project. The interdisciplinary teams worked very well. We saw the HCI students teaching the security students how to conduct user studies, and we saw the security students teaching the HCI students about security topics. If we had not had such a good mix of students in the class the faculty would have had to spend more time with the project groups or devoted more lecture time to the details of how to conduct user studies and on security topics related to the projects.
Because this was a new course, we conducted three evaluations of the course throughout the semester, in addition to our university's standard faculty course evaluations. The feedback was extremely favorable on each of the four evaluations. Our approach to providing only a small number of introductory lectures seemed to be successful. Students reported that the introductory lectures in the areas outside their primary area of expertise were very informative, while the lectures in their area of expertise were less informative, but still interesting. The students really liked the guest lecture we had from a PhD student who talked about a published paper he had written in the usable privacy and security area. Besides talking about his results, he gave the students a look behind the scenes at how he conducted his study. Most students said they liked the seminar style approach to the course. However, some students indicated that they would have preferred to have more faculty lectures and fewer student lectures. In future versions of this course we will likely include additional guest lectures and perhaps some additional faculty lectures as well.
Dhamija and Stuart
Course web site | Syllabus
This course covered content very similar to the CMU course. It included many of the same reading assignments as the CMU course and a similar term project. However, the instructors took a somewhat different approach to covering the course material, which is well-suited for a course that will have students who mostly do not have backgrounds in HCI. This course included take-home and in-class assignments to guide students through the interface design and usability study that is part of their semester-long group project. Students were assigned to perform a task analysis, develop design sketches, create and evaluate a low-fidelity prototype, create and evaluate a high-fidelity prototype, conduct cognitive walk throughs or heuristic evaluations, conduct pilot usability tests, and finally conduct formal usability tests. Each week one step in this process was partially completed in class and students provided suggestions to the other groups.
Similar to the CMU course, this course included weekly reading assignments, for which students were expected to write short summaries and commentaries. For this class the commentaries were submitted to an online discussion board so that students could read each others' comments and add further comments.
Some additional topics covered in this course that were not explicitly covered in the CMU course include: human interactive proofs, mobile and ubiquitous computing, digital rights management.
This course had about a dozen students in it when it was taught in Spring 2006.
Course web site | Syllabus
This course placed more emphasis on teaching security and privacy concepts and less emphasis on HCI methods than either the CMU or Harvard courses. It was similar to a traditional security course with 50% devoted to security, 25% devoted to privacy, and 25% devoted to usability. It included more traditional computer security readings and assignment, in addition to readings from the Security and Usability book. Students were required to conduct a literature review for a mid-term project and a pilot study for a final project. Projects were conducted in small groups.
This course was offered both on campus and remotely (via real-time video feed) in Fall 2005. About 30 students took it on campus and 30 students took it remotely. The couse was also offered in Fall 2004 at the Harvard Extension School and in Summer 2004 at Northeastern University.
We have broken down our course materials into 10 segments, similar to the approach we took when teaching our course at CMU. We spent between one and four lectures on each of these segments (for a total of 24 lectures). These segments can be covered in almost any order, although it would make sense to include the overview and introductory segments near the beginning of a course.
For each segment we provide a brief overview of the topics covered, suggested readings, discussion questions, activities, and pointers to slides and lecture notes. The discussion questions and activities might be used in class or as part of a homework assignment. In our class most of the activities were done in class with the students divided into small groups. The students were split into different groups for each class meeting. Some of the activities we have included here would be better done as homework assignments, ideally with a follow-up class discussion. Most of the slide sets and lecture notes have been provided by our students. We provide them unedited, as examples of ways to present material relevant to each segment. Many of the discussion questions and activities were also suggested by our students.
Our course used the book Security and Usability: Designing Secure Systems that People Can Use, edited by Lorrie Cranor and Simson Garfinkel. We have recommended readings from this book in our list of suggested readings. The Symposium On Usable Privacy and Security and the HCISec Bibliography are also good sources of readings. For our course we typically assigned 3 or 4 readings each week and also recommended a number of optional readings.
This segment is designed to provide an overview of the course and interest students in the topic. We presented this in two lectures: one on the first day of class and one after we had presented our three introductory lectures. We have also given stand-alone 1-hour lectures on this topic at conferences and in professional development courses. In this segment students should be exposed to a range of problems in the usable privacy and security area. They should see and discuss some tangible examples of both unusable and usable design related to security or privacy.
The activities for this segment are designed to get students thinking about usable privacy security problems and how to solve them. The two example activities presented here are related to authentication, but activities might be developed around any of the topics that will be discussed in this course.
Mobile device password scenario
Your target users are young professionals who are constantly on the go. These users have a variety of devices with them at all times, such as laptop, cell phone, PDA, and possibly others. Each of these devices requires a password for use. The users will use their devices in a variety of public settings and will need easy access to their data at a moment's notice. However, the consequences of someone else gaining access to their data are severe. The dynamic environment poses a threat to the entry of passwords into these devices, but hardware limitations preclude the use of biometrics. Discuss how to address security needs with passwords in this context.
Questions to Keep in Mind: Should each device have it's own password or should there be one password? Should users change their passwords on a regular basis? Should passwords be randomly assigned or chosen by users? What precautions should be taken for password entry in public spaces?
Hospital workstation password scenario
Your target users work in a hospital. Confidentiality of patient data cannot be compromised. Different employees have different levels of clearance within the one system that controls all of the patient records. There are a limited number of public workstations that are highly trafficked throughout the day. Current practice at the hospital is that one worker logs in and often many people with different levels of clearance work under that same account, even though they are not authorized to do so. Often, the workstation remains logged in between users, thus an unauthorized user could gain access to patient records. In addition, passwords change on a monthly basis so it is more convenient for the workers to just use the account that has already been logged in than try to recall their always changing password. Management insists that the passwords much change frequently to reduce the risk of a hacker viewing the confidential data. Discuss how to address security needs with passwords or other forms of authentication in this context.
Questions to Keep in Mind: Should users change their passwords on a regular basis? Should passwords be randomly assigned or chosen by users? What precautions should be taken for password entry in public spaces? Are there any alternatives to password authentication that might work in this setting? What can be done to change the workers' attitudes about security?
This part of the course provides an overview of human-computer interaction, including how to gather information about existing work practices through field studies, how to get quick feedback from end-users through low-fidelity paper prototypes, how to evaluate existing systems through user studies, and understanding mental models.
Why Johnny Can't Encrypt
Human subjects training
Students should complete your university's human subjects training. For example, CMU requires people doing human subjects research to take the National Cancer Institutes online training course.
Observations of people using technology
Observe people in a public place using a computerized system. For example, you might observe people using a public transit ticket machine, a parking garage pay station, a grocery store self-checkout machine, or an airport self-check-in kiosk. Stay long enough to observe both experienced and inexperienced users using the system. What kinds of problems did people have using the system? What aspects of the system appeared to be easy to learn? What aspects of the system appeared to be difficult to learn? What aspects of the system seemed to frustrate experienced users? How might the design of the system be improved?
Quick user study design
In a small group, examine a piece of technology and design a simple user study to test the usability of a particular aspect of that technology. The user study should involve having a user perform a task that will take no more than a few minutes.
Trade one member of your group with another group. The person from the other group will be your test participant. Conduct a pilot study with your test participant.
This is an activity that can be conducted during class in as little as 30 minutes to give students a taste of what it is like to design and conduct a user study -- divide the students into groups and have them spend 10 minutes on the first part and 10 minutes on the second part. Use the last 10 minutes for a discussion. In our class a cell phone, a portable CD player, and a calculator were tested.
Good and bad design examples
Look for examples of good and bad user interface design related to privacy and security. Offer suggestions for improving the bad designs. Keep a design scrap book throughout the semester or bring examples in for "show and tell" each week. (See http://www.baddesigns.com/ for bad design examples that are not necessarily related to privacy or security.)
This segment provides introductory material on privacy. While most people know privacy when they see it, few have given much thought to exactly what privacy is.
Technology and privacy risks
Pick a technology that causes privacy concerns.
Pick a particular industry or type of web site and use Privacy Finder to find three different companies from that industry, at least one of which has a P3P policy.
Examine the privacy policies for sites you found. For each policy, provide the URL and list major strengths and weaknesses related to both substance and presentation.
(Pop) cultural references to privacy
Find a reference to privacy in art, literature, advertising, or pop-culture (tv, movie, cartoon, etc.). Explain the reference and what it says about privacy. If possible, provide a URL or a copy of the privacy reference that you can share with the class.
This segment provides a brief introduction to concepts of computer security that are likely to be useful to the course participants. Since computer security is a very broad topic, this introduction is not intended to be comprehensive. Rather, the introduction focuses on topics for which background knowledge is likely to be useful when conducting user studies of security mechanisms.
The key concepts discussed in this introduction are:
The main goal of secure interaction design is to design a computer system to protect the interests of its legitimate users. Some examples might include protection from viruses, spyware, phishing, and accidental leaks of personal information.
Firefox security and privacy extensions
Download and review a number of Firefox extensions related to security and privacy (you can find them at https://addons.mozilla.org/extensions/). For each extension ask: Who is this extension designed for? What are their goals? Does the extension meet those goals? Does the interface accurately portray what the tool really does? Could a member of the intended audience use this extension?
Analyze the secure interaction design of a UI
Do a task analysis, cognitive walkthrough, and/or user study on a user interface that deals with security or privacy. Examples might include downloading software, opening up an attachment, logging into a web site, getting a certificate, being notified of a virus, etc. What are the strengths of the design? Weaknesses? What would you change to make it better?
This section looks at a class of security attacks that target end-users rather than computer systems themselves. Semantic attacks try to trick people into giving up information. The most common example of this is phishing attacks, in which scammers impersonate legitimate businesses to gather passwords and financial information.
Risks and approaches to mitigating them
Analyze the following situations. What risks are you exposed to? How could they be mitigated?
You receive a high-quality printed flyer in the mail advertising a new dating web site. You want to give it a try. It costs $20 per month. The first month is free, but you are required to provide checking account information so that payments are automatically deducted after the first month. You also need to fill out an extensive background and personality profile, including contact information.
While browsing a new web site you see an ad for an online electronics store. You have wanted to get a large HDTV for many months now. They are running a sale on their store brand. You aren't familiar with that brand, but it is a little cheaper than any of the name brands you have been keeping an eye on. And shipping is free for you first purchase. It has all the features you wanted. They accept all major credit cards.
You have been feeling pretty sick for several days now. One of your friends insists you have mono and should see the doctor. Your other friend says that is ridiculous and that it is probably just a cold. Since you just started a new job and your health insurance hasn't kicked in, it costs $80 to visit the doctor. You use Google to find an online healthcare website. You check the symptoms for mono on the site and you find your symptoms are consistent. (How would the situation change if your symptoms were consistent except for one?)
You see a newspaper ad for a new online bank. You haven't heard of it before, but it has really good interest rates and it says it is FDIC insured.
You receive an email advertising a free antivirus program. You were running a different one, but you let the subscription expire so you need a new one. You haven't heard of this other one before, but it is free.
Download and try an anti-phishing toolbar.
Anti-phishing on Web Sites
Show what different web sites are doing to fight off phishing attacks. Examples might include message centers, two-factor authentication, secrets between the end-user and web site (such as Bank of America's SiteKey), cards that have secret numbers that you can scratch off, honeypots, etc. Analyze anti-phishing solutions for failure modes and usability. Who is the user and who is the customer? What is the purpose of the anti-phishing system?
This segment looks at how to design privacy tools and how to design every day software with privacy in mind. It addresses the challenges involved in communicating with users about privacy as well as the challenges associated with privacy configuration interfaces. We spent four lectures on this topic, in addition to an introduction to privacy lecture earlier in the course. If your course has not included an introduction to privacy segment, some of that material might be combined with the material in this segment. If your course has included an introduction to privacy segment, some of that material might be reviewed here or revisited in more depth. Topics to cover include:
Distribute words and phrases related privacy to members of the class (or let them come up with their own) and ask them to draw a visual representation of them on the black board. The rest of the class tries to guess what the drawing represents.
Think about a piece of software with which you are familiar that has the ability to capture or disclose personal information (for example, a web browser, instant messaging client, video chat client, email client, etc.). Evaluate this software against the five pitfalls discussed in Five Pitfalls in the Design for Privacy. Propose ways of avoiding any of the pitfalls into which this software falls.
Alternatively, for each of the five pitfalls, come up with an example of a piece of software that falls into that pitfall. Propose a way to avoid this pitfall.
Download a privacy-related software application or use one you have already purchased, or use an online privacy service.
This could be done with specific types of privacy tools such as P3P user agents, cookie managers, anonymity tools, etc. Students could be asked to review more than one tool and compare them.
Examine the configuration interface for an online communication client (for example, instant messaging software) that allows users to control who has access to their online presence information. Come up with several examples of access control rules that typical users might wish to implement. Can the interface be used to implement these rules? Do you think most users will be able to figure out how to use this interface to implement these rules? Do you think most users will take the time to configure this interface? How could the configuration interface be improved? How could you perform a user study to test this interface?
Many software products contain "phone home" features, for example, for performing software updates or monitoring usage patterns. In some cases software phones homes quite frequently, for example, to update phishing black lists or check for fresh image files. Users may be concerned that the software company is using these features to track or profile them. Thus it is important that the software is up front about the fact that it is phoning home. Furthermore, some users may wish to disable such features or be prompted every time before they phone home (due to privacy or other concerns), whereas other users are happy to have them operate automatically. Discuss the various approaches you have seen different software manufacturers take to addressing this problem. What do you like/dislike about them? How should phone home features be designed so that they facilitate informed consent? Describe an example user interface design and general principles that might be applied to specific cases. What sort of user studies should be performed to test this user interface design?
Web browsers are arguably the most used class of software today. However, web browsers do many things that users may not necessarily understand. Some examples include sending cookies to web sites, making secure connections, and asking people whether to accept a certificate. This section looks at user interface metaphors for web browsers.
Find all of the security-critical settings in a web browser. Are they all in the same place? Should they be? What is a security-critical setting?
Review several cookie managers (browser extensions or built into web browsers) and design the user interface for a new cookie manager. What successful features from other cookie managers have you borrowed? How has your design improved on other cookie managers? Who are the intended users of your cookie manager?
Better User Interfaces
Take some aspect of the web browser and design a better user interface for it. This might include a better metaphor for secure connections, more awareness and control of cookies, a better way of managing unknown certificates, or a better way of identifying trusted / untrusted sites. Consider other constraints too, including existing knowledge of users, where people are likely to look (assuming a visual interface), screen real-estate, feasibility of implementation, and feasibility of adoption.
Abstractly, authentication in a computer system refers to determining the principal who could have originated some communication. User authentication is a common example of this, where the principal is some human user. But more generally any component of a computer system can originate some communication, and authentication refers to confirming a claim of which component originated it.
This segment introduces various types of authentication in computer systems. Authenticating messages in a distributed system is often done via digital signatures, and determining the public key to authenticate messages from some named person is typically done with the help of a so-called "public key infrastructure" or PKI. This segment introduces public key infrastructures. It also introduces various methods for user authentication, including via text passwords, biometrics, and graphical passwords.
Key concepts included:
Start with a score of zero. For each of the statements below, if the statement applies to you, add or subtract the number of points indicated.
This quiz was developed based on common password advice. Do you disagree with any of this advice? What are the violations that are most common among your classmates? What are some things you could do to create better passwords that are still memorable? Are there other bad password habits that are not addressed by this quiz?
Suppose your instructor uses the ThunderBird mail client, including (only) the certification authorities shipped with that client. Send to your instructor an email that is signed in a way that his/her client can authenticate you as the sender.
Many laptops today come with built-in fingerprint readers. Conduct a user study to determine how secure these readers are. Can you ever fool them?
This segment introduces the role and duties of a typical security administrator, and how the tools available to him/her are often inadequate. The segment also describes how visualization tools can aid in system diagnosis, but also how administrators who come to depend on them can be misled by an intelligent attacker who exploits weaknesses in the visualization interface.
Key ideas include:
Design a visualization interface for describing some security-relevant observation, which you believe is immune to attacks by an adversary who knows that interface is being used? What is the key principle you employed to achieve this?
The base-rate fallacy
Consider a computer installation that produces about 1,000,000 audit records per day. For this small installation, let's presume that only about one or two of these records are indicative of some form of attack. Now suppose that this installation installs an intrusion detection system that is 99% accurate, i.e., on records indicating an attack it raises an alarm 99% of the time, and on records devoid of attacks it raises an alarm only 1% of the time. Are the administrators likely to be happy with their new intrusion detection system? Explain.