
03-08-10
Architecture Is Policy: The Legal and Social Impact of Technical Design Decisions
Today several members of the Electronic Frontier Foundation (EFF) Board visited CMU and participated in a panel discussion on the effect technical design decisions can have on our society. Below is the abstract for the panel and a summary of the discussions, points, and comments.
Watch the entire panel on YouTube.
Abstract:
Technology design can maximize or decimate our basic rights to free speech, privacy, property ownership, and creative thought. Board members of the Electronic Frontier Foundation (EFF) discuss some good and bad design decisions through the years and the societal impact of those decisions.
The panel opened with comments from each of the panelists.
Cindy Cohn
EFF Legal Director, Moderator
Cindy pointed to the discussions EFF is having with Google about privacy issues concerning Google Books. Public libraries have dealt with privacy issues for a long time. Specifically they are concerned with the government’s ability to get information on what content people read. Libraries take measures to completely remove information about what books are being checked out and by whom. New technologies like Google Books need to consider privacy issues now, while they are implementing the system, not just when they come up.
Dave Farber
Distinguished Career Professor of Computer Science and Public Policy,
School of Computer Science, Carnegie Mellon University
The design of systems greatly controls the type of security and privacy that can be built on them. However, security and privacy requirements can also greatly affect the design of a system if considered early enough in the design process. An example of this is the Multics system. The designers wanted an environment that protected privacy and security but no hardware could support it. They worked out the security and privacy issues before the code and the technology came together to produce an enviornment that could support it. Planning how to put security and privacy into the system before coding it creates a system which is capable of supporting security and privacy instead of trying to work with a system that is inherently insecure.
In the early days of the internet many decisions were made without regard to policy or their impact on privacy and security. What would have happened if the creators of the internet had considered these policy issues?
Ed Felten
Professor of Computer Science and Public Affairs and Director, Center for Information Technology Policy, Princeton University
Ed talked about SSL certificates as an example of a technology where the developers were thinking about security from the beginning but still messed up. How do you know the webpage you are looking at is really coming from who you think it is online? The answer is SSL certificates. When your web browser contacts a secure (https) website the website gives your browser a certificate. How does your web browser know that the certificate is valid? It verifies the certificate using a Certificate Authority’s (CA) public key. How do you know the CA’s key is valid? You need a single Certificate Authority that everyone trusts that verifies the entire transaction. Unfortunately there is no such Certificate Authority. Instead your web browser has a default list of 7200 trusted CAs that they believe to be legitimate. Any of these CAs can sign a certificate and your browser will just trust it. Also, any one of these 7200 can delegate their “god-like abilities” to any other authority they choose. For example the person in charge of the Chinese government CA can give anyone the ability to do a man in the middle attack on any secure communication. There are no doubt that there are people doing creative things with this today. But fixing it is not easy.
Lorrie Cranor
Associate Professor of Computer Science and Engineering and Public Policy, and Director of the CyLab Usable Privacy and Security Laboratory (CUPS), Carnegie Mellon University
Lorrie started out with a discussion of P3P which she helped develop. About 10 years ago the goal was to define a standard way to let companies communicate their privacy policies using an XML based format. Using P3P, one concept allowed companies to post their privacy policies and browsers to interpret it and automatically negotiate the terms. People thought that using P3P would also allow browsers to display privacy policies in a standard way to end users. The only major browser to implement P3P was Internet Explorer (IE) and it used P3P only as a cookie blocking mechanism. Basically, if your site had 3rd party cookies and no P3P policy or a bad P3P policy IE would block your site. So all the companies that delivered third party advertisements quickly adopted “good” privacy policies as defined by Microsoft. As a result Microsoft set the standard for “good” privacy policies, for third party advertisers, across the internet.
The CUPS Lab at CMU is trying to better understand the effects privacy policies can have on consumer’s behavior. To this end, we have built a site called PrivacyFinder which allows people to see Google or Yahoo! search results annotated with privacy policy information. Our studies have shown that people will purchase from websites that cost a little bit more if they have good privacy. Depending on how you setup the architecture of a system you can impact how people react to it.
John Buckman
EFF Board Chair, Serial Entrepreneur
John chose to talk about three different technologies he has worked on that had interesting security and privacy issues.
Lyris was an email discussion server. Developers wanted to support the use of Lyris by a Chinese dissidents group. To help support this population the developers chose to create an anonymous message feature which was made available to all Lyris users. The Chinese dissidents didn’t use the feature but schools did. Teachers used it so that students could talk to each other without fear of retribution. The feature allowed students in creative writing classes to freely discuss controversial writing.
Bookmooch is a book swapping website. Privacy was an issue when the site was first designed. All the books you want to read and all you have given away are visible to everyone. But accounts are anonymous. The site uses the difference between privacy and anonymity to give people a strong trust network while still separating their online actions from their identity. The biggest user demographic is mothers who appreciate both the detailed online accounts and the privacy, through anonymity, it gives them.
Magnatune is an internet record label. What should the rights be of the people who come and listen to the music? There is normally a contract when you open a CD or listen to music online. But if you violate the contract there are really no repercussions. Magnatune takes the position that if you have chosen to spend money on songs, you are probably an honest person. So, there is a creative commons license on the music. Subscribers are asked not to share the music they purchase on Bittorrent. If subscribers want to share Magnatune music with friends that is a good thing, Magnatune would rather you shared the music locally and got people interested in the site. This policy seems to be working, Bittorrent doesn’t have any of the paid albums shared.
Open for Questions
Question: How do we fix the issue with Certificate Authorities? (Directed at Ed)
Ed pointed out that people need to think about institutions and trust. You need to give someone the divine authority to approve Certificate Authorities. You also need more things done in the open instead of behind closed doors. If a Certificate Authority chooses to delegate all their rights to another authority that should be forced to be in the open. You need to think how the technology works together with organizations. You also need to think about how the public can meaningfully understand the technology.
Lorrie added that even if you can address the underlying trust model issues most user interfaces that show certificate issues are very poor. Even if we made them very trustworthy the user just swats them away. What the user mostly cares about is if anything has changed. For example if I am going to Ebay and suddenly Ebay is being verified by someone else that is a problem.
Ed added that the problem is that the designers default solution is to ask the user or add another user preference. Neither is a real solution.
Question: Music used to work on an economy of scarcity. Now you can easily distribute music for nearly no cost. This is true for many electronic resources.
John pointed out that by suing the pirate economy the RIAA has made the UI interfaces for file sharing technologies quite bad. Meanwhile LastFM and Pandora have nice user experiences. An obvious way to combat music piracy and electronic distribution is to come up with things that encurage usage. An older example is mobile phone bills. For a while mobile phone bills would come and they would be scarily high causing people to stop using mobile phones. The mobile phone industry fixed that and now they are doing better. Music labels were working by suing offenders. Now websites like Hulu are doing well.
Cindy added that an alternative way to think about the problem is as a licensing problem. The traditional mechanism of licensing used by music distributors is to control all the copies of your product. Another solution is voluntary collective licensing. There are many models that already exist and we need to be considering existing models ones before we create new ones. Controlling all copies is not the only way.
Question: A current issue is that there are platforms, such as the iPhone, where anyone can write apps. What kind of privacy and security risks could this bring about?
John pointed out that the landscape for privacy and security on these platforms has a wide range. Apple tracks everything while other platforms have different privacy and security controls.
Dave added that the answer depends on what these apps run on and what they enable or don’t enable. The problem is an issue of how constrained an app is. Apple has this issue with apps that do things like find the number of the phone it is on. If the underpinnings were built better you could assume that apps were well behaved but unfortunately this is not the case.
Ed added that mobile devices cause more issues. They are 1) always on 2) easily lots or stolen 3) know where they are. All of these add extra privacy and security concerns.
Question: Multics was designed with security from the beginning and is no longer used. The Internet was not designed with security in mind and everyone uses it. Can we take a lesson from this?
Dave pointed out that Multics was successful when it was used and where it was used. It was success but the architecture changed. Internet is insecure and a success but you are sitting on the edge of a cliff. What would it take to crash DNS? It worrisome that we build more and more of our infrastructure on this thing and no one is sure how to secure it.
Question: What are other strategies for having social interaction where anonymity may not be an option?
John commented that he actually likes Facebook’s model. To see anything you need an account and to see information on anyone you need friends to friend you.
Cindy pointed out that Facebook started out to a trust based model. It was closed to a single university and in order to see information you had to be friended by someone. Many of EFF’s concerns with Facebook are that they are shifting away from a trust based model to a more Twitter-like open model.
Question: Are you doing anything with education? Are you working on any privacy and security curriculum?
Cindy started out by commenting that several of the board members are university professors. But creating education curriculum isn’t something EFF is working on much. EFF isn’t well possitioned to do that. Instead EFF is more interested in making students involved because today’s students going to be the ones building these sorts of tools.
Question: I haven’t heard about de-anonymization yet today. Anonymity is good but what if the data becomes no longer anonymous? This would be scary in places like Bookmooch that are based on anonymity.
Cindy pointed out that EFF has been critical of Google over this. EFF has a site called Panopticlick which shows visitors their “browser fingerprint,” or how unique their browsing behavior is. In a closed community there is different concerns. In some communities there isn’t and issue because they just don’t want names immediately attached to their online behavior. De-anomization breaks allot of things.
Lorrie commented how important it is to be realistic when saying that things are anonymous. For example Bookmooch people may have trouble identifying each other but if someone had the whole database they might be able to do more.
John pointed out that Bookmooch is very transparent. This is obvious from the first setup. John struggles with this with Magnitune. At one point someone crawled the site and came up with a number of how much each artist was making from music sales. This made musicians very unhappy.
Dave spent a year in Washington DC working with policy makers. According to him policy makers have minimal technological knowledge. But despite how little they know about technology these policy makers are going to make the laws that are going to decide the future of technology policy. The policy they are establishing impacts us continuously. Technologically skilled people would do yourself and the nation a service if you spend a couple years in Washington DC educating them.
Question: The earlier question about education was about certification. The real issue is a lack of educational materials. EFF does have some leverage in that. The real problem is educating everyone else not geeks like us. I would pick as the primary target post secondary education.
Cindy pointed out that EFF did develop a curriculum for copyright law. California created a requirement that k-12 students get an education in copyright. Music and other organizations with interests in protecting their copyrighted works immediately created curriculum that was somewhat biased and ignored important topics such as fair use. We created a counter curriculum. EFF isn’t quite sure how much the curriculum is used. There are a network of teachers across the nation that seem to use it.
One of the most frequent complaints from kids is that they call and say “my mom read my facebook page” and we tell them “that is because you friended her.”