July 23-25, 2008
Pittsburgh, PA


Call for participation





The Symposium on Accessible Privacy and Security (SOAPS '08)

July 23, 2008

part of 2008 Symposium on Usable Privacy and Security (SOUPS)

Pittsburgh, PA

Scope and Focus

Verification images, toolbars, CAPTCHAs, and other web-based mechanisms are increasingly being used to protect both users and service providers from fraudulent access to information services. These tools may play a valuable role in web privacy and security, but they pose problems for users with disabilities that may prevent them from interpreting images or other visually-presented information. Alternative approaches including audio may provide partial workarounds, but they suffer from problems of their own, both in terms of security and usability.

As these tools become more widely used, unresolved accessibility concerns may prevent individuals with various disabilities from safely accessing financial, commercial, and educational resources on the web. Techniques that are accessible, secure, and private will be needed to avoid disenfranchisement of this large group of users.

This workshop on accessible privacy and security will bring researchers and commercial practitioners interested in usable privacy and security together with accessibility researchers and experts in the accessibility needs of particular populations. Discussions and presentations will cover a range of topics at the intersection of these areas, including, but not limited to:

  • Human-Interaction-Proofs (CAPTCHAs) for website authentication
  • Mechanisms for verifying site identity
  • Anti-phishing tools
  • Anti-virus tools
  • Privacy policies
  • Concerns regarding situated use, including topics such as the privacy concerns of audio feedback in public places
  • Universally usable designs that meet the needs of all users.
  • Strategies for achieving both strong security and accessibility
  • Needs of specific sub-populations such as deaf-blind users
  • The impact of evolving web technologies including AJAX and Rich Internet Applications on accessible privacy and security

SOAPS '08 invites participation from researchers, practitioners, and accessibility advocates from all perspectives on accessibility and security. Participants will have the opportunity to discuss system requirements from various perspectives, present and discuss research prototypes, and develop an agenda for both future research and technology transfer.


Prospective participants should submit a short position paper along with a cover letter describing their research interests, experience, and background in relevant areas. Position papers should be up to 4 pages in length, using the SOUPS proceedings templates for LaTeX or MS Word. All submissions must be in accessible PDF format and should not be anonymized.

Submit your paper using the electronic submissions page for the SOUPS 2008 conference. A successful submission will display a web page confirming it, and a confirmation email will be sent to the corresponding author. Please make sure you receive that confirmation email when you submit, and follow the directions in that email if you require any follow up.

Important Dates

  • Position papers deadline: April 24.
  • Notification of acceptance: May 11
  • Camera ready final versions of the papers due: June 6

Workshop Organizers

Harry Hochheiser, Towson University

Jinjuan Feng, Towson University

Jonathan Lazar, Towson University

Anne Taylor, National Federation of the Blind

Mark Riccobono, National Federation of the Blind

Preliminary Program

9:00 AM-9:10 AM Introduction and Logistics - Harry Hochheiser, Jonathan Lazar and Jinjuan Feng, Workshop Organizers

9:10-10:10 Opening session: The challenges

10:10-10:30 Break

10:30 AM - 12:00 PM Human-Interactive Proofs (HIPs) and Passwords

12:00 - 1:00 PM Lunch

1:00-2:30 More HIPs, Cognitive Issues, and Standards

2:30 - 2:45 Break

2:45 - 3:15 Next Steps

  • Discussion: Next Steps and Other challenges. What's Next? Where do we go from Here?
  • Wrap-up & Concluding Remarks

SOUPS is sponsored by Carnegie Mellon CyLab.