Resources: UPA 2005 Idea Markets
Can we make the world a safer place? Usability for security and safety
Activator: Steve Fadden, Landmark College
Participants discussed issues related to the safety and security of the products and systems that they design, and the role that usability practitioners play in ensuring safety and security. The topics of interest included the role that practitioners play, the scope of their responsibilities, challenges that they face, skills and techniques that they can offer, and methods of communication available for them to get their message across. Participants agreed that regardless of the specific roles played and methods used by usability practitioners, usability practitioners can and should have an influence over safety and security, and we should continue to explore methods and activities for improving safety and security.
The Design of Security and Safety
The idea that was explored in this Idea Market activity was the relationship between usability and security/security. Numerous recent events highlight the importance of safety and security, from problems related to securing computer systems and databases (e.g. identity theft, compromised credit and medical information, distributed viruses) to issues associated with physical systems and protected spaces (e.g. break-ins, thefts, control of remote systems, and direct acts of sabotage and terrorism).
As our global society continues to turn to technologies and systems to improve safety and security, usability practitioners may be positioned to play an important role in ensuring that systems and tools are truly making us more secure and safe. Systems and technologies for safety and security are all designed. Some systems are designed at the product level, such as a video camera used for remote monitoring or a software application that has password protection. Other systems are designed at the opposite end of the spectrum, such as an entire system of tools and training procedures to ensure that people use products and follow instructions in order to operate in a coordinated and effective manner.
Given our growing reliance on technology and complex systems, the following questions were posed for discussion:
- Roles and Responsibilities
- Do usability practitioners have a role in ensuring the safety and security of systems? If so, what is the scope of that role?
- What are our responsibilities?
- What specific challenges to safety and security should we try to address?
- Which challenges represent our top priorities?
- How can we best communicate information about safety and security issues to our organizations? To our professions? To the public at large?
- Skills and Techniques
- What skills and techniques can usability practitioners use to improve the safety and security of people, places, and systems?
- How can we apply the principles of user-centered design to enhance security and safety? Which principles are most applicable?
- How can usability be modified into principles of unusability to make it more difficult for threatening individuals or organizations to gain access or usage of systems and locations?
Participants discussed each of these questions in turn, routinely returning to the primary question to clarify, elaborate, or constrain the scope of what was meant by the terms safety and security. Some participants constrained the scope to the intended purpose of the product or system being developed, without going beyond it. Other participants articulated a more expansive view, stating that designers of products need to understand the whole range of uses and abuses that may occur. Highlights from some of the conversations and debates expressed during the Idea Market are summarized below.
Roles and Responsibilities: We can make a difference
All participants indicated that usability practitioners should have some responsibility for identifying, addressing, and raising safety and security concerns with a product or system that is being designed. However, the scope of this responsibility was a point of considerable debate. Many participants expressed a desire to constrain the scope of the problems that should be addressed to include only those safety or security problems that would arise from the user interface, and not necessarily from the functionality of the product, or the willful abuse or misuse of the product.
Participants also believed that their responsibilities with regard to security and safety should include functions that are already performed as part of their role, such as assessing user needs, developing interfaces and prototypes, and evaluating performance. However, usability practitioners may have to change the nature of the questions they ask, and alter the ways in which products are investigated, in order to have a greater influence on safety and security. For example, designing and testing for safety and security might require an emphasis on different user profiles than are typically investigated, such as potential criminals and abusers of systems who are not part of the typical user profile or customer base.
Communication Challenges: We should find our voice and make ourselves heard, within reason
Although challenges and communication were presented as separate topics, participants generally saw communication as a key challenge that must be addressed in order for usability groups to play an active role in addressing security and safety.
A major challenge that practitioners need to face is their ability to be heard and accepted within the organization. Usability people often have a hard enough time getting their organizations to listen to them about usability problems, let alone other issues. So, some participants expressed concern that they would never be in a position where they could influence issues of security and safety. There were also concerns about the structure of the organization and the role that usability people typically play within that structure. In some cases, usability people may not be represented at a high enough level in the organization to have their concerns heard. In worst cases, usability people who try to deal with security and safety may be seen as over-stepping their bounds and treading on the turf generally reserved for dedicated security, safety, or compliance teams. Because of these concerns, participants emphasized the importance of working within their organizations to identify people and processes that they can leverage to help them communicate.
If usability practitioners are to be responsible for safety and security issues, they need to do so in a manner that will most effectively utilize their limited resources. Most participants stated that safety and security issues must be addressed through established communication channels and delivery mechanisms, such as highlighting problems in usability reports, and surfacing concerns with project teams and management. However, a small number of people argued that issues of security and safety were too important to restrict them to standard usability channels. These people believed that if someone identifies a security or safety concern, and the team or organization does not do anything about it, then it is important for the message to be disseminated through other channels, such as the media or public interest forums.
Design Challenges: We understand how people interact with systems
A central tenet for usability and user-centered design is the importance of understanding people’s goals and ensuring that they can accomplish their goals in a manner that is efficient, effective, and satisfying. Because much of what we do involves studying people and how they behave, many participants expressed that we already know many of the design challenges we must address for safety and security. The top two issues that were raised were designing for trust and designing for minimal effort.
To design for trust, the usability practitioner first needs to understand which factors are required by users to have faith and confidence in a system. Then, the practitioner needs to ensure that the system meets these requirements in the targeted usage environment. These factors can include following established conventions for security (e.g. an icon to indicate that a connection is secure, a login or authentication process to verify identity) and conveying a sense of professionalism in an interface (e.g. incorporating formal interface elements that users associate with professional products). Participants felt that a sense of trust was necessary for a user to engage in behaviors that have some element of risk – such as transmitting sensitive personal information or managing a household through a remote interface. However, providing a sense of trust presents a potential tradeoff because users of systems may be led to over-trust a system or engage in risky behaviors based on the perceived level of security and safety implied by the system’s design.
Another important consideration identified by participants is designing systems for minimal effort. Reducing the amount of effort required is important because users and operators may defeat safety and security measures when adhering to such measures requires too much energy, too much time, or exceeds the normal limits of what people can reasonably do. For example, systems that require multiple complex passwords are often defeated when users write their passwords down and store them near the work area (such as on a note taped to the monitor or stored in a desk). Interfaces or tools that require users to inspect or evaluate settings, or engage in several safety-related subtasks can be defeated by users who don’t feel they have the time necessary to engage in such activities.
Due to the psychological, behavioral, social, and cultural aspects involved in understanding what constitutes trust and effort for a user or operator, participants agreed that usability practitioners are well-suited for evaluating these aspects in the context of safety and security.
Skills and Techniques: We have a lot to offer
Although usability professionals may not have the latest knowledge and understanding of security issues and safety requirements, we often have the best understanding of how our tools and systems will be used by end-users and system operators. When new systems are developed, or significant modifications are introduced to existing tools, we are often the best-equipped to study how changes to systems and processes can introduce vulnerabilities or jeopardize safety before the system is fully developed.
While many techniques and methods were discussed in the context of identifying and evaluating security and safety, two major themes emerged. The first theme was that many of the techniques that we use in our day-to-day activities can be used to help us understand how systems may be used or abused in a manner that can result in reduced security or decreased safety. The second theme was that usability practitioners can apply their knowledge of current design principles and usability guidelines to develop new principles – principles of unusability – that can facilitate the exploration of ways in which systems can be designed to thwart abuse or access by unauthorized users.
Usability professionals use a number of user-centered design and evaluation techniques that can facilitate the identification of security and safety issues, and develop solutions for them. Participants discussed a number of the techniques they currently employ for user-centered design and usability, and considered how these techniques could be adopted for the improvement of security and safety. These techniques include the following:
- Developing user profiles of likely abusers or unauthorized personnel who may try to gain access to a system, or use a tool for malicious purposes;
- Conducting user needs assessments to evaluate usability and usage requirements in the context of overall security and safety goals;
- Analyzing tasks to identify specific steps in the process that can be compromised, and environments in which outsiders and potential saboteurs can interact with the system;
- Performing naturalistic observations to understand how users may defeat security and safety mechanisms, and to identify patterns of behavior that may be exploited by unauthorized or malicious individuals;
- Incorporating safety and security features into interface and tool prototypes for early evaluation;
- Evaluating the interactions between personnel, systems, and information flows in order to identify likely bottlenecks in performance and opportunities for error;
- Reviewing interfaces and design elements to evaluate the amount of trust a user may have in a system;
- Auditing tool and application interface screens to ensure consistent and appropriate use of security-related terms and safety conventions; and
- Testing tools and systems to determine the amount of effort required for users to comply with safety and security procedures.
Some participants expressed that the skills and techniques employed by usability practitioners may enable them to study and evaluate security and safety issues (at least those issues related to a product or system’s use or abuse) more effectively than personnel who are assigned to work on issues of safety and security. This is because team members responsible for safety and security often focus on standards, legal requirements, and technology features more than they focus on how a product is actually used. By studying how a product or system is used in the context of its intended environment, usability personnel are able to see risks and security problems first-hand.
The second theme of this discussion – principles of unusability – received substantial attention and input from nearly all participants. Several participants suggested that we develop principles based on known design heuristics and usability best practices, and some of these ideas are presented below. It is important to note that the ideas presented here represent starting ideas for brainstorming and exploration, not necessarily “principles” that should be employed by a designer or usability practitioner.
- (In)Consistency: Can areas of the interface employ inconsistent designs or procedures to make it less likely that people will engage in abusive behaviors?
- (No) Context: Can context be removed to decrease the chance that an intruder will understand how to defeat security?
- (In)Flexibility: Can the system force users to work in specific way that prevents them from taking unsafe or insecure shortcuts?
- (No) Feedback: Can sensitive or secure areas of a system be developed to prevent malicious individuals from receiving feedback about their actions?
- (In)Efficiency: Can time- and effort-saving conventions such as auto-filling forms or system shortcuts be removed to increase the effort required for a malicious individual to use a system?
- (No) Error-forgiveness: Can sensitive elements of the interface be made to be error-intolerant to force certain behaviors or catch intruders early?
- (Un)Predictability: Can the system be designed in a manner that does not predict the actions and intentions of the user? Can the system predict patterns and behaviors that likely represent abuse or unsafe use?
- (No) Assistance: Can the system be designed to prevent unauthorized users from receiving help?
While participants expressed enthusiasm over the potential solutions that could arise from exploring questions such as those presented above, several people emphasized that these questions should not overshadow the importance of taking a step back to consider design and implementation decisions in the context of how a product or system will be used. One participant stated that we need to “go back to basics” to identify when usually-sound decisions may not make sense in a safety or security context. For example, features which promote efficiency and decrease effort may not be appropriate for certain environments. If a computer application is to be designed for use in a public place (such as an internet browser that is available at a café or a public information kiosk), then features such as auto-filled passwords and browser histories may not be sensible. Another participant stated that it is important to pay attention to how a product is used, regardless of how much time and thought went into its design. If personal devices are designed to be used in public spaces, then usability practitioners should be mindful of the risks that may be present as the user operates the product. For example, pedestrians who use personal digital assistants or mobile phones may be more susceptible to safety risks and security compromises such as failing to hear the sound of oncoming traffic, or being so engrossed in the interface that their wallet is stolen.
Although some participants disagreed about the scope of responsibility that we should be accountable for, all agreed that usability professionals have much to contribute to the evaluation of security and safety issues. Our skills and techniques, combined with our roles as specialists who must integrate numerous demands from diverse product representatives (e.g. Marketing, Sales, Development, Testing), places us in a unique position to understand both the users of our products, and the ways in which our products may be used and abused.