Home
[an error occurred while processing this directive]

Resources: UPA 2005 Idea Markets

[Usability] Consumer Report: Can we make usability performance data more compelling and understandable?

Activator: Emil Georgiev, Kevin Lee

The Activator's Initial Questions

  • How do you quantify usability attributes? What usability attributes are you able to quantify?
  • How do you report out the quantified data?
  • What are some ways in which you can consistently quantify usability attributes?
  • What are your organization’s practices for reporting usability data (both qualitative and quantitative)?
  • What effort have you seen in your organization’s approach to reporting results?
  • How do you justify usability initiatives in your organization? (or do you?)
  • How do you track or quantify specific usability attributes (learnability, efficiency, low errors, memorability, satisfaction)?
  • What other usability or product attributes that should be tracked and measured?

Summary of Results

Justifying usability within organizations as well as in the marketplace has been challenging. One of reasons for such challenge was stemmed from the fact that quantifying usability attributes and presenting the data through use of metrics or ratings was considered novel but impractical idea. The intent of this discussion is to explore ways to make usability performance data more compelling and understandable for non-UCD business partners as well as consumers, and that a consumer-reports-style rating system may be a good solution. The following are some of the key takeaways from the ideas developed during the session.

  • Data presentation format is critical and has to be carefully tailored to the audience
  • Adopting a “consumer report” data format for presenting usability data can help make the data more compelling when presented both internally or externally (setting expectations)
  • Efficiency and user satisfaction are easy to measure. Learnability is also very important but has to be defined as a second level metrics measured not directly but as a function of efficiency over time for a novice user:

  • Simplify usability data presentation. Voice it in terms of the “pain” users go through
  • Focus on data you can reliably get
  • Agree on appropriate metrics in advance

Collected Comments

The comments of the participants collected during the Idea Market session are generally presented without editing. Some additional wording has been added to compensate for the abbreviated form in which some of these comments were captured.

  • We use generally three levels of reporting the data:
    • One page bulletin list containing high level summary and key customer concerns
    • Quick results list containing top ten findings. The outline is as follows:

    • Prototypes and design audits
  • Data presentation is critical
  • Try to simplify data presentation (charts and graphs)
  • Typical data usability data include:
    • Time on task
  • Task completionLearnability is important type of data but hard to measure
  • Learnability can be derived from conducting multiple trials over time starting with a novice user
  • A differential treatment should be adopted when reporting learnability for trained vs. untrained users
  • Learnability data reporting will need multiple reference points
  • Call center metrics can be used to describe learnability
  • Reports are usually based on study type
  • Only high confidence data should be reported
  • In your reporting use the terminology of users and consumers
    • Voice it in terms of the “pain” they are feeling
    • Keep soundbite size
  • Develop standard format (like consumer report) so people expect know what to expect
  • Ethical considerations will become important if usability data is developed and published “consumer reports” style – how to avoid appearance of impropriety as if endorsing a product
  • Focus on the usability data you can reliably get
  • A big challenge is how to link product performance improvements to usability changes as a result of UCD
  • For ROI purposes – work with management and quality to agree on UCD metrics for a given project

Usability Resources UPA Store UPA Chapters UPA Projects UPA Publications Conferences and Events Membership and Directories About UPA