Go to UPA home page The U P A voice

April 2008 Contents

UPA Job Bank

UPA 2008


Competition on World Usability Day 2007: Are you the world's best expert reviewer?

By Shazeeye Kirmani, Shanmugam Rajasekaran, Deepa Bachu, Muthukumar and Amit Pande

Shazeeye Kirmani, Senior Usability Engineer at Infosys Technologies Ltd., passionately innovates and implements compelling user experiences through modeling, research and testing.

Shanmugam Rajasekaran, Head of User Experience at Infosys Technologies Ltd, focuses on large scale deployment of user experience services and metrics driven usability processes across industries.

Deepa Bachu, Group Manager at INTUIT Technologies, uses Customer Driven Innovation (CDI) to create new offerings and improve upon existing ones, starting with identifying customer needs and then building solutions that meet those customer needs.

Muthukumar works on the Experience Design team, Sun Microsystems, Bangalore, and has 11 years of professional experience in designing, leading and managing User Experience for web & desktop products.

Amit Pande, Senior User Experience manager with Oracle and the President of UPA Bangalore, holds an MS in Engineering from the University of Minnesota, Twin Cities and a Bachelors degree from the National Institute of Technology, India.

A world wide expert review competition sponsored by Intuit and Infosys Technologies Ltd. was hosted by the Usability Professionals' Association, Bangalore on World Usability Day, 2007. The purpose of this competition was to expose all to a simple, yet powerful usability technique: heuristic evaluation. It also served to gather data to define heuristic expertise standards at a global level. This is critical as this popular and valuable technique is used by 76% of the usability community (UPA Survey, 2005) and it shows a cost-to-benefit-ratio of 1:48 (Nielsen, 1994). By defining these standards we can ensure that evaluations are of a certain standard.

The 1 hour competition spread over two weeks encouraged not just expert reviewers but also anyone who could identify issues with a website. Contestants evaluated a healthcare application. They were asked to find issues with five scenarios of the application and requested to submit entries in a particular format. Scenarios included finding symptoms and conditions associated with a cough, editing symptoms, finding articles related to a cough and emailing articles.

Twenty contestants participated in the competition. They were encouraged to participate via various methods such as sending individual emails, posting the competition on blogs, social networking sites, mailing communities like human computer interaction groups and hosting it as an event on World Usability Day. Their demographics are seen in Table 1.

Table 1: Demographic data of the contestants
Parameter Average Range
Age 28.4 years 22-34 years
Time spent as heuristic evaluator 23.7 months 0-120 months
Time spent as a usability practitioner 30.7 months 0-144 months
Time spent as a domain expert (i.e. healthcare) 4.2 months 0-24 months
Confident of winning (self rating on a scale of 5 where 5=absolutely win and 1=never win) 4.1 1-5
Gender 6 females and 14 males
Location 6 states and 2 continents

The competition was judged based on a methodology documented in the Journal of Usability Studies (Kirmani and Rajasekaran, 2007). Briefly, the methodology assigns weights of 5, 3 and 1 to showstoppers, major issues and irritants respectively. Showstoppers are catastrophic issues that prevent users from accomplishing goals. Major issues are issues causing users to waste time or increase learning significantly. Irritants are cosmetic issues violating minor usability guidelines. A Heuristic Evaluation Quality Score (or HEQS) is computed for each evaluator by multiplying the weightage factor with the number of issues in that severity category. For example, Evaluator A has identified 2 showstoppers, 10 major issues and 20 irritants. His HEQS= 2*5+10*3+20*1 =60.

Overall results indicated that the group performed very well in identifying major issues but did not perform well in identifying showstoppers and irritants (Figure 1). The winner was able to identify 6 showstoppers 22 major issues and 5 irritants arriving at a HEQS of 101. On average contestants were able to find 2 showstoppers, 12 major issues and 2 irritants.

Figure 1. Number of issues identified based on severity

Figure 1. Number of issues identified based on severity

Figure 2. HEQS

Figure 2. HEQS

The Usability Professionals' Association, Bangalore hopes to host many more such competitions to help in defining and improving heuristic expertise standards. Based on this study, average expertise can be defined as identifying 8% of the issues in 1 hour, incorporating the severity weights, by an evaluator with 2 years of heuristic evaluation experience.


Kirmani, S. and Rajasekaran, S. (2007). Heuristic Evaluation Quality Score (HEQS): a measure of heuristic evaluation skills. Journal of Usability Studies, Volume 2, Issue 2, pp 61-75.

UPA. (2005). UPA 2005 Member and Salary Survey. Usability Professionals Association. Bloomingdale, IL.

Nielsen, J. (1994). How to Conduct a Heuristic Evaluation. In http://www.useit.com/papers/heuristic/heurstic_evaluation.html.


Usability Professionals' Association
promoting usability concepts and techniques worldwide
Phone + 1.630.980.4997         office@upassoc.org

Contact the Voice