Half Day Tutorial Details

By Day - Selected
By Presenter
By Track
View My Agenda

The Psychological Basis of UI Design Rules

An introduction to cognitive psychology for UX and Usability practitioners

Applying UI design effectively requires determining their applicability (and precedence) in specific situations. It also requires balancing trade-offs that arise when design rules appear to contradict each other. By understanding the psychology for the design rules, designers and evaluators enhance their ability to interpret and apply them. Explaining that psychology is the focus of this tutorial. It is intended for UX and Usability practitioners who did not take Cognitive Psychology as part of their university education.


Half Day Tutorial by Jeff Johnson (UI Wizards, Inc.)
Category:
Track:
General
Schedule:
To Be Determined

About the Half Day Tutorial

Introduction: User Interface Design Rules: Where do they come from? Presents familiar sets of UI design rules, discusses their similarities, and explains that they are not simply invented out of thin air and that applying them well requires understanding their basis in human perception and cognition. Sets of UI design rules discussed include:  Nielsen and Molich (1993)  Stone et. al (2005)  Shneiderman & Plaisant (2005)

Topic 1: We perceive what we expect. Explains – and demonstrates – that perception is biased by:  Past: experience  Present: context  Future: goals

Topic 2: Our vision is optimized to see structure. Presents the Gestalt principles of visual perception, with visual demonstrations and examples of their application to UI design. The Gestalt principles are:  Proximity  Similarity  Continuity  Closure  Symmetry  Common fate  Figure/ground

Topic 3: We seek, use, and impose structure. Shows that we use structure when it is present in the world, and we create it even when it is absent.  Structured info is easier to scan  Value of visual hierarchy  We impose structure on everything

Topic 4: Reading is feature & pattern recognition. Explains how our visual system processes symbols to allow us to read. Demonstrates that:  Learning to read involves training our visual system to recognize patterns  Novice reading is context-driven, top-down, controlled; skilled reading is feature-driven, bottom-up, automatic  Whatever disrupts feature & pattern recognition, disrupts reading

Topic 5: Our color vision is limited. Demonstrates the strengths and limitations of our color vision, with examples of how that affects the optimal design of GUIs. Points:  Our vision is optimized to detect contrasts (changes and edges), not absolute brightness  We have trouble discriminating: pale colors, small color patches, separated patches  Some people have color blindness  Displays vary in color presentation  Viewing conditions affect color perception

Topic 6: Our peripheral vision is poor. Explains how our vision differs between the central (foveal) and peripheral areas of our visual field.  Static items in subtle colors presented in periphery often will not be noticed  Motion in periphery is noticed

Topic 7: Our attention is limited; our memory is imperfect. Discusses human short- and long-term memory, demonstrates their strengths and limits, and presents examples of computer UIs that do and do not take their limits into account.  Short-term memory, features and limitations  Long-term memory, features and limitations

Topic 8: Human thought-cycle: goal, execute, evaluate. Explains the normal cycle of human behavior, and how it dictates the optimal design of interactive systems.  We run through this cycle repeatedly at many levels simultaneously, e.g., keystroke level, task level, long-term life goals level  Thought-cycle interacts with short-term memory

Topic 9: Recognition is easy; recall is hard. Explains that human recognition is amazingly fast – so fast it must be accomplished via massively parallel processing – and that recall is slow and error-prone.  Recognition is amazingly fast  Recall is slow and unreliable

Topic 10: We think mostly about our tasks, not our tools. Explains that computer users devote very little mental effort to thinking about their tools (e.g., computer software or websites) and instead think mainly about what they are trying to accomplish.  Our mental resources are focused on our goals  It makes us very literal in following “scent” toward our goal

Topic 11: Learning from experience and performing learned actions are usually easy; problem-solving is usually hard. Explains the difference between generalizing from experience, performing well-learned routines, solving novel problems, and calculation: the first two we do naturally and (usually easily); the last two are usually difficult and often require domain-specific training. Argues that interactive systems that require users to diagnose problems or perform complex calculations will have a very limited user base.  Our brains evolved to learn from experience and perform routine actions. They evolved a cortex for solving novel problems, but this capability is evolutionarily new and limited in capacity. The brain did not evolve to perform calculations; that is a purely artificial invention.  We invented computers to enhance our ability to do calculation and problem solving  Don’t make users perform diagnostic tests  Don’t make users calculate things the system could calculate

Topic 12: Many factors affect learning. Explains and demonstrates that we learn faster when:  Operation is task-focused, simple, and consistent  Vocabulary is task-focused, familiar, and consistent  Risk is low

Topic 13: We have real-time requirements. Discusses time-constants of human perception, cognition, and action that affect how people perceive user interfaces. For human-computer interaction, the most important time-constants are at these four times:  0.1 sec: perception of cause-effect, perceptual-motor feedback, visual fusion  1.0 sec: average conversation gap, visi-motor (unexpected event) reaction time,  10 secs: unit task, unbroken attention to a task, one step of a complex task