UX Magazine logoThe Magazine of the Usability Professionals' Association

Multi-screen Games and Beyond: New Dimensions in User Interaction

By Chris Allen

At the Electronic Entertainment Expo (E3) in June 2011, Nintendo introduced the next generation of game consoles, the Wii U. It’s nothing less than a radically new approach to the user experience of games. In it, two screens work together wirelessly, allowing different views for different players, and much more. Not only is this new paradigm a major shift in the player experience, it’s also a huge change in the way game designers must think.

Smartphone or Tablet as another Screen
Nintendo isn’t the only kid on the block who’s playing around with multi-screen gaming:

Using a smartphone as one of the screens means that people have game controllers with them at all times. This allows for interesting scenarios where people can begin using their phones to interact with digital signs at locations outside of the home. Imagine, for example, multi-screen gaming with the televisions in bars and cafes. Wii sports games, trivia, Texas hold ’em poker, and a whole slew of game designs make perfect sense in these settings. Being able to engage directly with their consumers in such a compelling way is also a dream that’s possible today for advertisers.

Game Designs
Now let’s look in some detail at new game designs enabled through handheld touch devices working with a larger screen.

Multiplayer, each with a different view: Take a typical card game, for example, and imagine being able to show the players’ cards on their phones, each one being different from the next, while all the players focus on a host screen for the table and shared experiences.

Multiplayer with different roles: Nintendo demonstrated this scenario with some of the Wii U demos at E3. The player using the touch screen controller acts as a boss, looking at his personal screen for the view while other players use standard Wiimotes to control characters running around on a split-screen setup on the larger screen.

Extra screen as an auxiliary view: One of the screens can act as a separate view into important aspects of the game. Let’s say the user has an iPhone as the controller and an iPad acts as the map view. The TV screen shows play in the action game. The player is able to focus on the main screen but use the iPad as a sort of dashboard for all important info at a quick glance. This setup would be a fantastic advantage for those playing RPGs (role playing games) in the style of World of Warcraft.

Controller screen as a view into another screen: If you use the controller’s camera—supported by all smartphones and the new Wii U—to augment the view on the large screen, you will essentially be seeing through the device (an iPhone 4, for example) into the other screen. Doing so changes the view through the device’s screen.

Imagine a tank game where you control a tank within a 3D environment. The goal of this game is to shoot enemy tanks as they move about in the game, and to prevent being hit by their fire. You hold the phone in portrait mode, allowing for control of your tank’s movements by pressing a mini virtual joystick on the lower right side of the phone. You hold the phone up, looking through it as a scope. Movement of your tank’s turret (as well as view) can be controlled by simply moving the phone in any direction. The camera on the iPhone is activated to allow what is going on in the world around it to be displayed on its touch screen. Because the phone is being held the way the camera is pointed at the host screen, you can see through your phone to the action being shown on the screen in front of you. This setup allows for many possible scenarios:

There are literally a thousand other ways that screens can be put together to create compelling game experiences. My goal is to inspire you to think of some scenarios yourself and go out and make them a reality.

Beyond Games
Games are certainly an obvious application of multi-screen experiences, but how will this concept affect our lives beyond games? Where else can you imagine combining multiple screens to create rich user engagement? Here’s a quick list:

The classroom: Real-time collaboration applications in classrooms are a great use of the multi-screen experience. Imagine a professor giving a lecture to his students, where he periodically gives students a problem to solve. The professor has an application running on a large screen so that all the students can see it. This application shows the question being asked of the students. The students also have an app running on various devices that work with the teacher’s application. In this case, the professor’s question shows up on students’ screens prompting them for an answer. Perhaps this is a question that requires you to draw a diagram in the answer region. Along with the other students, you then draw what you think the answer should be. Some students may be using the latest iPad, where they use a finger to draw, and others may be running the application on a laptop, using its track pad to draw in the answer region.

When students are done, they click a button within the application that instantly submits their result to the professor’s program. The professor now has the results back from the students, and he can choose one student to share the answer with him by selecting the student from a list on his application, prompting the display of this student’s diagram on the large screen where all the students can view it. At this point, the professor can discuss how the student performed, perhaps making corrections to the diagram by drawing on top of it via his laptop’s mouse.

Medical settings: Hospital and patient care settings also pose interesting possibilities using portable and fixed screens. One thought is that large touch screens aren’t exactly the most sanitary devices, and people leave smudges of various germs on them. Perhaps there is another way to interact with these screens.

Imagine a doctor-patient visit. The doctor pulls up the patient’s medical records via a mobile device—let’s say an Android tablet. This application alerts the doctor that the patient’s MRI results are in, which the doctor would like to review with the patient. On the wall of the examination room is mounted a large LCD screen hooked up to a computer running a compatible medical application. When the doctor’s tablet is detected and the two endpoints are connected, the doctor can pull the patient’s MRI image onto the larger screen, so she can go over the image with her patient. The doctor can use fingers on the Android device to manipulate the image—pinch to zoom in, two fingers to rotate, and all the touch gestures we’ve become used to—on the tablet rather than on the LCD screen.

Museums, amusement parks, and other interactive experiences: Museums and other situations with interactive displays, both digital and physical, are another target for this type of technology. People love to interact with museum displays, and the more interactive the installation, the more use it usually gets. The problem is that all that use takes its toll. The display controllers often break, and maintenance of the installations can be a tough job. If we start letting people use the devices in their pockets, however, we put the maintenance responsibility back on the user.

Computers that drive the experiences of physical installations, like those in museums and theme parks, can allow for interaction with mobile screens. Imagine a museum that has an installation of prehistoric men. It includes mannequins that move and are controlled by a computer out of the visitors’ sight. Typically, museums will allow the visitor to control the experience via physical buttons on the display case. Instead, imagine that visitors can now use their mobile phones to trigger interactions.

Another installation could be a series of smaller fixed screens with which the user could interact. The possibilities for public installations are just as unlimited as the possibilities for games.

The Future
User experiences will involve us interacting with screens everywhere. Every screen, from the one you carry around in your pocket, to televisions, digital kiosks, and Jumbotrons at the ballpark, will all work together for next generation of experiences. Games will also undergo a major revolution because of all these screens being connected. It’s happening today, and it’s all very exciting.UX

Chris Allen is an international speaker, software inventor, and entrepreneur based in Boston, MA. He currently serves as president and CTO of Brass Monkey, a company that turns smartphones into game controllers.

UPA logo
Usability Professionals' Association

promoting usability concepts and techniques worldwide

User Experience Magazine is by and about usability professionals, featuring significant and unique articles dealing with the broad field of usability and the user experience.
http://www.usabilityprofessionals.org/upa_publications/user_experience/

This article was originally printed in User Experience Magazine, Volume 10, Issue 4, 2011.
http://www.usabilityprofessionals.org/upa_publications/past_issues/2011-4.html.

© Usability Professionals' Association
Contact UPA at http://www.usabilityprofessionals.org/about_upa/contact_upa.html