Skip to content Skip to sidebar Skip to footer

Multi-screen Games and Beyond: New Dimensions in User Interaction

At the Electronic Entertainment Expo (E3) in June 2011, Nintendo introduced the next generation of game consoles, the Wii U. It’s nothing less than a radically new approach to the user experience of games. In it, two screens work together wirelessly, allowing different views for different players, and much more. Not only is this new paradigm a major shift in the player experience, it’s also a huge change in the way game designers must think.

Smartphone or Tablet as another Screen

Nintendo isn’t the only kid on the block who’s playing around with multi-screen gaming:

  • Apple is also aggressively moving into this space. They recently introduced Airplay SDK for games running on the iPad 2. The iPad 2 can now send video directly to the Apple TV 2 wirelessly, creating a similar experience to the Nintendo solution.
  • Onlive, the current leader in cloud-based games, has created an iPad app to allow use of that device in conjunction with their service. Using Onlive in conjunction with Airplay is a strong possibility, opening up some interesting game-playing scenarios.
  • Adobe has been showing plenty of game examples using Android devices as controllers.
  • Qualcom’s open source AllJoyn project also allows developers to create multi-screen experiences using Android devices.
  • Our company, Brass Monkey, created a two-screen game experience for the Star Wars Trench Run in July 2010; we enabled an iPhone version of the game to become a controller for people running the game on the LucasFilm website www.starwars.com.

Using a smartphone as one of the screens means that people have game controllers with them at all times. This allows for interesting scenarios where people can begin using their phones to interact with digital signs at locations outside of the home. Imagine, for example, multi-screen gaming with the televisions in bars and cafes. Wii sports games, trivia, Texas hold ’em poker, and a whole slew of game designs make perfect sense in these settings. Being able to engage directly with their consumers in such a compelling way is also a dream that’s possible today for advertisers.

Game Designs

Now let’s look in some detail at new game designs enabled through handheld touch devices working with a larger screen.

Multiplayer, each with a different view: Take a typical card game, for example, and imagine being able to show the players’ cards on their phones, each one being different from the next, while all the players focus on a host screen for the table and shared experiences.

Multiplayer with different roles: Nintendo demonstrated this scenario with some of the Wii U demos at E3. The player using the touch screen controller acts as a boss, looking at his personal screen for the view while other players use standard Wiimotes to control characters running around on a split-screen setup on the larger screen.

Extra screen as an auxiliary view: One of the screens can act as a separate view into important aspects of the game. Let’s say the user has an iPhone as the controller and an iPad acts as the map view. The TV screen shows play in the action game. The player is able to focus on the main screen but use the iPad as a sort of dashboard for all important info at a quick glance. This setup would be a fantastic advantage for those playing RPGs (role playing games) in the style of World of Warcraft.

Controller screen as a view into another screen: If you use the controller’s camera—supported by all smartphones and the new Wii U—to augment the view on the large screen, you will essentially be seeing through the device (an iPhone 4, for example) into the other screen. Doing so changes the view through the device’s screen.

Imagine a tank game where you control a tank within a 3D environment. The goal of this game is to shoot enemy tanks as they move about in the game, and to prevent being hit by their fire. You hold the phone in portrait mode, allowing for control of your tank’s movements by pressing a mini virtual joystick on the lower right side of the phone. You hold the phone up, looking through it as a scope. Movement of your tank’s turret (as well as view) can be controlled by simply moving the phone in any direction. The camera on the iPhone is activated to allow what is going on in the world around it to be displayed on its touch screen. Because the phone is being held the way the camera is pointed at the host screen, you can see through your phone to the action being shown on the screen in front of you. This setup allows for many possible scenarios:

  •  The device’s camera display can show content not viewable on the host screen with the unaided eye, for example, simulate X-ray vision, show invisible enemies, or simulate night vision.
  • The device’s display can show a reticle used for lining up and indicating when an enemy is locked in on for firing (as described above).
  • The display could simulate real 3D based on head-tilt position. Similar to the example at http://www.youtube.com/watch?v=Jd3-eiid-Uw.

There are literally a thousand other ways that screens can be put together to create compelling game experiences. My goal is to inspire you to think of some scenarios yourself and go out and make them a reality.

Beyond Games

Games are certainly an obvious application of multi-screen experiences, but how will this concept affect our lives beyond games? Where else can you imagine combining multiple screens to create rich user engagement? Here’s a quick list:

The classroom: Real-time collaboration applications in classrooms are a great use of the multi-screen experience. Imagine a professor giving a lecture to his students, where he periodically gives students a problem to solve. The professor has an application running on a large screen so that all the students can see it. This application shows the question being asked of the students. The students also have an app running on various devices that work with the teacher’s application. In this case, the professor’s question shows up on students’ screens prompting them for an answer. Perhaps this is a question that requires you to draw a diagram in the answer region. Along with the other students, you then draw what you think the answer should be. Some students may be using the latest iPad, where they use a finger to draw, and others may be running the application on a laptop, using its track pad to draw in the answer region.

When students are done, they click a button within the application that instantly submits their result to the professor’s program. The professor now has the results back from the students, and he can choose one student to share the answer with him by selecting the student from a list on his application, prompting the display of this student’s diagram on the large screen where all the students can view it. At this point, the professor can discuss how the student performed, perhaps making corrections to the diagram by drawing on top of it via his laptop’s mouse.

Medical settings: Hospital and patient care settings also pose interesting possibilities using portable and fixed screens. One thought is that large touch screens aren’t exactly the most sanitary devices, and people leave smudges of various germs on them. Perhaps there is another way to interact with these screens.

Imagine a doctor-patient visit. The doctor pulls up the patient’s medical records via a mobile device—let’s say an Android tablet. This application alerts the doctor that the patient’s MRI results are in, which the doctor would like to review with the patient. On the wall of the examination room is mounted a large LCD screen hooked up to a computer running a compatible medical application. When the doctor’s tablet is detected and the two endpoints are connected, the doctor can pull the patient’s MRI image onto the larger screen, so she can go over the image with her patient. The doctor can use fingers on the Android device to manipulate the image—pinch to zoom in, two fingers to rotate, and all the touch gestures we’ve become used to—on the tablet rather than on the LCD screen.

Museums, amusement parks, and other interactive experiences: Museums and other situations with interactive displays, both digital and physical, are another target for this type of technology. People love to interact with museum displays, and the more interactive the installation, the more use it usually gets. The problem is that all that use takes its toll. The display controllers often break, and maintenance of the installations can be a tough job. If we start letting people use the devices in their pockets, however, we put the maintenance responsibility back on the user.

Computers that drive the experiences of physical installations, like those in museums and theme parks, can allow for interaction with mobile screens. Imagine a museum that has an installation of prehistoric men. It includes mannequins that move and are controlled by a computer out of the visitors’ sight. Typically, museums will allow the visitor to control the experience via physical buttons on the display case. Instead, imagine that visitors can now use their mobile phones to trigger interactions.

Another installation could be a series of smaller fixed screens with which the user could interact. The possibilities for public installations are just as unlimited as the possibilities for games.

The Future

User experiences will involve us interacting with screens everywhere. Every screen, from the one you carry around in your pocket, to televisions, digital kiosks, and Jumbotrons at the ballpark, will all work together for next generation of experiences. Games will also undergo a major revolution because of all these screens being connected. It’s happening today, and it’s all very exciting.
在 2011 年夏天举办的 E3 会议上,任天堂 宣布推出 Wii U,此游戏向公众展示了相当新的多屏幕游戏用户体验。此类游戏(使用基于触摸屏的控制器)将对我们的游戏玩法产生重大影响。这种方式不仅是玩家体验的重大转变,也是游戏设计师必须深思的重大转变。这篇文章介绍一些新游戏设计,这种设计通过使用大屏幕手持触摸设备进行操作,文章中还详细讨论了支持这些新体验的新平台。

文章全文为英文版2011年の夏、E3(エレクトロニック エンターテイメント エキスポ)コンファレンスで、任天堂がWii Uを発表し、マルチモニターのゲームにおける比較的新しいユーザエクスペリエンスを一般大衆に紹介した。このタイプのゲーム(タッチパネルベースのコントローラを使用)はゲームのプレイ方法に大きなインパクトを与える。このパラダイムはプレイヤーの体験に大きな変化を起こすだけでなく、ゲームデザイナーの考え方にも大きなシフトを要求する。この記事は、タッチ操作のハンドヘルドデバイスで大型スクリーンに表示してプレイする新しいゲームデザインについて取り上げ、この新体験を可能にする各種新プラットフォームについても詳しく説明している。

原文は英語だけになります

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.