Wednesday, November 19, 2014

How Wearable Technology Inspires Game Development

In this article, Josephine Tsay considers the gameplay possibilities when the player's own biology is used as an input and how the blending of physical and virtual can lead to a truly personalized gameplay experience.

New devices not only change the way we play, but the way we re-imagine the player experience. What is interesting about the intersection between wearable technology and health games, is that it removes the hardware controller as an input barrier, and puts the agent, or the player, truly at the center of the experience. Because of wearable device technology, player experience is no longer limited to pressing plastic buttons. The player is now directly using his or her biology as one, or many, inputs.

Consider, for example, using one’s own arm as an input controller1. Imagine the player is touching his/her arm to engage in gameplay. What analogies and metaphors can we, as game designers, extend with that interaction? The neural impulses that occur from a player pressing his/her own skin can trigger a very visceral response as compared to the player just tapping on hardware. A new player empathy map2 subsequently emerges, including the potential for new game scenarios. What types of horror games can push this analogy? How does this inform other genres? What is the potential to teach gameplay through this type of input from the start, the way even the menu screen for Megaman3 teaches the shooting mechanic from the get-go?

What wearable technology does for games is to bring this direct type of cause and effect feedback between virtual and physical environments. In gaming, “wearable technology” often evokes variations of next generation head mounted displays of the Oculus Rift/Google Glass variety. Yet, there are examples of various health related devices with the potential for unique immersive experiences and gameplay using the body as an input device, even if the primary purpose of the device was not intended for games. The LUMOback posture sensor4 is better known to be a posture improvement device. In its app, however, there is a stick figure that shifts, in real time, to your body movements. The magic moment there consists of a combination of how you’re moving your body, how the avatar on your phone is responding to it, and then a very physical sensation of the belt vibrating against your lower back depending on the settings. The angle of how one positions his or her body can now be part of the game design consideration set. Now, this particular device was not designed for games, but the potential of this type of interaction can serve as inspiration, at the very least, for innovative gameplay.

The intersection between wearable technology and health games is an interesting one, if mainly because it blends physical and virtual worlds in a way that goes beyond “just for fun.” Phobious “uses your smartphone as a Virtual Reality device to expose you to those situations that you fear, slowly and gradually.” Thync “creates wearable consumer products that use neurosignaling to shift your state of mind.” Such developments open up the gate for games using biofeedback to alter levels, as Nevermind strives to achieve with its “haunting gameplay experience” where “a biofeedback sensor will monitor how scared or stressed you become moment-to-moment.” By using biofeedback and neurosignals, the player experience can be further personalized in a way that’s specific to the individual player. Layer that with player types, and a game can really feel like it’s been especially crafted for you.

Nevermind screenshot
The next few years will be really exciting as wearable technology continues to disrupt and push the potential of games. As wearable controllers go beyond watches and gloves to jackets and arms, from the screen to the screen-less, the player becomes the center of the game experience in a way that continues to stretch the imagination and propels the industry forward.


1 “Skinput turns your arm into a touchscreen”, Lisa Zyga, http://phys.org/news186681149.html

2 Empathy Mapping, Stanford Design School, https://dschool.stanford.edu/groups/k12/wiki/3d994/Empathy_Map.html

3 http://code.tutsplus.com/articles/weekend-lecture-egoraptor-discusses-megamans-game-design--active-10557

4 LUMOback Kickstarter page, https://www.kickstarter.com/projects/lumoback/lumoback-the-smart-posture-sensor

Josephine Tsay studied at Carnegie Mellon University’s Entertainment Technology Center, and U.C. Berkeley’s College of Environmental Design. Her work spans across story, games, wearable tech, educational tech, and mobile user experience. She worked at Google for several years and is now currently exploring the intersection of psychology and games.

No comments:

Post a Comment