Video games are designed to deliver engaging player experiences covering almost all the emotions. Sculpting this blend of emotions is the job of the game designer. However, evaluating how close the final game is to the original designer’s intent can be a difficult task.
If we were evaluating a website, this would be where usability testing comes in. A set of tasks would be drawn up and the users’ ability to complete these tasks would be evaluated using appropriate methods. For games, however, what are the most appropriate methods for assessing the player experience?
At Vertical Slice we’re interested in understanding four key aspects of players and gameplay:
Behavior: What did the player do?
Rationale: Why did they behave as they did?
Perception: What do they think happened?
Experience: How did the gameplay make them feel?
By understanding the relationship between players’ behavior and experience, we begin to gain better insights into understanding the player experience. But collecting this information using traditional methods can lead to problems.
What’s Wrong Today?
First, the usual methods of determining the players’ experience (such as questionnaires or interviews) are sampling methods, meaning that the players respond with what is true at a specific moment in time. If they fill out questionnaires during the game, then we are interrupting and modifying their experience of the game. If we wait until the end of play, then they may have forgotten what the real experience was like. Wouldn’t it be better if we could capture their experience continuously?
Second, if we ask players to self-report in interviews or questionnaires, we rely on their awareness, recall, and cognitive filtering abilities to function before a response emerges—probably a tainted one. Wouldn’t it be useful to capture their experience unconsciously?
So how do we collect this information automatically and unconsciously from players?
One approach is facial coding, in which Paul Ekman proposes six basic facial expressions that we all have in common. These basic expressions are automatic; they occur whether we like it or not. We don’t have to interrupt the player (continuous), and we don’t have to ask for their opinion (unconscious).
In video games, however, these basic emotions are not usually expressed. In playtests it’s common for players to remain quite expressionless, making facial coding very difficult, not to mention time-consuming. Self-report methods are also far from ideal. In the vast majority of cases, players can’t accurately remember their gameplay experience, even after short game sessions.
Immediately following gameplay sessions, we often ask players to draw a graph of their experience. In almost all cases, the players draw a line graph which contains one peak or trough—what psychologists call the serial position effect. Broadly speaking, people tend to remember events at the start, the end, and perhaps one in the middle. Figure 1 shows a player experience diagram drawn after just a twenty-minute gameplay session.
There are three reasons for the lack of rich detail in these player experience diagrams:
Awareness – Players have to be aware that a visual, auditory, or cognitive event occurred.
Recall – Players are poor at recalling the subtlety of gameplay and only remember key events.
Cognitive filtering – Players’ true experiences may be tainted by factors such as trying to please the moderator, excitement at being in a game studio, or playing a pre-release game.
Although self-report methods are not reliable for understanding the actual player experience, the player experience diagrams do help us address the perception issue, what players think happened, and what they may tell their friends about.
If facial coding, questionnaires, and interviews are not particularly reliable, wouldn’t it be useful if we had an automatic way of capturing the player experience?
Biometrics, is the practice of using sensors attached to a subject’s body to monitor physical data. Typical sensors include those for measuring arousal (excitement), valence (mood), respiration, heart rate, brain waves, and facial muscle movement.
During playtests, we can view the game and which buttons players are pushing, their faces or bodies, and the players’ biometrics, all in real time. This allows us to observe what the player is doing (behavior) and see the associated experiential reaction at any second. When we add real-time eye tracking into the mix (the green dot in Figure 2), we gain further insights.
The use of biometrics does not directly identify the emotion that someone experiences. When we see a peak in the signal of the galvanic skin response sensor which we use to measure arousal (excitement), we simply make a note of the timestamp and then replay the event to the player in post-interview. Using biometrics to drive our post-interviews results in more meaningful insights into players’ motivations and expectations.
Case Study: Pure
Black Rock Studio’s Pure offers off-road quad-bike action. The game was designed as an adrenaline racer. The spectacular jumps, in particular, were intended to cause peaks of excitement. To evaluate whether Pure was being experienced as the designer had intended, we employed a biometric approach to identify the exact moments in the game that caused arousal (excitement or frustration). We examined the first three levels of Pure using a player with no previous experience of the game.
On Track 1, the player was experiencing arousal moments before the game began. In post-interview, he said that he was anticipating good things from the game because the graphics were good and he could see many other racers on the track.
During the initial moments of gameplay, his arousal levels began to decrease. When we analyzed his behavior, he was merely racing with competitors; there was no opportunity to do tricks and jumps. When he reached the first large jump, though, there was a large peak in his arousal levels. Was this due to the novelty of the first time, or does the player continually enjoy the jumps? This sort of information would be very difficult to determine using traditional user research approaches.
What was surprising, however, was that for a racing game, the racing itself did not lead to enjoyment for this player. Rather, it was the tricks and jumps. Other players may have had a different experience.
As the countdown began to the race in Track 2, we could identify player arousal, in this case the expectation of racing. The player had enjoyed the first level and was looking forward to this one, but it didn’t happen. The level was designed as a quick race; so there were no opportunities for jumps and tricks, the gameplay elements that had led to fun in the first level.
At the start of Track 3, we were concerned that we wouldn’t see any peak before the race; that he’d lost interest. He did peak, however. In post-interview he said he could tell from the graphics that new experiences were going to be offered, and he was intrigued.
In the case of Pure, biometrics helped us to identify precise moments in which the player experienced arousal. Sometimes this arousal was good, such as when jumping off of a cliff, and sometimes it was bad, such as frustration at confusing gameplay elements.
This biometric approach is incredibly useful for identifying what Jesse Schell refers to as “psychographics.” Whereas demographics identify external factors such as age, gender, and games played, psychographics identify individual motivations. Using biometrics, we identify precisely which game mechanics players enjoy (collecting objects, fighting, or exploring) without ever having to ask them during gameplay. Of course this would all be verified in post-interview, but before that stage we already have a reasonable profile of who this player is.
On our player database we also have notes on individual psychographics. It’s a useful tool for player recruitment.
Biometrics are also particularly useful for improving the reports on user research. Using the correct user research methods, and analyzing data thoroughly, doesn’t matter if the final results are not communicated well, are not believable, or are not acted on.
What we use is called biometric storyboards. All of our methods are aimed at capturing and understanding the actual experience of the player interacting with our client’s game. If a game is a narrative journey unfolding in time, our way of communicating the experience is to visualize the player’s complete journey.
Based on feedback from clients, we are now on our third iteration of biometric storyboards. The first version (Figure 3) was divided up by time, but time is not meaningful for some games. Beats, or thematic areas, may be more representative, as in Figure 4. The current version (Figure 5) makes the diagram easier to read and matches behavior (the text along the bottom) to the associated player experience.
Clients have used biometric storyboards to enable discussion. They are easily understandable and use neutral language so that programmers, designers, artists, and producers can all quickly pinpoint areas of the game that are working, and those that need refining.
Comparing the biometric storyboard version of gameplay with what the player actually remembers (player-experience graphs) shows how much more detail can be captured using the storyboard approach. It is as close as we have come to representing the player’s true gameplay experience.
Although we use biometrics to identify gameplay issues, understand player motivations, and report the player experience, they are not a perfect solution. We still ask the player to identify which emotion they were experiencing, and they could still lie. Now, however, we have another data point for forming an opinion. Biometrics help me as a user researcher to reduce the amount of uncertainty in explaining an issue, and they add confidence to the findings in our client reports.
Analyzing the player’s gameplay behavior, seeing which joypad buttons they pushed, noting their various biometric sensor measurements, seeing where their eyes looked, and capturing their facial expressions—all in real time—gives me a reasonable set of data points to make an informed decision on the player experience. I may also confirm this with the players through traditional interviews and questionnaires.
Using different biometric sensors, particularly EMG (facial muscles) and EEG (brain waves), could be appropriate for understanding other digital experiences, for example, gaining insight into how users experience websites. Indeed, the techniques have been used for some time by the movie industry.
Understanding a player’s interaction with a game takes us on a user research journey covering behavior, rationale, perception, and emotion. Video games are highly complex, but still not as complex as the player. Using these research methods will embrace the essence of video game use, the unconscious and continuous experience. Using these approaches gets us closer to what we’re really trying to achieve——making games better
Retrieved from http://uxpamagazine.org/biometrics_player_experience/