Download as pdf or txt
Download as pdf or txt
You are on page 1of 22

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/279751807

A Heuristic Framework for Evaluating User Experience in Games

Chapter · January 2015


DOI: 10.1007/978-3-319-15985-0

CITATIONS READS

11 670

4 authors:

Christina Hochleitner Wolfgang Hochleitner


AIT Austrian Institute of Technology Fachhochschule Oberösterreich
46 PUBLICATIONS   351 CITATIONS    11 PUBLICATIONS   118 CITATIONS   

SEE PROFILE SEE PROFILE

Cornelia Graf Manfred Tscheligi


CURE - Center for Usability Research and Engineering University of Salzburg
11 PUBLICATIONS   56 CITATIONS    513 PUBLICATIONS   4,041 CITATIONS   

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

uTRUSTit View project

Robot Errors View project

All content following this page was uploaded by Wolfgang Hochleitner on 20 December 2017.

The user has requested enhancement of the downloaded file.


A Heuristic Framework for Evaluating User
Experience in Games

Christina Hochleitner†, Wolfgang Hochleitner‡, Cornelia Graf†, Manfred


Tscheligi†¥


Center for Usability Research and Engineering, Vienna, Austria
‡ Dept.
of Digital Media, University of Applied Sciences Upper Austria, Hagenberg, Austria
¥ICT&S Center, University of Salzburg, Salzburg, Austria

Abstract This book chapter describes an approach of evaluating user experience in video
games by using heuristics. We provide a short overview of video games and explain the
concept of user-centred design for games. Furthermore we describe the history of heuristics
for video games and the role of user experience of games in general. Based on our previous
work and experience we propose a revised framework consisting of two sets of heuristics
(game play/game story, virtual interface) to detect the most critical issues in games. To as-
sess its applicability to measure user experience factors we compare the results of expert
evaluations of six current games with the user experience-based ratings of various game re-
views. Our findings indicate a correlation between the extent to which our framework is
satisfied and the game’s average rating.

1 Introduction

The computer games industry has remarkably increased in importance over the
last years (ESA 2011). The numbers of units sold climb up steadily and video
games have changed from being a product for a small minority to a widely used
and accepted medium. The expanding game market also opens the door for a se-
ries of research-related activities. Especially the term user experience (UX) has
become increasingly important. Researchers and human-computer interaction
(HCI) experts want to find out how computer gamers experience the game situa-
tion (cf. Clarke and Duimering 2006) to create more compelling and immersive
game environments, and the industry is interested in finding ways to measure UX
and to interpret the collected data (e.g. to acquire new target groups). The evalua-
tion of the user’s experience and the closely connected user-centred development
of video games have been addressed in numerous publications (cf. Marsh et al.
2005, Bostan and Marsh 2010). Several researchers have designed methods to
2

evaluate video games by adopting techniques from the usability field such as usa-
bility tests and heuristic evaluations. In recent years, sets of heuristics for the
evaluation of video games have been proposed, all treating overlapping subject ar-
eas but diverse in detail and quality of description (cf. Federoff 2002; Desurvire et
al. 2004; Korhonen and Koivisto 2006; Schaffer 2007; Pinelle et al. 2008a, 2008b;
Bernhaupt et al. 2007, 2008; Desurvire and Wiberg 2009; Livingston et. al. 2010).
An approach similar to the one used in this chapter was presented by Febretti
and Garzotto (Febretti and Garzotto 2009). They conducted a study for evaluating
engagement, usability and playability, all UX factors, but they did not focus on the
overall UX of a game, which is the main aspect of our work.
As part of our research we are interested in reliable and cost-efficient methods
to measure and predict the UX of games. An area where this approach is expected
to be widely beneficial is the sector of so-called indie games – games that try to be
innovative and provide new experiences for the player on the one hand and that
are developed by smaller companies with little budget for expensive user and play
testing on the other hand (Gril 2008). Within the last years we have investigated
several different sets of heuristics including various aspects and oriented at differ-
ent goals. Based on our experience in heuristic evaluations and games we have
categorized 49 heuristics into twelve categories that, from our point of view, con-
tain the most important factors for both, the system’s usability, as well as the per-
ceived UX. To validate our approach to heuristics and prove their connection to
UX, we have conducted an evaluation of six games and related the obtained re-
sults to common game review reports.

1.1 Overview

Only few approaches are currently linking the results of heuristic evaluation
methods to UX. Especially in the field of computer games, where the experience
is the leading factor, different aspects can be evaluated using heuristics. Therefore,
we put the main focus of this chapter on the assessability of a game’s UX through
the use of heuristics. We provide an overview of previously available heuristics
and introduce categorized heuristics based on available literature, as well as our
personal experience in the field of video games. To evaluate the applicability of
our heuristics to UX ratings, we conduct heuristic evaluations of several games
and compare the resulting data to UX-based game reviews. Finally we critically
assess our method and offer improvements and future perspectives. We deliver a
complete and updated framework usable for evaluating the usability and UX of
games and provide proof for the connection between the developed heuristics and
the game’s UX.
3

2 Video Game and Game Genres

Before discussing heuristics for video games, we want to get a clear understanding
of the terminology video game. Esposito provides an interesting definition (Espos-
ito 2005):
A videogame is a game which we play thanks to an audiovisual apparatus and which can
be based on a story.

Esposito’s definition contains four important elements that classify a video


game: game, play, audiovisual apparatus, and story. These elements are derived
from literature such as (Huizinga 1950; Callois 1961; Zimmermann 2004).
We second this definition and want to point out the need to clearly distinguish
games from productivity applications as done in (Pagulayan et al. 2003). Finally,
to avoid misunderstandings about the term itself we consider video games as an
umbrella term for all electronic games, independent of their platform (computer,
console, arcade, etc.). Still there is need to put games into certain categories to be
able to unite titles of similar type.
There are many different distinctions available, some more common than oth-
ers. Wolf defined a set of 41 genres in (Wolf 2001), being sometimes too specific
(e.g. when defining Diagnostic Cartridges as a genre). Ye proposes to adapt the
genre term and certain genre conventions from movies to games, but does not give
a clear genre definition himself (Ye 2004). A common and well established genre
definition has been created by the NPD group and is mentioned amongst others in
(Pagulayan et al. 2003) and used by (ESA 2011) for their market statistics. This
classification contains eleven well known and well established (super-) genres
such as role-playing game (RPG), action or shooters and abstains from introducing
fine-grained subcategories. We propose the use of these genres in order to be able
to classify games in accordance with the market/industry later on and focus mainly
on computer based (PC) video games for the subject of this chapter.

3 User-centred Design in Games

User-centred design is a design philosophy, which describes a prototype-driven


software development process, where the user is integrated during the design and
development process. The approach consists of several stages that are iteratively
executed: Requirements analysis, user analysis, prototyping and evaluation. User-
centred design is specified in EN ISO 9241-210 – Human Centred Design Pro-
cesses for Interactive Systems (ISO 9241-210 2010). This approach is also used
for game design as described in (Fullerton et al. 2004). It contains three distinct
development phases: conceptualization, prototyping and playtesting. The first
phase typically involves the complete planning such as identification of goals,
challenges, rules, controls, mechanics, skill levels, rewards, story and the like
4

(Pagulayan et al. 2003). These specifications are done by game designers and are
put on record in game design documents.
The second phase – prototyping – is used to quickly generate playable content.
This version of the game is in no way final but can be efficiently used to do play
testing thus give players an opportunity to play the game, test its game mechanics
and provide feedback on their UX (Fullerton et al. 2004). Measurable attributes
are for example the overall quality (commonly denoted as fun), the ease of use or
the balancing of challenge and pace (Pagulayan et al. 2003).
To gather results for these variables a range of usability methods can be applied
during playtesting. Pagulayan et al. propose structured usability tests (c.f. Dumas
and Redish 1999) and rapid iterative testing and evaluation (RITE, c.f. Medlock et
al. 2002) as two applicable methods. They also propose additional evaluation
methods such as prototyping, empirical guideline documents or heuristics (Pagu-
layan et al 2003). We believe that especially heuristics can be a fast and cost-
efficient, but still effective and accurate evaluation method for UX in games.
Therefore we will present our own set of heuristics in chapter 5 and verify them
by conducting an expert evaluation. Before that we will give a short introduction
to heuristic evaluation as an expert-based usability approach.

3.1. Heuristic Evaluation

Heuristic evaluation is one of the so-called expert-based usability inspection


methods (Nielsen and Mack 1994). It is an efficient analytical and low-cost usabil-
ity method to be applied repeatedly during a development process, starting at the
very beginning of a project design circle (Nielsen and Mack 1994). In general,
heuristics can be considered as rules of thumb that describe the affordances of the
users to a particular system. The formulation of heuristics is more universally than
the one of usability guidelines (Koeffel 2007). The heuristics should provide
enough information to enable the evaluator to judge all possible problems of a sys-
tem (Sarodnick and Brau 2006). During a traditional user-interface evaluation
three to five experts (in the field of the application, usability or both) inspect a sys-
tem according to recognized and established usability principles (i.e. the heuris-
tics). The number of detected usability issues increases significantly with the first
three evaluators and the most problems are expected to be discovered employing
three to five experts (Nielsen and Mack 1994). Heuristics allow for an evaluation
of systems in a very early stage of the design process (e.g. paper mock-ups). Alt-
hough numerous heuristics are available for the evaluation of video games (see
following section), no particular work on how to evaluate UX through the applica-
tion of heuristics has been introduced.
5

4. History of Heuristics for Video Games

In the following a brief overview of the history of heuristics for video games will
be presented, starting with Malone who was the first one to introduce the idea of
using heuristics to evaluate games (Malone 1980, 1982). His heuristics mainly fo-
cused on educational games, not possessing the graphical, acoustic and computa-
tional possibilities that current video games offer. Malone categorized his heuris-
tics into challenge, fantasy and curiosity.
Although Malone has introduced his heuristics as early as 1980, this method
was only adopted by a wider audience with Jakob Nielsen’s ten heuristics that he
introduced in 1994 (Nielsen 1994). Since then these ten heuristics are the mostly
referenced set of heuristics and frequently used for different kinds of applications.
Originally they have been developed for traditional interfaces, nevertheless they
are also (to a certain extent) applicable to several other areas such as video games.
Federoff assessed the applicability of these heuristics to this area (Federoff 2002).
She discovered their usefulness and developed a set of 40 heuristics that was par-
tially based on Nielsen’s heuristics. For a better overview and easier assignment of
single problems to heuristics she categorized them into game interface, game me-
chanics and game play. We think that the heuristics published by Federoff some-
times do not cover the entire extent of facets offered by video games, especially
when considering the capabilities of state of the art video games. Furthermore they
appear to concentrate on role playing games and are therefore not applicable to all
possible game genres.
In 2004 Desurvire et al. released a new set of verified heuristics, called HEP
(heuristic evaluation of playability), which were based on the heuristics introduced
by Federoff (Desurvire et al. 2004). In contrast to Federoff’s approach, these heu-
ristics were categorized into the four sections game story, game play, game me-
chanics and game usability. Through further evaluations these heuristics have
proven to be effective. The categorisation of heuristics for video games into game
play, game story, game mechanics and game usability has been taken into account
when formulating our framework. Still we think that the heuristics by Desurvire et
al. do not consider the important impact of challenge onto the user’s experience.
The evaluation of mobile games has also been of interest to researchers. In 2006
Nokia released a framework for the evaluation of the playability of mobile games
(Korhonen and Koivisto 2006). Their framework is split into modules containing
heuristics for game play, game usability and mobility. The modules do not have to
be evaluated at the same time and the modules concerning game play and game
usability should be able to be applied to other kinds of games, not only mobile
games.
In April 2007 Schaffer released a white paper introducing a new version of
heuristics for video games (Schaffer 2007). According to his opinion, the heuris-
tics introduced so far were too vague, difficult to realize, more suitable to post-
mortem reviews and not applicable during the design process. He provides a set of
6

detailed heuristics with graphical examples for each heuristic, which eases the
evaluation significantly, especially when it is not conducted by an expert in the
field of computer games.
Pinelle et al. introduced a set of heuristics based on game reviews in 2008 (Pi-
nelle et al. 2008a). For their work five researchers reviewed 108 game reviews of
the Gamespot1-website and categorized the issues found into twelve different
problem categories. They subsequently generated ten final heuristics out of these
categories. According to Pinelle et al. this approach offers the possibility to evalu-
ate a game’s usability without reviewing unnecessary technical issues and issues
related to entertainment. In a follow-up work Pinelle at al. extended their research
by taking genres into account. They grouped the analysed games into six major
genres and assigned the problems found to their twelve categories. Through this
they were able to determine that the frequency of certain problems occurring is
dependent on the genre (Pinelle at al. 2008b).
A further development and refinement of the HEP heuristics (Desurvire et al.
2004) was presented in form of the Heuristics of Playability (PLAY) (Desurvire
and Wiberg 2008). The set consists of three categories, namely game play, cool-
ness/entertainment/humor/emotional immersion and usability & game mechanics.
The PLAY heuristics were tested by fifty-four gamers and overall the heuristics
were found useful. Based on the findings of Pinelle et al., Livingston at. al created
a heuristic evaluation technique they call “Critic Proofing” in 2010. Through the
application of a genre rating they were able to create prioritized severity ratings in
order to help developers focus on the most important issues for their genre first
(Livingston et al. 2010).
Korhonen et al. (Korhonen et al. 2009) compared the HEP heuristics (Desurvire
et al. 2004) with the playability heuristics for Mobile Games (Korhonen and Koi-
visto 2006) and assessed their strengths and weaknesses. Overall the game evalua-
tors liked the heuristic evaluation method but their results also indicated that the
heuristics needed further improvements before they could be widely adopted. In
2011 Korhonen (Korhonen 2011) conducted a study where 36 novice evaluators
evaluated a mobile game using two different sets of heuristics; the first one based
on Pinelle et al. (Pinelle et al. 2008a), the second based on Korhonen and Koivisto
(Korhonen and Koivisto 2006). The evaluators also analysed and described the
problems found. The study results showed that the heuristic set had to cover the
main aspects of playability to guarantee its usefulness.

5 User Experience of Games

Within recent years UX has become a well-established concept within the com-
munity focusing on HCI. According to (Hassenzahl and Tractinsky 2006) this is

1 http://www.GameSpot.com
7

the counter-reaction to the more dominant task and work related usability para-
digm. Still, this is not a completely new concept. The American philosopher and
psychologist John Dewey described experiences to be “not mere feelings; they are
characteristics of situations themselves, which include natural events, human af-
fairs, feelings, etc.” as early as 1934 (Dewey 1934).
Nevertheless, a clear definition and founded understanding of this term has
long been missing (Law et al. 2008). According to Law et al. the main problem is
that UX treats non utilitarian aspects of interactions between humans and ma-
chines. This means that UX mainly focuses on affect and sensation – two very
subjective impressions. It encompasses areas from traditional usability to beauty,
hedonic, affective or experimental aspects of technology use (Forlizzi and Bat-
tarbee 2004). Hassenzahl and Law, one of the leading researchers in the field of
UX, define it as “a momentary, primarily evaluative feeling (good-bad) while in-
teracting with a product or service” (Hassenzahl 2008). Therefore UX is design-
ing for joy and fun instead of designing for the absence of pain (Hassenzahl and
Tractinsky 2006). Thus the community has recently undertaken measures to better
understand the meaning of UX and to find a unified definition through different
conferences, workshops (Law et al. 2008; Roto and Kaasinen 2008; Roto et al.
2011), forums and the like. Especially the MAUSE COST Action 2942 has aimed
for finding a definition and measurement of UX. Law et al. performed a survey
among 275 participants from the fields of industry and research to gather the
community’s understanding of the term UX. The study showed that UX was seen
as something dynamic, context-dependent and subjective (Law et al. 2009).
In 2010 ISO defined UX as “a person's perceptions and responses that result
from the use or anticipated use of a product, system or service” (ISO 9241-120
2010). Law sees this definition as a starting point for discussion but also states that
it is too imprecise and abstract. Finding a final definition might be an ostensible
task (Law 2011).

5.1 Measuring User Experience in Games

According to literature, UX in games can be measured using the following


qualitative and quantitative methods (Federoff 2002; Desurvire et al. 2004; Sweet-
ser and Wyeth 2005; Hazlett 2006; Koivisto and Korhonen 2006; Mandryk and
Atkins 2007, Nacke 2010): psycho-physiological measurements, physiological
measurements, expert evaluation (heuristics etc.), subjective, self-reported
measures and usability tests. Fierley and Engl discussed how common UX re-
search methods such as thinking aloud can be used in the context of gaming. Fur-
thermore the authors presented approaches for adapting common methods for
measuring UX in games (Fierley and Engl 2010). Similar to this Bernhaupt and

2 http://www.cost294.org/
8

Linard presented how UX evaluation should be adopted to the area of multi-modal


interaction in games (Bernhaupt and Linard 2010). A collection of UX factors as
well as evaluation methods is provided in (Bernhaupt 2010).
Integral factors of UX are the state of flow and immersion defining the level of
enjoyment and fun (IJsselsteijn et al. 2007). The measurement of the state of flow
through different methods is one of the major topics of UX in games and by many
seen as the optimal experience when playing games (cf. Sweetser and Wyeth
2005). According to Hassenzahl the concept of flow is very close to the idea of
UX and describes flow as “a positive experience caused by an optimal balance of
challenges and skills in a goal-oriented environment” (Hassenzahl 2008). The
concept of flow was first introduced in (Csikszentmihalyi 1975) and further re-
fined to fit to video games and player enjoyment in (Cowley et al. 2008; Sweetser
and Wyeth 2005). Whereas Cowley et al. introduce a framework to map flow to
the game play, Sweetser and Wyeth try to integrate heuristics into a model to help
design and evaluate enjoyment in games. They found out that there is a certain
overlap of the heuristics investigated and the concept of flow.
Another concept that is tightly linked to UX is immersion. One definition of
immersion and its stages was proposed in (Brown and Cairns 2004). Through a
semi-structured interview with seven gamers they were able to distinguish immer-
sion into three phases: engagement, engrossment and total immersion. Engage-
ment is the first stage of immersion. According to Brown and Cairns the players
have to be interested in the game to reach this state. When the user continues to
play a game after the stage of engagement she will reach engrossment. When en-
grossed in a game, the player’s emotions are directly affected by the game. Total
immersion is the most immersed a user can get. She will be completely involved
in the game and experience absolute presence, where only the game and the emo-
tions produced by the game matter. In a follow-up work Cheng and Cairns have
further investigated the different stages of immersion (Cheng and Cairns 2005).
They tested a game with changing graphics and behaviour on 14 different users.
Through this experiment Cheng and Cairns found out that when a user is im-
mersed in a game, she would oversee usability issues and even not notice changes
in the game’s behaviour.
Our work is influenced by the approach described in (Sweetser and Wyeth
2005) to integrate common known heuristics into the eight steps of flow as pro-
posed by Csikszentmihalyi. Nevertheless, we will not try to measure UX through
the factor flow. Instead we will provide a set of heuristics that is independent of
the flow approach and will target usability and UX of the evaluated games. A de-
tailed overview of this process will be given in section 6.
9

6 Overview and Review of Existing Video Game Heuristics and


their Impact onto User Experience

As introduced in section 3 and further discussed in section 4, heuristics can be a


valuable method in video game design. In this section we want to present a modu-
lar framework which is based on our previous work (Koeffel 2007; Koeffel et al.
2009) and current literature. The framework consists of the sections game
play/game story and virtual interface. The section game play/game story contains
heuristics regarding these very topics. In the section about the virtual interface
heuristics concerning the displayed virtual interface that the player interacts with
are presented. The heuristics treating game play/game story and the virtual inter-
face are designed to be applicable to video games as shown in Table 1.

6.1 Video Game Heuristics

In our previous work (Koeffel et al. 2009) we collected a literature-based set of 29


heuristics derived from research by (Nielsen and Molich 1990; Federoff 2002; De-
survire et al. 2004; Sweetser and Wyeth 2005; Koivisto and Korhonen 2006;
Röcker and Haar 2006; Schaffer 2007; Pinelle et al. 2008a).
The major part of these heuristics was also part of the approach introduced in
(Sweetser and Wyeth 2005). It was their main goal to establish a method to meas-
ure the state of flow that a game offers to the player. Moreover they put usability
on a level with UX, which has proven to be a different concept (see section 4).
Furthermore they only applied their heuristics to the area of real time strategy
games, whereas we sought to generate a set of heuristics that was applicable to
multiple game genres.
Our research showed that in the literature many heuristics focused either on us-
ability (e.g. Pinelle et al. 2008a) or on playability, fun and enjoyment (e.g. Sweet-
ser and Wyeth 2005) – factors closely connected to UX. We therefore want to cre-
ate a more holistic set of heuristics that does not solely concentrate on either UX
or usability. Moreover we want to focus on all aspects offered by video games, es-
pecially as occurring problems have an impact onto the UX, and the quality of a
game can hardly be determined by usability only.
We previously established a connection between a game’s perceived UX and
the use of the previous iteration of our heuristics (Koeffel et al. 2009). Neverthe-
less, through further research and the application of these heuristics we encoun-
tered inconsistencies within them. One of the more frequent issues was in their
discriminatory power. When categorizing issues it was at times difficult to assign
them to only one heuristic. Since some heuristics were kept very general, their
quintessence would apply to multiple issues, e.g. “The player should be able to
identify game elements […]” and “The menu should be intuitive and the meanings
10

obvious […]” could both apply to a badly readable help text. In other words, in
some cases up to five heuristics would be assigned to one issue. Thus, we saw the
need for heuristics that are easier to distinguish, more autonomous and moreover,
easy to apply.
We also intended to address the inclusion of different genres into our new set
of heuristics. The twelve problem categories established in (Pinelle et al. 2008a)
and the large number of evaluated games have been used in (Pinelle et al. 2008b)
to generate genre-dependent profiles. This data was used by Livingston et al. to
derive a formula for a genre-weighted severity rating (Livingston et al. 2010).
While we support the process of generating a genre coefficient that is taken into
account when determining the severity of issues, it is limited to the data compiled
by Pinelle et al. and therefore their twelve problem categories. Although we see
the inclusion of genre-specific aspects as a valuable part in a heuristic evaluation,
we chose not to adhere to Pinelle at al.’s categories and rather define our own
(thereby knowingly forfeiting the possibility to resort to their genre-data) since we
found their categories not exhaustive.
Table 1 contains the final 49 heuristics concerning game play/game story and
virtual interface. They have been grouped into twelve categories for easier appli-
cation. When an issue is found the reviewer can at first choose the appropriate cat-
egory and proceed to assign a heuristic from the remaining subset.

No. Heuristic
Game play / Game story

O1 Goals
1.1 Overall goal: The player is presented clear goals (e.g. overriding goals) early enough or
be able to create her own goals and is able to understand and identify them.
1.2 Short-time goals: There can be multiple goals on each level (short-term and long-term
goals), so that there are more strategies to win. Furthermore the player knows how to
reach the goal without getting stuck.
O2 Motivation
2.1 The player is receiving meaningful rewards. The acquisition of skills (personal and in-
game skills) can also be a reward.
2.2 The game does not stagnate and the player feels the progress.
2.3 The game and the outcome are perceived as being fair.
2.4 The game itself is replayable and the player enjoys playing it.
2.5 The game play does not require the player to fulfil boring tasks.
2.6 Challenges are positive game experiences and encourage the user to continue playing.
2.7 The first-time experience is encouraging.
O3 Challenge
3.1 The game is paced to apply pressure to but does not frustrate the player.
3.2 Challenge, strategy and pace are in balance.
3.3 The artificial intelligence is reasonable, visible to the player, consistent with the play-
er’s expectations and yet unpredictable.
3.4 There are variable difficulty levels for a greater challenge.
11

3.5 The challenge of the game is adapted to the acquired skills. The difficulty level varies
so the player experiences greater challenges as she develops mastery.
3.6 Challenging tasks are not required to be completed more than once (e.g. when dying
after completing a hard task).
3.7 The game is easy to learn, but hard to master.
O4 Learning
4.1 The player is given space to make mistakes, but the failure conditions must be under-
standable.
4.2 The learning curve is shortened. The user’s expectations are met and the player has
enough information to get immediately started (or at least after reading the instruction
once).
4.3 General help displaying the game’s fundamentals exists and is a meaningful addition to
the game and provides useful assistance before and during the game.
4.4 Tutorials and adjustable levels are able to involve the player quickly (learning) and
provided upon request throughout the entire game.
O5 Control
5.1 The player feels that she is in control. That includes the control over the character as
well as the impact onto the game world. It is clear what’s happening in the game.
5.2 The player can impact the game world and make changes.
5.3 The player is able to skip non-playable and repeating content if not required by the
game play.
5.4 The game mechanics feel natural and have correct weight and momentum. Furthermore
they are appropriate for the situation the player is facing.
5.5 The player is able to save the game in different states (applies to non arcade-like
games) and is able to easily turn the game off and on.
5.6 The player is able to respond to threats and opportunities.
O6 Consistency
6.1 Changes the player makes to the game world are persistent and noticeable.
6.2 The game is consistent and responds to the user’s action in a predictable manner. This
includes consistency between the game elements and the overarching settings as well
as the story.
O7 Game Story
7.1 The meaningful game story supports the game play and is discovered as part of the
game play.
7.2 The story suspends disbelief and is perceived as a single vision, i.e. the story is planned
through to the end.
7.3 The game emotionally transports the player into a level of personal involvement (e.g.
scare, threat, thrill, reward, punishment).
Virtual Interface

O8 Feedback
8.1 The acoustic and visual effects arouse interest and provide meaningful feedback at the
right time.
8.2 Feedback creates a challenging and exciting interaction and involves the player by cre-
ating emotions.
8.3 The feedback is given immediately to the player’s action.
12

8.4 The player is able to identify game elements such as avatars, enemies, obstacles, power
ups, threats or opportunities (orthogonal unit differentiation).
8.5 The player knows where she is on the mini-map if there is one and does not have to
memorize the level design.
8.6 The player does not have to memorize resources like bullets, life, score, points and
ammunition.
O9 Visual Appearance
9.1 In-game objects are standing out (contrast, texture, colour, brightness), even for players
with bad eyesight or colour blindness and cannot easily be misinterpreted.
9.2 Furthermore the objects look like what they are for (affordance).
O10 Interaction
10.1 Input methods are easy to manage and have an appropriate level of sensitivity and re-
sponsiveness.
10.2 Alternative methods of interaction are available and intuitive. When existing interac-
tion methods are employed, they are adhering to standards.
10.3 The first player action is obvious and results in immediate positive feedback.
O11 Customization
11.1 The game allows for an appropriate level of customization concerning different aspects
(e.g. audio and video settings, etc.)
11.2 The input methods allow customization concerning the mappings. The customization is
persistent.
O12 Menu and Interface Elements (HUD)
12.1 The interface is consistent in control, colour, typography and dialog design (e.g. large
blocks of text are avoided, no abbreviations) and as non-intrusive as possible.
12.2 The menu is intuitive and the meanings are obvious and perceived as a part of the
game.
12.3 The visual representation (i.e. the view) allows the user to have a clear, unobstructed
view of the area and of all visual information that is tied to the location.
12.4 Relevant information is displayed and the critical information stands out. Irrelevant in-
formation is left out. The user is provided enough information to recognize her status
and to make proper decisions.
12.5 If standard interface elements are used (buttons, scroll bars, pop-up menus), they are
adhering to common game interface design guidelines.

Table 1. Revised Heuristics concerning game play/game story and virtual interface.

As mentioned before, these heuristics are based on our previous 29 heuristics


(Koeffel et al. 2009). While some heuristics remained the same we split or re-
phrased those that had proven to be too unspecific or ambiguous. The introduced
categories can further help choosing the right heuristic.
The categories were chosen to cover the most important aspects of video games
in terms of game play/game story and virtual interface. The goals cover the
game’s overall objective as well as short term goals to be completed by the player.
Motivation contains important aspects to cause the player to continue playing such
as rewards, fairness or the avoidance of completing mundane tasks. The challenge
13

created by the game addresses issues connected to pacing, difficulty and player
skills. As the game’s acceptance is also connected to the player’s learning curve,
learning contains aspects related to help, tutorials and error conditions. Possibili-
ties to influence the game world as well as being able to perform desired actions at
any given time (e.g. saving or quitting) are part of the control category. Making
changes that last and predictable responses by the game are covered by consisten-
cy. Emotional involvement and narratives are part of the game story.
The virtual interface needs to provide meaningful and timely feedback. This
concerns acoustic and visual feedback, the possibility to identify game elements
and the player’s location within the game. The assessment of the visual appear-
ance of in-game objects and their purposes must be possible for the player. The
quality of the game’s input methods is covered by the category interaction. The
possibility to adapt the game to the player’s needs and desires is part of the cus-
tomization. Finally, menu and interface elements addresses all components pro-
vided by the heads up display (HUD) and the game’s menu.
Our assumption is that a game that is enjoyable to play has to a large extent be
free of usability issues that keep the user from enjoying a game. Especially the
heuristics targeting game play/game story deem appropriate not only for classical
usability issues (missing feedback, etc.), but also to issues connected to enjoyment
and fun of a game (challenge, fairness, etc.).
In order to be able to estimate the UX through heuristics, we have set up a
methodology to prove this concept (see following section). Our approach states
that the overall UX of video games can be determined by conducting an expert-
based evaluation of the game in question, using the heuristics shown above. The
more heuristics are met, the higher the overall UX is, the more heuristics point to
flaws in the game, the worse the UX is.

6.2 Heuristic Approach to User Experience

In the previous sections we linked heuristic evaluations to UX and used a similar


approach for the newly developed heuristics. This was done, on the one hand, to
evaluate the usefulness of the newly developed set of heuristics and on the other
hand, to prove the applicability of these heuristics to the measurement of UX. We
decided to focus our research on the field of indie games, since a low-cost evalua-
tion method seems appropriate and could be a valuable tool for a part of the game
industry that is not supported by big companies and/or large budgets. In his work
Larsen states that common game reviews are to a major part based on the subjec-
tive evaluation of a game’s UX from the game reviewer’s point of view (Larsen
2008). Game reviewers have been unwittingly evaluating UX of games for nearly
two decades.
Following this idea, we chose to evaluate a number of computer games using
our 49 heuristics and compare the results to common game reviews. Therefore, we
14

are able to compare the heuristics – primarily designed to detect usability issues –
with UX oriented game reviews. In order to be able to make a quantitative state-
ment we tried to establish a connection between the number of problems found
through the heuristic evaluation and the numerical rating obtained from several
different game reviews. The process of our evaluation was designed as a heuristic
evaluation for video games. To obtain meaningful results, three evaluators con-
ducted the study. All three of them were experienced in the fields of computer
games and usability, with two being usability experts with gaming experience and
third one vice versa. Two female and one male researcher were selected. Since
gaming habits and preferences could influence the outcome, one evaluator can be
considered as a so-called core-gamer who frequently plays games of different gen-
res. The second evaluator was rather a representative of the casual gaming scene
with experience in different genres (among them also core-games) while experi-
ence and preferences of the third evaluator where situated somewhere between
those two extremes. For the evaluation we decided to choose games from different
genres in the field of indie games in order to avoid biasing towards one genre, as
experienced in some of our analysed work (cf. Federoff 2002; Sweetser and Wy-
eth 2005). Furthermore the chosen games had to be rather recent ones to exhaust
all current technical possibilities. Therefore the following games have been select-
ed:
 Adventures: Machinarium (Amanita Design)3, Gemini Rue (Wadjet Eye
Games)4, The Tiny Bang Story (Colibri Games)5
 Casual: Rhythm Zone (Sonic Boom Games)6, Fortix 2 (Nemesys Games)7
 Action: Drug Wars (Paleo Entertainment)8
The games were chosen due to the broad range of their Metacritic.com9 ratings.
The best game was rated with 85 %, the worst with 35 %. Even though, three
games are classified as Adventure games, they show considerable differences in
their core game mechanics. Furthermore, the reviewers had no prior experiences
with the games to avoid biasing.
We defined our evaluation protocol in the following way: Each evaluator ob-
tained a list with the according heuristics and an evaluation report for the found
usability issues. Previous to the evaluation the reviewers met and previewed the
heuristics in order to get familiar with them and to avoid misapprehensions. All
three reviewers evaluated each single game by playing it for exactly 30 minutes.
Issues found while playing were noted in the evaluation report. For the assessment

3 http://machinarium.net/demo/

4 http://wadjeteyegames.com/gemini-rue.html

5 http://www.colibrigames.com/

6 http://www.sonicboomgames.com/

7 http://www.fortix2.com/

8 http://www.paleoent.com/

9 http://www.metacritic.com/
15

of the games, two different ratings were applied: a Nielsen severity scale and a
point-scale ranking (to enable a better comparison to the game-review site).
First the researchers reviewed each game after playing it, using the heuristics to
rank the found issues according to Nielsen and Mack’s severity scale (Nielsen and
Mack 1994) which led to the number of total usability issues found per game as
displayed in Table 2: The results of the evaluation ranked according to points
obtained, issues found and compared to the results of Metacritic.com.

In order to assess the linear relationship between the results of our evaluation and
the Metacritic.com scores, we calculated the Pearson product-moment correlation
coefficient (PPMCC, r) after ensuring a normal distribution of our calculated data
(Kolmogorov–Smirnov test). The resulting coefficients, as well as the respective
coefficient of determination (R²) are shown in Table 3.

0. Not a usability problem at all


1. Cosmetic problem only: It does not have a profound impact onto the game
2. Minor problem: It has a slight impact onto the game and influences the experi-
ence a bit
3. Major problem: This problem has a severe impact onto the game and negatively
influences the user experience
4. Usability catastrophe: This problem has to be fixed in order to allow for a de-
cent user experience
Second the evaluators assigned again a score from 0 to 4 (0 being worst, 4 be-
ing best) to every single heuristic to determine how well the game fulfilled each of
them. The problems and their severity, which were found during the rating accord-
ing to the above mentioned scale, helped to determine which heuristics were the
least satisfied ones. The achieved score was then converted into a percentage scale
indicating to which degree the game complied with the heuristics (100 % would
mean the achievement of maximum points for each heuristic).

7 Results

To compare the results of the expert-based heuristic evaluation, we chose to se-


lect at least 10 game reviews (on average 20) for each game to avoid biasing of the
single reviewers and therefore guarantee a more objective rating. Metacritic.com
exactly fulfils these requirements by accumulating scores from different reviewing
sites and calculating a weighted average. Their score reaches from 0 to 100 and
can therefore be seen as a percentage rating which is very common among review-
ing sites. The results of our study can be seen in Table 2: The results of the eval-
uation ranked according to points obtained, issues found and compared to
the results of Metacritic.com.
16

In order to assess the linear relationship between the results of our evaluation and
the Metacritic.com scores, we calculated the Pearson product-moment correlation
coefficient (PPMCC, r) after ensuring a normal distribution of our calculated data
(Kolmogorov–Smirnov test). The resulting coefficients, as well as the respective
coefficient of determination (R²) are shown in Table 3.

Rank Ranking according to found issues Ranking according to points Metacritic.com ranking
1 Machinarium (27) Fortix 2 (75.09 %) Machinarium (85 %)
2 Tiny Bang Story (27) Machinarium (65.07 %) Gemini Rue (82 %)
3 Fortix 2 (33) Tiny Bang Story (65.02 %) Fortix 2 (73 %)
4 Gemini Rue (34) Gemini Rue (60.37 %) Tiny Bang Story (65 %)
5 Rhythm Zone (35) Rhythm Zone (50.87 %) Rhythm Zone (44 %)
6 Drug Wars (49) Drug Wars (44.39 %) Drug Wars (35 %)

Table 2: The results of the evaluation ranked according to points obtained, issues found
and compared to the results of Metacritic.com.

In order to assess the linear relationship between the results of our evaluation and
the Metacritic.com scores, we calculated the Pearson product-moment correlation
coefficient (PPMCC, r) after ensuring a normal distribution of our calculated data
(Kolmogorov–Smirnov test). The resulting coefficients, as well as the respective
coefficient of determination (R²) are shown in Table 3.

Correlation r R²
Gameplay point score x Metacritic.com rating 0,793 63%
Virtual interface point score x Metacritic.com rating 0,647 42%
Overall point score x Metacritic.com rating 0,782 62%
Number of Issues found x Metacritic.com rating × -0,733 54%

Table 3: r and R² show medium to high relationships between gameplay, virtual interface
and average point score and the Metacritic.com rating as well as the number of issues
found and the Metacritic.com rating.

The PPMCC denotes the strength of a linear relationship (-1 < r < 1). A posi-
tive value means a positive linear relationship and vice versa. In the case of the
Metacritic.com score and our heuristic point score a high linear relationship exists.
This means that a higher conformance to the heuristics results in a higher Meta-
critic.com score. The same applies to sub-parts of our heuristic framework (game
play/game story and virtual interface). A high but negative linear relationship ex-
ists for the number of issues found and the Metacritic.com score: the fewer issues
are found, the higher the resulting Metacritic.com score.
17

8 Discussion and Future Work

In a nutshell, the results presented in the previous section demonstrate that the
UX measured through heuristic evaluations is reflected by the Metacritic.com rat-
ing. This tendency shows the connection between heuristic evaluations and UX. In
relation to the results from Metacritic.com we can state that the more usability is-
sues are found during a heuristic evaluation, the worse the UX is. The fact that the
ranking according to points is not as high as the ranking according to Metacrit-
ic.com can be caused by the fact that our heuristics focus on usability issues which
might not be detected during a game review or which might not be weighted that
dramatically. Nevertheless to further prove this concept more extensive evalua-
tions, involving a higher number of games that belong to several different genres
other than the ones tested so far. This could also add up to a genre specific analy-
sis in the form of (Pinelle at al. 2008b). An additional outcome of such tests could
also be a definitive number of heuristics which have to be fulfilled in order to
grant an optimized UX.
We do however acknowledge that we use a quantitative score from the reviews
and not the qualitative data represented by the actual content of the review. Such a
score cannot represent the written review in its entirety and is therefore less accu-
rate. Still using the review score allows us to draw the conclusion that the UX of a
game is worse the less it adheres to the heuristics.
Our experience gathered as part of the evaluations has indicated that an evalua-
tion period of 30 minutes, although helpful in assessing the first impression and
critical issues of a game, does not suffice to judge all aspects of the game in ques-
tion. Thus, overarching goals or the quality of the game story throughout the game
cannot be completely assessed. Therefore it will be necessary in our future work to
prolong the time played or even complete the game.

9 Summary

The present chapter has introduced a possibility to evaluate the overall UX of vid-
eo games using heuristic evaluation. The topic of UX has significantly gained in
importance in the HCI community and research. The experience a user perceives
when playing a computer game has been one of the central issues of many recent
publications. Although being a subjective impression, researchers seek to objec-
tively evaluate and properly describe it (cf. Phillips 2006).
Therefore we have analysed and reviewed the most common heuristics for vid-
eo games and built a framework upon our findings. This framework consists of 49
heuristics, categorized into the parts game play/game story and virtual interface.
We used our framework to conduct an expert-based heuristic evaluation of six
different video games to determine weaknesses and problems. We then attempted
18

to prove that heuristics can be used to measure the level of UX by comparing the
results of our study with accumulated reviews from several different gaming sites.
Since theses reviews focus on explaining how the UX of a game was perceived by
the author we see it as a legitimate description of UX for a game. Our results indi-
cate a direct correlation between the outcome of the heuristic evaluation and the
level of UX as indicated by an average rating score (Metacritic.com).

10 References

Bernhaupt R (2010) Evaluating User Experience in Games: Concepts and Methods (1st ed.).
Springer Publishing Company, Incorporated.
Bernhaupt R, Ijsselsteijn W, Mueller F, Tscheligi M, Wixon D (2008) Evaluating user experi-
ences in games. In: CHI '08 Extended Abstracts on Human Factors in Computing Systems
(Florence, Italy, April 05-10, 2008). CHI '08. ACM, New York, NY, 3905-3908
Bernhaupt R, Eckschlager M, Tscheligi M (2007) Methods for Evaluating Games – How to
Measure Usability and User Experience in Games? In: Proceedings of the international Con-
ference on Advances in Computer Entertainment Technology (Salzburg, Austria, June 13-15,
2007). ACE '07, Volume 203. ACM, New York, NY, 309-310
Bernhaupt R, Linard N (2010) UX Evaluation for Multimodal Interaction in Games, Workshop
during EICS 2010, online. http://research.edm.uhasselt.be/~craymaekers/deng-ve/
Bostan B, Marsh T (2010) The 'interactive' of interactive storytelling: customizing the gaming
experience. In Proceedings of the 9th international conference on Entertainment computing
(ICEC'10), Hyun Seung Yang, Rainer Malaka, Junichi Hoshino, and Jung Hyun Han (Eds.).
Springer-Verlag, Berlin, Heidelberg, 472-475
Brown E, Cairns P (2004) A grounded investigation of game immersion. In: CHI '04 Extended
Abstracts on Human Factors in Computing Systems (Vienna, Austria, April 24-29, 2004).
CHI '04. ACM, New York, NY, 1297-1300
Cheng K, Cairns PA (2005) Behaviour, realism and immersion in games. In: CHI '05 Extended
Abstracts on Human Factors in Computing Systems (Portland, OR, USA, April 02-07, 2005).
CHI '05. ACM, New York, NY, 1272-1275
Clarke D, Duimering PR (2006) How Computer Gamers Experience the Game Situation: A Be-
havioral Study, Computers in Entertainment (CIE), Volume 4, Issue 3 (July 2006),
Cowley B, Charles D, Black M, Hickey R (2008) Toward an understanding of flow in video-
games. Computers in Entertainment (CIE) Volume 6, Issue 2 (July 2008), 1-27
Csikszentmihalyi M (1975) Beyond Boredom and Anxiety. Jossey-Bass, San Francisco, CA
Desurvire H, Caplan M, Toth JA (2004) Using heuristics to evaluate the playability of games. In:
CHI '04 Extended Abstracts on Human Factors in Computing Systems (Vienna, Austria,
April 24-29, 2004). CHI '04. ACM, New York, NY, 1509-1512
Desurvire H, Wiberg C (2009) Game Usability Heuristics (PLAY) for Evaluating and Designing
Better Games: The Next Iteration. In Proceedings of the 3d International Conference on
Online Communities and Social Computing: (OCSC '09), A. Ant Ozok and Panayiotis
Zaphiris (Eds.). Springer-Verlag, Berlin, Heidelberg, 557-566
Dewey J (1934) Art as Experience. Minton, Balch, New York, NY
Dumas J, Redish J (1999) A Practical Guide To Usability Testing. Intellect Books, Exeter, UK
ESA (2011) Essential Facts about the Computer and Videogame Industry, 2011 Sales, Demo-
graphic and Usage Data. Entertainment Software Association,
http://www.theesa.com/facts/pdfs/ESA_EF_2011.pdf. Accessed 27 April 2012
19

Esposito N (2005) A Short and Simple Definition of What a Videogame Is. In: Proceedings of
DiGRA 2005 Conference: Changing: Views – Worlds In Play (Vancouver, British Columbia,
Canada, June 16-20, 2005) DiGRA’05. University of Vancouver, BC
Febretti A. Garzotto F (2009) Usability, playability, and long-term engagement in computer
games. In CHI EA '09. ACM, New York, NY, USA, 4063-4068.
Federoff MA (2002) Heuristics and usability guidelines for the creation and evaluation of fun in
videogames. Master’s thesis, Department of Telecommunications, Indiana University
Fierley R. Engl S (2010) User experience methods and games: lessons learned. BCS '10. British
Computer Society, Swinton, UK, UK, 204-210.
Forlizzi J, Battarbee K (2004) Understanding experience in interactive systems. In: Proceedings
of the 5th Conference on Designing interactive Systems: Processes, Practices, Methods, and
Techniques (Cambridge, MA, USA, August 01-04, 2004). DIS '04. ACM, New York, NY.
261-268
Fullerton T, Swain C, Hoffman S (2004) Game Design Workshop: Designing, Prototyping, and
Playtesting Games, CMP Books, San Francisco, CA
Gril J (2008) The State of Indie Gaming, Gamasutra,
http://www.gamasutra.com/view/feature/3640/the_state_of_indie_gaming.php, Accessed 27
April 2012
Hassenzahl M (2008) User Experience (UX): Towards an experiential perspective on product
quality, Keynote IHM, http://www.uni-landau.de/hassenzahl/pdfs/hassenzahl-ihm08.pdf. Ac-
cessed 07 December 2008
Hassenzahl M, Tractinsky N (2006) User Experience – a research agenda. Behavior & Infor-
mation Technology, Volume 25, No 2, March-April 2006, 91-97
Hazlett RL (2006) Measuring emotional valence during interactive experiences: boys at video-
game play. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Sys-
tems (Montréal, Québec, Canada, April 22-27, 2006). CHI '06. ACM, New York, NY, 1023-
1026
Huizinga J (1950) Homo Ludens: A Study of the Play-Element in Culture. Beacon Press, Boston,
MA
IJsselsteijn WA, Kort YAW de, Poels K, Jurgelionis A, Belotti F (2007) Characterising and
Measuring User Experiences. In: Proceedings of the international Conference on Advances in
Computer Entertainment Technology (Salzburg, Austria, June 13-15, 2007). ACE '07, Vol-
ume 203. ACM, New York, NY
ISO 9241-210 (2010) Ergonomics of human-system interaction – Part 210: Human-centred de-
sign for interactive systems. International Organization for Standardization, Geneva, Switzer-
land.
http://www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_detail.htm?csnumber=52075
Accessed April 27 2012
Koeffel C (2007) Heuristics for tabletop games. Master’s thesis, Department of Digital Media,
Upper Austria University of Applied Sciences Hagenberg
Koeffel C, Hochleitner W, Leitner J, Haller M, Geven A, Tscheligi M (2009) Using Heuristics to
Evaluate the Overall User Experience of Video Games and Advanced Interaction Games. In:
Evaluating User Experience in Games, Bernhaupt R, Springer, 233–256
Korhonen H, Koivisto E.M.I. (2006) Playability heuristics for mobile games. In MobileHCI '06.
ACM, New York, NY, USA, 9-16
Korhonen H, Paavilainen J, Saarenpää H (2009) Expert review method in game evaluations:
comparison of two playability heuristic sets. In MindTrek '09. ACM, New York, NY, USA,
74-81.
Korhonen H (2011) The explanatory power of playability heuristics. In ACE '11, Teresa Romão,
Nuno Correia, Masahiko Inami, Hirokasu Kato, Rui Prada, Tsutomu Terada, Eduardo Dias,
and Teresa Chambel (Eds.). ACM, New York, NY, USA, , Article 40 , 8 pages
20

Larsen JM (2008) Evaluating User Experience – how game reviewers do it. In: Evaluating User
Experiences in Games, Workshop at the 2008 Conference on Human Factors in Computing
Systems (Florence, Italy, April 05-10, 2008). CHI '08
Law E (2011) The measurability and predictability of user experience. In Proceedings of the 3rd
ACM SIGCHI symposium on Engineering interactive computing systems (EICS '11). ACM,
New York, NY, USA, 1-10.
Law E, Roto V, Hassenzahl M, Vermeeren AP, Kort J (2009) Understanding, scoping and defin-
ing user experience: a survey approach. In Proceedings of the 27th international conference
on Human factors in computing systems (CHI '09). ACM, New York, NY, USA, 719-728
Law E, Roto V, Vermeeren AP, Kort J, Hassenzahl M (2008) Towards a shared definition of us-
er experience. In: CHI '08 Extended Abstracts on Human Factors in Computing Systems
(Florence, Italy, April 05-10, 2008). CHI '08. ACM, New York, NY, 2395-2398
Livingston IJ, Mandryk RL, Stanley KG (2010) Critic-proofing: how using critic reviews and
game genres can refine heuristic evaluations. In: Proceedings of the International Academic
Conference on the Future of Game Design and Technology (Futureplay '10). ACM, New
York, NY, USA, 48-55
Malone TW (1980) What makes things fun to learn? heuristics for designing instructional com-
puter games. In: SIGSMALL ’80: Proceedings of the 3rd ACM SIGSMALL symposium and
the first SIGPC symposium on Small systems (Palo Alto, CA, September 18-19, 1980) ACM,
New York, NY, USA, 162-169
Malone TW (1982) Heuristics for designing enjoyable user interfaces: Lessons from computer
games. In: Proceedings of the 1982 Conference on Human Factors in Computing Systems
(Gaithersburg, Maryland, United States, March 15-17, 1982). ACM, New York, NY, 63-68
Mandryk RL, Atkins MS (2007) A fuzzy physiological approach for continuously modeling
emotion during interaction with play technologies. Int J Hum-Comput Stud, No 65, Issue 4
(April 2007), 329-347
Marsh T, Yang K, Shahabi C, Wong WL, Nocera L, Carriazo E, Varma A, Yoon H, Kyriakakis
C (2005) Automating the detection of breaks in continuous user experience with computer
games. In: CHI '05 Extended Abstracts on Human Factors in Computing Systems (Portland,
OR, USA, April 02-07, 2005). CHI '05. ACM, New York, NY, 1629-1632
Medlock MC, Wixon D, Terrano M, Romero R, Fulton B (2002) Using the RITE Method to im-
prove products: a definition and a case study. Usability Professionals Assoc., Orlando FL
Nacke LE, Drachen A and Goebel S. (2010) Methods for Evaluating Gameplay Experience in a
Serious Gaming Context. International Journal of Computer Science in Sport, vol. 9 (2) (In-
ternational Association of Computer Science in Sports)
Nielsen J (1994) Usability Engineering. Morgan Kaufmann, San Francisco, CA
Nielsen J, Mack RL (1994) Usability Inspection Methods. John Wiley & Sons, New York, NY
Nielsen J, Molich R (1990) Heuristic evaluation of user interfaces. In: Proceedings of the
SIGCHI Conference on Human Factors in Computing Systems: Empowering People (Seattle,
Washington, United States, April 01-05, 1990). CHI '90. ACM, New York, NY, 249-256
Pagulayan RJ, Keeker K, Wixon D, Romero R, Fuller T (2003) User-centered design in games.
In: Jacko J, Sears A (eds), Handbook for Human-Computer Interaction in Interactive Sys-
tems. Lawrence Erlbaum Associates, Inc, Mahwah, NJ
Phillips B (2006) Talking about games experiences: a view from the trenches. interactions Vol-
ume 13, Issue 5 (September 2006), 22-23
Pinelle D, Wong N, Stach T (2008a) Heuristic evaluation for games: usability principles for vid-
eogame design. In: Proceeding of the Twenty-Sixth Annual SIGCHI Conference on Human
Factors in Computing Systems (Florence, Italy, April 05-10, 2008). CHI '08. ACM, New
York, NY, 1453-1462
Pinelle D, Wong N, Stach T (2008b) Using genres to customize usability evaluations of video
games. In: Proceedings of the 2008 Conference on Future Play: Research, Play, Share (Future
Play '08). ACM, New York, NY, USA, 129-136.
21

Röcker C, Haar M (2006) Exploring the usability of videogame heuristics for pervasive game
development in smart home environments. In: Proceedings of the Third International Work-
shop on Pervasive Gaming Applications – PerGames 2006 (Dublin, Ireland, May 7-10,
2006), Springer-Verlag, Heidelberg, 199-206
Roto V, Kaasinen E (2008) The second international workshop on mobile internet user experi-
ence. In: Proceedings of the 10th international Conference on Human Computer interaction
with Mobile Devices and Services (Amsterdam, The Netherlands, September 02-05, 2008).
MobileHCI '08. ACM, New York, NY, 571-573
Sarodnick F, Brau H (2006) Methoden der Usability Evaluation, Wissenschaftliche Grundlagen
und praktische Anwendung. Hans Huber Verlag, Bern, Switzerland
Schaffer N (2007) Heuristics for usability in games. Technical report, Rensselaer Polytechnic In-
stitute. http://friendlymedia.sbrl.rpi.edu/heuristics.pdf. Accessed 07 December 2008
Sweetser P, Wyeth P (2005) GameFlow: a model for evaluating player enjoyment in games.
Computers in Entertainment (CIE), Volume 3, Issue 3 (July 2005), 3-3
Wolf MJP (2001) The Medium of the Videogame. University of Texas Press, Austin, TX
Ye Z (2004) Genres as a Tool for Understanding and Analyzing User Experience in Games. In:
CHI '04 Extended Abstracts on Human Factors in Computing Systems (Vienna, Austria,
April 24-29, 2004). CHI '04. ACM, New York, NY, 773-774
Zimmerman E (2004) Narrative, Interactivity, Play, and Games. In: Wardrip-Fruin N, Harrigan P
(eds), First Person, MIT Press, Cambridge, UK

View publication stats

You might also like