ࡱ > w V @ |! bjbjFF ;I , ,
p p p ( ( ( 8 ` d L ] " 2 " " P P P P - P V f\ $ _ R ka \ 9 p % " " " % % \
\ 4 4 4 %
R p I 4 % P 4 4 F N " p H r-;- ( {0 G I \ 0 ] G _b S2 8 _b 0 H
_b p H " }# ^ 4 # L '$ " " " \ \ ( 4 ( Applying direct manipulation interfaces to customizing player character behaviour
Marco Gillies1
1 Department of Computer Science, University College London,Malet Place, London WC1E 6BT, UK
m.gillies@cs.ucl.ac.ukhttp://www.cs.ucl.ac.uk/staff/m.gillies
Abstract. The ability to customize a players avatar (their graphical representation) is one of the most popular features of online games and graphical chat environments. Though customizing appearance is a common ability in most games, creating tools for customizing a characters behaviour is still a difficult problem. We propose a methodology, based on direct manipulation, that allows players to specify the type of behaviour they would like in a given context. This methodology is iterative, with the player performing a number of different customizations in different contexts. Players are also able to continue customizing their character during play, with commands that can have long term and permanent effects.
1 Introduction
Avatars are a vital part of any online game. The graphical representation of a player is the essential element that presents their persona to the rest of the community. Players can develop a deep bond and association with their avatar. For this reason, creators of online games have dedicated a lot of attention to the appearance and animation of avatars. It has also recently been pointed out ADDIN REFMGR.CITE Vilhjálmsson1998240BodyChat: Autonomous Communicative Behaviors in AvatarsConference Proceeding240BodyChat: Autonomous Communicative Behaviors in AvatarsVilhjálmsson,H.H.Cassell,J.1998Not in FileVilhjalmsson98second ACM international conference on autonomous agents12[29] that allowing avatars some autonomous behaviour can greatly enhance their realism, for example by giving them complex body language that would be too difficult for a player to control in real time. This autonomous behaviour allows the avatar to produce appropriate responses to the behaviour of other players without the player having to control every movement, for example, looking at another players avatar when they talk. If a player is to truly form a bond with their avatar then they must be able to customize it to create the persona they want to project, this is one of the most popular features of on-line worlds ADDIN REFMGR.CITE Cheng200256Lessons Learned: Building and Deploying Virtual EnvironmentsReport56Lessons Learned: Building and Deploying Virtual EnvironmentsCheng,L.Farnham,S.Stone,L.2002Not in FileRalph,Schroeder24[6]. Current games largely restrict customization to graphical appearance, however, if an avatar is to present a consistent persona it should also be possible to customize their behaviour to make it consistent with their appearance.
Creating user-friendly tools for customizing characters is a challenging problem. When customizing the appearance of a character the player can pretty much see the whole effect of their changes in a single view, maybe having to rotate the view occasionally. However, autonomous behaviour involves responding to different events in the world and therefore requires the character to respond very differently in different contexts. This means that a player cannot simply judge whether they have created the character they want by quickly looking at a single view, or even a moderately sized sequence of views. What is needed is an iterative process of refinement of a character. We propose a methodology that involves iterative design of a character. Players may design their characters before joining a game by editing their behaviour in a number of different contexts. However, they can also refine the behaviour while playing using real-time customization.
Another problem with customizing behaviour is that autonomous behaviour systems are typically controlled by a large number parameters. The effect of these parameters on behaviour can be complex and, as described above, highly dependent on context. This means that directly editing these parameters can be highly unintuitive for players. To solve this problem we take inspiration from the highly successful Direct Manipulation paradigm of human computer interaction. Direct manipulation enables people to interact with software by directly editing the end result rather than the internal parameters that produce this result. Our methodology allows players to directly specify the behaviour that the characters should produce in a given context, while the software infers appropriate parameters. Typically specifying behaviour in a single context underconstrains the values of parameters. This means that players must edit behaviour in a number of different contexts, however, doing so in all possible contexts would be very time consuming, if possible at all, certainly not something that can be required of people playing games in their leisure time. This leads us back to the need for an iterative methodology that allows players to specify just as much as they feel they need at a given time, while allowing them to refine the behaviour at a given time.
2 Related Work
This work builds on a long tradition of character animation. The lower level aspects focus on body animation in which there has been a lot of success with techniques that manipulate pre-existing motion data, for example that of Gleicher ADDIN REFMGR.CITE Gleicher199798Motion Editing with Space Time ConstraintsConference Proceeding98Motion Editing with Space Time ConstraintsGleicher,Michael1997Not in File139148Gleicher97symposium on interactive 3D graphics12Michael2001101Comparing Constraint-Based Motion Editing MethodsJournal101Comparing Constraint-Based Motion Editing MethodsMichael,Gleicher2001Not in File107134Graphical Models63Gleicher01bGraphical Models1[10,20], Lee and Shin ADDIN REFMGR.CITE Jehee1999139A Hierarchical Approach to Interactive Motion Editing for Human-like FiguresConference Proceeding139A Hierarchical Approach to Interactive Motion Editing for Human-like FiguresJehee,LeeSung,Yong Shin1999Not in File3948Lee99ACM SIGGRAPH12