Data Driven Game Design

81
Prof. Pier Luca Lanzi Data-Driven Game Design Pier Luca Lanzi & Daniele Loiacono

Transcript of Data Driven Game Design

Page 1: Data Driven Game Design

Prof. Pier Luca Lanzi

Data-Driven Game Design�Pier Luca Lanzi & Daniele Loiacono

Page 2: Data Driven Game Design
Page 3: Data Driven Game Design
Page 4: Data Driven Game Design

testing

productionpre

production

Version Concept/Design Alpha Version Beta Version Gold Master/Release Version

initiation

beta release

Page 5: Data Driven Game Design

http://www.gamesradar.com/best-atari-2600-games-all-time/

Page 6: Data Driven Game Design

http://kotaku.com/5957989/the-atari-2600-et--video-game-is-still-terrible-even-when-its-finished-in-less-than-30-seconds

Page 7: Data Driven Game Design
Page 8: Data Driven Game Design
Page 9: Data Driven Game Design

Pier Luca Lanzi & Daniele Loiacono

However, smaller studios …

•  Have shorter development windows

•  Fewer chances of extensive playtesting

• What about early access?

9

Page 10: Data Driven Game Design

Pier Luca Lanzi & Daniele Loiacono

Mobile Game Development

•  Several companies follow the same development used for traditional platforms and invest 1-2 years on large projects

•  However the approach is infeasible for most mobile/indie companies which cannot sustain such a “long” cycle

•  Success in the mobile market appears not to follow established criteria

•  Long projects are perceived as too risky

•  Recent strategies favor the rapid exploration of new ideas�and follow only the more successful ones§  Development 2-3 months (4-6 applications per year)§  Follow up only to the most successful ones

10

Page 11: Data Driven Game Design

video game

sales

in-app purchases

user data

gameplay data

playing habits

modding

social networks

Page 12: Data Driven Game Design

collect game data

release

developmentgame design

analyzegame data

initiation

beta

release

Page 13: Data Driven Game Design

design

developanalyze

data

Page 14: Data Driven Game Design

Data Mining Basics

Page 15: Data Driven Game Design

Exploratory Data Analysis

Page 16: Data Driven Game Design

Clustering

Page 17: Data Driven Game Design

Classification

Page 18: Data Driven Game Design

Graph Mining

Page 19: Data Driven Game Design

Examples

Page 20: Data Driven Game Design

Predicting Churn

Page 21: Data Driven Game Design

Pier Luca Lanzi & Daniele Loiacono

Predicting Churn in Aion (Dmitry Nozhnin)

•  Can we predict new players' churn the day �they logged in for the last time?

•  Churn = inactivity for 7 days•  Variables considered

§  Playtime at current level, previous�level, and total during lifetime

§ Mobs killed per minute �(current/previous/lifetime)

§ Quests completed per minute (same)§  Average playtime per play day§  Days of play§  Absenteeism rate (number of days skipped during the seven day free trial)

21

Page 22: Data Driven Game Design

Player Modeling

Page 23: Data Driven Game Design

Pier Luca Lanzi & Daniele Loiacono

Player Modeling in Tomb Raider Underworld�(Tobias Mahlmann & c.)•  Analysis on 1365 players that completed the game (over 25250 players)•  Clustering with Self Organizing Maps of players based on several statistics

§ Causes of deaths (environment, opponents, and falling)§ Total deaths§ Completion Time§ Help on demand

•  Four types of players identified§ Veterans § Solvers § Pacifists§ Runners

23

the Gaussian kernel. The learning rate is set to 0.7 but isdecreased linearly during training reaching the value of 0.1 atthe end of the 100 training epochs used. The training samplesare presented in a randomly permuted order at each epochof the algorithm.

In order to minimize the effect of non-deterministic se-lection of initial weight values we repeat the training 20times — using dissimilar initial weight vectors — and selectthe ESOM with the smallest average quantization error (seeSection V). The highest-performing ESOM that is examinedin the remained of this paper has a quantization error of 0.038and a corresponding topographic error of 0.005. As a baselineperformance to compare against, the mean quantization andtopographic error for 10 randomly generated ESOMs equals0.1744 and 0.9983 respectively.

2) ESOM visualization: The training data can be clusteredby observation of the best performing ESOM. The U-matrixdepicted in Fig. 5(a) is a visualization of the local distancestructure in the data placed onto the two-dimensional map.The average distance value between each neuron’s weightvector and the weight vectors of its immediate neighborscorresponds to the height of that neuron in the U-matrix(positioned at the map coordinates of the neuron). Thus, U-matrix values are large in areas where no or few data pointsreside, creating mountain ranges for cluster boundaries. Onthe other hand, visualized valleys indicate clusters of datasince small U-matrix values are observed in areas where thedata space distances of neurons are small.

Distance based map visualizations (e.g. U-matrix) usuallywork well for clearly separated clusters; however, problemsmay occur with overlapping clusters. Density-based SOMvisualizations display the density of the data onto the mapspace via the best-matching neurons. The P-matrix (seeFig. 5(b)) displays the local density measures with ParetoDensity Estimation [21]. Neurons with large P-matrix valuesare located in dense regions of the input vector space and,therefore, areas with height P-matrix values indicate clustersin the data.

3) Player Types: The two map visualizations are comple-mentary and used for cluster identification within the TRUdata. Four main classes (player types) can be identified as de-picted in Fig. 5(a) and Fig. 5(b). The best-matching neuronsfor all 1365 input vector samples are also represented withsmall squares of varying color — different colors correspondto different clusters of the best matching neurons. On thesame basis, Table I presents the number of observations (i.e.players completed TRU) and percent of neurons belonging toeach of the four clusters. Note that 87 (6.37% of the sample)players were not assigned to any cluster since their best-matching neurons are placed in cluster borders of the ESOM(see Fig. 5).

Fig. 6 illustrates the corresponding component planes ofthe ESOM. A component plane projects the relative distribu-tion of one input data vector component (i.e. input vector di-mension) to the ESOM. In the grayscale illustration of thosevalues, white areas represent relatively small values while

(a) U-matrix

(b) P-matrix

Fig. 5. Visualization maps of the highest-performing ESOM obtainedand the 4 clusters identified. The 1365 best-matching neurons are drawnas squares on top of the maps. The maps illustrated are border-less sincethe map grid is organized in a toroid shape.

TABLE ITHE FOUR PLAYING BEHAVIOR CLUSTERS IDENTIFIED USING ESOM

Class Observations Neurons on map space (%)1 122 8.682 270 22.123 641 46.184 245 16.56N/A 87 6.46

dark areas represent relatively large values. By matching thecomponent planes with the U-matrix of Fig. 5(a) we caninfer characteristics (i.e. playing behavior features) for eachcluster identified.

Cluster number 1 corresponds to players that die very fewtimes; their death is caused mainly by the environment andthey complete TRU very fast. These players’ HOD requestsvary from low to average and they are labeled as Veterans asthey are the most well performing group of players despitethe high number of environment-related deaths. Likewise,cluster number 2 corresponds to players that die quite oftenmainly due to falling; it takes them quite a long time tocomplete the game; and they do not appear to ask for puzzlehints or answers. Players of this cluster are labeled as Solvers,because they are adept at solving the puzzles of TRU. Theirlong completion times, low number of deaths by enemiesor environment effects indicate a slow-moving, careful styleof play with the number one cause of death being falling(jumping).

Players of cluster number 3, form the largest group andare labeled as Pacifists as they die primarily from activeopponents. The total number of their deaths varies a lot

6 2009 IEEE Symposium on Computational Intelligence and Games

Page 24: Data Driven Game Design

Research

Page 25: Data Driven Game Design

Player Modeling in Quake Live

Page 26: Data Driven Game Design
Page 27: Data Driven Game Design

players’ social network geographical distribution friendship relations gaming relations

player modeling playing preferences player groups …

Page 28: Data Driven Game Design

Collecting Data

Page 29: Data Driven Game Design

221,857 users over 211 nations

Page 30: Data Driven Game Design

Pier Luca Lanzi & Daniele Loiacono

Some findings

•  When considering 221,857 users § 40% of users has at least one friend from their own country§ 22% of users has only friends from their own country§ Only 5% of users ever played with their friends

•  In Quake Live there are 1713 clans, § 4% of the players belong to a clan§ 70% of users has clan-mates also as friends§ 70% of clans are formed by users all from the same country§ 30% of users ever played with clan mates

30

Page 31: Data Driven Game Design

Clustering Quake Live data based on (i) preferred match type, (ii) weapon of choice, hours played, etc.��

Expert StrategicLone Novice

0

0.5

1

1.5

2

Expert Strategic Lone Novice

win/loss kill/death kill per min

Page 32: Data Driven Game Design

53%

17%

15%

78%

11%

8%

Expert Players

Page 33: Data Driven Game Design

76%

13%

10%

75%

9%

6%

Strategic Players

Page 34: Data Driven Game Design

87%

5%

4%

73%

14%

7%

Lone Players

Page 35: Data Driven Game Design

48%

29%

12%

56%

21%

10%

Novice Players

Page 36: Data Driven Game Design

Rapid Playtest and Analysis

Page 37: Data Driven Game Design

the task develop one video game for Windows Phone

to participate in the 2012 Microsoft Imagine Cup

the challenges short development (four months from start to end)

small user base (almost nobody had a Windows Phone) variety of platform with rather different features

secrecy! the app could not be distributed before submission

Page 38: Data Driven Game Design

our approach instrument the application code

to trace almost everything the users could do

perform very short play testing sessions (1-2 days)

apply data mining to the collected data to extract typical users’ behavior to evaluate gameplay

check users’ behavior on different platforms

Page 39: Data Driven Game Design

Pier Luca Lanzi & Daniele Loiacono

Bad Blood – A Serious Game About Diseases

•  Casual game for Windows phones developed during the Videogame Design and Programming course at the Politecnico di Milano

•  Bad Blood aims at spreading the knowledge about human diseases through a series of games settled in blood vessels, in the respiratory system, and in the brain

•  Five continents, in which players can select a specific region (e.g., West Australia) that also corresponds to a disease and thus to a specific scenario

•  Four game mechanics: attack, tap, survive and puzzle

39

Page 40: Data Driven Game Design

Pier Luca Lanzi & Daniele Loiacono

Page 41: Data Driven Game Design

Pier Luca Lanzi & Daniele Loiacono

Collecting Game Data

•  Our analysis focused on the two game modes with the highest interactivity (attack and tap)

•  Code was instrumented to collect any possible information (raw data) about user behavior every 200ms

•  The raw data were then elaborated to compute several variables including§  length and direction of the swipe gesture§ center position of the players’ cells during collisions § number of opponents in every screen§  the number of hits and misses in every seconds§  the positions of the hits and misses§ …

41

Page 42: Data Driven Game Design

Pier Luca Lanzi & Daniele Loiacono

Attack Mode: Trajectory of Users’ Swipes 42

Page 43: Data Driven Game Design

Pier Luca Lanzi & Daniele Loiacono

Good Taps & Bad Taps 43

Page 44: Data Driven Game Design

Pier Luca Lanzi & Daniele Loiacono

Our Flawed Gameplay 44

Page 45: Data Driven Game Design

Pier Luca Lanzi & Daniele Loiacono

How We Solved the Issue in �Time for Submission•  We modified the gameplay before the final

submission to the competition

•  Each level in attack mode has a random instant mini boss fight involving bigger bacteria and viruses

•  The users has to instantly increase the firing rate to be able to destroy the enemy before it can hit the player or disappear at the bottom of the screen

45

Page 46: Data Driven Game Design

Pier Luca Lanzi & Daniele Loiacono

Take-Home Message

•  We would never make the submission with a more traditional approach to playtesting

•  Instrumenting the code helped us getting the best out of the relatively few users we could test our game with

•  The analysis of the collected data helped us §  Improving the touch interface (and colliders’ placement)§  Discovering a major design flaw that would have made the game boring

•  We did not win the Microsoft Imagine Cup 2012! L But we won “Share Care” a major national competition for serious games devoted to blood donation and a special prize for innovation J

46

Page 47: Data Driven Game Design

Mining Design Patterns

Page 48: Data Driven Game Design
Page 49: Data Driven Game Design

Pier Luca Lanzi & Daniele Loiacono

Level Design in First Person Shooters

•  Level design involves§ Map geometry§ Navigation meshes§ Pickup locations§ Key areas (e.g. flags)

•  Level design affect players’ experience in terms of challenge, pace, strategy, team-play, etc.

49

Page 50: Data Driven Game Design

Pier Luca Lanzi & Daniele Loiacono

Level Design

•  Rely on the designer’s experience

•  Trial and error process

•  There are several books and studies but a little consolidated knowledge

•  Several heuristics and trick of the trade

•  Abstraction and generalization are not obvious

50

Page 51: Data Driven Game Design

research question

Can we collect meaningful data from maps?

Can we apply data mining to learn interesting design patterns?

What do players/modders like?

Can we use this patterns to support design process?

How discovered patterns affect gameplay?

Page 52: Data Driven Game Design

Pier Luca Lanzi & Daniele Loiacono

Unreal Tournament III (UT3)

•  “Old school” competitive FPS§ several maps available§ parameterized AI§ easily customizable§ already used for research

•  Different game modes•  Several weapon types

§ Flak Cannon (short range)§ Rocket Launcher (medium range)§ Shock Rifle (medium range)§ Sniper Rifle (long range)

52

Page 53: Data Driven Game Design

Extended Game ServerData Filtering Backend (Java)

Gephi Graph Manipulation

Software

Custom Bots (e.g., fixed weapons,�

explorer)

Weka Open Source Datamining Tool

Page 54: Data Driven Game Design

Pier Luca Lanzi & Daniele Loiacono

Collected Data (1)

•  A graph representation of the maps�is obtained by the navigation mesh

•  Nodes features include§ Type (Powerup/Ammo/Weapon/

Navigation)§ Position§  In/Out Degree§ Closeness Centrality§ Betweennes Centrality§ Clustering Coefficient

54

Page 55: Data Driven Game Design

Pier Luca Lanzi & Daniele Loiacono

Collected Data (2)

•  More statistics are computed with�a breadth first exploration using a bot

•  Edge features include§ Traveling time§ Length§ Special actions/items§  Jumps § Edge morphology (raycast)

55

Page 56: Data Driven Game Design

Team Deathmatch Maps vs Duel Maps

Node Classification

Weapon-Map Relationship

Page 57: Data Driven Game Design

Team Deathmatch vs Duel Maps

Weapon Pickup Distribution

Page 58: Data Driven Game Design

Team Deathmatch vs Duel Maps (2)

Ammo Pickup Distribution

Page 59: Data Driven Game Design

Pier Luca Lanzi & Daniele Loiacono

Node Classification

Authority

Betweenness Centrality

Degree

TABLE III: Prediction of the map’s type (Duel or Team Deathmatch) based on the features of a single node. Accuracy of differentalgorithms applied to classify ammo pickups, weapons pickups and powerup pickups. The Majority Guess always predictsthe most frequent class in the dataset and is reported here as a baseline.

classification of classification of classification ofAlgorithm ammo pickups weapons pickups powerup pickups

Majority Guess 62.90 55.03 52.70Naive Bayes 42.32± 4.15 48.76± 10.84 51.32± 4.99

Decision Trees (J4.8) 67.89± 5.45 56.66± 9.37 63.77± 7.06Classification Rules (JRip) 69.75± 5.58 53.98± 11.34 59.19± 7.88

Logistic Regression 69.54± 6.31 63.92± 12.78 65.76± 7.39

TABLE IV: Prediction of the node’s type (navigation node, ammo pickup, weapon pickup, or powerup pickup). Accuracy ofdifferent algorithms applied to classify the nodes of all the maps, of Duel maps and of Team Deathmatch maps. The MajorityGuess always predicts the most frequent class in the dataset and is reported here as a baseline.

Algorithm All Maps Duel Maps Team Deathmatch MapsMajority Guess 70.44 69.24 71.25

Naive Bayes 46.34± 4.26 46.34± 4.26 39.98± 3.85Decision Trees (J4.8) 70.18± 2.75 70.18± 2.75 73.63± 2.04

Classification Rules (JRip) 71.59± 1.71 71.59± 1.71 72.71± 1.48Logistic Regression 69.85± 1.07 69.85± 1.07 71.61± 0.93

’03. New York, NY, USA: ACM, 2003, pp. 158–170. [Online].Available: http://doi.acm.org/10.1145/963900.963915

[16] A. Tahhan, “Creating meaningful choices in single player first personshooter levels with modifiable spaces,” Master’s thesis, MIT, 2007.

[17] R. Gee, “The right way is the wrong way: Dead-end theory in leveldesign.” Master’s thesis, MIT, 2008.

[18] A. Liapis, G. N. Yannakakis, and J. Togelius, “Towards a genericmethod of evaluating game levels,” in Proceedings of the AAAI ArtificialIntelligence for Interactive Digital Entertainment Conference, 2013.

[19] R. Giusti, K. Hullett, and J. Whitehead, “Weapon design patternsin shooter games,” in Proceedings of the First Workshop on DesignPatterns in Games, ser. DPG ’12. New York, NY, USA: ACM, 2012,pp. 3:1–3:7. [Online]. Available: http://doi.acm.org/10.1145/2427116.2427119

[20] S. Dahlskog and J. Togelius, “Patterns and procedural contentgeneration: Revisiting mario in world 1 level 1,” in Proceedings ofthe First Workshop on Design Patterns in Games (DPG 2012), ser.

DPG ’12. New York, NY, USA: ACM, 2012, pp. 1:1–1:8. [Online].Available: http://doi.acm.org/10.1145/2427116.2427117

[21] ——, “Patterns as objectives for level generation,” in Proceedings ofthe Second Workshop on Design Patterns in Games (DPG 2013), ser.DPG ’13. [Online]. Available: http://dpg.fdg2013.org/papers.html

[22] B. Unreal, “Beyond unreal wiki,” 2013. [Online]. Available: http://wiki.beyondunreal.com/

[23] H. Durrant-Whyte and T. Bailey, “Simultaneous localisation and map-ping (slam): Part i the essential algorithms,” IEEE Robotics andAutomation Magazine, vol. 2, p. 2006, 2006.

[24] Clanbase, “UT3 ladder,” 2013. [Online]. Available: http://clanbase.ggl.com/ladders.php?gid=82

[25] Gephi, “Gephi,” 2013. [Online]. Available: https://gephi.org/

[26] I. H. Witten, E. Frank, and M. A. Hall, Data Mining: Practical MachineLearning Tools and Techniques, 3rd ed. San Francisco, CA, USA:

TABLE III: Prediction of the map’s type (Duel or Team Deathmatch) based on the features of a single node. Accuracy of differentalgorithms applied to classify ammo pickups, weapons pickups and powerup pickups. The Majority Guess always predictsthe most frequent class in the dataset and is reported here as a baseline.

classification of classification of classification ofAlgorithm ammo pickups weapons pickups powerup pickups

Majority Guess 62.90 55.03 52.70Naive Bayes 42.32± 4.15 48.76± 10.84 51.32± 4.99

Decision Trees (J4.8) 67.89± 5.45 56.66± 9.37 63.77± 7.06Classification Rules (JRip) 69.75± 5.58 53.98± 11.34 59.19± 7.88

Logistic Regression 69.54± 6.31 63.92± 12.78 65.76± 7.39

TABLE IV: Prediction of the node’s type (navigation node, ammo pickup, weapon pickup, or powerup pickup). Accuracy ofdifferent algorithms applied to classify the nodes of all the maps, of Duel maps and of Team Deathmatch maps. The MajorityGuess always predicts the most frequent class in the dataset and is reported here as a baseline.

Algorithm All Maps Duel Maps Team Deathmatch MapsMajority Guess 70.44 69.24 71.25

Naive Bayes 46.34± 4.26 46.34± 4.26 39.98± 3.85Decision Trees (J4.8) 70.18± 2.75 70.18± 2.75 73.63± 2.04

Classification Rules (JRip) 71.59± 1.71 71.59± 1.71 72.71± 1.48Logistic Regression 69.85± 1.07 69.85± 1.07 71.61± 0.93

’03. New York, NY, USA: ACM, 2003, pp. 158–170. [Online].Available: http://doi.acm.org/10.1145/963900.963915

[16] A. Tahhan, “Creating meaningful choices in single player first personshooter levels with modifiable spaces,” Master’s thesis, MIT, 2007.

[17] R. Gee, “The right way is the wrong way: Dead-end theory in leveldesign.” Master’s thesis, MIT, 2008.

[18] A. Liapis, G. N. Yannakakis, and J. Togelius, “Towards a genericmethod of evaluating game levels,” in Proceedings of the AAAI ArtificialIntelligence for Interactive Digital Entertainment Conference, 2013.

[19] R. Giusti, K. Hullett, and J. Whitehead, “Weapon design patternsin shooter games,” in Proceedings of the First Workshop on DesignPatterns in Games, ser. DPG ’12. New York, NY, USA: ACM, 2012,pp. 3:1–3:7. [Online]. Available: http://doi.acm.org/10.1145/2427116.2427119

[20] S. Dahlskog and J. Togelius, “Patterns and procedural contentgeneration: Revisiting mario in world 1 level 1,” in Proceedings ofthe First Workshop on Design Patterns in Games (DPG 2012), ser.

DPG ’12. New York, NY, USA: ACM, 2012, pp. 1:1–1:8. [Online].Available: http://doi.acm.org/10.1145/2427116.2427117

[21] ——, “Patterns as objectives for level generation,” in Proceedings ofthe Second Workshop on Design Patterns in Games (DPG 2013), ser.DPG ’13. [Online]. Available: http://dpg.fdg2013.org/papers.html

[22] B. Unreal, “Beyond unreal wiki,” 2013. [Online]. Available: http://wiki.beyondunreal.com/

[23] H. Durrant-Whyte and T. Bailey, “Simultaneous localisation and map-ping (slam): Part i the essential algorithms,” IEEE Robotics andAutomation Magazine, vol. 2, p. 2006, 2006.

[24] Clanbase, “UT3 ladder,” 2013. [Online]. Available: http://clanbase.ggl.com/ladders.php?gid=82

[25] Gephi, “Gephi,” 2013. [Online]. Available: https://gephi.org/

[26] I. H. Witten, E. Frank, and M. A. Hall, Data Mining: Practical MachineLearning Tools and Techniques, 3rd ed. San Francisco, CA, USA:

59

Page 60: Data Driven Game Design

0 2 4 6 8

10 12 14 16 18 20

BioHazard

Idoma

Liandri

DeckFPS

Diesel

Penetrated

Salvation Fearless

Hypoxia

RisingSunF

Sanctuary

Sentinel

Subterranee

Flak Cannon vs Rocket Launcher

Page 61: Data Driven Game Design

Pier Luca Lanzi & Daniele Loiacono

Conclusions

•  This is a preliminary work…• … but this framework could be used to discover, analyze and

understand interesting patterns in FPS maps

•  Future works include:§ Further investigation of gameplay dynamics§ Better visualization of game dynamics on the maps (e.g.,

heathmap of deaths on the map)§ Use the framework to allow what-if analysis and map

annotation

Page 62: Data Driven Game Design

Balancing Gameplay

Page 63: Data Driven Game Design
Page 64: Data Driven Game Design

Pier Luca Lanzi & Daniele Loiacono

Balancing Multiplayer First-Person Shooters

•  Providing the “right amount” of challenge is very important. Multiplayer games are more difficult to balance

•  Balance depends on the players’ skill, the playing strategies, the game environment, the weapons, etc.

•  How can we evaluate if an FPS is balanced? It is mainly subjective! However, the distribution of kills/scores among players could be a good proxy

•  For example, in a 2-players match best player should kill the opponent less than twice the time it has been killed

64

Page 65: Data Driven Game Design

Balancing multiplayer games is both a design and a matchmaking problem

Page 66: Data Driven Game Design

design match making

Page 67: Data Driven Game Design

research question

How the map affect the match balancing?

Can we automatically design map to improve balancing?

Page 68: Data Driven Game Design

f=0.92 f=0.96

Page 69: Data Driven Game Design
Page 70: Data Driven Game Design

BOT1 SKILL

BO

T2 S

KIL

L

BOT1 SKILL

BO

T2 S

KIL

L EVOLUTION

>84% 66%-84% 50%-66% 33%-50% 16%-33%

Page 71: Data Driven Game Design

f=0.93 f=0.98

Page 72: Data Driven Game Design

Pier Luca Lanzi & Daniele Loiacono

Take-Home Message�

•  Good News§ Maps can be evolved to improve balancing§ It works (even better) with players with different weapons§ It can be combined with other balancing approaches

•  Bad News§ Bots are used to measure balancing (i.e., accuracy bias)§ Requires modeling of player

72

Page 73: Data Driven Game Design

Pacing the Gameplay

Page 74: Data Driven Game Design
Page 75: Data Driven Game Design
Page 76: Data Driven Game Design
Page 77: Data Driven Game Design
Page 78: Data Driven Game Design
Page 79: Data Driven Game Design

Pier Luca Lanzi & Daniele Loiacono

Data-Driven Game Design

•  Video games can generate and collect huge amount of data

•  These data contain potentially useful information that can help improving the design or inspiring new designs

•  Advanced data mining methods are required to analyze such data so as to produce models and knowledge to support designers

79

Page 80: Data Driven Game Design

http://www.polimigamecollective.orghttp://www.facebook.com/polimigamecollective

http://www.youtube.com/[email protected]

Page 81: Data Driven Game Design

Pier Luca Lanzi & Daniele Loiacono

References

•  Churn§  http://gamasutra.com/view/feature/170472/predicting_churn_datamining_your_.php?page=1§  http://www.gamasutra.com/view/feature/176747/predicting_churn_when_do_veterans_.php?print=1

•  Player Modeling§  http://julian.togelius.com/Mahlmann2010Predicting.pdf

81