Interactive Floor

download Interactive Floor

of 36

Transcript of Interactive Floor

  • 8/3/2019 Interactive Floor

    1/36

    Interactive FloorDesign Project Report

    Interaction Design, IT-UniversityChalmers University of Technology

    Mohammad Ardavan

    Eelke Boezeman

    Amir Chamsaz

    Keyvan Minoukadeh

    Alejandro Valenzuela

  • 8/3/2019 Interactive Floor

    2/36

    1 Introduction...............................................31.1 Background............................................. 31.2 Design Goals .......................................... 3

    2 Architecture ...............................................3

    2.1 Hardware................................................ 42.2 Software................................................. 5

    2.2.1 Tracking module................................. 62.2.2 Interface module ................................ 7

    3 Operation ...................................................93.1 User types .............................................. 93.2 Scenarios................................................ 9

    3.2.1 Prisoners Dilemma ............................. 93.2.2 Personzilla ........................................10

    3.2.3 Meditation game................................113.2.4 Pong game .......................................12

    3.3 Installation.............................................123.4 Licensing ...............................................15

    4 Realisation ................................................155 Evaluation.................................................17

    5.1 Costs and Free Software ..........................175.2 Sensitivity to Light ..................................185.3 Speed....................................................18

    5.4 Foot as Pointing Device ............................185.5 Shadows................................................195.6 Sound, ambient sound and music ..............19

    6 Related Work ............................................196.1 Living Surface by Vertigo Systems.............206.2 The Famous Grouse Experience byART+COM....................................................206.3 Audience by Chris O'Shea.........................21

    7 References ................................................238 Appendices................................................24

    8.1 Project plan............................................248.2 Design document ....................................25

  • 8/3/2019 Interactive Floor

    3/36

    1 Introduction

    The Interactive Floor is a video-projection displayed on the floor that people can interactwith. As soon as someone walks into the projected area he becomes part of the digitalworld that is projected around him.

    The Interactive Floor offers a low-barrier interaction. People do not have to have anyknowledge or skills to perform the interaction; the only requirement is that they enterand move around in the projection area. As soon as someone enters the area he is nolonger a bystander, but a user. The system keeps track of where the user is and usesthis information to change the interface that it projects around the user.

    The user sees the interface projected around him and can move around to explore. He

    learns quickly that by moving around, he changes the interface. His actions change thestate of the device. In this way the Interactive Floor offers a direct feedback loop that isvery much similar to a computer: the projected area is the monitor and the user is themouse - yet, our aim is not to provide another input method for ordinary computerprograms but to present users with an innovative interactive experience, approachingwhat is known as "enhanced reality".

    1.1 Background

    The Interactive Floor is a Master project developed by five students from the InteractionDesign program at Chalmers University of Technology, Gothenburg, Sweden. The projectis part of the Design Project course in which students are required to spend 3 months

    developing a hardware and/or software project. The theme of the 2009 Design Projectcourse was Interacting Interactive Personalities. All student projects were presentedduring Expo 09 exhibition at Lindholmen Science Park, Gothenburg, Sweden.

    1.2 Design Goals

    The aim for the project is twofold. First, the goal is to develop a device that projects adisplay on the floor while capturing user input with a webcam. The device should be ableto function under different lighting circumstances and environments, with different kindsof users and should be able capture multiple user simultaneously. The device is requiredto work on basic hardware as the project budget is limited.

    The second goal of the project is to research what kind of scenarios and games aresuitable for this kind of interface. By developing different applications that each have adifferent perspective, task and type of user input, the goal is to get a rough idea of whatworks - and what doesn't. In light of the theme of the Design Project course, the developapplications should contain agents with personalities that can interact with each other.

    2 Architecture

    The architecture comprises everything that was used to project the interface and captureand process the movements from users. The architecture involves both the hardware

    and the software that runs on it.

  • 8/3/2019 Interactive Floor

    4/36

    2.1 Hardware

    Due to budgetary issues it was necessary to use standard basic hardware components.The setup consisted of several components that were all integrated to each other tocreate a single working device. It consisted of:

    Projector: The system uses a projector to display images on the floor; thereforea projector that projects as bright and as wide as possible was required. Mostcommonly available projectors will work as long as light conditions can becontrolled, i.e. kept to a minimum by blocking external light. For this project theSharp Notevision PG-C30XE projector is used, because of its wide angle (33mm)and high lumen value (1700 lm).

    Webcam: A fully UVC-compliant webcam is required; the recommended frame-rate capabilities of the webcam for adequate responsiveness range from 15 to 30frames per second in low light conditions. The webcam must be modified byremoving the infrared-blocking filter and placing an infrared-only-pass filter; thiswill ensure that the system does not pick up the projected images as users. Therecommended webcam model is Creative Live!Cam IM Ultra. There are variousinstructions on the web for modifying webcams for IR use.

    Infrared light beam: under normal lighting conditions humans emit not enoughinfrared light to be detected properly. A self-made infrared LED circuit,containing 64 OSRAM SFH 285-2 infrared LEDs, was used to produce the infraredlight beam.

    Circuit diagram of the infra-red LED array.

    Mount structure and mirror: Mounting the projector exactly above the

    displayed area often requires an extensive and expensive hardware setup. Itmust obviously be very safe and functional. Projection from a high altitude is

  • 8/3/2019 Interactive Floor

    5/36

    therefore much easier if the projector is placed on a balcony or a high plateau.To project downwards a mirror is used that reflects the projection downwards tothe floor.

    In house testing of the mounting structure in the Design Lab at IT-Univ. The

    mirror reflects the image downwards. The mounting structure also holds thewebcam (left side) and the infra-red light beamer (right side).

    The support structure for the mirror was also used to mount the webcam and theinfrared beam. Both are ideally placed towards the center of the displayed area,so the mirror was obviously the best place to do this. It also allowed to have arelatively compact hardware setup which is helpful during transportation.

    Computer(s): The software was made in such a way that it could run on eitherone computer or on two computers (one for image tracking and for interfaceprojection). In the two computer setup a TCP/IP connection is necessary so thecomputers can communicate.

    The minimum system requirements are: 2 GHz processor 1 GB RAM 1 USB port for the webcam Well-supported video card VGA port for the projector

    The recommended system requirements are: Dual-core 2 GHz processor 2 GB RAM 1 USB port for the webcam Graphics accelerator video card

    VGA port for the projector

    GNU/Linux or MacOSX hardware support is required.

    2.2 Software

    The development of the software framework was divided in two modules: tracking andinterface. The tracking module takes care of the input into the system which consisted ofuser positions. The interface module is responsible of processing this user position dataand changing the display accordingly.

  • 8/3/2019 Interactive Floor

    6/36

    The software framework was divided in two parts for flexibility as well as developmentreasons:

    Parallel development: many different software third party frameworks andlibraries needed to be tested to find the one that best suits our requirements.This work could be done in parallel to the development of the interface module

    without the interfering with each other.

    Different requirements: While user tracking has efficiency as its mostimportant requirement, the user interface had expressiveness and ease ofprogramming as its primary goals. The obvious programming language takingcare of user tracking would therefore be C++. Because of two module division,the interface module could be implemented in any other programming language.

    Different team members: Each team member had a different set of skills andexperience with programming languages and frameworks. The division allowedteam members to work in a programing language that suited their needs andexperience. Because of it's simplicity and object-orientation all interfaces were

    developed in Processing, a framework that sits on top of Java, while the

    Dividing the software into two modules to allow for parallel development created theneed for a protocol that defined the communication between the two modules. Theprotocol defined what would be communicated and in what format. Both modules wereobligated to follow the predefined protocol. This created the advantage of seamlessswitching between modules. It also created the possibility of simulating tracking dataand using it as input for the interface module.

    A more monolithic approach to parallel development was considered as well. Howeverthe advantages of working on a single code base did not outweigh the down sides ofdeveloping with multiple team members - with each different needs, skills andknowledge.

    After trying out many different software platforms, OpenFrameworks was chosenbecause the OpenFrameworks-based implementation of the Optic Flow algorithm wasfound to be the best fit for our requirements.

    2.2.1 Tracking module

    The Optic Flow algorithm was used for keeping track of objects that change their positionand their shape in a gradual manner from one frame to another.

    Since the webcam was able to sense a much wider area than the projected image,scaling users' positions and ignoring those outside the projected image's area were alsoactions performed by the tracking module.

    During the testing, it was found out that on-the-fly adjustments would be needed tocompensate as much as possible for changing light conditions; therefore the trackingmodule also allowed for adjusting the minimum and maximum blob sizes for differentuser groups, as well as image filtering to eliminate mild noise. To compensate for thewebcam perspective, the option to offset users' positions was also added to the trackingsystem.

  • 8/3/2019 Interactive Floor

    7/36

    Interface used to change tracking settings and reset tracking background image.

    Detected user positions were relayed to applications in two ways:1. Through the standard output stream (stdout)2. Through TCP/IP sockets

    Initially we relied on the standard output stream but later on switched to sockets. Themain reason for the switch was to allow us to separate the tracking module and theoutput module by giving each its own machine (and therefore more processing power).

    2.2.2 Interface module

    The interface module could be anything that used the communication protocol. In section3.2 four different scenarios are described which are implementations of the interface

    module. The relationship between the tracking module and the interface module is one-to-many.

    OverviewThe following diagram shows how the two modules integrated with each other. Theprocess begins at the tracking module where image data from the webcam is processedinto user position data. The interface module uses this data to update the scenario andsend the graphical data to the projector.

  • 8/3/2019 Interactive Floor

    8/36

    Diagram showing how the two modules integrated with each other.

  • 8/3/2019 Interactive Floor

    9/36

    Diagram showing how the two modules make use of different software frameworks.

    3 Operation

    3.1 User types

    A participant is someone standing on the projected field. Participants are able to interactwith the visible agents, as well as among themselves (if we have more than oneparticipant on the field the system allows participants to interact simultaneously), inthe different schemes available.

    Participants can enter and exit the projected field at any time and we expect their

    interaction with it to last between 3 and 15 minutes, depending on how interested theyare. We expect participants to have a very diverse age and occupation range, as theinteractive floor is pitched as a public exhibition.

    3.2 Scenarios

    The system can be used in a number of different ways depending on the applicationrunning. The tracking module always provides the same input (position of participants,amount of movement) regardless of the application running. Each application can usethis input and interpret it in whatever way is relevant. For example, an application which

    only responds to the amount of movement on the floor can ignore the position values ofparticipants provided by the input module.

    We developed four different games (scenarios) for the interactive floor, each with its ownstyle of interaction. A few applications that were developed for the system are outlinedbelow.

    3.2.1 Prisoners Dilemma

    Prisoners' Dilemma is a strategy game base on the game theory where two playerschoose whether to cooperate or defect each other while they do not know what the otherplayer's choice is. After each choice, every player receives a score depending on the

    accumulative result of both choices. If they both choose to cooperate, each player gets 3points; if one cooperates and the other defects, the defector gains 5 points while the

  • 8/3/2019 Interactive Floor

    10/36

    cooperator gets nothing; in case they both defect, then each player receives 1 point. Thegoal for each player is to outscore and win the other player while the number of gameiterations is unknown.

    Still from video of the Prisoners Dilemma. See the whole demo: http://vimeo.com/4184793

    There are different strategies to play the game and each person has their own style ofacting in different situations. Our computer generated agents, too, incorporate thesedifferent behaviors in their personality. Agents projected on the floor interact with a useras soon as the user steps into the field. Each agent possesses a specific personalitywhich could be for instance aggressive, retaliating, forgiving, etc. These differentpersonalities are in fact, strategies which agents interact through them.

    As long as a player walks in the playing field, the system recognizes a moving object andtracks him/her as they move. When the player encounters an agent they start a gameand will score based on their moves. In its standard form, the number of iterations ineach game is random thus unknown, in order to reduce chances of always aggressiveplays. During the exhibition however, three iterations were assigned to each round of thegame and players were able to play as long as they stayed in the filed.

    We realized that the game concept was hard for the audience to grasp. If they were notacquainted with prisoners' dilemma, understanding the idea behind the game wouldseem fuzzy. Although, the system was capable of tracking multiple users, we realizedthat images will interfere with each other if multiple users play simultaneously and thusthe game was designed for a single player. One goal in the development of this gamewas to enable agents to interact amongst each other. This goal was not realized to the

    limitation of the projected space as well as time constrains. The implementation ofprisoner's dilemma game was an attempt to explore interactive personalities in thecontext of user tracking and although it did not attract high user attention, it wastechnically a successful implementation.

    3.2.2 Personzilla

    Personzilla is a game where participants become godzilla-like giants in an attempt toprotect their egg from evil tanks who will stop at nothing to crack it open until they'vebeen stomped on! The aim of the game is to destroy as many tanks as possible beforethe they can shoot the egg three times.

    http://vimeo.com/4184793http://vimeo.com/4184793http://vimeo.com/4184793http://vimeo.com/4184793
  • 8/3/2019 Interactive Floor

    11/36

    Left: Still from video. Children playing Personzilla and setting another high score. Seethe whole video here: http://vimeo.com/4738059. Right: Screenshot of the displayedarea. White dots are users, green tanks are tanks ready to kill the egg and the red tanksare destroyed tanks.

    Designed in an attempt to explore multiple users tracking, personzilla was the mostsuccessful scenario in attracting audience. As multiple players walked through the field,they could interact with projected tanks and destroy them by stepping onto them.

    The simplicity of the game and the active playing style was an advantage where eventhe small kids trying the game could easily comprehend and have fun with. Amongdifferent user groups, kids were the most attracted to personzilla.

    The game style allowed for incorporating very flexible handling of user tracking tominimise as much as possible the interference caused by changing light conditions and

    other unexpected factors (children hiding behind the installation's curtains, for instance).

    Personzilla also employed sound and music for better feedback and immersionexperience: tanks explode when stepped on and the happy soundtrack we chose on days2 and 3 of the exhibition set the mood for enjoyment (on day 1 we had chosen asoundtrack composed of war sounds that, while being excellent, was deemed not entirelyappropriate for children).

    3.2.3 Meditation game

    This idea behind this game was to create something a little less conventional: a gamewhere peoples' natural movements and tendency to fidget could be recorded as userscores. In this game the aim is to simply sit on the floor for one minute without movingor fidgeting. The application monitors players as they sit still and records any movement.The person recording the least movement at the end of the 'meditation' session is thewinner.This is one idea we had when thinking about scenarios which our interactive floor couldenable but which would not make sense as a conventional computer game (using mouse,keyboard or game controllers).

    http://vimeo.com/4738059http://vimeo.com/4738059
  • 8/3/2019 Interactive Floor

    12/36

    One of the better players we've encountered during usability testing

    3.2.4 Pong game

    This scenario is an implementation of the pong's gameplay where players slide theirpaddle by moving along the field sides. Players move their paddles to bounce a ball backand forth. The aim for each player is to earn more points by bouncing the ball in thedirection where the opponent misses to catch up.Pong was developed as a two player game to experiment assigning user IDs to playersand keeping a record of each identity. The implementation of the game faced issues dueto the noise caused by lighting conditions as well as a programming issue that dismissedplayers if they accidentally walked away from the sides of the field.

    Still from video. Testing pong German game. See the demo: http://vimeo.com/4184980

    3.3 Installation

    We had quite a few requirements for the interactive floor and the space it occupied wasvery important to us. We needed to install the projector and connect the computersrunning the code. The projector had to be a few meters above the actual floor so weneeded a spot that allowed us to be on the floor and still reach the projector andcomputers. The only suitable spot we found was by the stairs leading up the 1st floor.

    The projector ended up on the 1st floor (on top of a table to give it a little more height).

    http://vimeo.com/4184980http://vimeo.com/4184980
  • 8/3/2019 Interactive Floor

    13/36

    To keep the light out of the projected area we hung curtains by attaching one end to thesides of the stairs. On the first day of the exhibition we improved this by erecting wallsaround the space to keep even more light out and to prevent people moving the curtainsfrom outside.

    Our desired location within Lindholmen Science Park

    We also found that by placing the white walls face down on the floor (a grey patternedcolour) we ended up with a much clearer image from the projector. This change,however, proved problematic for our tracking module. The resulting infrared imageshowed people blending in with the floor much more than before, reducing the accuracyof the tracking algorithm. We considered using different materials to cover the floor butwith very little time left decided to go back to projecting on the grey floor.

    Two posters were designed to put up on the wall at the entrance of the project in orderto explain about the work. A brief description of the project was given in the projectposter (see bellow) and implemented scenarios were explained in the second postertitled scenarios. Since following different scenarios was essential in understanding theproject, posters served as a mediator for explaining the project as well as its scenarios

    and game plays to visitors.

  • 8/3/2019 Interactive Floor

    14/36

    Project poster

    Scenarios poster

  • 8/3/2019 Interactive Floor

    15/36

    3.4 Licensing

    Our project was largely made possible by adapting existing Free Software to our needs -both OpenFrameworks and Processing are Free Software frameworks in themselves.

    Therefore, we recognise the advantages that come with Free Software and have decidedto license it under the terms of the GNU General Public License Version 3.0 orhigher, hoping our development efforts will be useful to others as well.

    4 Realisation

    The project started with working several project ideas which we published in ourdevelopment blog, located at: http://idp.mexinetica.com/blog/ .

    Chief among these ideas, which were proposed by mindstorming, we had four concepts: A "playing field" which interacts with users, with a very basic projected image for

    "personality", a character to interact with. An augmented refrigerator - with a display showing animated characters that

    reflect the status of different items inside it who react to the user. A space shooter game where the spaceship is not directly controlled by the user,

    but by a computer-controlled character displaying emotions and reacting to userinput.

    An application which displays an animated character and tracks if user is payingattention. When the user fidgets, the animated character becomes angry. Theessence of this idea would actually return in the end, as the "meditation"scenario.

    Recombining these ideas the Interactive Floor was the one project concept thateverybody could get involved in the way he wanted. It seemed to be a challenge tocreate the device, but still feasible. After settling on the Interactive Floor we developedtwo prototypes: one prototype focused on how the device would look when it would work(see image). The second prototype was one of the many software frameworks we triedwith a basic version of the tracking algorithm, using Miis (Nintendo Wii avatars) insteadof real people to demonstrate basic tracking functionality.

    Still from video. This prototype used real video footage with the interactive floor digitallyrendered into the images.

    http://idp.mexinetica.com/blog/http://idp.mexinetica.com/blog/
  • 8/3/2019 Interactive Floor

    16/36

    Still from video of Miis walking around which was used as input to demonstrate the

    tracking algorithm prototype.

    After the prototype phase we got feedback from groups with experience in similarprojects that we would really need an infra-red light beam to be able to capture userinput properly. Without the infra-red light there would be much more noise, or we wouldhave to use natural light which is much more unstable, and agents projected on the floorwould be picked up erroneously as users.

    To test the concept and our initial tracking code, we used pre-recorded videos of movingobjects (e.g. Nintendo Miis) instead of a live video feed and applied the trackingalgorithm to the video. And instead of using a projector straight away, we viewed theresults on the monitor.

    We knew that the test environment indoors was very different from the space at theexhibition. The ceiling in the design studios was a lot lower and we weren't able to fixthe projector high enough to give us the projection area we were after. We weretherefore unable to test with real people inside the design studios. To get around thisproblem we placed the beamer and its stand over one of the tables in the design studioto project against the floor and we used our feet and a rod with reflective materialattached to one end to mimic user movements.

  • 8/3/2019 Interactive Floor

    17/36

    "Feet chasing monster" prototype

    To test the scenarios most of the time the mouse was used to simulate the input - whichwas useful, yet we ran into a few issues which we had not realised:

    The mouse support was coded in such a way that it never sent a "user entered/user left" notification.

    It reported position events much more frequently than the webcam did. Multiuser event generation was awkward and scenario-dependent.

    A multiuser event simulator was partially developed, but it was not finished because wepreferred to focus our efforts into the tracking code and the scenarios.

    The scenarios comprising the interface were developed simultaneously to the trackingsoftware; the first test scenario consisted of a "monster" chasing the user's locationreported from the input.

    The second scenario to be developed was the Prisoner's dilemma, which made us realisethe following facts:

    A player avatar is needed because the tracking is not 100% accurate. The interface for making choices had to be very explicit and able to detect when

    its graphics were out of bounds (outside of the projected image) The interface should also be able to convey at least some usage instructions by

    itself Graphics should be simple and with contrasting colours.

    Finally, the Personzilla and Meditation scenarios' development was started in parallel afew weeks before the exhibition, as the tracking system was considered ready, mainly tofully explore the interaction possibilities but also to debug the tracking system.The main design factor behind Personzilla was multiuser and active interaction, the factthat it should be a game with tanks being crushed was more of a sudden inspirationcombined with the fact that people are stepping on the images. The reasons for creatingthe meditation game were to experiment with longer tracking of multiple users and alsoto create something a little less conventional than a typical computer game.

    5 Evaluation

    Carrying out this project was a very interesting experience for us and enabled us toexplore a number of areas related to human interaction with such systems, thetechnologies behind these systems and the issues likely to arise.

    5.1 Costs and Free Software

    One of the biggest obstacles we came across when researching these types of systemswas the costs involved. On the hardware side, finding a suitable projector was difficultand the cost of buying a more powerful projector (or one with a wider angle) was muchtoo high.

    On the software side, we could not find any existing solutions or any frameworks (free orclosed-source) to help developers build applications for these types of systems. Onemajor downside to this is that we had to spend a considerable amount of time buildingand testing our own framework instead of focusing on applications and user interaction.

    We were all keen to stick to free software solutions as much as possible so we focusedon free software frameworks with support for image and video analysis. We did

  • 8/3/2019 Interactive Floor

    18/36

    eventually find some some code which partly implemented some of the basic tracking wewere hoping for and ended up building on top of that.

    5.2 Sensitivity to Light

    Another problem we encountered was changes in light which affected the trackingsystem. The tracking software works by taking a snapshot of the floor when it starts andusing that against subsequent frames to produce a difference image. The differencebetween these frames, in a controlled environment, should only be the presence ofpeople on the floor. When light conditions change, however, the tracking system easilybecomes confused because it's suddenly comparing images of the floor with more lightthan on the original snapshot. The light is sometimes identified as a user, sometimes itaffects an actual user's position on the floor.

    Despite our best efforts to keep light out of the project area by erecting walls around thespace, we still experienced problems with tracking due to changes in light.

    5.3 Speed

    Another aspect of the system which was important to us was speed. We had twocomponents running simultaneously - a tracking component which was constantlyanalysing a live video feed from our webcam, and the display/reaction component whichwas constantly reading from the tracking component and projecting something inresponse at a high frame rate. This put a great burden on the computer. We managed tolessen the load by splitting these tasks to separate PCs and letting the componentscommunicate via a wireless connection.

    Despite these changes we still experienced an unexpected speed boost on the last day ofthe exhibition. We're still not completely sure why, but one likely reason could be the IRfilters we used in our modified webcam. The filters block out a lot of light and end upreducing the framerate of the video feed unless we can compensate with enough IR lightof our own. We suspect one of these filters came loose causing more light to enter thewebcam resulting in a speed increase.

    5.4 Foot as Pointing Device

    Observing people who hadn't been on the interactive floor before we noticed many using

    their feet as a kind of pointing/clicking device. Especially with games were objects wereprojected, people's immediate reaction was to try and stamp on the objects with theirfeet. This proved to be a problem as the tracking component could not tell when a user'sfoot hit the floor and calculated the position based on the user's entire body (not just thefoot).

    While the tracking component could perceive the users' shapes, transmitting this data tothe scenarios proved to be unfeasible because it required a much more complexcommunication protocol; to deal with this problem, a collision detection query wasproposed but unfortunately was not carried out due to time constraints.

    Since it was not possible for the interactive scenarios to accurately detect cases wereusers stuck their feet out to stamp on object, we could, and did, make assumptionsabout users' heights so we could calculate positions around the feet.

  • 8/3/2019 Interactive Floor

    19/36

    A better approach might have been to integrate shock sensors for footfall recognition(see The Famous Grouse Experience by ART+COM). This would, however, have addedsome extra complexity to our system.

    5.5 Shadows

    Shadows were not a big problem, but unlike similar systems our projector waspositioned to one side of the projected floor and projecting at a slight angle. This anglemeant we had to deal with slightly more extended shadows than in systems whichposition the projector exactly in the center of the floor. In games which required users tolook out for projected, moving objects (such as personzilla and prisoner's dilemma) theywere a slight nuisance.We discussed two options for reducing shadows. One option was to try and use twoprojectors projecting from two sides, the other was to move our projector to the centerof the floor. The first option was too complicated as we would have to configure bothprojectors to align the projected images up exactly. The second option was tricky

    because we had no easy way of securing the projector to prevent it falling.

    5.6 Sound, ambient sound and music

    Sound effects were considered first as an obvious complement to the visuals ofpersonzilla - stepping on a tank should produce some sound. However we found out thatit was not only an "obvious complement" but rather a much needed feedback, as it madethe results of the interaction completely unambiguous when people were not familiarwith the graphics -no matter how different we tried to make a "live" tank look incomparison to a crushed tank, it was never explicit enough- and also worked around theshadow problems mentioned earlier: users no longer needed to see that the tank was

    crushed, they knew it from hearing the sound effects.

    On the other hand, having ambient sound or music dramatically changed the users'appreciation of the project. Without ambient sound, they would just wander by, see theimages on the floor, perhaps crush one or two tanks, and go somewhere else.With war ambient sounds they would spend at least three minutes testing the project,giving the game enough time to become difficult and more interesting. The war ambientsounds created an athmosphere of war, which though good to our game in general,would probably not be the most appropriate for children, so on the second day of theexhibition a decision was taken to change it for a happy, instrumental electronic musicsoundtrackwith the goal of making the game seem less serious and more like a fun contest.

    6 Related Work

    Related work falls into two categories: works with similar goals to ours and works whichused technologies similar to ours but with different goals in mind.

  • 8/3/2019 Interactive Floor

    20/36

    6.1 Living Surface by Vertigo Systems

    Vertigo Systems, a German company, have a similar product called Living Surface. Wedo not have very much information on technical aspects of the system, particularly thesoftware, but we do know the hardware used is similar to the hardware we used: aprojector, video camera, infrared LEDs. We suspect one difference between their systemand ours is the use of a wide angle lens on the projector. Their installation in theUniverseum projects on a large area of the floor despite the projector being a shortdistance away from the floor.

    6.2 The Famous Grouse Experience by ART+COM

  • 8/3/2019 Interactive Floor

    21/36

    ART+COM, another German company, have worked on similar interactive floor systems

    for various clients. This particular work was created in 2002 using 6 PCs and 6

    projectors, 6 infrared spotlights and 2 cameras for tracking, and 8 shock sensors for

    footfall recognition! (They also used a separate PC just to control everything.)

    6.3 Audience by Chris O'Shea

    Chris O'Shea's work, Audience, uses user tracking in a different way. It is an installationmade up of a number of mirrors which all point at a particular person and follow the

  • 8/3/2019 Interactive Floor

    22/36

    him/her around (keep the mirror pointed at the user).The idea here is not an interactive

    floor, but it is related because of the hardware and software used in implementation.

    Chris uses free software such as openFrameworks and OpenCV to implement the

    tracking part of the system. We heard about his work through forums discussing the

    topic of user tracking using free software and contacted him for more information. The

    information we found on the forums, and looking at projects like Audience, influenced

    our decision to use openFrameworks and OpenCV to implement the tracking part of our

    own system.

  • 8/3/2019 Interactive Floor

    23/36

    7 References

    Living Surface: http://livingsurface.de

    Accessed on: 19 May 2009

    The Famous Grouse Experience: http://www.artcom.de/index.php?lang=en&option=com_acprojects&id=7&Itemid=113&page=6Accessed on: 19 May 2009

    Audience: http://www.chrisoshea.org/projects/audience/Accessed on: 19 May 2009

    OpenCV: http://opencv.willowgarage.com/wiki/Accessed on: 27 May 2009

    OpenFrameworks: http://www.openframeworks.cc/

    Accessed on: 27 May 2009

  • 8/3/2019 Interactive Floor

    24/36

    8 Appendices

    8.1 Project plan

    Week 5Form groups, brainstorm about project ideasPresent three project proposals

    Week 6Decide on projectCreate prototype(s)Present final project with prototypes

    Week 7Discuss project possibilities, general approach and software and hardware requirementsDecide on development schedulePrepare pitch presentationPresent pitch

    Week 8, 9, 10Develop software for tracking and interfaces / scenariosCreate mounting prototypeTest different webcams and projectorsTest prototype with soft- and hardware in design labs

    Week 11Find testing location in Science parkTest setup in Science Park

    Week 12, 13, 14Develop software for tracking and interfaces / scenariosTest setup in Science Park

    Week 15, 16, 17Create websiteDevelop software for tracking and interfaces / scenariosTest setup in Science Park

    Week 18Prepare for exhibitionMake last changes to soft- and hardware

    Week 19Exhibition time!

  • 8/3/2019 Interactive Floor

    25/36

    8.2 Design document

  • 8/3/2019 Interactive Floor

    26/36

    Interactive Floor

    Design Document 4.0

    Mohammad Ardavan

    Amir Chamsaz

    Eelke Boezeman

    Keyvan Minoukadeh

    Alejandro Valenzuela

    Last modified: 2009-05-27

  • 8/3/2019 Interactive Floor

    27/36

    Index

    1 Introduction......................................................................................................................................1

    1.1 Background...............................................................................................................................1

    1.2 Design Goals.............................................................................................................................1

    1.3 Related Work............................................................................................................................1

    2 Architecture......................................................................................................................................1

    2.1 Overview...................................................................................................................................1

    2.2 Software dependencies.............................................................................................................2

    2.3 Components..............................................................................................................................3

    2.4 Data...........................................................................................................................................3

    2.5 Communication.........................................................................................................................3

    3 Operation..........................................................................................................................................4

    3.1 User types.................................................................................................................................4

    3.2 Scenarios...................................................................................................................................4

    3.3 Installation................................................................................................................................73.4 Licensing...................................................................................................................................7

    4 Development.....................................................................................................................................7

    5 References........................................................................................................................................8

  • 8/3/2019 Interactive Floor

    28/36

    1 Introduction

    1.1 Background

    An Interactive Floor is a real video-projection floor display, where images appear and people caninteract with them. Compared to typical interfaces the Interactive Floor offers the user the capability

    of really being part of the interaction. All the user needs to do is walk into the projected area and he

    becomes part of the digital world that is projected around him.

    The setup consists of a beamer projecting images on the floor and a webcam scanning the projected

    area for user input. The beamer projects interactive interacting personalities on the floor that react to

    the user moving within the projected field.

    The interactive floor projection is an attractive high tech application of image projection formeetings and events. A floor projection can be used in almost any place imaginable: the entrance of

    a venue, the floor of a stand, a dance floor, a table surface Attention guaranteed; a great tool to

    attract people and get exposure.

    1.2 Design Goals

    Develop a virtual surface by projecting images on the floor.

    Process input from multiple users simultaneously.

    Research limits on efficient projection and tracking with ordinary commercial hardware.

    Present agents that interact with the user as well as among themselves.

    Use these agents to simulate interacting interactive personalities.

    Present various interactive scenarios such as small strategy games and less goal-oriented

    schemes.

    Research these interaction schemes to find guidelines for creating suitable interface for the

    Interactive Floor

    1.3 Related Work

    Vertigo Systems, a German company, has a similar product called Living Surface:

    http://living-surface.de.

    We do not have very much information on technical aspects of the system, particularly the software,

    but we do know the hardware used is similar to the hardware we intend to use: a projector, videocamera, infrared LEDs. Although we are not sure, we suspect one difference between their system

    and ours is the use of a wide angle lens on the projector. Their installation in the Universeum,

    Gteborg, projects on a large area of the floor despite the projector being a short distance away

    from the floor.

    2 Architecture

    2.1 Overview

    The following figure shows an overview of the logical design of the hardware and the software.

    1

    http://living-surface.de/http://living-surface.de/
  • 8/3/2019 Interactive Floor

    29/36

    The webcam captures an image and converts the image into user tracking data the Tracking

    module. This data adheres to a predefined protocol. The user tracking data serves as input to the

    Interface module. The Interface module uses this data to project an interface with a beamer.

    The system works as a loop: the image projected by the Output component provides an interface for

    the user projected on the floor. If the user decides to interact with this interface, the Input

    mechanism captures his movement. This results in user tracking data that serves as input to the

    Output mechanism upon which the interface can respond to change the interface accordingly.

    2.2 Software dependencies

    The following figure shows the software dependencies

    2

    Figure 1: Hardware and software flow

  • 8/3/2019 Interactive Floor

    30/36

    The Interface Module and interactive scenarios depend on Processing 1.0.3

    The Tracking Module depends on OpenFrameworks'0051 and its implementation of the

    Optic Flow algorithm.

    2.3 Components

    The interactive floor is composed of the following hardware and software

    Input: Webcam, UVC compliant (with an IR filter for the lens)

    Output: Wide-angle Beamer

    Tracking Software: Image processing software (OpenFrameworks; custom enhancements to

    Optic Flow Algorithm)

    Interface software: Graphics, Interactive Scenarios (Processing)

    Platform: Ubuntu, Debian GNU/Linux and MacOSX are supported

    2.4 Data

    Input: Image data obtained from webcam.

    OpenCV-compatible internal representation.

    Internal representation after image processing and Optic Flow algorithm.

    Interactive Scenarios' internal states

    2.5 Communication

    The main communication in the system is a cycle between the tracking and interface modules. The

    tracking module analyses a live video stream of the floor and extracts information such as peoples

    positions, direction of movement and a general level of movement.

    This information is passed to a separate output module which determines the next action of the

    currently running application. The result of this process is then projected back onto the floor,

    prompting the user(s) to make his/her next move. The cycle continues until the application comes to

    an end. Technical details are specified below

    1 '006 in the case of MacOSX

    3

    Figure 2: Software dependencies

  • 8/3/2019 Interactive Floor

    31/36

    A bitmap buffer in UVYV or JPEG format is received from the camera drivers, which is

    processed by OpenCV/OpenFrameworks.

    Subsequently, OpenCV/ OpenFrameworks employs filtering techniques to detect people

    present inside the image, and the Optic Flow algorithm to keep track of their movement, and

    produces User Event data.

    User event data is passed from the Tracking module to the Interface module in User

    Tracking Protocol Format through TCP sockets. This format is subject to change as the

    project is developed. Its description follows:

    A string of variable length between 45 and 70 bytes, NULL-terminated, containing the

    following fields separated by straight pipes:

    - User event type; one digit [0: User joined; 1: User moved; 2:User exited; 3:User mood]

    - User number; one or two digits

    - User's X position; floating point number [0.0 1.0], relative to virtual environment width

    - User's Y position; floating point number [0.0 1.0], relative to virtual environment height

    - User's X orientation coordinate; floating point number [0.0 1.0]

    - User's Y orientation coordinate; floating point number [0.0 1.0]* Measure of an user's activity pattern.

    A standard VGA/SVGA/XVGA signal in a compatible resolution and frequency is sent to

    the beamer.

    3 Operation

    3.1 User types

    If the system is deployed in an adequate, light-controlled setting, no calibration is necessary.Therefore only one type of user is recognised:

    Participant:

    A participant is someone standing on the projected field. Participants are able to interact

    with the visible agents, as well as among themselves, in the different scenarios available.

    Participants can enter and exit the projected field at any time and we expect their interaction

    with it to last between 3 and 15 minutes, depending on how interested they are. We expect

    participants to have a very diverse age and occupation range, as the interactive floor is

    pitched as a public exhibition, however it is usually children who are most interested in it.

    3.2 Scenarios

    The system can be used in a number of different ways depending on the application running. The

    Tracking module always provides the input in the same format (position of participants, amount of

    movement) regardless of the application running. Each application can use this input and interpret it

    in whatever way is relevant to the application. For example, an application which only responds to

    the amount of movement on the floor can ignore the position values of participants provided by the

    input module.

    The scenarios developed for the Interactive Floor showcase much of its functionality but are by no

    means exhaustive:

    4

  • 8/3/2019 Interactive Floor

    32/36

    Strategy game: Prisoner's Dilemma

    Prisoners' Dilemma is a strategy game where two players choose whether to cooperate or defect

    each other while they do not know what the other player's choice is.

    After each choice (iteration), every player receives a score depending on the accumulative result of

    both choices. If they both choose to cooperate, each player gets 3 points; if one cooperates and the

    other defects, the defector gains 5 points while the cooperator gets nothing; in case they both defect,then each player receives 1 point. The goal for each player is to outscore and win the other player

    while the number of game iterations is unknown.

    There are different strategies to play the game and each person has their own style of acting in

    different situations. Our computer generated agents, too, incorporate these different behaviors in

    their personality. Agents are projected on the floor and interact amongst each other and with a user

    as long as the user steps into the field. Each agent possesses a specific personality which could be

    for instance aggressive, retaliating, forgiving, etc. These different personalities are strategies which

    agents interact through them.

    As soon as a player steps into the playing field, the system recognizes them and tracks them as theymove. When the player encounters an agent they enter a game and will score based on their moves.

    The number of iterations in each game is unknown in order to reduce chances of always aggressive

    plays. Each player tries to maximize their score and as soon as the game finishes, the space around

    the winner shines and the looser leaves the field.

    PersonzillaPersonzilla is a game where participants become godzilla-like giants in an attempt to protect their

    egg from evil tanks who will stop at nothing to crack it open until they've been stomped on! The

    aim of the game is to destroy as many tanks as possible before the they can shoot the egg three

    times.

    It's easy to understand yet challenging and active nature make it popular, especially with children.

    5

    Figure 3: Prisoner's dilemma interface

  • 8/3/2019 Interactive Floor

    33/36

    Meditation Game

    In this game the aim is to simply sit on the floor for one minute without moving or fidgeting. The

    application monitors players as they sit still and records any movement. The person recording the

    least movement at the end of the 'meditation' session is the winner.

    Pong

    This scenario is an implementation of the pong's gameplay where players slide their paddle by

    moving along the field sides. Players move their paddles to bounce a ball back and forth. The aim

    for each player is to earn more points by bouncing the ball in the direction where the opponent

    misses to catch up.

    6

    Figure 4: Children playing Personzilla

    Figure 5: User experience of the

    Meditation scenario

  • 8/3/2019 Interactive Floor

    34/36

    3.3 Installation

    The system, in its most basic setup, consists of the following hardware

    A projector pointing at a mirror (positioned at an angle) which reflects the image down onto

    the floor. This is placed a few meters up from the floor to allow a large enough projection

    area. A custom support must be built for this effect.

    A UVC-compliant webcam pointing at the projected area to pick up movement of people

    below.

    IR LEDs positioned next to webcam pointing in the same direction (at the projected area)

    A dedicated computer connected to the projector (VGA cable) and webcam (USB cable)

    The software employed in the system consists of

    A GNU/Linux Operating System (such as Debian or Ubuntu) or MacOSX

    The OpenCV library, used for processing images captured from the webcam

    Tracking module

    Interface module (Scenarios)

    Once the hardware has been set-up, a ready-to-use live-CD or live USB memory stick can be

    provided to enable usage among a wide range of computers.

    3.4 Licensing

    Our project was largely made possible by adapting existing Free Software to our needs - both

    OpenFrameworks and Processing are Free Software frameworks in themselves.

    Therefore, we recognise the advantages that come with Free Software and have decided to license it

    under the terms of the GNU General Public License Version 3.0 or higher, hoping our development

    efforts will be useful to others as well.

    4 Development

    Week 5

    7

    Figure 6: Pong game

  • 8/3/2019 Interactive Floor

    35/36

    Form groups, brainstorm about project ideas

    Present three project proposals

    Week 6

    Decide on project

    Create prototype(s)Present final project with prototypes

    Week 7

    Discuss project possibilities, general approach and software and hardware requirements

    Decide on development schedule

    Prepare pitch presentation

    Present pitch

    Week 8, 9, 10

    Develop software for tracking and interfaces / scenarios

    Create mounting prototypeTest different webcams and projectors

    Test prototype with soft- and hardware in design labs

    Week 11

    Find testing location in Science park

    Test setup in Science Park

    Week 12, 13, 14

    Develop software for tracking and interfaces / scenarios

    Test setup in Science Park

    Week 15, 16, 17

    Create website

    Develop software for tracking and interfaces / scenarios

    Test setup in Science Park

    Week 18

    Prepare for exhibition

    Make last changes to soft- and hardware

    Week 19Exhibition time!

    5 References

    Living Surface: http://livingsurface.de

    Accessed on: 19 May 2009

    The Famous Grouse Experience: http://www.artcom.de/index.php?

    lang=en&option=com_acprojects&id=7&Itemid=113&page=6

    8

  • 8/3/2019 Interactive Floor

    36/36

    Accessed on: 19 May 2009

    Audience: http://www.chrisoshea.org/projects/audience/

    Accessed on: 19 May 2009

    OpenCV: http://opencv.willowgarage.com/wiki/

    Accessed on: 27 May 2009

    OpenFrameworks: http://www.openframeworks.cc/

    Accessed on: 27 May 2009