Manas Mittal, Joseph Paradiso Responsive Environments MIT Media Lab Tricorder X-Reality Lunch Talk...

Click here to load reader

download Manas Mittal, Joseph Paradiso Responsive Environments MIT Media Lab Tricorder X-Reality Lunch Talk 22 nd Feb, 2008.

of 10

Transcript of Manas Mittal, Joseph Paradiso Responsive Environments MIT Media Lab Tricorder X-Reality Lunch Talk...

Tricorder

Manas Mittal, Joseph ParadisoResponsive EnvironmentsMIT Media Lab

TricorderX-Reality Lunch Talk22nd Feb, 20081ClairvoyanceClairvoyance = Clair (Clear) + Voyant (Seeing)

The ability to gain information about an object, location or physical event through means other than the known human senses.

[Wikipedia]We all want to be clairvoyant. Not just for knowing magical things, but even just simple things about our surroundings. When I walk back from the lab, I want to know if La Verde's open. Before I knock at my friends dorm door, I want to know if he is awake.

It is simple to put simple sensors to answers my questions. A simple vibratab on Laverde's cash counter can be provide enough data that a cashier is there and you can guess that it the store is probably open. Similarly, a similar sensor on my friends desk can tell if he is awake - if he is up, he tends to sit on his chair and work.

So basically, it is cheap / it will be cheap to have sensor data. How can we easily look at all this data ?

I am building a mobile interface for browsing this sensor data. Here was the first version - we called it the tricorder based on StarTrek.2Point and SeePlug data : light, microphone, vibration, PIRCompassRadio

[Lifton et. al]This was the first version of this.

Josh Lifton, Michael Lapinski and Me built this Tricorder. It pulls data off the Plug Sensor Network. The Plug sensor Network was a network of around 30 power strips with sensors, a radio and a microcontroller built into it.

This Tricorder had a compass and a wireless radio and would directly talk to these sensors and display this data. The visualization was these static nodes which shook or moved as sensors reported vibration, light etc.3V2Zoomable User Interface

Selection and Aggregation

Tagging

I am now building the second version. The second version has a few key features Since we are hoping that there are lots of sensors around, I am building a zoomable user interface. In this interface, youll see more detail as you zoom into the data.

The system will also allow selection and aggregation for sensor data.

Finally, we want to allow users to come up higher level primitives inferred from sensor data for example, if temperature is > 70, Vibration is present for a sensor node on a desk, a person is probably present.

Finally, the system.

Ill give you a quick selection and aggregation example before talking about the hardware.4Selection & AggregationSelection : Location Data Values

MinMaxLegend

So for selection and aggregation, I have some example interactions

There are 2 key divisions here Location of interest Sensor Values.

Similarly, the user could be interested in the aggregate sensor values corresponding to a region. She scribbles on the map, and selects the sensor she wants values to, and the system fills up the area with color coded aggregate information.5Selection & AggregationSelection : Location Data Values

MinMaxLegend

So for selection and aggregation, I have some example interactions

There are 2 key divisions here Location of interest Sensor Values.

Similarly, the user could be interested in the aggregate sensor values corresponding to a region. She scribbles on the map, and selects the sensor she wants values to, and the system fills up the area with color coded aggregate information.6Selection & AggregationSelection : Data Values Location

Eg: Selection changes asslider is moved.

Selector

MinMax11 / 16In another example, consider when the user wants to select regions with temperature between 60 and 80 degrees. As the user moves the temperature slider, the selected areas fill up on the corresponding map.7Selection & AggregationSelection : Data Values Location

Eg: Selection changes asslider is moved.

Selector

MinMax11 / 16In another example, consider when the user wants to select regions with temperature between 60 and 80 degrees. As the user moves the temperature slider, the selected areas fill up on the corresponding map.8N810I/O: Touchscreen, Keyboard, Camera, GPS, Memory SlotNetworking : Wi-Fi, Bluetooth, USB 2.0 (Peer Mode)Linux (Maemo), Python, Java, SWT for UI design, FlashProcessor: 400 MhZ

CompassRadio TI MSP 430

Ill talk about the platform This is a Nokia N810, Nokias newest device. The device is an ideal development platform for most mobile applications. Runs Linux, also has a JRE port on it. You can do graphics with SWT or with Flash. It also has camera and does Wi-Fi. Also great is that it now has peer mode USB, so you can attach other devices and it will behave as a master (except you need to power the devices).

To this, Ive added a compass and a Radio a TI MSP 430.

9Why Handheld?Real Time Sensor DataWeb is mostly non-real time.Hard to extract meaning out of sensor data.

Why HandheldTags are shared, and corrected.Meaningful tags emerge.10