Integrating People, Places, and Things into a Desktop Search Engine

38
By Kyle Rector Senior, EECS, OSU

description

Integrating People, Places, and Things into a Desktop Search Engine. By Kyle Rector Senior, EECS, OSU. Agenda. Background My Approach Demonstration How it works The Survey Plans for User Evaluation Future Plans. What is the Issue?. - PowerPoint PPT Presentation

Transcript of Integrating People, Places, and Things into a Desktop Search Engine

Page 1: Integrating People, Places, and Things into a Desktop Search Engine

By Kyle RectorSenior, EECS, OSU

Page 2: Integrating People, Places, and Things into a Desktop Search Engine

AgendaBackgroundMy ApproachDemonstrationHow it worksThe SurveyPlans for User EvaluationFuture Plans

Page 3: Integrating People, Places, and Things into a Desktop Search Engine

What is the Issue?Amount of emails, web browsing and files on

the computer are always increasingSolutions:

Filing systemsDesktop searchWeb searchEmail filtering

However, people can misfile things, and search may not be useful if you don’t know what to query

Page 4: Integrating People, Places, and Things into a Desktop Search Engine

Related WorkVannevar Bush’s concept of memex[1]:

“…a device in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility.”

Page 5: Integrating People, Places, and Things into a Desktop Search Engine

Related WorkThree publications from EuroPARC have

investigated logging of user activitiesPEPYS[2]: used an active badge system to log

locationVideo Diary[3]: two major cues of

remembering events were people and objectsActivity-based Information Retrieval[4]:“…systems which aim to support human

memory retrieval may require special attention to the user interface; otherwise the cognitive load imposed by interaction can outweigh the reduction in load on the user’s memory”.

Page 6: Integrating People, Places, and Things into a Desktop Search Engine

Related WorkMemory landmarks: events

that stick out in one’s mindHorvitz et. al. [5] designed a

Bayesian model to predict important memory landmarks from their study

Important variables: subject, location, attendees, and whether meeting is recurrent.

Page 7: Integrating People, Places, and Things into a Desktop Search Engine

Related WorkEpisodic Memory[6]:

memory can be organized into different episodes

Ringel et. al. [7] also created a timeline display of files, emails, and web history based on user events

Page 8: Integrating People, Places, and Things into a Desktop Search Engine

Related WorkStuff I’ve Seen[8]: Desktop search which

indexes email, files, web, and calendarInitial findings from their experiment:

Time and people are important retrieval cues48% of queries involved a filter, most common

being file type25% of queries involved peopleSorting by date is a good way for people to find

items.

Page 9: Integrating People, Places, and Things into a Desktop Search Engine

Related WorkPhlat[9]: Desktop search using contextual

cuesFindings from long term study:

47% of queries involved a filterPeople and file type were the most common

filters17% of queries used only filters.

Had an issue with the aliasing of names, which RFID Ecosystem would fix

Page 10: Integrating People, Places, and Things into a Desktop Search Engine

AgendaBackgroundMy ApproachDemonstrationHow it worksThe SurveyPlans for User EvaluationFuture Plans

Page 11: Integrating People, Places, and Things into a Desktop Search Engine

My ApproachGoogle Desktop Gadget interfaceEvent filters: people, objects, location, and

timeFile filters: query string, file typeUses Google Desktop SearchDisplay results in a timeline view

My Gadget

Page 12: Integrating People, Places, and Things into a Desktop Search Engine
Page 13: Integrating People, Places, and Things into a Desktop Search Engine

AgendaBackgroundMy ApproachDemonstrationHow it worksThe SurveyPlans for User EvaluationFuture Plans

Page 14: Integrating People, Places, and Things into a Desktop Search Engine

System Architecture

User Input Google Desktop Gadget

RFID Ecosystem Database

Google Desktop Search

Browse Timeline Results

Page 15: Integrating People, Places, and Things into a Desktop Search Engine

Step 1: Configure the Database

User Input Google Desktop Gadget

RFID Ecosystem Database

Google Desktop Search

Browse Timeline Results

Page 16: Integrating People, Places, and Things into a Desktop Search Engine

Step 1: Configure the DatabaseGadget:

communicates with the database to get events

User: specifies any combination of events they would like to use

Gadget: setup to do searches, and has a dropdown list of event choices

Page 17: Integrating People, Places, and Things into a Desktop Search Engine

Step 2: Filter Your Query

User Input Google Desktop Gadget

RFID Ecosystem Database

Google Desktop Search

Browse Timeline Results

Page 18: Integrating People, Places, and Things into a Desktop Search Engine

Step 2: Filter Your QueryDesktop Search

filters:Event: before,

during, or afterFile typeText query

Event filters:PeopleLocationsObjectsDate

Page 19: Integrating People, Places, and Things into a Desktop Search Engine

Step 2: Filter Your QueryUser: specifies the

filters in the gadgetGadget:

communicates with the database to get the possible event times

User: can choose one or all

event timescan decide if they want

to search before, during, or after one or all events

Page 20: Integrating People, Places, and Things into a Desktop Search Engine

Step 3: Search Your Desktop

User Input Google Desktop Gadget

RFID Ecosystem Database

Google Desktop Search

Browse Timeline Results

Page 21: Integrating People, Places, and Things into a Desktop Search Engine

Step 3: Search Your DesktopGadget:

Accesses Google Desktop URL by using Registry Editor

Parses Google Desktop HTML to get to Browse Timeline page

Parses Browse Timeline HTML to find correct date of event

Page 22: Integrating People, Places, and Things into a Desktop Search Engine

Step 3: Search Your DesktopBrowse Timeline: History of file

modification times

Page 23: Integrating People, Places, and Things into a Desktop Search Engine

Step 3: Search Your DesktopGadget:

Parses through Browse Timeline HTML to filter files

i.e.: If you wanted files that you modified when you met with Magda on July 14th from 4:30 - 5:00pm, then files between those times will be selected.

Displays the selected results in an HTML file saved to the Temp directory

Page 24: Integrating People, Places, and Things into a Desktop Search Engine

Step 4: The Results

User Input Google Desktop Gadget

RFID Ecosystem Database

Google Desktop Search

Browse Timeline Results

Page 25: Integrating People, Places, and Things into a Desktop Search Engine

Step 4: The ResultsExample: All file types while meeting with

Magda

Page 26: Integrating People, Places, and Things into a Desktop Search Engine

AgendaBackgroundMy ApproachDemonstrationHow it worksThe SurveyPlans for User EvaluationFuture Plans

Page 27: Integrating People, Places, and Things into a Desktop Search Engine

The SurveyBefore the survey, had a simple prototype

program

Old GUI

Old Results Page

Page 28: Integrating People, Places, and Things into a Desktop Search Engine

Survey on Mobile Computer Usage within CSE

Page 29: Integrating People, Places, and Things into a Desktop Search Engine

The SurveySent survey to Faculty, Staff, Graduate, and

Undergraduate students9 questions, where 2 were demographic33 people responded to the surveyChanges made based on survey:

Object featureBefore, During, or After meeting option

Page 30: Integrating People, Places, and Things into a Desktop Search Engine

AgendaBackgroundMy ApproachDemonstrationHow it worksThe SurveyPlans for User EvaluationFuture Plans

Page 31: Integrating People, Places, and Things into a Desktop Search Engine

Plans for User EvaluationQuestions I want to answer:

Do contextual parameters (people, places, things) with relation to work events save time when doing a desktop search?

Do the size and frequency of text queries decrease when doing a desktop search?

Are the Google Desktop Gadget GUI and the results page easy and functional to use?

Page 32: Integrating People, Places, and Things into a Desktop Search Engine

Plans for User EvaluationEach participant will have six tasks:

Three with Google DesktopThree with my gadget

Develop User ScenariosPowerPoint story board with pictures and speechWill only be seen for a temporary amount of time

Users complete search tasksParticipants should remember and use contextual

information to make searching easier

Page 33: Integrating People, Places, and Things into a Desktop Search Engine

Plans for User EvaluationDo contextual parameters (people, places,

things) with relation to work events save time when doing a desktop search?Time how long a participant takes from the end

of the story session to successfully completing a task

Compare Google Desktop Search times to my gadget desktop search times

Page 34: Integrating People, Places, and Things into a Desktop Search Engine

Plans for User EvaluationDo the size and frequency of text queries

decrease when doing a desktop search?Review what types of filters subjects are usingCount how many times a subject does not use

text in their queryIf they use text, count how many words are in the

queryCan compare results to previous work (Phlat,

Stuff I’ve Seen)

Page 35: Integrating People, Places, and Things into a Desktop Search Engine

Plans for User EvaluationAre the Google Desktop Gadget GUI and the

results page easy and functional to use?Will have participants answer a evaluation survey

after the tasks are doneSubjects will rate features and output page using

the Likert scale

Page 36: Integrating People, Places, and Things into a Desktop Search Engine

AgendaBackgroundMy ApproachDemonstrationHow it worksThe SurveyPlans for User EvaluationFuture Plans

Page 37: Integrating People, Places, and Things into a Desktop Search Engine

Any Questions?

Page 38: Integrating People, Places, and Things into a Desktop Search Engine

Sources1. Bush, V. As we may think Atlantic Monthly 176, 101-108 (1945).2. Newman, W., Eldridge, M., Lamming, M. PEPYS: Generating

autobiographies by automatic tracking. ECSCW Amsterdam, The Netherlands 175 – 188 (1991).

3. Eldridge, M., Lamming, M., Flynn, M. Does a video diary help recall? People and Computers VII Cambridge University Press, Cambridge 257 – 269 (1992).

4. Lamming, M., Newman, W. Activity-based information retrieval: technology in support of personal memory.

5. Horvitz, E., Dumais, S., Koch, P. Learning predictive models of memory landmarks. In Proceedings of the CogSci 2004: 26th Annual Meeting of the Cognitive Science Society, Chicago, USA, August 2004 (2004).

6. Tulving, E. Elements of episodic memory. Oxford University Press (2004).7. Ringel, M., Cutrell, E., Dumais, S., Horvitz, E. Milestones in time: the value

of landmarks in retrieving information from personal stores. Proceedings of Interact (2003).

8. Dumais, S., Cutrell, E., Cadiz, J., Jancke, G., Sarin, R., Robbins, C. Stuff I’ve seen: a system for personal information retrieval and re-use, SIGIR’03, July 28 – August 1, 2003, Toronto, Canada. (2003).

9. Cutrell, E., Robbins, D., Dumais, S., Sarin, R. Fast, flexible filtering with Phlat – personal search and organization made easy, Proceedings in CHI 2006, April 22-27, 2006, Montreal, Quebec, Canada (2006).