Path finding Framework using HRR

Post on 23-Feb-2016

41 views 0 download

description

Surabhi Gupta ’11 Advisor: Prof. Audrey St. John. Algorithm and associated equations. Path finding Framework using HRR. Roadmap. Circular Convolution Associative Memory Path finding algorithm. Hierarchical environment. Locations are hierarchically clustered. X 1. X 4. j - PowerPoint PPT Presentation

Transcript of Path finding Framework using HRR

Path finding Framework using HRRAlgorithm and associated equations

Surabhi Gupta ’11Advisor: Prof. Audrey St. John

Roadmap

Circular Convolution Associative Memory Path finding algorithm

Hierarchical environment

Locations are hierarchically clustered

d e f

a b c

j k l

m n o

Z

X1

X2 X3Y1

X5

X4

X6Y2

g h i

p q r

Tree representation

The scale of a location corresponds to its height in the tree structure.

The node of a tree can be directly queried without pointer following

Maximum number of goal searches = height of the tree

Circular ConvolutionHolographic Reduced Representations

Circular Convolution (HRR) Developed by Tony Plate in 1991 Binding (encoding) operation –

Convolution Decoding operation – Involution

followed by convolution

Basic Operations

1) Binding2) Merge

Binding - encoding

C≁AC≁B

Circular Convolution ( )

Elements are summed along the trans-diagonals (1991, Plate).

Involution

Involution is the approximate inverse.

Decoding

Basic Operations

1) Binding2) Merge

Merge

Normalized Dot product

Properties

Commutativity: Distributivity:

(shown by sufficiently long vectors) Associativity:

Associative MemoryRecall and retrieval of locations

Framework

d e f

a b c

j k l

m n o

Z

X1

X2 X3Y1

X5

X4

X6Y2

g h i

p q r

Assumptions

Perfect tree – each leaf has the same depth

Locations within a scale are fully connected e.g. a,b and c, X4, X5 and X6 etc.

Each constituent has the same contribution to the scale location (no bias).

a

Z

X1

X2 X3Y1X5

X4

X6Y2 p

Associative Memory

Consists of a list of locations Inputs a location and returns the

most similar location from the list.Memory Input OutputWhat do we store?

Scales

Locations a-r are each2048-bit vectors taken from a normal distribution (0,1/2048).

Higher scales - Recursive auto-convolution of constituents

Constructing scales

a b c

X1

X1 =

a

b

c

++

a

b

c

a

b

c

X1

Across Scale sequences

Between each location and corresponding locations at higher scales. a

b c

X1

+a

a X1

a

X1

Path finding algorithmQuite different from standard graph search algorithms…

Path finding algorithm

Start Move towards the Goal

Start==Goal?

Go to a higher scale andsearch for the goal

If goal found at this scale

Retrieve the scales corresponding to the goal

If goal not found at this scale

Retrieving the next scale1) If at scale-0, query the AS memory

to retrieve the AS sequence. Else use the sequence retrieved in a previous step.

2) Query the L memory with

Retrieving the next scale1) Helllo2) Query the L memory with

Path finding algorithm

Start Move towards the Goal

Start==Goal?

Go to a higher scale andsearch for the goal

If goal found at this scale

Retrieve the scales corresponding to the goal

If goal not found at this scale

Locating the goal

For example:location:

and goal: c

Locating the goal

Goal: p Not contained in X1

a

Z

X1

X2 X3Y1X5

X4

X6Y2 p

Path finding algorithm

Start Move towards the Goal

Start==Goal?

Go to a higher scale andsearch for the goal

If goal found at this scale

Retrieve the scales corresponding to the goal

If goal not found at this scale

Goal not found at Y1

a

Z

X1

X2 X3Y1X5

X4

X6Y2 p

Goal found at Z!

a

Z

X1

X2 X3Y1X5

X4

X6Y2 p

Path finding algorithm

Start Move towards the Goal

Start==Goal?

Go to a higher scale andsearch for the goal

If goal found at this scale

Retrieve the scales corresponding to the goal

If goal not found at this scale

Decoding scales

Same decoding operation

Decoding scales

Using the retrieved scales

Path finding algorithm

Start Move towards the Goal

Start==Goal?

Go to a higher scale andsearch for the goal

If goal found at this scale

Retrieve the scales corresponding to the goal

If goal not found at this scale

Moving to the Goal

d e f

a b c

j k l

m n o

Z

X1

X2 X3Y1

X5

X4

X6Y2

g h i

p q r

To work on

Relax the assumption of a perfect tree.

Relax the assumption of a fully connected graph within a scale location.

References Kanerva, P., Distributed Representations,

Encyclopedia of Cognitive Science 2002. 59. Plate, T. A. (1991). Holographic reduced

representations: Convolution algebra for compositional distributed representations. In J. Mylopoulos & R. Reiter (Eds.), Proceedings of the 12th International Joint Conference on Artificial Intelligence (pp. 30-35). San Mateo, CA: Morgan Kaufmann.