Establishing the Equivalence between Recurrent Neural Networks and Turing Machines. Ritesh Kumar...

Post on 12-Jan-2016

217 views 0 download

Transcript of Establishing the Equivalence between Recurrent Neural Networks and Turing Machines. Ritesh Kumar...

Establishing the Equivalence between Recurrent Neural Networks and Turing Machines.

Ritesh Kumar Sinha(02d05005)

Kumar Gaurav Bijay(02005013)

“No computer has ever been designed that is everaware of what it’s doing; but most of the time, wearen’t either”

– Marvin Minsky.

Introduction

Plan : Establish Equivalence between recurrent neural network and turing machine History of Neurons Definitions Constructive Proof of equivalence

Approach : Conceptual Understanding

Motivation

Understanding the learning patterns of human brain – concept of neuron

Is Turing Machine the ultimate computing machine ?

How powerful are neural networks – DFA, PDA, Turing Machine, still higher …

Brain

The most complicated human organ sense, perceive, feel, think, believe, remember, utter Information processing Centre

Neurons : information processing units

MCP Neuron

McCulloch and Pitts gave a model of a neuron in 1943

But it's only a highly simplified model of real neuron Positive weights (activators) Negative weights (inhibitors)

Artificial Neural Neworks (ANN)

Interconnected units : model neurons

Modifiable weights (models synapse)

Types of ANN

Feed-forward Networks Signals travel in one way only Good at computing static functions

Neuro-Fuzzy Networks Combines advantages of both fuzzy reasoning

and Neural Networks. Good at modeling real life data.

Recurrent Networks

Recurrent Neural Networks

Activation Function:f(x) = x , if x > 0 0 , otherwise

Turing Machine and Turing Complete Languages

Turing Machine: As powerful as any other computer

Turing Complete Language: Programming language that can compute any

function that a Turing Machine can compute.

A language L

Four basic operations: No operation : V V Increment : V V+1 Decrement : V max(0,V-1) Conditional Branch : IF V != 0 GOTO j

(V is any variable having positive integer values, and j stands for line numbers)

L is Turing Complete

Turing Machine can encode Neural Network

Turing machine can compute any computable function, by definition

The activation function, in our case, is a simple non-linear function

Turing machine can therefore simulate our recurrent neural net

Intuitive, cant we write code to simulate our neural net

An example

Function that initiates: Y=X

L1: X X - 1

Y Y + 1

if X != 0 goto L1

An example

Function that initiates: Y=X

if X != 0 goto L1

X X + 1

if X != 0 goto L2

L1: X X - 1

Y Y + 1

if X != 0 goto L1

L2: Y Y

L is Turing Complete : Conceptual Understanding

Idea: Don’t think of C++, think of 8085 ;)

Subtraction: Y = X1 - X2

L is Turing Complete : Conceptual Understanding

Idea: Don’t think of C++, think of 8085 ;)

Subtraction: Y = X1 - X2 Yes, decrement X, increment Y, when Y=0,

stop

Multiplication, division:

L is Turing Complete : Conceptual Understanding

Idea: Don’t think of C++, think of 8085 ;)

Subtraction: Y = X1 - X2

Yes, decrement X, increment Y, when Y=0, stop

Multiplication, division:

Yes, think of the various algos you studied in Hardware Class :)

L is Turing Complete : Conceptual Understanding

If: if X=0 goto L

L is Turing Complete : Conceptual Understanding

If: if X=0 goto L

Yes,

if X != 0 goto L2

Z Z + 1 // Z is dummy

if Z != 0 goto L

L2: ...

Constructing a Perceptron Network for Language L

For each variable V : entry node NV

For each program row i : instruction node Ni

Conditional branch instruction on row i :

2 transition nodes Ni’ and Ni”

Constructions

Variable V :

NV NV

No Operation:

Ni Ni+1

Constructions Continued

Increment Operation :

Ni Ni+1

Ni NV

Decrement Operation:

Ni Ni+1

Ni NV

Constructions Continued

Conditional Branch Operation :

Ni Ni’

Ni Ni’’

Nv Ni’’

Ni’’ Ni+1

Ni’ Nj

Ni’’ Nj

Definitions

Legal State: Transition Nodes (Ni’ and Ni’’) : 0 outputs

Exactly one Ni has output=1

Final State : All instruction nodes have 0 outputs.

Properties

If yi = 1 then program counter is on line i

V = output of Nv

Changes in network state are activated by non-zero nodes.

Proof

•If row i is V V

•If row i is V V - 1

•For V V + 1, behavior is similar

Proof Continued

•If row i is : if V != 0 GOTO j

Proof Continued

•Thus a legal state leads to another legal state.

What about other activation functions?

• Feed-forward Net With Binary State Function is DFA

• Recurrent Neural Network with:1) Sigmoid is Turing Universal2) Saturated Linear Function is

Turing Universal

Conclusion

Turing machine can encode Recurrent Neural Network.

Recurrent neural net can simulate a Turing complete language.

So, Turing machines are Recurrent Neural Networks !

Project

Implementation of Push down automata (PDA) using Neural Networks.

References

[1] McCulloch , W. S., and Pitts, W. : A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biophysics 5, 1943 :115-133.

[2] Hyotyniemi, H : Turing Machines are Recurrent Neural Networks. In SYmposium of Artificial Networks, Vaasa, Finland, Aug 19-23, 1926, pp. 13-24

[3] Davis, Martin D.Weyuker, Elaine J. : Computability, complexity, and languages : fundamentals of theoretical computer science, New York : Academic Press, 1983.

References Continued

[4] J. E. Hopcroft and J. D. Ullman, Introduction to Automata Theory, Languages and Computation. Addison-Wesley, 1979.

[5] Arbib, M. : Turing Machnies, Finite Automata and Neural Nets. Journal of the ACM (JACM), NY, USA, Oct 1961, Vol 8 , Issue 4 , pp. 467 - 475.