slide 1 mathematics of the brain: a. overview of the general approach b. promising directions of...

45
Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford University, Department of Electrical Engineering

Upload: brent-neil-clark

Post on 11-Jan-2016

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

Slide 1

Mathematics of the brain:

A. Overview of the general approach

B. Promising directions of research

Victor Eliashberg

Consulting professor, Stanford University, Department of Electrical Engineering

Page 2: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

A. Overview of the general approach

Slide 2

Page 3: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

This is our most important “personal computer” – the pinnacle of biological system engineering. What does this computer compute?

Frontal Lobe

Temporal Lobe

Parietal Lobe

Occipital Lobe

Cerebellum

Cervical Spinal Cord

Thoracic Spinal Cord

Lumbar Spinal Cord

Cauda Equina

Our brain still lives in a sea!

Dura mater

Slide 3

12 cranial nerves ; ~1010 neurons in each hemisphere

8 pairs

12 pairs

5 pairs

6 pairs

31 pairs of nerves; ~ 107 neurons

~1011 neurons

Page 4: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

Big picture: Cognitive system (Robot,World)

B(t) is a formal representation of B at time t, where t=0 is the beginning of learning. B(0), call it Brain 0 (Brain Zero), is an “unprogrammed” brain.

BDW

Human-like robot (D,B)External world, W

External system (W,D)

Sensorimotor devices, D

Computing system, B, simulatingthe work of human nervous system

Slide 4

Page 5: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

Slide 5

Processor

LTM

RAM

OtherW,D

Conventional computer versus the brain

Processor

LTM

E-states

OtherW,D

1 encoded (symbolic) states 2 analog (dynamic) states

There is a big intuitive difference between 1 and 2 that is difficult to express formally.

Page 6: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

Time vs. space: computational universality

Type 1: Context-sensitive grammars

Type 4: Combinatorial machines (the lowest computing power)

Type 0

Type 1

Type 2

Type 3

Type 4

Type 0: Turing machines (the highest computing power)

Type 2: Context-free grammars (push-down automata)

Type 3: Finite-state machines

Slide 6

NOTE. We can perform, in principle, any mental computations. This means that the human brain is a system of type 0 – the limited size of our working memory is of no principle importance. Importantly, an attempt to represent a behavior of higher type in terms of a system of a lower type leads to a combinatorial explosion of the size of such an inadequate representation!

Page 7: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

The mystery of human learning

How big is B(0)? How big is B(t)?

B(0) B(t), t >20 years

MEGABYTE(S)? TERABYTES?

learning

The main part of brain’s software must be created in the course of learning

Slide 7

Page 8: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

General approach to learning

HYPOTHESIS. The results of learning depend only on the SMI-sequence and not on the way this sequence is produced.

NI –internal observable centers such as, e.g., the centers of emotion, etc.

Teacher

BD

NS

NM

NI

W

Concept of observable behavior

NS –sensory centers

NM –motor centers

S

M

I

S(1) S(2) . . . S(i) . . . S(n)

M(1) M(2) . . . M(i) . . . M(n)

I(1) I(2) . . . I(i) . . . I(n)

SMI – sequence or the brain’s complete experience =

Slide 8

Page 9: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

G-states and E-states: the brain as an E-machine

where

gt+1 = fg(x*) fg: X* G (learning

procedure)

et+1 = fe(xt, et, gt ,qt) fe: X×E×G×Q E (next E-state

procedure)

yt = fy(xt, et, gt , qt) fy: X×E×G×Q Y (interpretation

procedure)

X and Y are the input and output sets, respectively, G is the set of states of

“symbolic” LTM, E is the set of states of “dynamic” STM and ITM, X* is the

set of SMI-sequences (see slide 7), t is discrete time. Q all other states.

BDW

xS

yS

xM

yM

xI yI

x =(xS,xM,xI)

y =(yS,yM,yI)

NS

NM

NI

Slide 9

Page 10: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

Wolfgang Amadeus Mozart

Henri Poincaré John von Neumann

People who had eidetic memory

Slide 10

Page 11: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

Savant Kim Peek Dustin Hoffman and savant Stephen Wiltshire

Eidetic memory vs. creativity

Slide 11

Page 12: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

Working memory and mental imagery as a simulation of external system (W,D). System (W,D) is the “Teacher” for system AS

AS

D

Working memory and mental imagery

associations

NS

AM

Teacher

SM M

Motor controlW

associationsMS S

S

S

M

M

S

M

NM

Slide 12

Page 13: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

Motor and sensory areas of the neocortex

Slide 13

Working memory, episodic memory, and mental imagery

ASAM

Motor control

Page 14: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

Slide 14

Primary sensory and motor areas, association areas

Page 15: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

Association fibers (neural busses)

Slide 15

Page 16: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

Basic structure of a primitive E-machine

Slide 16

Control outputs

Association outputs

E-STATES (dynamic STM and ITM)MODULATION, NEXT E-STATE PROCEDURE

CHOICE

Data inputs to ILTM

Data inputs to OLTM

Control inputs

INPUT LONG-TERM MEMORY (ILTM)DECODING, INPUT LEARNING

OUTPUT LONG-TERM MEMORY (OLTM)ENCODING, OUTPUT LEARNING

Association inputs

Data outputs from OLTM

Modulated (biased) similarity function

Similarity function

Selected subset of active locations of OLTM

Page 17: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

Turing’s machine as a system (Robot, World)

Slide 17

Page 18: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

Mental computations (thinking) as an interaction between motor control and working memory (EROBOT) , (PEM)

Slide 18

Page 19: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

~4,000 inner hair cells ~12,000 outer hair cells

~30,000 fibers~90,000 cells

~390,000 cells

~580,000 cells

~100,000,000 cells

From vectors to symbols: auditory pathways

Slide 19

Page 20: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

Slide 20

Eye movements: oculomotor, trochlear, and abducense nerves

Page 21: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

From symbols to vectors: voluntary movements

Slide 21

Page 22: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

Internal signals: from emotions to symbols and vice versa

Slide 22

Page 23: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

Slide 23

W

AS1

ASk

AM1

AMm

S1

SENSORY CORTEX

MOTOR CORTEX

SUBCORTICAL SYSTEMS

SUBCORTICAL SYSTEMS

M1

D

D

The brain as a complex E-machine

Page 24: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

B. Promising directions of research

Slide 24

Page 25: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

1. Investigating and simulating computational resources of a single neuron

The metaphor the brain as an E-machine promotes the hypothesis that a large portion of the brain hardware computations – especially the MODULATION and the NEXT E-STATE procedures -- is performed at the level of individual cells.

Slide 25

Page 26: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

Typical neuron

Neuron is a very specialized cell. There are several types of neurons with different shapes and different types of membrane proteins. Biological neuron is a complex functional unit. Where does this complexity come from?

Slide 26

Page 27: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

Computational machinery of a cell

Nucleus

Membrane proteins

Membrane

It took evolution much longer to create individual cells than to build systems containing many cells, including the human brain. Different cells differ by their shape and by the types of membrane proteins.

Nucleus

Membrane proteins

Membrane

18nm

3nm

Slide 27

Page 28: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

Protein molecule as a probabilistic molecular machine (PMM)

i

Slide 28

Page 29: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

Ensemble of PMMs (EPMM)

E-states as occupation numbers

Slide 29

Page 30: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

EPMM as a statistical mixed-signal computer

Slide 30

Page 31: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

2. Investigating and simulating neural networks with reciprocal inhibition

The metaphor the brain as an E- machine promotes the hypothesis that a neural layer with reciprocal inhibition can perform the procedure of random equally probable choice. It is interesting to study different possible implementations of such a procedure in the case of a large number of neurons.

Slide 31

Page 32: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

Simple “3-neuron” associative neural network (WTA.EXE)

Slide 32

DECODING

ENCODING

RANDOM CHOICE

Input long-term memory (ILTM)

Output long-term memory (OLTM)

addressing by content

retrieval

S21(I,j)

N1(j)

S21(i,j)

Page 33: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

3. Studying possible neural implementations of Input and output LTM in the case of a very large number of neurons.

Slide 33

Page 34: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

Example of a possible implementation of a very big ILTM

Slide 34

Page 35: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

Example of a possible implementation of a very big OLTM

Slide 35

Page 36: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

5. Studying possible neural implementations of associative connections in the case of a very large number of neurons.

NOTE. It is physically impossible to have a crossbar connectivity between, say, 108 and 108 neurons – the number of required synapses is 1016. The hypothesis of hash-recoding offers a possible solution. Is this hypothesis correct? What other solutions are possible?

Slide 36

Page 37: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

6. Studying the possibilities of primitive E-machines with different DECODING, MODULATION, CHOICE, ENCODING, NEXT E-STATE, and LEARNING procedures.

NOTE. It is particularly interesting to study the possibilities of primitive E-machines with universal learning algorithms – the algorithms that don’t lose training information.

Slide 37

Page 38: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

7. Studying the possibilities of universal learning E-machines (type 0) consisting of two primitive E-machines

NOTE. The program EROBOT.EXE provides a simple example of such a universal learning E-machine.

Slide 38

Page 39: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

8. Studying the possibilities of complex E-machines with hierarchical structure of associative memory

NOTE. Eliashberg 1979 contains a simple example of such an E-machine.

Slide 39

Page 40: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

9. Using natural language analysis for reverse engineering the basic hardware mechanisms of a hypothetical brain-like E-machine.

The problem of a natural language is the most important and interesting problem that seems to be consistent with the metaphor the brain as an E-machine. Therefore, the analysis of a natural language provides a rich source of reliable psychological facts for developing this metaphor.

NOTE. So far I’ve been unable to find either psychological or neurobiological facts that would force me to reject this general metaphor. It is interesting to try to find such facts .

Slide 40

Page 41: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

10. Understanding the possibilities of dynamically reconfigurable associative software.

This is the most promising and practically unlimited area of research. An E-machine with a given knowledge (G-state) can be dynamically reconfigured into a combinatorial number of different symbolic machines by changing the state of its dynamic STM and ITM (E-state). It is my belief that understanding the possibilities associated with such a context-dependent dynamic reconfiguration of knowledge (software) is a key to understanding the work of the human brain and to creating really intelligent autonomous robots.

It is especially promising and challenging to try to understand how increasingly complex dynamically reconfigurable associative software can be created in the course of learning.

NOTE. Interesting as they are for specific applications, traditional ANN models have a limited general level of computing power (not higher than type 3) and cannot address the problem of nontrivial brain software – more so, the problem of context-dependent dynamically reconfigurable software.

Slide 41

Page 42: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

BIBLIOGRAPHY (slide 1 of 4)

Chomsky, N. (1956). Three models for the description of language. I.R.E. Transactions on Information Theory. JT-2, 113-124.

Eliashberg, V. (1967). On a class of learning machines. Proceedings of the Conference on Automation in the Pulp & Paper industry, April 1967, Leningrad USSR. Proc of VNIIB, #54, 350-398.

Eliashberg, V. (1981). The concept of E-machine: On brain hardware and the algorithms of thinking. Proceedings of the Third Annual Meeting of the Cognitive Science Society, 289-291.

Eliashberg, V. (1979). The concept of E-machine and the problem of context-dependent behavior. Txu 40-320, US Copyright Office.

Ashcroft, F.M. (2004). Ion channels and disease. Academic Press, London

Anderson, J.R. (1976). Language, Memory, and Thought. Hillsdale, New Jersey: Lawrence Erlbaum Associates, Publishers.

Eliashberg, V. (1988). Neuron layer with reciprocal inhibition as a mechanism of random choice . Proc. of the IEEE ICNN-88

Collins, A.M., & Quillian, M.R., (1972). How to make a language user. In E. Tulving & W. Donaldson (Eds.) Organization and memory. New York: Academic Press.

Baddeley, A.D. (1982). Your memory. A user's guide. Macmillan Publishing Co., Inc.

Deutsch, D. (1985). Quantum theory, the Church-Turing principle and the universal quantum computer. Proc. of the Royal Society of London A400, pp. 97-117.

Slide 42

Eliashberg, V. (1989). Context-sensitive associative memory: "Residual excitation" in neural networks as the mechanism of STM and mental set. Proceedings of IJCNN-89, June 18-22, 1989, Washington, D.C. vol. I, 67-75.

Page 43: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

BIBLIOGRAPHY (slide 2 of 4)

Eliashberg, V. (2002). What Is Working Memory and Mental Imagery? A Robot that Learns to Perform Mental Computations. Web publication, www.brain0.com, Palo Alto, California

Eliashberg, V. (1993). A relationship between neural networks and programmable logic arrays. 0-7803-0999-5/93, IEEE, 1333-1337.

Eliashberg, V. (2005). Ensembles of membrane proteins as statistical mixed-signal computers. Proc. of IJCNN-2005, Montreal, Canada, pp. 2173-2178.

Hille, B. (2001). Ion channels of excitable membranes. Sinauer Associates. Sunderland, MA

Hodgkin, A.L., and Huxley, A.F. Description of membrane current and its application to conduction and excitation in nerve. Journal of Physiology, 117, pp. 500-544, 1952.

Eliashberg, V. (1990b). Molecular dynamics of short-term memory. Mathematical and Computer modeling in Science and Technology. vol. 14, 295-299.

Eliashberg, V. (1990a). Universal learning neurocomputers. Proceeding of the Fourth Annual parallel processing symposium. California state university, Fullerton. April 4-6, 1990. , 181-191.

Grossberg, S. (1982). Studies of mind and brain. Boston: Reidel Press.

Hinton, G.E., and Andereson, J. (1981). Parallel models of associative memory. Hilsdale, NJ: Lawrence Erlbaum Associates, Publishers.

Gerstner, W., and Kistler, W.M. (2002). Spiking Neuron Models. Cambridge University Press.

Slide 43

Gold, E.M. (1967). Language identification in the limit. Information and control. 10:447-474.

Page 44: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

BIBLIOGRAPHY (slide 3 of 4)

Rosenblatt, F. (1961). Principles of Neurodynamics. Perceptron and the theory of brain mechanisms. Spartan Books.

Meynert, T. (1884). Psychiatrie. Wien.

Reichardt, W., and MacGinitie, G. (1962). Zur Theorie der Lateralen Inhibition. Kybernetik, B. 1, Nr. 4.

Minsky, M.L. (1967). Computation: Finite and Infinite Machines. Prentice-Hall, Inc.

McCulloch, W.S., and Pitts, W. (1943). A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biophysics, 5, 115-133.

Rumelhart, D.E., McClelland, J.L., and the PDP research group (1986). Parallel distributed processing. Cambridge MA: MIT press.

Kandel, E.R., and Spencer, W.A. (1968). Cellular Neurophysiological Approaches in the Study of Learning. Physiological Rev. 48, 65-134.

Hopfield, J.J. (1982). Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences USA, 79, 2554-2558.

Kandel, E. Jessel, T., and Schwartz, J. (2000). Principles of Neural Science. McGraw-Hill.

Kohonen, T. (1984). Self-Organization and Associative Memory, Springer-Verlag.

Pinker, S, and Mehler, J. eds. (1988). Connections and Symbols. The MIT Press.

Slide 44

Page 45: Slide 1 Mathematics of the brain: A. Overview of the general approach B. Promising directions of research Victor Eliashberg Consulting professor, Stanford

BIBLIOGRAPHY (slide 4 of 4)

Zadeh, L. A. (1965). Fuzzy sets. Information and Control, 8, pp. 338-353.

Vvedensky, N.E. (1901). Excitation, inhibition, and narcosis. In Complete collection of works. USSR, 1953.

Zopf, G.W., Jr. (1962). Attitude and Context. In "Principles of Self-organization", Pergamon Press, pp. 325- 346.

Turing, A.M. (1936). On computable numbers, with an application to the Entscheidungsproblem. Proc. London Math. Society, ser. 2, 42

Szentagothai, J. (1968). Structuro-functional considerations of the cerebellar neuron network. Proc. of IEEE, vol. 56, 6, pp. 960-966.

Steinbuch, K. (1961). “Die Lernmatrix”, Kybernetik, vol. 1, pp.36-45.

Varju, D. (1965). On the theory of Lateral Inhibition. Consiglio Nazionalle Delle Reicerche Quardeni de “La Ricerca Scientifica”, v.31.

Veveen, A.A, and Dreksen, H.E. (1968). Fluctuation phenomena in nerve membrane. Proc. of the IEEE, v. 56, No. 6.

Siegelman, H.T. (2003). Neural and Super-Turing Computing: Minds and Machines, vol. 13, No. 1. pp. 103-114.

Sima, J., and Orponen, P. (2003). General-Purpose Computations with Neural Networks: A survey of Complexity-Theoretic Results. Neural Computations, 15, pp. 2727-2778.

Slide 45