Вы находитесь на странице: 1из 78

N.

Laskaris

Professor John Hopfield The Howard A. Prior Professor of Molecular Biology Dept. of Molecular Biology Computational Neurobiology; Biophysics

Princeton University

The physicist Hopfield showed that models of physical systems could be used to solve computational problems

Such systems could be implemented in hardware by combining standard

The importance of the Hopfield nets in practical application is limited due to theoretical limitations of the structure, but, in some cases, they may form interesting models.

Usually employed in binary-logic tasks : e.g. pattern completion and association

The concept

In the beginning of 80s Hopfield published two scientific papers, which attracted much interest.
(1982): Neural networks and physical systems with emergent collective computational abilities.

Proceedings of the National Academy of Sciences, pp. 2554-2558.

(1984): Neurons with graded response have collective computational properties like those of two-state neurons.
Proceedings of the National Academy of Sciences, pp. 81:30883092

This was the starting point of the new era of neural networks, which continues today

The dynamics of brain computation


The core question :
How is one to understand the incredible effectiveness of a brain in tasks such as recognizing

Like all computers, a brain is a dynamical system that carries out its computations by the change of its 'state' with time. Simple models of the dynamics Using these collective properties of neural circuits processing information are described in that haveis effective dynamical properties. collective in that it exploits the spontaneous properties These can be exploited cells and circuits of nerve in recognizing computation. sensory patterns. to produce robust

J. Hopfields quest
While the brain is totally unlike modern computers, much of what it Associative described as computation. does can be memory,

logic and inference, His research focuses recognizing an odor or a chess position, on the world into objects, parsing understanding andhow the neural circuits of the brain generating appropriate sequences of locomotor muscle commands produce are all describable such powerful and complex as computation.

computations.

Olfaction
However, olfaction olfaction The simplest problem inallows remote sensing, and much more complex computations is simply Hopfield has been studying involving wind identifying a known odor. directionmight be how such computations and fluctuating mixtures known neural performed by the of odors must be described to account for the ability circuitry of the of bulb olfactoryhoming pigeons or slugs to navigate through the use of odors. and prepiriform cortex of mammals or the analogous circuits of simpler animals.

Dynamical systems
Any computer does its computation Systems of differential equations its changes in by can represent these aspects of internal state. neurobiology. In neurobiology, He seeks tothe change of potentials ofof understand some aspects neurons neurobiological computation (and changes in the strengths of the synapses) through studying the behavior of equations with time is what modeling the time-evolution of neural activity. performs the computations.

Action potential computation


For much of neurobiology, information is represented by the paradigm of firing rates, i.e. information is represented by the rate of generation of action potential spikes, and the exact timing of these spikes is unimportant.

Action potential computation

Since action potentials ast only about a millisecond, the use of action potential timing seems a powerful potential means of neural computation.

Action potential computation There are cases,


for example the binaural auditory determination of the location of a sound source,

where information is encoded in the timing of action potentials.

Speech
Identifying words in natural speech is a difficult computational task which brains can easily do. They use this task as a test-bed for thinking about the computational abilities of neural networks and neuromorphic ideas

Simple (e.g. binary-logic ) neurons are coupled in a system with recurrent signal flow

A 2-neurons Hopfield network 1st of continuous states Example characterized by 2

stable states

Contour-plot

2nd Example

A 3-neurons Hopfield network of 23=8 states characterized by 2 stable states

The behavior of such a dynamical system is fully determined by the synaptic weights

3rd Example

Wij = Wji

And can be thought of as an Energy minimization

Hopfield Nets are fully connected, symmetrically-weighted networks that extended the ideas of linear associative memories by adding cyclic connections .

Note: no self-feedback !

Operation of the network


After the teaching-stage, in which the weights are defined, the initial state of the network is set (input pattern) Regardingtwo maina Hopfieldoperation: training modes of net There are and a simple recurrent rule is iterated as a content-addressable memory Synchronous vs. Asynchronous updating till convergence to a stable state (output pattern) for storing patterns is used the outer-product rule

Hebbian Learning

Probe pattern

Dynamical evolution

A Simple Example Step_1. Design a network


with memorized patterns (vectors) [ 1, -1, 1 ] & [ -1, 1,
-1 ]

Step_2. Initialization
There are 8 different states that can be reached by the net and therefore can be used as its initial state
#1: y1 #2: y2 #3: y3

Step_3. Iterate till convergence - Synchronous Updating 3 different examples of the nets flow

It converges immediately

Step_3. Iterate till convergence

- Synchronous Updating -

Stored pattern

Schematic diagram of all the dynamical trajectories that correspond to the

Or
Step_3. Iterate till convergence - Asynchronous Updating Each time, select one neuron at random and update its state with the previous rule
and the usual- convention that if the total input to that neuron is 0 its state remains unchanged

Explanation of the convergence


There is an energy function related with each state of the Hopfield network

E( [y1, y2, , yn]T ) = - wij yi yj


where [y1, y2, , yn]T is the vector of neurons output,

The corresponding dynamical system evolves toward states of lower Energy

States of lowest energy correspond to attractors of Hopfield-net dynamics


E( [y1, y2, , yn]T ) = = - wij yi yj

Attractor-state

Capacity of the Hopfield memory


In short, while training the net (via the outer-product rule) were storing patterns by posing different attractors in the state-space of the system. While operating, the net searches the closest attractor. When this is found, the corresponding pattern of activation is

How many patterns we can store in a Hopfield-net ?

0.15 N,

N: # neurons

Computer Experimentation

Class-project

A simple Pattern Recognition Example

Stored Patterns (binary images)

Perfect RecallImage Restoration

Erroneous Recall

Irrelevant results

Note: explain the negatives .

The continuous Hopfield-Net as optimization machinery

[ Tank and Hopfield ; IEEE Trans. Circuits Syst. 1986; 33: 533-541.]:

Simple "Neural" Optimization Networks: An A/D Converter, Signal Decision Circuit, and a Linear Programming Circuit

Hopfield modified his network so as to work with continuous activation and


-by adopting a dynamical-systems approach-

showed that the resulting system is characterized by a Lyaponov-function who termed it ComputationalEnergy & which can be used to tailor the

The system of coupled differential equation describing the operation of continuous Hopfield net
dui ui n = + Tij g j ( u j )+ I i dt ni j =1

Neuronal outputs: Yi Vi

Biases: Ii Weights:

Wij Tij
1 n n E = Tij g i ( u i ) g j ( u j ) 2 i =1 j=1
n
i =1

g( u) =
n

1 2

1 + tanh( gain u)

I i gi (ui )

Tij=Tji Tij=0

1 n n E = Tij Vi V j 2 i =1 j=1

i =1

I i Vi

The Computational Energy

When Hopfield nets are used for function optimization, the objective function F to be minimized is written as energy function in the form of computational energy E .

The comparison between E and F leads to the design, i.e. definition of links and biases, of the network that can solve the problem.

The actual advantage of doing this is that the Hopfield-net has a direct hardware implementation that enables even a VLSI-integration of the algorithm performing the optimization task

An example:

Given a set of N vectors {Xi} define the k among them that form the most compact cluster {Zi} N N 2 F ({ u i } ) = ui u j Xi - X j
i=1 j=1

Clustering

Dominant-Mode

1 if Xi { Zi } {u i } with u i = k : {u i } = 0 if Xi { Zi } i
N

The objective function F can be written easily in the form of computational energy E

F ({ u i } ) =

i=1

Xi - X j
j=1

ui u j

Theres an additional Constraint so as k neurons are on

N 1 N N F = - T ij V i V j - I i V i 2 i=1 j=1 i=1

0 if i = j 2 T ij = T ji = -2 D(i, j) = 2 X i X j

I iobj = 0

With each pattern Xi we associate a neuron in the Hopfield network ( i.e. #neurons = N ).
The synaptic weights are the pairwise-distances (*2) If its activation is 1 when the net will converge

A classical example: The Travelling Salesman Problem

The principle

Coding a possible route as a combination of neurons firings 53 4 1 2 5


|5-3|+|3-4|+|4-1|+|1-2|+|2-5|

An example from clinical Encephalography


The problem : The idea :

The solution :

Hopfield Neural Nets


for monitoring Evoked Potential Signals
N. Laskaris et al.
[ Electroenc. Clin. Neuroph. 1997;104(2) ]

The Boltzmann Machine Improving Hopfield nets by simulating annealing and adopting more complex topologies

(430 355) .X.

. . . . .............. . . . . ,
- 1 105

(1979-1982)

(1982)

Hopfield-nets PNAS

....

A Very Last Comment on Brain-MindIntelligence-Life-Happiness

How I Became Stupid


by Martin Page

enguin Books, 2004, 160 pp. ISBN: 0-14-200495-2

In HOW I BECAME STUPID, The 25-year-old Antoine concludes

to think is to suffer,
a twist on the familiar assertion of

Descartes.

For Antoine, intelligence

is the source of unhappiness.

He embarks on a series of hilarious strategies to make himself

stupid and possibly happy

Animals that Abandon their Brains


Dr. Jun Aruga
Laboratory for Comparative Neurogenesis A primitive but successful animal

Oxycomanthus japonicus

There is astonishing diversity in the nervous systems of animals, and the variation between species is remarkable. From the basic, distributed nervous systems of jellyfish and sea anemones to the centralized neural networks of squid and octopuses to the complex brain structures at the terminal end of the neural tube in vertebrates, the variation across species is humbling people may claim that more advanced species like humans are the result of an increasingly centralized nervous system that was produced through evolution. This claim of advancement through evolution is a common, but misleading, one. It suggests that evolution always moves in one direction: the advancement of species by increasing complexity

evolution may selectively enable body structures that are more enhanced and complicated,

but it may just as easily enable species that have abandon complex adaptations in favour of simplification.
Brains, too, have evolved in the same way. While the brains of some species, including humans, developed to allow them to thrive, others have abandoned their brains because they are no longer necessary.

For example, the ascidian, or sea squirt, lives in shallow coastal waters and which is a staple food in certain regions, has a vertebrate-like neural structure with a neural tube and notochord in its larval stage. As the larvae becomes an adult, however, these features disappear until only very basic ganglions remain. In evolutionary terms this animal is a winner because it develops a very simplified neural system better adapted to a stationary life in seawater In the long run, however, evolutionary success will be determined by what species survives longer:

humans with their complex brains (and their weapons) or the brainless

1948-1990

. . 1970 - " ".

1976 "".

Emotional Intelligence

also called EI or EQ , describes an ability, capacity, or skill to perceive, assess, and manage the emotions of one's self, of others, and of groups

H ,

Class-project Oral-Exams

Oral-Exam Appointments
AEM 31 May 794 1st hour
845 893 899

Date
5 June
711 809 874 909 923 950 979 1024 1227

7 June
627 887 946 960 962 980 995 1202 1223

Time

2nd hour

915 920 932 949 1023

3rd hour

Further Inquiries

Вам также может понравиться