Академический Документы
Профессиональный Документы
Культура Документы
Laskaris
Professor John Hopfield The Howard A. Prior Professor of Molecular Biology Dept. of Molecular Biology Computational Neurobiology; Biophysics
Princeton University
The physicist Hopfield showed that models of physical systems could be used to solve computational problems
The importance of the Hopfield nets in practical application is limited due to theoretical limitations of the structure, but, in some cases, they may form interesting models.
The concept
In the beginning of 80s Hopfield published two scientific papers, which attracted much interest.
(1982): Neural networks and physical systems with emergent collective computational abilities.
(1984): Neurons with graded response have collective computational properties like those of two-state neurons.
Proceedings of the National Academy of Sciences, pp. 81:30883092
This was the starting point of the new era of neural networks, which continues today
Like all computers, a brain is a dynamical system that carries out its computations by the change of its 'state' with time. Simple models of the dynamics Using these collective properties of neural circuits processing information are described in that haveis effective dynamical properties. collective in that it exploits the spontaneous properties These can be exploited cells and circuits of nerve in recognizing computation. sensory patterns. to produce robust
J. Hopfields quest
While the brain is totally unlike modern computers, much of what it Associative described as computation. does can be memory,
logic and inference, His research focuses recognizing an odor or a chess position, on the world into objects, parsing understanding andhow the neural circuits of the brain generating appropriate sequences of locomotor muscle commands produce are all describable such powerful and complex as computation.
computations.
Olfaction
However, olfaction olfaction The simplest problem inallows remote sensing, and much more complex computations is simply Hopfield has been studying involving wind identifying a known odor. directionmight be how such computations and fluctuating mixtures known neural performed by the of odors must be described to account for the ability circuitry of the of bulb olfactoryhoming pigeons or slugs to navigate through the use of odors. and prepiriform cortex of mammals or the analogous circuits of simpler animals.
Dynamical systems
Any computer does its computation Systems of differential equations its changes in by can represent these aspects of internal state. neurobiology. In neurobiology, He seeks tothe change of potentials ofof understand some aspects neurons neurobiological computation (and changes in the strengths of the synapses) through studying the behavior of equations with time is what modeling the time-evolution of neural activity. performs the computations.
Since action potentials ast only about a millisecond, the use of action potential timing seems a powerful potential means of neural computation.
Speech
Identifying words in natural speech is a difficult computational task which brains can easily do. They use this task as a test-bed for thinking about the computational abilities of neural networks and neuromorphic ideas
Simple (e.g. binary-logic ) neurons are coupled in a system with recurrent signal flow
stable states
Contour-plot
2nd Example
The behavior of such a dynamical system is fully determined by the synaptic weights
3rd Example
Wij = Wji
Hopfield Nets are fully connected, symmetrically-weighted networks that extended the ideas of linear associative memories by adding cyclic connections .
Note: no self-feedback !
Hebbian Learning
Probe pattern
Dynamical evolution
Step_2. Initialization
There are 8 different states that can be reached by the net and therefore can be used as its initial state
#1: y1 #2: y2 #3: y3
Step_3. Iterate till convergence - Synchronous Updating 3 different examples of the nets flow
It converges immediately
- Synchronous Updating -
Stored pattern
Or
Step_3. Iterate till convergence - Asynchronous Updating Each time, select one neuron at random and update its state with the previous rule
and the usual- convention that if the total input to that neuron is 0 its state remains unchanged
Attractor-state
0.15 N,
N: # neurons
Computer Experimentation
Class-project
Erroneous Recall
Irrelevant results
[ Tank and Hopfield ; IEEE Trans. Circuits Syst. 1986; 33: 533-541.]:
Simple "Neural" Optimization Networks: An A/D Converter, Signal Decision Circuit, and a Linear Programming Circuit
showed that the resulting system is characterized by a Lyaponov-function who termed it ComputationalEnergy & which can be used to tailor the
The system of coupled differential equation describing the operation of continuous Hopfield net
dui ui n = + Tij g j ( u j )+ I i dt ni j =1
Neuronal outputs: Yi Vi
Biases: Ii Weights:
Wij Tij
1 n n E = Tij g i ( u i ) g j ( u j ) 2 i =1 j=1
n
i =1
g( u) =
n
1 2
1 + tanh( gain u)
I i gi (ui )
Tij=Tji Tij=0
1 n n E = Tij Vi V j 2 i =1 j=1
i =1
I i Vi
When Hopfield nets are used for function optimization, the objective function F to be minimized is written as energy function in the form of computational energy E .
The comparison between E and F leads to the design, i.e. definition of links and biases, of the network that can solve the problem.
The actual advantage of doing this is that the Hopfield-net has a direct hardware implementation that enables even a VLSI-integration of the algorithm performing the optimization task
An example:
Given a set of N vectors {Xi} define the k among them that form the most compact cluster {Zi} N N 2 F ({ u i } ) = ui u j Xi - X j
i=1 j=1
Clustering
Dominant-Mode
1 if Xi { Zi } {u i } with u i = k : {u i } = 0 if Xi { Zi } i
N
The objective function F can be written easily in the form of computational energy E
F ({ u i } ) =
i=1
Xi - X j
j=1
ui u j
0 if i = j 2 T ij = T ji = -2 D(i, j) = 2 X i X j
I iobj = 0
With each pattern Xi we associate a neuron in the Hopfield network ( i.e. #neurons = N ).
The synaptic weights are the pairwise-distances (*2) If its activation is 1 when the net will converge
The principle
The solution :
The Boltzmann Machine Improving Hopfield nets by simulating annealing and adopting more complex topologies
. . . . .............. . . . . ,
- 1 105
(1979-1982)
(1982)
Hopfield-nets PNAS
....
to think is to suffer,
a twist on the familiar assertion of
Descartes.
Oxycomanthus japonicus
There is astonishing diversity in the nervous systems of animals, and the variation between species is remarkable. From the basic, distributed nervous systems of jellyfish and sea anemones to the centralized neural networks of squid and octopuses to the complex brain structures at the terminal end of the neural tube in vertebrates, the variation across species is humbling people may claim that more advanced species like humans are the result of an increasingly centralized nervous system that was produced through evolution. This claim of advancement through evolution is a common, but misleading, one. It suggests that evolution always moves in one direction: the advancement of species by increasing complexity
evolution may selectively enable body structures that are more enhanced and complicated,
but it may just as easily enable species that have abandon complex adaptations in favour of simplification.
Brains, too, have evolved in the same way. While the brains of some species, including humans, developed to allow them to thrive, others have abandoned their brains because they are no longer necessary.
For example, the ascidian, or sea squirt, lives in shallow coastal waters and which is a staple food in certain regions, has a vertebrate-like neural structure with a neural tube and notochord in its larval stage. As the larvae becomes an adult, however, these features disappear until only very basic ganglions remain. In evolutionary terms this animal is a winner because it develops a very simplified neural system better adapted to a stationary life in seawater In the long run, however, evolutionary success will be determined by what species survives longer:
humans with their complex brains (and their weapons) or the brainless
1948-1990
1976 "".
Emotional Intelligence
also called EI or EQ , describes an ability, capacity, or skill to perceive, assess, and manage the emotions of one's self, of others, and of groups
H ,
Class-project Oral-Exams
Oral-Exam Appointments
AEM 31 May 794 1st hour
845 893 899
Date
5 June
711 809 874 909 923 950 979 1024 1227
7 June
627 887 946 960 962 980 995 1202 1223
Time
2nd hour
3rd hour
Further Inquiries