Jets and Sharks model is an example of a competitive network
- bi-directional connections
- connection strengths are fixed, and set by hand
- individual units represent distinct concepts
- network takes many cycles to settle on a final pattern of activity
- models information retrieval from memory
Pattern associator networks
- one-way feed-forward connections between units
- connection strengths change over time with experience
- concepts are represented by distributed patterns of activity over
many units
- network produces output pattern in one step
- models associative learning by example
Principle of Hebbian learning
When an axon of cell A is near enough to excite a cell B and repeatedly or
persistently takes part in firing it, some growth process or metabolic change
takes place in one or both cells such that A's efficiency as one of the cells
firing B, is increased.
Donald Hebb, The Organization of Behavior (1949)
Basic idea: Strengthen connections between units with similar activity.
Otherwise weaken connections.
Unit A |
Unit B |
Change in strength |
+ | + | + |
+ | - | - |
- | + | - |
- | - | + |
Teaching a network by example
Start with a set of pattern associations:
+1 -1 -1 +1 ("image of steak") ----> -1 -1 +1 +1 ("smell of steak")
-1 +1 -1 +1 ("image of rose") ----> -1 +1 +1 -1 ("smell of rose")
- Initialize connection strengths to zero or small random values
- Choose an association to be learned
+1 -1 -1 +1 ("image of steak") ----> -1 -1 +1 +1 ("smell of steak")
- Present the association to the network as input and target
- Compute response of network
- Compare response to target pattern and calculate error ("delta")
- Update connection strengths based on input activations and delta
values
strength-change = learning-rate * delta-value * input-activation
- Go back to step 2 until error for all associations is acceptable
Properties of pattern associators
- ability to generalize behavior to novel inputs, beyond the original training patterns
- resistance to noise
- graceful degradation
- can learn to behave as if following a rule
- single-layer networks suffer from limitations (example: XOR problem)
- multi-layer networks can overcome these limitations using
backpropagation learning algorithm
- many different activation functions are possible