HyperNet
1994-05-21
----------------------------------------------------------------------
HyperNet 0.5 (C) 1994 David Wallace Croft. All rights reserved.
The author may be reached via CompuServe at [76600,102].
HyperNet is a fully-connected artificial neural network that learns,
stores, and plays musical patterns. It requires the DOS operating
system and uses the normal internal speaker of the PC.
----------------------------------------------------------------------
Installation
Create a new sub-directory for hypernet.
C:\> md hypernet
Change your default directory to the new directory.
C:\> cd hypernet
Copy the hypernet files to the new directory.
C:\HYPERNET> copy a:*.* *.*
----------------------------------------------------------------------
Quick Start
Start HyperNet by entering HYPERNET at the DOS prompt.
C:\HYPERNET> hypernet
At the next prompt hit enter to to accept the default option and a
demonstration of a song being taught to HyperNet will begin.
Option? (0..20) [1]:
To stop the demonstration and return to the menu, press .
To quit the program HyperNet, select option 0 at the menu prompt.
Option? (0..20) [1]: 0
----------------------------------------------------------------------
Applications
HyperNet learns, stores, recognizes, identifies, predicts, and
composes musical patterns.
Learning and Storage
If music with uncorrelated background noise and errors is played to
the network repetitively, an averaged clean version of the music will
be learned for playback. The network could store several songs if
they were sufficiently different. That is, if the songs have long
periods of exactly identical choruses, HyperNet may forget which
song it was playing and switch between them.
Recognition and Prediction
If music that is sufficiently correlated with a previously-learned
pattern is played to the network, HyperNet will begin to play its
stored version of the music. The network could also be trained to
play back or silently display a sequence of notes that would
uniquely identify the title of the song.
Composition and Reinforcement
When the menu option "training" is not on and the network is allowed
to play freely, the network will gradually settle into a stable
pattern, or tune. During the settling process, one can store a
good tune by saving the networks weights to disk. One can also
re-inititate the creative process by adding random de-stabilizing
inputs.
Non-Musical Applications
Although HyperNet was designed to learn the patterns of music, many
other temporal and stationary patterns may be learned as well. By
assigning notes (that is, their associated neurons), to particular
events or states, you can now hear the patterns. Thus, every system
or function has its own "music."
To learn the time-varying relationships between the inputs and outputs
of a non-musical system (an arbitrary state-machine), one can assign
a musical note (that is, a neuron), to each of the states in the system
to be modeled. For example, if on any given day you decide it is
sunny, cloudy, raining, or snowing, you could attempt to train HyperNet
to sound one of four different notes for each type of weather on a
given day. You would then be able to hear HyperNet play a tune that
changes as the simulated seasons change.
For stationary patterns, such as the input to output relationship of a
function, a musical note could likewise be assigned to each input or
desired output to create a "song" that starts and either quickly
comes to a conclusion or continues to loop. For example, if you
wanted a note to sound everytime two separate events occurred
simultaneously that would cause the stock market to rise, you could
assign one neuron, or note, to each of the two events and a third
neuron to represent the rising of the stock market. You would then
train HyperNet to play the third note when and only when the first and
second notes are on at the same time.
----------------------------------------------------------------------
Future Improvements
The ability to load and save learned patterns to a file.
Optional generation of random inputs to enhance creativity.
An increase in network size to improve learning.
Screen display in the format of musical score.
Optional input from a file, MIDI device, or microphone.
Optional sound card output.
Generation of non-pure tones such as speech by combining frequencies
generated by fast "clicks."
----------------------------------------------------------------------
Experiments
[Needs to be revised to match current software state]
After running the default training program given in "Quick Start"
for about 5 minutes, stop the program by pressing . Select
options 2, 19, and 13 to turn off learning and training and reset the
network. Select option 1 to begin the experiment. Initially, nothing
will happen as the network has been reset. Press the following keys
in order quickly: 1, 2, 3, 4, and 5. As this is the beginning of the
default training song, the trained network will begin to play and
finish the full song by producing 1234567890. This is known as
"pattern completion".
Turn learning back on by selecting option 2. Turn off the tones
by selecting option 11. Turn off the screen refresh by selecting
option 10 and entering 0. Start the experiment by entering 1.
You will hear a rythmic pattern of fast clicks. Each click
means that one or more neurons are turning on at that time.
Press the space bar to refresh the screen. Note how the screen
refresh delays the clicking while it is working. Press many of the
number keys in any order and listen to how the clicking changes.
The rythmic clicking will slowly change over time generally moving
from an unsteady series of beats to a final stable pattern. Press
ENTER to return to the menu.
Turn the tones back on by selecting option 11. Turn training back
on by selecting option 19. Change the training song by selecting
option 20 and entering "9876543210". Begin the training by selecting
option 1. You will hear a fast sequence of tones as training
progresses. After a minute, press ENTER to return to the menu. Turn
learning and training off. Reset the network by selecting option 13.
Set the refresh rate to 1. Start hypernet again. Press "1234567890".
Notice that HyperNet settles into some strange pattern. Stop the run
and reset the network again with option 13. Restart the run and press
"9876543210". Notice that HyperNet quickly settles into that pattern
which it became accustomed to during training.
Discretize the real intermediate values of the sine function to 10
levels. Assign each of these values to one of 10 musical notes.
Create a training file with the 10 musical notes firing over time
in a sinusoidal fashion with a frequency of your choice.
----------------------------------------------------------------------
Future Experiments
If microphone input is available, place the microphone in front
of a radio or television for a few days. Frequently repeated jingles
will gradually be stored and parroted back.
----------------------------------------------------------------------
Neural Network Basics
The following is an introduction to the workings of neurons and
networks of neurons. As explanatory devices, the analogies of a
glass of water and a electronic circuit are interspersed.
A neuron is the basic information processing cell in the brain.
______
----> / \
Inputs ---->| Neuron |----> Output
----> \______/
A neuron has inputs, such as ion currents, and an output, such as
a membrane voltage potential.
A glass can be filled by a current of water. When tipped over, it
will spill forth its contents as "output".
_______ _______
--> / \ Same --> / \
Input -->| State X |--> 0 Input -->| State Y |--> 1
--> \_______/ Again --> \_______/
A neuron has an internal stored state which determines the output it
will generate for a set of inputs at any given time.
This internal stored state of a glass of water would be a measure of
how filled it is at any given time.
Input Current --> *----*----* Output Voltage
|
__|__
_____ Capacitor
|
|
=== Ground
=
The current state of a neuron may be stored as its present membrane
voltage level which is originally polarized at some resting potential,
or voltage much as a charge is stored on a capacitor. Positive input
currents tend to raise, or depolarize, the voltage.
One could also think of this as a trickle on water filling a glass
that starts out half full. The "capacitance" would be some measure
of the capacity of the glass to hold water, such as its diameter.
Voltage (-) and "Firing" Output Current (*)
+1| *****_/
| * _/
| *_/
Threshold =>|........_*.........
Voltage | _/ *
| _/ *
0|_*_*.*.*.*.........--> time
_______
| _ |
I --> *----*----|__| |__|---*
| |_______|
__|__ Threshold
C _____
|
|
=== Ground
=
The neuron will "fire" if the membrane voltage exceeds some threshold
value as driven by the input currents and rapidly rise independently.
This can be modeled as a threshold-sensing device which is "on" only
when the voltage on the capacitor exceed some positive level.
|------| glass |-------| /\ Tipping
| | | | / \
| | | | /wwwwwww/w
|-----|wwwwww|-----| pivot |www|www| /wwwwwww/ w
| |water | | |www|www| /wwww|ww/ w
| -------- | stand ---|--- \www|/ w
| Front View | Side | View \/| w
= = = = www
For our water-in-a-cup analogy, consider that the cup of water is on
a steep slope or pivot. When it is filled with enough water, it will
tip over and spill all of its contents and then right itself.
_______
| _ | Voltage
I --> *----*-----*----------|__| |__|--* |
| | |_______| +1|
__|__ \ Switch <----/ | _/|
C _____ | | _/ |
| = 0|___/....|....
| === - | |
| = Battery | |
| === + -1| |
| | ---------------> time
*_____*
|
===
=
After a neuron fires, its voltage will drop very suddenly to a value
below its resting potential, or hyperpolarize. In our model, the
threshold-sensing output device also controls a switch to a negative
battery.
In the case of the cup, assume that, once tipped, it is completely
empty instead of half empty as it started.
_______
| _ | Voltage
I --> *----*----*----------*-----|__| |__|--* |
| | | |_______| +1|
__|__ | \ S <---/ | _/|
C _____ \ | | _/ |
| / Slow = 0|___/....|....._
| \ Leak === - | | _/
| / = E | | _/
| \ R === + -1| |/
| | | --------------->
*____*__________* Time
|
===
=
The voltage of a neuron not being depolarized by positive input
currents will gradually return to the resting potential, unless it has
exceeded the threshold, whether it must drop from a depolarized, or
positive state, or rise from a hyperpolarized, or negative, state.
For this purpose, we add a small leakage conductance (large resistor).
This completes the "leaky integrate and tire" neuron model.
|------|
| |
| | | Large Pool of Water |
|-----|wwwwww|-----| |wwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwww|
| |wwwwww| | |wwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwww|
| ---*---- | |wwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwww|
| tiny | hole | ----*-----------------------------------
= \ = /
\_______________/
small hose
To model the small leakage conductance with our glass, imagine that
there is a tiny hole in the bottom of it that allows a slight leak
of water in and out through an attached small hose. This hose is
also connected to the bottom of a large pool of water whose surface
is level with the mid-point of the glass. When the glass is empty,
the leak will slowly fill the glass up to the half-way mark. When
the glass is nearly full but has not tipped yet, the leak will slowly
drain the glass down to the half-way mark.
The output of one neuron may drive the input to another neuron.
To "excite" a neuron is to drive it with positive input currents
which raise its voltage potential. This driving neuron, when firing,
releases a chemical known as neurotransmitter which creates input
currents to the driven neuron.
If our cup full of water spills, the flow may pour into other cups
below it, possibly causing them to tip over as well.
Current
/ -------->
*----/ ----/\/\/\/\/\/----* Neurotransmitter Switch
| Switch Conductance to Excitatory Voltage Source
|
===
= +
=== Battery
= -
|
|
=== Ground
=
This neurotransmitter is released at a junction between the driving
neuron, which generates an output, and the driven neuron, which
receives inputs, known as a "synapse". Here, we model the
neurotransmitter as a switch which allows current to flow.
___________________ ____________________
/ \ --------- / \
| Pre-Synaptic Neuron |---->| Synapse |----> | Post-Synaptic Neuron |
\-------------------/ --------- \--------------------/
Since the information flows from the driving neuron across the synapse
to the driven neuron, the driving neuron is called the "pre-synaptic"
neuron and the driven neuron is called the "post-synaptic" neuron.
_
/|
/
*---/\/\/X/\/\------* Adjustable Conductance
/
/
The effectiveness of the neurotransmitter in generating a weak or
strong current is partially determined by the conductivity, or
"weight", of the synapse. We can model this as a variable conductance
(or potentiometer).
___________
| | Tipped
|WW|WWWWW|W Glass
---|-------W
| W
= \ W \ @ Adjustable Shutoff Valve
Pipe \ W \|
\ W|
\w
w Controlled Drip
w
|--w--|
| w |
|WW|WW|
|WW|WW|
---|---
|
=
Using the water analogy, we can visualize a pipe running from the
spillway of one cup to the next with the water flow controlled by an
adjustable valve.
OFF _ ON
/ |\
*---/ ---.....X.....------* Disconnected Synapse
S \
A synaptic weight of zero means that the pre-synaptic neuron is
effectively disconnected from the post-synaptic neuron. No current
will flow.
Current
/ <-----
*---/ ---/\/\/\/\----*
| S G
|
=
=== -
= E Hyperpolarizing Inhibition
=== +
|
|
=== Ground
=
Synaptic weights may also be negative which generates negative
currents which lower, or hyperpolarize, the voltage membrane.
"Hyperpolarizing inhibition" makes the neuron less likely to fire.
Assume that our glass might have a hole in the bottom which is
opened by current from an inhibiting glass by pressing on a spring-
loaded lever or water wheel. When the glass is drained empty, it is
"hyperpolarized".
______
/ / \
*---/ ---/\/\/\/\---->| Neuron |
| S G \------/
|
| Shunting Inhibition
=== Ground
=
"Shunting" or "silent inhibition" generates currents which drive
the membrane voltage to its resting value, or "ground", whether it
has to lower the voltage from a depolarized positive value or raise
it from a hyperpolarized negative value. Whereas excitatory and
hyperpolarizing weights can be considered as positive and negative
conductivities to a positive potential source, shunting weights
should be considered as conductivities to ground.
The glass may have other small holes in its bottom with tubes
running to the large pool which, when opened by flowing water from
another glass, bring the water in the glass back to the level of
the pool.
N---S---N
|\ /| Networks of Neurons (N) and Synapses (S)
| S S |
| \ / |
S N S N--S-->--\
| / \ | ==>N
| S S | N--S-->--/
|/ \|
N---S---N
A collection of neurons connected by synaptic weights is called a
neural network. A neuron in a neural network may have non-zero
weights to none, some, or all of the neurons in the network.
_____________
/ \ Autapse
| --- |
---*----S--->| N |-*-->
---
A neuron may have a synapse to itself known as an "autapse" because it
automatically drives an input to itself after some transmission delay.
Input Hidden Output
Neurons
/\ /\ /\
| | | | | |
-->N---->N---->N-->
Input Sensation \ / \ / Output Behavior
/ \ / \
-->N---->N---->N-->
Some of the neurons in a neural network may be considered as "input"
neurons, such as those neurons receiving sensory inputs, and others as
"output" neurons, such as those generating behavioral output. Neurons
in between are called "hidden" neurons or interneurons.
A "feed-forward" neural network is arranged so that there are no
feedback paths from a neuron to itself.
Information in a feed-forward network generally flows from the input
neurons to the middle or "hidden" neurons to the output neurons.
//==<===>==\\
|| || Three Neurons Fully-Interconnected
N<===>N<===>N Including Self-Connections
/ \ / \ / \
\>/ \>/ \>/
A "recurrent" or "feedback" network has feedback paths.
A "fully-interconnected" network is a recurrent network which has
connections from every neuron to every other neuron going both ways.
The "neuron state" is the current output of a neuron, usually 1/0,
"on"/"off", "firing"/"resting" (and possibly "tired" also).
The "network state" is the current value of all of the neuron states
in the network.
Pattern State of a Network Displayed in Spike Raster Format
(Pattern State "Wave")
Neuron '!' = Firing Spike
0 ! ! ! ' ' = Not Firing
1 ! ! ! ! !
2 ! ! !
------------------------> Time
The "pattern state" is a particular dynamic looping oscillation or
stable repeated sequence of network states over time. Which patterns
will emerge depends on the synaptic conductances, or weights,
connecting the neurons. Pattern states may not emerge if the weights
in the network connections lead to instability.
"Learning" is what a synaptic weight does when it modifies its
conductance to achieve a desired output for a given input set.
"Training" is the process of causing the weights to learn.
Training may either be "supervised", "unsupervised", or
"reinforcement".
Supervised training forces the network to learn to generate a desired
output given an example training input. An example of this would be
to "clamp" the voltages of neurons in a network to desired values and
allow the synaptic weights to learn to generate those voltages on their
own.
Unsupervised training allows the network to learn to generate outputs
given training inputs in a manner which seeks stability with the hope
that the outputs will carry some unforseen information. Here, only
the inputs are given and the outputs are allowed to settle where they
may.
Reinforcement training is similar to supervised training except that
the an indication of performance error is given instead of the exact
desired, correct outputs. An example of this would be to simply
indicate "yes" or "no" as the network settles into a desirable or
undesirable pattern.
----------------------------------------------------------------------
HyperNet Operational Specifics
HyperNet uses excitatory and shunting inhibitory weights.
The transmission delay between neurons is limited to 100 times the
time delta (dt) maximum.
HyperNet is sensitive to timing issues, such as screen output delays.
The user performs supervised training on the network by firing neurons
using the number keys at the desired time intervals or by creating a
training file in spike raster format with the filename HyperNet.Mus.
When the "Tones On" parameter is off, firing neurons emanate "clicks".
HyperNet uses "autapses" which extends the common meaning of "fully-
interconnected."
HyperNet uses a personal variant of the Hebbian Learning Rule known as
FISL which tends to drive the fully-interconnected network into
stability. This rule can be useful in implementing supervised,
unsupervised, and reinforcement training.
----------------------------------------------------------------------
Ada Programming Language
HyperNet was written in Ada, the "International Software Engineering
Programming Language".
The author may be contacted for access to his personal standard Ada
and Meridian OpenAda for DOS software libraries used to create
HyperNet and other Ada programs.
----------------------------------------------------------------------
Acknowledgments
My thanks to Professor James Boyk of the California Institute of
Technology for his guidance while I was in his class "EE/Mu 107c:
Projects in Music and Science."
Transcribed to HTML on 1997-10-27 by
David Wallace Croft.