Artificial Consciousness/Neural Correlates/Neural Models/Multi-layer Network Model

The Multi-Layer Network Model edit

One way of capturing the complexity of the dendritic mass, is to develop a model, that simulates the integration that happens in the dendrites, as if it were a neuron. This technique is often called the Hidden Layer technique, in that the dendritic complexity is hidden in the physical neuron, but needs to be dealt with if we want a valid model. We build the complexity into the network by adding hidden layers to the network, to simulate the dendritic complexity of the natural neurons.

A single layer of neurons in a biological system, would therefore require a minimum of three layers in the Multi-layer Network model to simulate it. The first layer or input layer gives us the interface to the network, the hidden layer/s give us the dendritic complexity, and the output layer gives us the Neural Integration. If we base our neurons on the HH model we get the second order or spiking output.

This is all we need to simulate the Network connections that make up a neural structure, however, it is important to realize that it is not all we need to adequately simulate a real neuron. One of the most important additions to this structure, is some sort of learning rule that sets the weights for the synapses. Although early learning rules were based on the Delta rule, which required an error function to determine how to set the weights, current models tend to have both a pre-synaptic rule and a post-synaptic rule that influence the weights.

Depending on whether you want a Biological Model of the brain, or whether you want to capture the flexibility of Neural Networks for some other reason, Neural Network theory, splits at this point, into two separate families of technology. Biological Neural Networks, simulate neurons and Artificial Neural Networks are an offshoot of Artificial Intelligence.