Artificial neuron: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Felipe Ortega Gutiérrez
No edit summary
imported>Felipe Ortega Gutiérrez
No edit summary
Line 12: Line 12:
<math>a = \sum_{i=0}^n w_i x_i</math>
<math>a = \sum_{i=0}^n w_i x_i</math>


After the activation is produced, a function modifies it, producing an output. That function is often called ''transfer function'' and its purpose is to filter the activation.
After the activation is produced, a function modifies it, producing an output. That function is often called ''transfer function'' and its purpose is to filter the activation. The output value <math>y</math> can be expressed as:


<math>y = \varphi \left( \sum_{i=0}^n w_i x_i \right)</math>
<math>y = \varphi \left( \sum_{i=0}^n w_i x_i \right)</math>


==Transfer Functions==
==Transfer Functions==
'''Transfer functions''' is the name given for the functions which apply the threshold to the activation value. This functions can be discrete or continuous, and they also can be defined as step functions.
'''Transfer function''' is the name given for the functions which filter the activation value. This functions can be discrete or continuous.
 
===Step function===
The step function (also called hard-limiter) is used to produce [[binary]] outputs. In this function, the result is 0 if the activation is less than a value called ''threshold'', often symbolized with theta (<math>\theta</math>).
 
<math>y = \left\{ \begin{matrix} 1 & \mbox{if }a \ge \theta \\ 0 & \mbox{if }a < \theta \end{matrix} \right.</math>
 
===Sigmoid===
The sigmoid function is used to produce continuous values. It's an S-shaped curve, and it's used when the inputs are between 0 and 1.
 
<math>\sigma \left( a \right) = \frac{1}{1+\exp\bigg(\frac{-a-\theta}{\rho}\bigg)}</math>
Where <math>a</math> is the activation, <math>\theta</math> is the ''threshold'' (that can be zero, simplifying the equation), and <math>\rho</math> is a value which defines the curvature of the sigmoid.


==Impulse pass==
==Impulse pass==

Revision as of 00:37, 5 July 2007

Artificial neurons are processing units based on the biological neural model. The first artificial neuron model was created by McCullough and Pitts, and then newer and more complex models have appeared. Since the connectivity in the biological neurons is higher, artificial neurons must be considered as only an approximation to the biological model.

Artifical neurons can be organized and connected in order to create artificial neural networks, which process the data carried through the neural connections in different layers. Learning algorithms can also be applied to artificial neural networks in order to modify their behavior.

McCullough-Pitts neuron with 4 inputs.

Behavior

A neuron may have multiple inputs, each input has an assigned value called weight, which represents the strength of the connection between the source and destination neuron. The input signal value is multiplied by the weight .

The sum of all input values multiplied by their respective weights is called activation, or weighted-sum.

After the activation is produced, a function modifies it, producing an output. That function is often called transfer function and its purpose is to filter the activation. The output value can be expressed as:

Transfer Functions

Transfer function is the name given for the functions which filter the activation value. This functions can be discrete or continuous.

Step function

The step function (also called hard-limiter) is used to produce binary outputs. In this function, the result is 0 if the activation is less than a value called threshold, often symbolized with theta ().

Sigmoid

The sigmoid function is used to produce continuous values. It's an S-shaped curve, and it's used when the inputs are between 0 and 1.

Where is the activation, is the threshold (that can be zero, simplifying the equation), and is a value which defines the curvature of the sigmoid.

Impulse pass

Depending on the network model, neurons can pass their impulses to their terminals, or backwards. The "backward pass" can be observed in learning algorithms like "Backpropagation".

Analogy to Biological Neurons

In biological neurons there is a similar behavior. Inputs are electrical pulses transmitted to the synapses (terminals in the dendrites). Electrical pulses produce a release of neurotransmitters which may alter the dendritic membrane potential (Post Synaptic Potential). The Post Synaptic Potential travels over the axon, reaching another neuron, which will sum all the Post Synaptic Potentials received, and fire an output if the total sum of the Post Synaptic Potentials in the axon hillock received exceeds a threshold.