(Artificial neural networks)
-
Artificial neural networks are computational models inspired by biological neural networks, and are used to approximate functions that are generally unknown. Particularly, they are inspired by the behaviour of neurons and the electrical signals they convey between input (such as from the eyes or nerve endings in the hand), processing, and output from the brain (such as reacting to light, touch, or heat).
(Backpropagation)
-
Backpropagation is a method used in artificial neural networks to calculate a gradient that is needed in the calculation of the weights to be used in the network. Backpropagation is shorthand for "the backward propagation of errors," since an error is computed at the output and distributed backwards throughout the network’s layers. It is commonly used to train deep neural networks.
-
Tags: Algorithms, Artificial Neural Networks
-
In this network the information moves in only one direction-forward, From the input nodes data goes through the hidden nodes (if any) and to the output nodes. There are no cycles or loops in the network. Feedforward networks can be constructed from different types of units, e.g. binary McCulloch-Pitts neurons, the simplest example being the perceptron. Continuous neurons, frequently with sigmoidal activation, are used in the context of backpropagation of error.
-
Invented in 1957 by Frank Rosenblatt at the Cornell Aeronautical Laboratory, a perceptron is the simplest neural network possible a computational model of a single neuron. A perceptron consists of one or more inputs, a processor, and a single output. A perceptron follows the "feed-forward" model, meaning inputs are sent into the neuron, are processed, and result in an output.
-
Tags: Classification Algorithms, Machine Learning, Artificial Neural Networks
(Stochastic gradient descent)
-
Stochastic gradient descent (often shortened in SGD), also known as incremental gradient descent, is a stochastic approximation of the gradient descent optimization method for minimizing an objective function that is written as a sum of differentiable functions. In other words, SGD tries to find minima or maxima by iteration.
-
http://ufldl.stanford.edu/tutorial/supervised/OptimizationStochasticGradientDescent/
-
Tags: Algorithms, Computational Statistics