Return to page

WIKI

What is a Neural Network?

The fields of artificial intelligence (AI), machine learning, and deep learning use neural networks to recognize patterns and solve problems, similarly to a human brain. Neural networks, also called artificial neural networks (ANNs) or simulated neural networks (SNNs), differentiate from biological networks. 

Node layers, each comprised of an input layer, at least one hidden layer, and an output layer, form the ANN. Each node is connected and has a corresponding weight and threshold. To be activated, and for data sent to the next layer, the output of the node must reach a specified threshold value.
 

What is Forward Propagation in Neural Networks?

Forward propagation is where input data is fed through a network, in a forward direction, to generate an output. The data is accepted by hidden layers and processed, as per the activation function, and moves to the successive layer. The forward flow of data is designed to avoid data moving in a circular motion, which does not generate an output. 

During forward propagation, pre-activation and activation take place at each hidden and output layer node of a neural network. The pre-activation function is the calculation of the weighted sum. The activation function is applied, based on the weighted sum, to make the neural network flow non-linearly using bias. 
 

How Does Forward Propagation Relate to Backpropagation?

In order to be trained, a neural network relies on both forward and backward propagation. Backpropagation is used in machine learning and data mining to improve prediction accuracy through backward propagation calculated derivatives. Backward Propagation is the process of moving from right (output layer)  to left (input layer). Forward propagation is the way data moves from left (input layer) to right (output layer) in the neural network. 

A neural network can be understood by a collection of connected input/output nodes. The accuracy of a node is expressed as a loss function or error rate. Backpropagation calculates the slope of a loss function of other weights in the neural network.