官术网_书友最值得收藏!

Backpropagation – a method for neural networks to learn

Great! We have come a long way, from looking at the biological neuron, to the types of neuron, to determining accuracy, and correcting the learning of the neuron. Only one question remains: how can the whole network of neurons learn together?

Backpropagation is an incredibly smart approach to making gradient descent happen throughout the network across all layers. Backpropagation leverages the chain rule from calculus to make it possible to transfer information back and forth through the network:

In principle, the information from the input parameters and weights is propagated through the network to make a guess at the expected output and then the overall inaccuracy is backpropagated through the layers of the network so that the weights can be adjusted and the output can be guessed again.

This single cycle of learning is called a training step or iteration. Each iteration is performed on a batch of the input training samples. The number of samples in a batch is called batch size. When all of the input samples have been through an iteration or training step, then it is called an epoch.

For example, let's say there are 100 training samples and in every iteration or training step, there are 10 samples being used by the network to learn. Then, we can say that the batch size is 10 and it will take 10 iterations to complete a single epoch. Provided each batch has unique samples, that is, if every sample is used by the network at least once, then it is a single epoch. 

This back-and-forth propagation of the predicted output and the cost through the network is how the network learns.

We will revisit training step, epoch, learning rate, cross entropy, batch size, and more during our hands-on sections. 

主站蜘蛛池模板: 西吉县| 昌都县| 清远市| 东台市| 犍为县| 资兴市| 鹤山市| 沅陵县| 莒南县| 黄浦区| 景宁| 天镇县| 宁陕县| 山东省| 金湖县| 专栏| 高尔夫| 静宁县| 南阳市| 策勒县| 阿城市| 中卫市| 竹山县| 通州市| 蒙山县| 德惠市| 灵宝市| 中西区| 西华县| 荔波县| 湟源县| 香港 | 达日县| 德兴市| 张家港市| 澄江县| 公主岭市| 策勒县| 和政县| 广平县| 淳安县|