- Applied Deep Learning and Computer Vision for Self/Driving Cars
- Sumit Ranjan;Dr. S. Senthamilarasu
- 445字
- 2021-04-09 23:13:03
The cost function of neural networks
We will now explore how can we evaluate the performance of a neural network by using the cost function. We will use it to measure how far we are from the expected value. We are going to use the following notation and variables:
Variable Y to represent the true value
Variable a to represent the neuron prediction
In terms of weight and biases, the formula is as follows:
We pass z, which is the input (X) times the weight (X) added to the bias (b), into the activation function of .
There are many types of cost functions, but we are just going to discuss two of them:
The quadratic cost function
The cross-entropy function
The first cost function we are going to discuss is the quadratic cost function, which is represented with the following formula:
In the preceding formula, we can see that when the error is high, which means the actual value (Y) is less than the predictive value (a), then the value of the cost function will be negative and we cannot use a negative value as cost. So, we are going to square the result, then the value of the cost function will be a positive value. But unfortunately, when we use the quadratic cost function, the way the formula works actually reduces the learning rate of the network.
Instead, we are going to use the cross-entropy function, which can be defined as follows:
This cost function allows faster learning because the larger the difference between y and a, the faster our neurons' learning rate. This means that if the network has a large difference between the predicted value and the actual value at the beginning of the model training process, then we can essentially move toward using a cost function because the larger the difference, the faster the neurons are going to learn.
There are two key components of how neural networks learn from features. First, there are neurons and their activation functions and cost functions, but we are still missing a key step: the actual learning process. So, we need to figure out how we can use the neurons and their measurement of error (the cost function) to correct our prediction or make the network learn. Up until now, we have tried to understand neurons and perceptrons, and then linked them to get a neural net. We also understand that cost functions are essentially measurements of errors. Now, we are going to fix the errors between the actual and predicted values using gradient descent and backpropagation.
- Animate 2022動畫制作:團體操隊形
- 中文版Maya 2012實用教程(第2版)
- 3ds Max & Unreal Engine 4:VR三維建模技術實例教程(附VR模型)
- 中文版CorelDRAW基礎培訓教程
- 老郵差數碼照片處理技法 圖層篇
- After Effects中文版入門、精通與實戰
- PS App UI設計從零開始學
- VR策劃與編導
- 零基礎攝影后期修圖 Photoshop照片處理輕松入門
- 剪映:從零開始精通短視頻剪輯(電腦版)
- LaTeX入門與實戰應用
- AutoCAD 2020與天正建筑T20 V6.0建筑設計從入門到精通
- 3ds Max-Photoshop游戲模型制作全攻略
- Python Text Processing with NLTK 2.0 Cookbook
- Vulkan 應用開發指南