- Deep Learning Quick Reference
- Mike Bernico
- 211字
- 2021-06-24 18:40:07
Bias and variance errors in deep learning
You may be familiar with the so-called bias/variance trade-off in typical predictive models. In case you're not, we'll provide a quick reminder here. With traditional predictive models, there is usually some compromise when we try to find an error from bias and an error from variance. So let's see what these two errors are:
- Bias error: Bias error is the error that is introduced by the model. For example, if you attempted to model a non-linear function with a linear model, your model would be under specified and the bias error would be high.
- Variance error: Variance error is the error that is introduced by randomness in the training data. When we fit our training distribution so well that our model no longer generalizes, we have overfit or introduce a variance error.
In most machine learning applications, we seek to find some compromise that minimizes bias error, while introducing as little variance error as possible. I say most because one of the great things about deep neural networks is that, for the most part, bias and variance can be manipulated independently of one another. However, to do so, we will need to be very careful with how we structure our training data.
推薦閱讀
- Deep Learning Quick Reference
- 微型計(jì)算機(jī)控制技術(shù)
- 數(shù)據(jù)挖掘?qū)嵱冒咐治?/a>
- 計(jì)算機(jī)網(wǎng)絡(luò)技術(shù)基礎(chǔ)
- Linux嵌入式系統(tǒng)開發(fā)
- 液壓機(jī)智能故障診斷方法集成技術(shù)
- Photoshop CS5圖像處理入門、進(jìn)階與提高
- 從零開始學(xué)Java Web開發(fā)
- 21天學(xué)通Linux嵌入式開發(fā)
- 智慧未來
- Windows 7故障與技巧200例
- 分布式Java應(yīng)用
- JSP網(wǎng)絡(luò)開發(fā)入門與實(shí)踐
- 歐姆龍CP1系列PLC原理與應(yīng)用
- 三維動畫制作(3ds max7.0)