官术网_书友最值得收藏!

Bias and variance errors in deep learning

You may be familiar with the so-called bias/variance trade-off in typical predictive models. In case you're not, we'll provide a quick reminder here. With traditional predictive models, there is usually some compromise when we try to find an error from bias and an error from variance. So let's see what these two errors are:

  • Bias error: Bias error is the error that is introduced by the model. For example, if you attempted to model a non-linear function with a linear model, your model would be under specified and the bias error would be high.
  • Variance error: Variance error is the error that is introduced by randomness in the training data. When we fit our training distribution so well that our model no longer generalizes, we have overfit or introduce a variance error.

In most machine learning applications, we seek to find some compromise that minimizes bias error, while introducing as little variance error as possible. I say most because one of the great things about deep neural networks is that, for the most part, bias and variance can be manipulated independently of one another. However, to do so, we will need to be very careful with how we structure our training data.

主站蜘蛛池模板: 通州市| 彝良县| 高阳县| 南岸区| 台南县| 巴塘县| 绥江县| 涪陵区| 宁阳县| 天镇县| 灵璧县| 洞头县| 壶关县| 本溪| 佛山市| 蒙山县| 周至县| 万山特区| 长沙市| 承德县| 武山县| 康乐县| 舟山市| 南川市| 虹口区| 景洪市| 宽城| 青冈县| 上思县| 英吉沙县| 北宁市| 会宁县| 乾安县| 云和县| 当阳市| 台江县| 恭城| 交口县| 茌平县| 绵阳市| 雅安市|