官术网_书友最值得收藏!

Regularization

We have now got a fair understanding of what overfitting means when it comes to machine learning modeling. Just to reiterate, when the model learns the noise that has crept into the data, it is trying to learn the patterns that take place due to random chance, and so overfitting occurs. Due to this phenomenon, the model's generalization runs into jeopardy and it performs poorly on unseen data. As a result of that, the accuracy of the model takes a nosedive.

Can we combat this kind of phenomenon? The answer is yes. Regularization comes to the rescue. Let's figure out what it can offer and how it works.

Regularization is a technique that enables the model to not become complex to avoid overfitting.

Let's take a look at the following regression equation:

The loss function for this is as follows:

 

The loss function would help in getting the coefficients adjusted and retrieving the optimal one. In the case of noise in the training data, the coefficients wouldn't generalize well and would run into overfitting. Regularization helps get rid of this by making these estimates or coefficients drop toward 0.

Now, we will cover two types of regularization. In later chapters, the other types will be covered.

主站蜘蛛池模板: 桂阳县| 太保市| 区。| 沅江市| 平利县| 蒙城县| 泸水县| 长治市| 余江县| 奉化市| 龙门县| 胶州市| 宁化县| 榆树市| 昆山市| 常宁市| 吕梁市| 霸州市| 尼木县| 桐柏县| 巴塘县| 佛坪县| 盐亭县| 海伦市| 新丰县| 长兴县| 吴川市| 河曲县| 略阳县| 徐汇区| 福海县| 廉江市| 体育| 咸宁市| 沁水县| 保山市| 建宁县| 佛山市| 观塘区| 万荣县| 保德县|