- Deep Learning Quick Reference
- Mike Bernico
- 151字
- 2021-06-24 18:40:15
The cost function
We need our classifier to predict the probability of seizure, which is class 1. This means that our output will be constrained to [0,1] as it would be in a traditional logistic regression model. Our cost function, in this case, will binary cross-entropy, which is also known as log loss. If you've worked with classifiers before, this math is likely familiar to you; however, as a refresher, I'll include it here.
The complete formula for log loss looks like this:

This can probably be seen more simply as a set of two functions, one for case and
:

When and

When
The log function is used here to result in a monotonic function (one that is always increasing or decreasing) that we can easily differentiate. As with all cost functions, we will adjust our network parameters to minimize the cost of the network.
- Big Data Analytics with Hadoop 3
- 嵌入式系統(tǒng)及其開發(fā)應(yīng)用
- Verilog HDL數(shù)字系統(tǒng)設(shè)計(jì)入門與應(yīng)用實(shí)例
- Learning Social Media Analytics with R
- HBase Design Patterns
- 信息物理系統(tǒng)(CPS)測試與評價(jià)技術(shù)
- 傳感器與新聞
- 人工智能技術(shù)入門
- 計(jì)算機(jī)組成與操作系統(tǒng)
- 大數(shù)據(jù)案例精析
- Windows安全指南
- Instant Slic3r
- QTP自動(dòng)化測試實(shí)踐
- 玩轉(zhuǎn)機(jī)器人:基于Proteus的電路原理仿真(移動(dòng)視頻版)
- PVCBOT零基礎(chǔ)機(jī)器人制作(第2版)