- Deep Learning Quick Reference
- Mike Bernico
- 151字
- 2021-06-24 18:40:15
The cost function
We need our classifier to predict the probability of seizure, which is class 1. This means that our output will be constrained to [0,1] as it would be in a traditional logistic regression model. Our cost function, in this case, will binary cross-entropy, which is also known as log loss. If you've worked with classifiers before, this math is likely familiar to you; however, as a refresher, I'll include it here.
The complete formula for log loss looks like this:

This can probably be seen more simply as a set of two functions, one for case and
:

When and

When
The log function is used here to result in a monotonic function (one that is always increasing or decreasing) that we can easily differentiate. As with all cost functions, we will adjust our network parameters to minimize the cost of the network.
- R Machine Learning By Example
- 協(xié)作機器人技術及應用
- Getting Started with MariaDB
- 工業(yè)機器人工程應用虛擬仿真教程:MotoSim EG-VRC
- 21天學通C++
- 永磁同步電動機變頻調速系統(tǒng)及其控制(第2版)
- CompTIA Network+ Certification Guide
- DevOps:Continuous Delivery,Integration,and Deployment with DevOps
- Apache源代碼全景分析(第1卷):體系結構與核心模塊
- Hands-On Dashboard Development with QlikView
- 重估:人工智能與賦能社會
- Mastering OpenStack(Second Edition)
- Microsoft System Center Data Protection Manager Cookbook
- Data Science with Python
- Machine Learning for Healthcare Analytics Projects