- Reinforcement Learning with TensorFlow
- Sayon Dutta
- 105字
- 2021-08-27 18:51:52
How to choose the right activation function
The activation function is decided depending upon the objective of the problem statement and the concerned properties. Some of the inferences are as follows:
Sigmoid functions work very well in the case of shallow networks and binary classifiers. Deeper networks may lead to vanishing gradients.
The ReLU function is the most widely used, and try using Leaky ReLU to avoid the case of dead neurons. Thus, start with ReLU, then move to another activation function if ReLU doesn't provide good results.
Use softmax in the outer layer for the multi-class classification.
Avoid using ReLU in the outer layer.
推薦閱讀
- 電力自動化實用技術問答
- Hands-On Artificial Intelligence on Amazon Web Services
- Getting Started with MariaDB
- MCSA Windows Server 2016 Certification Guide:Exam 70-741
- Zabbix Network Monitoring(Second Edition)
- Arduino &樂高創意機器人制作教程
- Machine Learning with the Elastic Stack
- Working with Linux:Quick Hacks for the Command Line
- 云計算和大數據的應用
- 中文版AutoCAD 2013高手速成
- Mastering Ceph
- Ansible 2 Cloud Automation Cookbook
- Apache Spark Quick Start Guide
- Raspberry Pi 3 Projects for Java Programmers
- Mastercam X5應用技能基本功特訓