- Deep Learning with Keras
- Antonio Gulli Sujit Pal
- 109字
- 2021-07-02 23:58:02
Activation functions
Sigmoid and ReLU are generally called activation functions in neural network jargon. In the Testing different optimizers in Keras section, we will see that those gradual changes, typical of sigmoid and ReLU functions, are the basic building blocks to developing a learning algorithm which adapts little by little, by progressively reducing the mistakes made by our nets. An example of using the activation function σ with the (x1, x2, ..., xm) input vector, (w1, w2, ..., wm) weight vector, b bias, and Σ summation is given in the following diagram:

Keras supports a number of activation functions, and a full list is available at https://keras.io/activations/.
推薦閱讀
- 新媒體跨界交互設(shè)計
- Cortex-M3 + μC/OS-II嵌入式系統(tǒng)開發(fā)入門與應(yīng)用
- Intel FPGA/CPLD設(shè)計(高級篇)
- 精選單片機(jī)設(shè)計與制作30例(第2版)
- Getting Started with Qt 5
- The Applied AI and Natural Language Processing Workshop
- AMD FPGA設(shè)計優(yōu)化寶典:面向Vivado/SystemVerilog
- Artificial Intelligence Business:How you can profit from AI
- 單片機(jī)系統(tǒng)設(shè)計與開發(fā)教程
- 超大流量分布式系統(tǒng)架構(gòu)解決方案:人人都是架構(gòu)師2.0
- Source SDK Game Development Essentials
- FL Studio Cookbook
- FPGA實驗實訓(xùn)教程
- 微服務(wù)實戰(zhàn)(Dubbox +Spring Boot+Docker)
- 計算機(jī)組裝、維護(hù)與維修項目教程