- R Deep Learning Cookbook
- Dr. PKS Prakash Achyutuni Sri Krishna Rao
- 221字
- 2021-07-02 20:49:14
How to do it...
This section covers type activation functions in multilayer perceptrons. Activation is one of the critical component of ANN as it defines the output of that node based on the given input. There are many different activation functions used while building a neural network:
- Sigmoid: The sigmoid activation function is a continuous function also known as a logistic function and has the form, 1/(1+exp(-x)). The sigmoid function has a tendency to zero out the backpropagation terms during training leading to saturation in response. In TensorFlow, the sigmoid activation function is defined using the tf.nn.sigmoid function.
- ReLU: Rectified linear unit (ReLU) is one of the most famous continuous, but not smooth, activation functions used in neural networks to capture non-linearity. The ReLU function is defined as max(0,x). In TensorFlow, the ReLU activation function is defined as tf.nn.relu.
- ReLU6: It caps the ReLU function at 6 and is defined as min(max(0,x), 6), thus the value does not become very small or large. The function is defined in TensorFlow as tf.nn.relu6.
- tanh: Hypertangent is another smooth function used as an activation function in neural networks and is bound [ -1 to 1] and implemented as tf.nn.tanh.
- softplus: It is a continuous version of ReLU, so the differential exists and is defined as log(exp(x)+1). In TensorFlow the softplus is defined as tf.nn.softplus.
推薦閱讀
- The Complete Rust Programming Reference Guide
- Python應用輕松入門
- Instant QlikView 11 Application Development
- Advanced Oracle PL/SQL Developer's Guide(Second Edition)
- Swift語言實戰(zhàn)精講
- 軟件測試實用教程
- 深入剖析Java虛擬機:源碼剖析與實例詳解(基礎卷)
- Statistical Application Development with R and Python(Second Edition)
- 遠方:兩位持續(xù)創(chuàng)業(yè)者的點滴思考
- Learning Image Processing with OpenCV
- 從零開始學UI:概念解析、實戰(zhàn)提高、突破規(guī)則
- WCF全面解析
- Java服務端研發(fā)知識圖譜
- 讓Python遇上Office:從編程入門到自動化辦公實踐
- Building Web and Mobile ArcGIS Server Applications with JavaScript(Second Edition)