- R Deep Learning Cookbook
- Dr. PKS Prakash Achyutuni Sri Krishna Rao
- 221字
- 2021-07-02 20:49:14
How to do it...
This section covers type activation functions in multilayer perceptrons. Activation is one of the critical component of ANN as it defines the output of that node based on the given input. There are many different activation functions used while building a neural network:
- Sigmoid: The sigmoid activation function is a continuous function also known as a logistic function and has the form, 1/(1+exp(-x)). The sigmoid function has a tendency to zero out the backpropagation terms during training leading to saturation in response. In TensorFlow, the sigmoid activation function is defined using the tf.nn.sigmoid function.
- ReLU: Rectified linear unit (ReLU) is one of the most famous continuous, but not smooth, activation functions used in neural networks to capture non-linearity. The ReLU function is defined as max(0,x). In TensorFlow, the ReLU activation function is defined as tf.nn.relu.
- ReLU6: It caps the ReLU function at 6 and is defined as min(max(0,x), 6), thus the value does not become very small or large. The function is defined in TensorFlow as tf.nn.relu6.
- tanh: Hypertangent is another smooth function used as an activation function in neural networks and is bound [ -1 to 1] and implemented as tf.nn.tanh.
- softplus: It is a continuous version of ReLU, so the differential exists and is defined as log(exp(x)+1). In TensorFlow the softplus is defined as tf.nn.softplus.
推薦閱讀
- 大學計算機基礎(第二版)
- C程序設計簡明教程(第二版)
- Hadoop+Spark大數據分析實戰
- Mastering Swift 2
- Expert Android Programming
- VMware虛擬化技術
- 微信小程序項目開發實戰
- 自制編程語言
- Raspberry Pi Home Automation with Arduino(Second Edition)
- PySpark Cookbook
- Learning R for Geospatial Analysis
- 搞定J2EE:Struts+Spring+Hibernate整合詳解與典型案例
- Learning Material Design
- 深入實踐DDD:以DSL驅動復雜軟件開發
- Ext JS 4 Plugin and Extension Development