- Deep Learning with PyTorch
- Vishnu Subramanian
- 174字
- 2021-06-24 19:16:28
PyTorch non-linear activations
PyTorch has most of the common non-linear activation functions implemented for us already and it can be used like any other layer. Let's see a quick example of how to use the ReLU function in PyTorch:
sample_data = Variable(torch.Tensor([[1,2,-1,-1]]))
myRelu = ReLU()
myRelu(sample_data)
Output:
Variable containing:
1 2 0 0
[torch.FloatTensor of size 1x4]
In the preceding example, we take a tensor with two positive values and two negative values and apply a ReLU on it, which thresholds the negative numbers to 0 and retains the positive numbers as they are.
Now we have covered most of the details required for building a network architecture, let's build a deep learning architecture that can be used to solve real-world problems. In the previous chapter, we used a simple approach so that we could focus only on how a deep learning algorithm works. We will not be using that style to build our architecture anymore; rather, we will be building the architecture in the way it is supposed to be built in PyTorch.
- SDL Game Development
- 深入淺出SSD:固態存儲核心技術、原理與實戰
- INSTANT ForgedUI Starter
- 數字邏輯(第3版)
- Visual Media Processing Using Matlab Beginner's Guide
- 基于Apache Kylin構建大數據分析平臺
- 筆記本電腦應用技巧
- Managing Data and Media in Microsoft Silverlight 4:A mashup of chapters from Packt's bestselling Silverlight books
- 單片微機原理及應用
- IP網絡視頻傳輸:技術、標準和應用
- FPGA實驗實訓教程
- The Applied Artificial Intelligence Workshop
- 計算機應用基礎案例教程(Windows 7+Office 2010)
- 主板維修實踐技術
- 計算機組裝與維護立體化教程(微課版)