- Deep Learning with PyTorch
- Vishnu Subramanian
- 174字
- 2021-06-24 19:16:28
PyTorch non-linear activations
PyTorch has most of the common non-linear activation functions implemented for us already and it can be used like any other layer. Let's see a quick example of how to use the ReLU function in PyTorch:
sample_data = Variable(torch.Tensor([[1,2,-1,-1]]))
myRelu = ReLU()
myRelu(sample_data)
Output:
Variable containing:
1 2 0 0
[torch.FloatTensor of size 1x4]
In the preceding example, we take a tensor with two positive values and two negative values and apply a ReLU on it, which thresholds the negative numbers to 0 and retains the positive numbers as they are.
Now we have covered most of the details required for building a network architecture, let's build a deep learning architecture that can be used to solve real-world problems. In the previous chapter, we used a simple approach so that we could focus only on how a deep learning algorithm works. We will not be using that style to build our architecture anymore; rather, we will be building the architecture in the way it is supposed to be built in PyTorch.
- 24小時(shí)學(xué)會(huì)電腦組裝與維護(hù)
- 電腦軟硬件維修大全(實(shí)例精華版)
- 深入淺出SSD:固態(tài)存儲(chǔ)核心技術(shù)、原理與實(shí)戰(zhàn)
- 平衡掌控者:游戲數(shù)值經(jīng)濟(jì)設(shè)計(jì)
- 筆記本電腦維修不是事兒(第2版)
- Apple Motion 5 Cookbook
- Creating Flat Design Websites
- 單片機(jī)開發(fā)與典型工程項(xiàng)目實(shí)例詳解
- Internet of Things Projects with ESP32
- 嵌入式系統(tǒng)原理及應(yīng)用:基于ARM Cortex-M4體系結(jié)構(gòu)
- Arduino項(xiàng)目案例:游戲開發(fā)
- USB應(yīng)用分析精粹:從設(shè)備硬件、固件到主機(jī)端程序設(shè)計(jì)
- 計(jì)算機(jī)組裝、維護(hù)與維修項(xiàng)目教程
- The Machine Learning Workshop
- 詳解FPGA:人工智能時(shí)代的驅(qū)動(dòng)引擎