官术网_书友最值得收藏!

PyTorch non-linear activations

PyTorch has most of the common non-linear activation functions implemented for us already and it can be used like any other layer. Let's see a quick example of how to use the ReLU function in PyTorch:

sample_data = Variable(torch.Tensor([[1,2,-1,-1]]))
myRelu = ReLU()
myRelu(sample_data)

Output:

Variable containing:
1 2 0 0
[torch.FloatTensor of size 1x4]

In the preceding example, we take a tensor with two positive values and two negative values and apply a ReLU on it, which thresholds the negative numbers to 0 and retains the positive numbers as they are.

Now we have covered most of the details required for building a network architecture, let's build a deep learning architecture that can be used to solve real-world problems. In the previous chapter, we used a simple approach so that we could focus only on how a deep learning algorithm works. We will not be using that style to build our architecture anymore; rather, we will be building the architecture in the way it is supposed to be built in PyTorch.

主站蜘蛛池模板: 汉沽区| 灵武市| 吴桥县| 湘潭市| 罗定市| 上饶市| 新疆| 肇庆市| 独山县| 新乡市| 突泉县| 绥阳县| 四子王旗| 甘德县| 利川市| 宿州市| 融水| 海南省| 海林市| 汉源县| 乐亭县| 海口市| 子洲县| 平潭县| 稻城县| 乌拉特中旗| 广南县| 高雄市| 新巴尔虎右旗| 巴青县| 报价| 措美县| 黔南| 河西区| 辽宁省| 卢龙县| 定边县| 龙川县| 邵阳县| 宁化县| 崇礼县|