- Deep Learning with PyTorch
- Vishnu Subramanian
- 116字
- 2021-06-24 19:16:25
Creating learnable parameters
In our neural network example, we have two learnable parameters, w and b, and two fixed parameters, x and y. We have created variables x and y in our get_data function. Learnable parameters are created using random initialization and have the require_grad parameter set to True, unlike x and y, where it is set to False. There are different practices for initializing learnable parameters, which we will explore in the coming chapters. Let's take a look at our get_weights function:
def get_weights():
w = Variable(torch.randn(1),requires_grad = True)
b = Variable(torch.randn(1),requires_grad=True)
return w,b
Most of the preceding code is self-explanatory; torch.randn creates a random value of any given shape.
推薦閱讀
- 用“芯”探核:龍芯派開發(fā)實戰(zhàn)
- 數(shù)字道路技術(shù)架構(gòu)與建設(shè)指南
- 嵌入式系統(tǒng)設(shè)計教程
- 平衡掌控者:游戲數(shù)值經(jīng)濟設(shè)計
- 深入淺出SSD:固態(tài)存儲核心技術(shù)、原理與實戰(zhàn)(第2版)
- 計算機組裝與維修技術(shù)
- CC2530單片機技術(shù)與應(yīng)用
- 筆記本電腦應(yīng)用技巧
- Intel Edison智能硬件開發(fā)指南:基于Yocto Project
- 龍芯自主可信計算及應(yīng)用
- Blender Quick Start Guide
- Istio服務(wù)網(wǎng)格技術(shù)解析與實踐
- 電腦組裝與維護即時通
- 單片機原理及應(yīng)用:基于C51+Proteus仿真
- Hands-On Deep Learning for Images with TensorFlow