- Deep Learning with PyTorch
- Vishnu Subramanian
- 135字
- 2021-06-24 19:16:23
Data preparation
PyTorch provides two kinds of data abstractions called tensors and variables. Tensors are similar to numpy arrays and they can also be used on GPUs, which provide increased performance. They provide easy methods of switching between GPUs and CPUs. For certain operations, we can notice a boost in performance and machine learning algorithms can understand different forms of data, only when represented as tensors of numbers. Tensors are like Python arrays and can change in size. For example, images can be represented as three-dimensional arrays (height, weight, channel (RGB)). It is common in deep learning to use tensors of sizes up to five dimensions. Some of the commonly used tensors are as follows:
- Scalar (0-D tensors)
- Vector (1-D tensors)
- Matrix (2-D tensors)
- 3-D tensors
- Slicing tensors
- 4-D tensors
- 5-D tensors
- Tensors on GPU
推薦閱讀
- Raspberry Pi 3 Cookbook for Python Programmers
- 圖解西門子S7-200系列PLC入門
- 龍芯應用開發標準教程
- Linux KVM虛擬化架構實戰指南
- SDL Game Development
- 數字道路技術架構與建設指南
- 計算機組裝·維護與故障排除
- 硬件產品經理手冊:手把手構建智能硬件產品
- 電腦高級維修及故障排除實戰
- 計算機組裝維修與外設配置(高等職業院校教改示范教材·計算機系列)
- BeagleBone Robotic Projects
- Spring Cloud微服務和分布式系統實踐
- Neural Network Programming with Java(Second Edition)
- 單片機原理及應用:基于C51+Proteus仿真
- Blender for Video Production Quick Start Guide