- Deep Learning with PyTorch
- Vishnu Subramanian
- 135字
- 2021-06-24 19:16:23
Data preparation
PyTorch provides two kinds of data abstractions called tensors and variables. Tensors are similar to numpy arrays and they can also be used on GPUs, which provide increased performance. They provide easy methods of switching between GPUs and CPUs. For certain operations, we can notice a boost in performance and machine learning algorithms can understand different forms of data, only when represented as tensors of numbers. Tensors are like Python arrays and can change in size. For example, images can be represented as three-dimensional arrays (height, weight, channel (RGB)). It is common in deep learning to use tensors of sizes up to five dimensions. Some of the commonly used tensors are as follows:
- Scalar (0-D tensors)
- Vector (1-D tensors)
- Matrix (2-D tensors)
- 3-D tensors
- Slicing tensors
- 4-D tensors
- 5-D tensors
- Tensors on GPU
推薦閱讀
- 筆記本電腦使用、維護與故障排除實戰
- 電腦維護與故障排除傻瓜書(Windows 10適用)
- 深入淺出SSD:固態存儲核心技術、原理與實戰
- 計算機組裝·維護與故障排除
- 現代辦公設備使用與維護
- 單片機原理及應用系統設計
- Hands-On Machine Learning with C#
- Creating Flat Design Websites
- 龍芯自主可信計算及應用
- Intel Edison智能硬件開發指南:基于Yocto Project
- 單片機項目設計教程
- Istio實戰指南
- Deep Learning with Keras
- 計算機組裝、維護與維修項目教程
- The Reinforcement Learning Workshop