PyTorch Packages

PyTorch is an optimized tensor library for deep learning using CPUs and GPUs. PyTorch has a rich set of packages which are used to perform deep learning concepts. These packages help us in optimization, conversion, and loss calculation, etc. Let’s get a brief knowledge of these packages.

S.NoNameDescription
1.TorchThe torch package includes data structure for multi-dimensional tensors and mathematical operation over these are defined.
2.torch.TensorThis package is a multi-dimensional matrix which contains an element of a single data type.
3.Tensor Attributes
a) torch.dtypeIt is an object which represents the datatype of thetorch.Tensor.
b) torch.deviceIt is an object that represents the device on which torch.Tensor will be allocated.
c) torch.layoutIt is an object which represents a memory layout of a toch.Tensor.
4.Type InfoThe numerical properties of a torch.dtype will be accessed through either the torch.iinfo or the torch.finfo.
1) torch.finfoIt is an object which represents the numerical properties of a floating-point torch.dtype.
2) torch.iinfoIt is an object which represents the numerical properties of an integer torch.dtype.
5.torch.sparseTorch supports sparse tensors in COO (rdinate) format, which will efficiently store and process tensors for which the majority of elements are zero.
6.torch.cudaTorch supports for CUDA tensor types which implement the same function as CPU tensors, but for computation they utilize GPUs.
7.torch.StorageA torch.Storage is a contiguous, one-dimensional array of a single data type.
8.torch.nnThis package provides us many more classes and modules to implement and train the neural network.
9.torch.nn.functionalThis package has functional classes which are similar to torch.nn.
10.torch.optimThis package is used to implement various optimization algorithm.
11.torch.autogardThis package provides classes and functions to implement automatic differentiation of arbitrary scalar value functions.
12.torch.distributedThis package supports three backends and each one is with different capabilities.
13.torch.distributionThis package allows us to construct the stochastic computation graphs, and stochastic gradient estimators for optimization
14.torch.hubIt is a pre-trained model repository which is designed to facilitate research reproducibility.
15.torch.multiprocessingIt is a wrapper around the native multiprocessing module.
16.torch.utils.bottleneckIt is a tool which can be used as an initial step for debugging bottlenecks in our program.
17.torch.utils.checkpointIt is used to create checkpoint in our source program.
18.torch.tils.cpp_extensionIt is used to create the extension of C++, CUDA, and other languages.
19.torch.utils.dataThis package is mainly used for creating the dataset.
20.torch.utils.dlpackIt will use to decode the Dlpack into tensor.
21.torch.onnxThe ONNX exporter is a trace-based exporter, which means that it operates by executing your model once and exporting the operators which were actually run during this run


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *