PyTorch is a powerful library primarily known for building neural networks. However, PyTorch also provides an efficient, NumPy-like interface for handling multi-dimensional arrays (tensors) and performing numerical computations. In this lab, we will:

Note: This tutorial does not cover gradients or backpropagation. We will use PyTorch purely as a numerical library.

WARNING: Do not use GPT; otherwise, you will learn nothing. At the initial stage, you need to become familiar with PyTorch grammar and style and learn how to debug. Only then will you be able to utilize GPT or other LLMs effectively. Otherwise, neither you nor anyone else will learn this.

Setup

Environment Check: Ensure you have a recent version of PyTorch installed (e.g., 2.0+).

!pip install torch==2.1.1

If you are using a virtual environment or a Colab notebook, make sure it is activated. PyTorch 2.0 has been widely used for most academic work, loose requirements, and industrial solutions, with strict requirements often in place, especially in the past two years.

import torch

# Check PyTorch version
print("PyTorch version:", torch.__version__)

Pytorch Tensors vs. NumPy Arrays: They are the same in concept.

Creating Tensors

Let’s explore multiple ways to create tensors in PyTorch.

From Python Lists

# 1D tensor
t1 = torch.tensor([1, 2, 3])
print("t1:", t1)
print("t1 shape:", t1.shape)

# 2D tensor
t2 = torch.tensor([[1, 2], [3, 4]])
print("\\nt2:\\n", t2)
print("t2 shape:", t2.shape)

Using Built-in Methods

zeros_tensor = torch.zeros((2, 3))
ones_tensor = torch.ones((2, 3))
randn_tensor = torch.randn((2, 3))
range_tensor = torch.arange(0, 10, 2)

print("zeros_tensor:\\n", zeros_tensor)
print("ones_tensor:\\n", ones_tensor)
print("randn_tensor:\\n", randn_tensor)
print("range_tensor:", range_tensor)

Exercise:

  1. Convert a 3D list into a PyTorch tensor. Confirm its shape.