compute-grad-PyTorch¶Task: compute the gradient of a simple function using PyTorch
import torch
from torch import tensor
import matplotlib.pyplot as plt
%matplotlib inline
We now define a function of two variables:
def square(x):
return x * x
def double(x):
return 2 * x
def f(x1, x2):
return double(x1) + square(x2) + 5.0
We evaluate it at a few values.
f(0.0, 0.0)
f(0.1, 0.0)
f(0.0, 0.1)
Compute the gradient of f with respect to x1, when x1 = 1.0 and x2 = 1.0.
Steps:
x1 = torch.tensor(1.0, requires_grad=True)
x2 = ...
result = f(x1, x2)
backward on the result.result.backward()
The gradient is now stored in x1.grad.
x1.grad
Compute the gradient of f with respect to x2, when x1 = 1.0 and x2 = 1.0.
x1 = torch.tensor(1.0, requires_grad=True)
x2 = torch.tensor(1.0, requires_grad=True)
# your code here
Repeat both tasks above for several other values of x1 and x2. Also look at the definition of f and recall what you learned about derivatives in Calculus. Based on that:
+ or *; don't use any autograd functionality (like .backward()).x1_grad = ...
Make sure that you understand why this is different from the value of x1.grad.
x2_grad = ...