Task: more practice using the softmax function, and connect it with the sigmoid function.
import torch
from torch import tensor
import matplotlib.pyplot as plt
%matplotlib inline
def softmax(x):
return torch.softmax(x, axis=0)
Try this example:
x1 = tensor([0.1, 0.2, 0.3])
x2 = tensor([0.1, 0.2, 100])
softmax(x1)
p = softmax(x1) then evaluates p.sum(). Before you run it, predict what the output will be.# your code here
p2 = softmax(x2) and displays the result. Before you run it, predict what it will output.# your code here
torch.sigmoid(tensor(0.1)). Write an expression that uses softmax to get the same output. Hint: Give softmax a two-element tensor([num1, num2]), where one of the numbers is 0.# your code here
softmax(x) a valid probability distribution? Why or why not?your answer here
x is called the "logits" and x.softmax(axis=0).log() (or x.log_softmax(axis=0)) is called the "logprobs", short for "log probabilities". Complete the following expressions for x1 (from the example above).logits = ...
logprobs = ...
probabilities = ...
softmax(x1) and softmax(x2), why might softmax be an appropriate name for this function?your answer here