You start by using Michael Hamilton’s HMM module
to implement a simple HMM.
Do the following exercises based on AIMA’s rainy-umbrella HMM example given in Figure 15.2.
Download the following sample code: lab1.py. Be sure to read through the comments carefully.
Run the code and verify that it produces the values computed in the text for the Forward algorithm. See the comments in the code itself for detailed explanations.
Consider the following additional questions:
You now implement a similar example.1
Use Hamilton’s HMM module to build a Bayesian network
matching the diagram shown on the right and use its implementation
of the Forward algorithm to compute the following probabilities:
Assume that P(rain0)=0.5. Be sure that you can explain the values you produce (for steps b and c) by deriving them by hand. |
![]() |
We expect you to be able to compute the FORWARD algorithm by hand.
The HMM given in the previous exercise is a Bayesian network. Can you compute the specified probabilities using the Enumeration-Ask algorithm? If so, demonstrate it (programmatically) and explain which algorithm is a better choice. If not, explain why not.
Be sure that you understand the relationship between Bayesian networks and HMMs before moving on.
Filtering (aka state estimation) is not the only kind of inference that can profitably be made over HMMs.
Download the following sample code: lab4.py and run it, verifying that it produces the values computed in the text for the HMM inference algorithms. See the comments in the code itself for detailed explanations.
Consider the following questions:
Smoothing
Most likely explanation
We don’t expect you to be able to compute these latter algorithms by hand.
Submit your source code as specified above in Moodle under lab 7. Submit paper if that’s easier for you.
1 This example is adapted from Thrun’s happy-grumpy example.