Over Christmas, Allen Newell and I created a thinking machine. — informal statement to a graduate class attributed to H. Simon (Problem Solving Research), 1956.

In this lab exercise, you’ll work with the AIMA Python implementations of local search algorithms.

Objective Functions

In local search, we’d like to find the maximum value for a given objective function. Consider the two functions shown here. There’s no particular magic in either of them; they simply generate a set of objective values in two dimensions (only) that are all positive and display interesting global and local maxima:

$$f(x) = {maximum \over 2} - abs( {maximum \over 2} - x )$$
$$f(x) = abs(x * sin(x))$$

These plots were created with Matplotlib by graphs.py.

Local Search

Local search algorithms work with complete-state formulations.

Exercise 2.1

Run abs.py, an absolute value variant module and answer the following questions:

  1. Which of the local search algorithms solves the problem? How well does each algorithm do?
  2. Which algorithm works more quickly?
  3. Does the starting value for x make any difference? Why or why not?
  4. What affect does changing the delta step value make on each algorithm? Why?
  5. What is the purpose of the exp_schedule() method in the simulated annealing function call?

Save the answers to these questions in a text file, lab.txt, so that you can submit them later.

That objective function wasn’t that interesting. Let’s try a slightly more interesting one, still in two dimensions.

Exercise 2.2

Create a new module (sine.py), similar to the absolute value variant from exercise 2.1, that uses the sine function variant specified above as its objective function. Get your new implementation running and then answer the following (similar) questions:

  1. How do each of the algorithms do on this problem space? Why?
  2. Does the starting value make any difference here?
  3. Does modifying the step size (i.e., delta) affect the operation of the two algorithms? Why or why not?
  4. What are the maximum and minimum possible values here, and how do the two algorithms score with respect to them?

Add the answers to these questions to lab.txt.

Random Restarts and Local Beam Search

Because local search algorithms can hit local minima, it is useful to run them more than once or to run them in parallel.

Exercise 2.3

Add code to your sine implementation (sine.py) that implements random restarts.

  1. How does each algorithm do with these restarts? Why?
  2. What are the average values of the runs for each of the algorithms?
  3. If one of the algorithms does better, explain why; if not, explain why not.

Add the answers to these questions to lab.txt.

One key advantage of local search is the reduced memory requirements. This can be leveraged to allow more than one state to be maintained in parallel.

Exercise 2.4

Consider the use of local beam search in which the successor states are chosen using one of the two algorithms as applied to the abs-sin function.

  1. For which algorithm does beam search make the most sense?
  2. How many solutions could you maintain with reasonable space and time constraints?
  3. How would you modify the code to implement beam search? How is it different from random restarts, if at all?

Add the answers to these questions to lab.txt. You are not required to implement this.

In local beam search, successor states are generally chosen using simple hill-climbing.

Checking in

Submit your code under the directory lab02. We will grade your work according to the following criteria:

See the policies page for lab due-dates and times.