Operations Research II Heuristic Methods
ISE: 421
KFUPM
November 21, 2016
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 1 / 35
Outline of the Course
¬ Introduction to Operations Research
Integer Programming
® Heuristic Methods
¯ Traveling Salesperson Problem
° Dynamic Programming
± Nonlinear Programming
² Additional Topics (if time allows)
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 2 / 35
Chapter Outline
1 Heuristic & Meta-Heuristic 2 Greedy and Random Heuristic
Single Variables Multiple Variables
3 Simulated Annealing ILP Applications
4 Constraint Programming 5 Conclusion
The focus of this chapter is to study the heuristic methods in optimization.
By the end of this chapter, we should be able to
H identify the differences among exact, greedy heuristic and metaheuristic.
H design and implement greedy heuristic for optimization problems.
H design and implement simulated annealing for optimization problems.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 3 / 35
1 Heuristic & Meta-Heuristic
2 Greedy and Random Heuristic Single Variables Multiple Variables
3 Simulated Annealing ILP Applications
4 Constraint Programming
5 Conclusion
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 4 / 35
Why Heuristics?
o Name some of the methods to solve an Integer Program (IP)??
o What is the main idea involved in solving IPs? o In the worst case scenario, how many LPs do you end-up
solving in a Cutting Plane (CP) or Branch & Bound (BnB)
method?
o In the worst case scenario, how many LPs do you end-up solving in a BnB method for binary variables?
o Getting tight bounds requires availability of good Integer Feasible Solutions (IFS). How easy it is to find an IFS using
the CP or BnB methods?
o So, how to get good IFS very fast? o Where to use good IFS?
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 5 / 35
Why Heuristics?
o Name some of the methods to solve an Integer Program (IP)?? Branch & Bound, Cutting Plane.
o What is the main idea involved in solving IPs? o In the worst case scenario, how many LPs do you end-up
solving in a Cutting Plane (CP) or Branch & Bound (BnB)
method?
o In the worst case scenario, how many LPs do you end-up solving in a BnB method for binary variables?
o Getting tight bounds requires availability of good Integer Feasible Solutions (IFS). How easy it is to find an IFS using
the CP or BnB methods?
o So, how to get good IFS very fast? o Where to use good IFS?
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 5 / 35
Why Heuristics?
o Name some of the methods to solve an Integer Program (IP)?? o What is the main idea involved in solving IPs?
o In the worst case scenario, how many LPs do you end-up solving in a Cutting Plane (CP) or Branch & Bound (BnB)
method?
o In the worst case scenario, how many LPs do you end-up solving in a BnB method for binary variables?
o Getting tight bounds requires availability of good Integer Feasible Solutions (IFS). How easy it is to find an IFS using
the CP or BnB methods?
o So, how to get good IFS very fast? o Where to use good IFS?
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 5 / 35
Why Heuristics?
o Name some of the methods to solve an Integer Program (IP)?? o What is the main idea involved in solving IPs?
Add infinite fractional points, and iteratively solve the resulting LPs by efficiently removing the fractional points.
o In the worst case scenario, how many LPs do you end-up solving in a Cutting Plane (CP) or Branch & Bound (BnB)
method?
o In the worst case scenario, how many LPs do you end-up solving in a BnB method for binary variables?
o Getting tight bounds requires availability of good Integer Feasible Solutions (IFS). How easy it is to find an IFS using
the CP or BnB methods?
o So, how to get good IFS very fast? o Where to use good IFS?
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 5 / 35
Why Heuristics?
o Name some of the methods to solve an Integer Program (IP)?? o What is the main idea involved in solving IPs? o In the worst case scenario, how many LPs do you end-up
solving in a Cutting Plane (CP) or Branch & Bound (BnB)
method?
o In the worst case scenario, how many LPs do you end-up solving in a BnB method for binary variables?
o Getting tight bounds requires availability of good Integer Feasible Solutions (IFS). How easy it is to find an IFS using
the CP or BnB methods?
o So, how to get good IFS very fast? o Where to use good IFS?
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 5 / 35
Why Heuristics?
o Name some of the methods to solve an Integer Program (IP)?? o What is the main idea involved in solving IPs? o In the worst case scenario, how many LPs do you end-up
solving in a Cutting Plane (CP) or Branch & Bound (BnB)
method? No upper bound, but the number of LPs will be exponential...
o In the worst case scenario, how many LPs do you end-up solving in a BnB method for binary variables?
o Getting tight bounds requires availability of good Integer Feasible Solutions (IFS). How easy it is to find an IFS using
the CP or BnB methods?
o So, how to get good IFS very fast? o Where to use good IFS?
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 5 / 35
Why Heuristics?
o Name some of the methods to solve an Integer Program (IP)?? o What is the main idea involved in solving IPs? o In the worst case scenario, how many LPs do you end-up
solving in a Cutting Plane (CP) or Branch & Bound (BnB)
method?
o In the worst case scenario, how many LPs do you end-up solving in a BnB method for binary variables?
o Getting tight bounds requires availability of good Integer Feasible Solutions (IFS). How easy it is to find an IFS using
the CP or BnB methods?
o So, how to get good IFS very fast? o Where to use good IFS?
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 5 / 35
Why Heuristics?
o Name some of the methods to solve an Integer Program (IP)?? o What is the main idea involved in solving IPs? o In the worst case scenario, how many LPs do you end-up
solving in a Cutting Plane (CP) or Branch & Bound (BnB)
method?
o In the worst case scenario, how many LPs do you end-up solving in a BnB method for binary variables? Almost 2n LPs, and the complexity depends upon the bounds LP formulation and availability of the bounds.
o Getting tight bounds requires availability of good Integer Feasible Solutions (IFS). How easy it is to find an IFS using
the CP or BnB methods?
o So, how to get good IFS very fast? o Where to use good IFS?
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 5 / 35
Why Heuristics?
o Name some of the methods to solve an Integer Program (IP)?? o What is the main idea involved in solving IPs? o In the worst case scenario, how many LPs do you end-up
solving in a Cutting Plane (CP) or Branch & Bound (BnB)
method?
o In the worst case scenario, how many LPs do you end-up solving in a BnB method for binary variables?
o Getting tight bounds requires availability of good Integer Feasible Solutions (IFS). How easy it is to find an IFS using
the CP or BnB methods?
o So, how to get good IFS very fast? o Where to use good IFS?
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 5 / 35
Why Heuristics?
o Name some of the methods to solve an Integer Program (IP)?? o What is the main idea involved in solving IPs? o In the worst case scenario, how many LPs do you end-up
solving in a Cutting Plane (CP) or Branch & Bound (BnB)
method?
o In the worst case scenario, how many LPs do you end-up solving in a BnB method for binary variables?
o Getting tight bounds requires availability of good Integer Feasible Solutions (IFS). How easy it is to find an IFS using
the CP or BnB methods? Generally, the difficulty of finding a feasible solution is same as difficulty of finding an optimal solution.
o So, how to get good IFS very fast? o Where to use good IFS?
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 5 / 35
Why Heuristics?
o Name some of the methods to solve an Integer Program (IP)?? o What is the main idea involved in solving IPs? o In the worst case scenario, how many LPs do you end-up
solving in a Cutting Plane (CP) or Branch & Bound (BnB)
method?
o In the worst case scenario, how many LPs do you end-up solving in a BnB method for binary variables?
o Getting tight bounds requires availability of good Integer Feasible Solutions (IFS). How easy it is to find an IFS using
the CP or BnB methods?
o So, how to get good IFS very fast?
o Where to use good IFS?
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 5 / 35
Why Heuristics?
o Name some of the methods to solve an Integer Program (IP)?? o What is the main idea involved in solving IPs? o In the worst case scenario, how many LPs do you end-up
solving in a Cutting Plane (CP) or Branch & Bound (BnB)
method?
o In the worst case scenario, how many LPs do you end-up solving in a BnB method for binary variables?
o Getting tight bounds requires availability of good Integer Feasible Solutions (IFS). How easy it is to find an IFS using
the CP or BnB methods?
o So, how to get good IFS very fast? Use heuristic methods instead of exact methods.
o Where to use good IFS?
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 5 / 35
Why Heuristics?
o Name some of the methods to solve an Integer Program (IP)?? o What is the main idea involved in solving IPs? o In the worst case scenario, how many LPs do you end-up
solving in a Cutting Plane (CP) or Branch & Bound (BnB)
method?
o In the worst case scenario, how many LPs do you end-up solving in a BnB method for binary variables?
o Getting tight bounds requires availability of good Integer Feasible Solutions (IFS). How easy it is to find an IFS using
the CP or BnB methods?
o So, how to get good IFS very fast? o Where to use good IFS?
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 5 / 35
Why Heuristics?
o Name some of the methods to solve an Integer Program (IP)?? o What is the main idea involved in solving IPs? o In the worst case scenario, how many LPs do you end-up
solving in a Cutting Plane (CP) or Branch & Bound (BnB)
method?
o In the worst case scenario, how many LPs do you end-up solving in a BnB method for binary variables?
o Getting tight bounds requires availability of good Integer Feasible Solutions (IFS). How easy it is to find an IFS using
the CP or BnB methods?
o So, how to get good IFS very fast? o Where to use good IFS?
Either to get bounds for exact methods, or use is as a replace- ment of optimal solution (pragmatic).
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 5 / 35
Exact vs Heuristic
Exact Methods:
H They are mostly iterative.
H These methods are invaluable for obtaining optimal solution to a small-scale practical problems.
H You will get an optimal solution.
H The solution time will be huge.
H Standard commercial methods are avaiable to solve the problem.
Heuristic Methods:
J They are mostly iterative.
J These methods are indespensible for obtaining good solution to a majority of the large-scale practical problems.
J You will get a good working solution. No guarantee can be given about the quality of the solution obtained, but a well-designed heuristic method usually can provide a solution that is at least nearly optimal (or conclude that no such solutions exist).
J The soultion time will be very less compared to exact methods.
J Most of the time, a custom solution method is developed.Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 6 / 35
Exact vs Heuristic
Exact Methods:
H They are mostly iterative.
H These methods are invaluable for obtaining optimal solution to a small-scale practical problems.
H You will get an optimal solution.
H The solution time will be huge.
H Standard commercial methods are avaiable to solve the problem.
Heuristic Methods:
J They are mostly iterative.
J These methods are indespensible for obtaining good solution to a majority of the large-scale practical problems.
J You will get a good working solution. No guarantee can be given about the quality of the solution obtained, but a well-designed heuristic method usually can provide a solution that is at least nearly optimal (or conclude that no such solutions exist).
J The soultion time will be very less compared to exact methods.
J Most of the time, a custom solution method is developed.
So, which method to pick for solving an OR problem?
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 6 / 35
Exact vs Heuristic
Exact Methods:
H They are mostly iterative.
H These methods are invaluable for obtaining optimal solution to a small-scale practical problems.
H You will get an optimal solution.
H The solution time will be huge.
H Standard commercial methods are avaiable to solve the problem.
Heuristic Methods:
J They are mostly iterative.
J These methods are indespensible for obtaining good solution to a majority of the large-scale practical problems.
J You will get a good working solution. No guarantee can be given about the quality of the solution obtained, but a well-designed heuristic method usually can provide a solution that is at least nearly optimal (or conclude that no such solutions exist).
J The soultion time will be very less compared to exact methods.
J Most of the time, a custom solution method is developed.
So, which method to pick for solving an OR problem?
Use both!
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 6 / 35
Heuristic vs MetaHeuristic
Heuristic Methods:
H Heuristic methods often are based on relatively simple common-sense ideas for how to search for a good solution.
H These ideas need to be carefully designed to fit the specific problem of interest.
H Each method usually is tailored to fit a specific problem type rather than a variety of applications (ad hoc).
H Typically, they are greedy methods, and may guarentee local optimal solutions.
MetaHeuristic Methods:
J They are special type of heuristic method.
J They are general solution methods that provide structure and strategy for solving any optimization problem.
J They are not ad hoc, i.e., evey method has a strcuture, and the problem formulation needs to be tailored.
J They are not greedy methods, and some of them can guarentee optimal solution (theoritically).
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 7 / 35
1 Heuristic & Meta-Heuristic
2 Greedy and Random Heuristic Single Variables Multiple Variables
3 Simulated Annealing ILP Applications
4 Constraint Programming
5 Conclusion
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 8 / 35
Single Variables: Heuristic Idea
. inner ysep . Assume the objective is to maximize.
. The idea is to explore the feasible region to find a good feasible solution.
. We start from initial feasible solution.
. From the initial solution, pick a direction to move.
. Decide the step size and/or define a neighborhood.
. Decide on terminating criteria.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 9 / 35
Single Variables: Heuristic Idea
. inner ysep . Assume the objective is to maximize.
. The idea is to explore the feasible region to find a good feasible solution.
. We start from initial feasible solution.
. From the initial solution, pick a direction to move.
. Decide the step size and/or define a neighborhood.
. Decide on terminating criteria.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 9 / 35
Single Variables: Heuristic Idea
Feasible Region
. inner ysep . Assume the objective is to maximize. . The idea is to explore the feasible region to find a good feasible solution.
. We start from initial feasible solution.
. From the initial solution, pick a direction to move.
. Decide the step size and/or define a neighborhood.
. Decide on terminating criteria.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 9 / 35
Single Variables: Heuristic Idea
. inner ysep . Assume the objective is to maximize. . The idea is to explore the feasible region to find a good feasible solution.
. We start from initial feasible solution.
. From the initial solution, pick a direction to move.
. Decide the step size and/or define a neighborhood.
. Decide on terminating criteria.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 9 / 35
Single Variables: Heuristic Idea
Objective Function
. inner ysep . Assume the objective is to maximize. . The idea is to explore the feasible region to find a good feasible solution.
. We start from initial feasible solution.
. From the initial solution, pick a direction to move.
. Decide the step size and/or define a neighborhood.
. Decide on terminating criteria.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 9 / 35
Single Variables: Heuristic Idea
𝑥 = 2
. inner ysep . Assume the objective is to maximize. . The idea is to explore the feasible region to find a good feasible solution. . We start from initial feasible solution.
. From the initial solution, pick a direction to move.
. Decide the step size and/or define a neighborhood.
. Decide on terminating criteria.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 9 / 35
Single Variables: Heuristic Idea
𝑥 = 2
. inner ysep . Assume the objective is to maximize. . The idea is to explore the feasible region to find a good feasible solution. . We start from initial feasible solution.
. From the initial solution, pick a direction to move.
. Decide the step size and/or define a neighborhood.
. Decide on terminating criteria.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 9 / 35
Single Variables: Heuristic Idea
𝑥 = 2
Which direction to
move?
. inner ysep . Assume the objective is to maximize. . The idea is to explore the feasible region to find a good feasible solution. . We start from initial feasible solution. . From the initial solution, pick a direction to move.
. Decide the step size and/or define a neighborhood.
. Decide on terminating criteria.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 9 / 35
Single Variables: Heuristic Idea
What should be the step
size?
. inner ysep . Assume the objective is to maximize. . The idea is to explore the feasible region to find a good feasible solution. . We start from initial feasible solution. . From the initial solution, pick a direction to move.
. Decide the step size and/or define a neighborhood.
. Decide on terminating criteria.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 9 / 35
Single Variables: Heuristic Idea
𝑥 = 2
Neighborhood
. inner ysep . Assume the objective is to maximize. . The idea is to explore the feasible region to find a good feasible solution. . We start from initial feasible solution. . From the initial solution, pick a direction to move. . Decide the step size and/or define a neighborhood.
. Decide on terminating criteria.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 9 / 35
Single Variables: Heuristic Idea
𝑥 = 3
. inner ysep . Assume the objective is to maximize. . The idea is to explore the feasible region to find a good feasible solution. . We start from initial feasible solution. . From the initial solution, pick a direction to move. . Decide the step size and/or define a neighborhood.
. Decide on terminating criteria.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 9 / 35
Single Variables: Heuristic Idea
𝑥 = 3
. inner ysep . Assume the objective is to maximize. . The idea is to explore the feasible region to find a good feasible solution. . We start from initial feasible solution. . From the initial solution, pick a direction to move. . Decide the step size and/or define a neighborhood.
. Decide on terminating criteria.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 9 / 35
Single Variables: Heuristic Idea
𝑥 = 4
. inner ysep . Assume the objective is to maximize. . The idea is to explore the feasible region to find a good feasible solution. . We start from initial feasible solution. . From the initial solution, pick a direction to move. . Decide the step size and/or define a neighborhood.
. Decide on terminating criteria.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 9 / 35
Single Variables: Heuristic Idea
𝑥 = 5
. inner ysep . Assume the objective is to maximize. . The idea is to explore the feasible region to find a good feasible solution. . We start from initial feasible solution. . From the initial solution, pick a direction to move. . Decide the step size and/or define a neighborhood.
. Decide on terminating criteria.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 9 / 35
Single Variables: Heuristic Idea
𝑥 = 5
. inner ysep . Assume the objective is to maximize. . The idea is to explore the feasible region to find a good feasible solution. . We start from initial feasible solution. . From the initial solution, pick a direction to move. . Decide the step size and/or define a neighborhood.
. Decide on terminating criteria.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 9 / 35
Single Variables: Heuristic Idea
𝑥 = 5
. inner ysep . Assume the objective is to maximize. . The idea is to explore the feasible region to find a good feasible solution. . We start from initial feasible solution. . From the initial solution, pick a direction to move. . Decide the step size and/or define a neighborhood. . Decide on terminating criteria.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 9 / 35
Single Variables: Heuristic Elements
Current Solution: A solution from which the algorithm will search for a better solution. It typically contains the information of iteration number. Example x(k), where k is the iteration number.
Neighborhood:
Move:
Update or Terminate:
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 10 / 35
Single Variables: Heuristic Elements
Current Solution:
Neighborhood: A group of solution adjacent to a given solution, denoted by N(x(k)). Immediate Neighborhood: A group of
solution adjacent and very near to a given solution.
Extended Neighborhood: A group of solution adjacent to a given solution. The solutions could be farther from the given solution. Extended neighborhood includes the immediate neighborhood.
Move:
Update or Terminate:
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 10 / 35
Single Variables: Heuristic Elements
Current Solution:
Neighborhood: A group of solution adjacent to a given solution, denoted by N(x(k)). Immediate Neighborhood: A group of
solution adjacent and very near to a given solution.
Extended Neighborhood: A group of solution adjacent to a given solution. The solutions could be farther from the given solution. Extended neighborhood includes the immediate neighborhood.
Move:
Update or Terminate:
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 10 / 35
Single Variables: Heuristic Elements
Current Solution:
Neighborhood: A group of solution adjacent to a given solution, denoted by N(x(k)). Immediate Neighborhood: A group of
solution adjacent and very near to a given solution.
Extended Neighborhood: A group of solution adjacent to a given solution. The solutions could be farther from the given solution. Extended neighborhood includes the immediate neighborhood.
Move:
Update or Terminate:
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 10 / 35
Single Variables: Heuristic Elements
Current Solution:
Neighborhood:
Move: A process of selecting one solution from a neighborhood, say candidate solution.
Greedy Move: Candidate solution will be obtained after searching the entire neighborhood. Complete neighborhood will be searched.
Random Walk: Candidate solution will be the first solution from neighborhood that is better than the current solution. The neighborhood is not searched completely.
Update or Terminate:
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 10 / 35
Single Variables: Heuristic Elements
Current Solution:
Neighborhood:
Move: A process of selecting one solution from a neighborhood, say candidate solution.
Greedy Move: Candidate solution will be obtained after searching the entire neighborhood. Complete neighborhood will be searched.
Random Walk: Candidate solution will be the first solution from neighborhood that is better than the current solution. The neighborhood is not searched completely.
Update or Terminate:
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 10 / 35
Single Variables: Heuristic Elements
Current Solution:
Neighborhood:
Move:
Update or Terminate: Replace the current solution with the candidate solution (if there is an improvement). That is, the new current solution will be the candidate solution. If there is no improvement, then terminate the algorithm.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 10 / 35
Single Variable: Heuristic Algorithm
Below One typical Iteration of a general heuristic algorithm is shown:
é Build neighborhood for the current solution.
é Execute move and identify a candidate solution.
é Update the candidate solution as current solution for the next iteration.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 11 / 35
Single Variable: Heuristic Algorithm
Below One typical Iteration of a general heuristic algorithm is shown:
é Build neighborhood for the current solution.
é Execute move and identify a candidate solution.
é Update the candidate solution as current solution for the next iteration.
Specific Heuristic
Precise definitions of neighborhood and move will give rise
to a specific heuristic.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 11 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Following are the definitions of neighborhood and move:
Neighborhood: Since the approach is greedy, immediate neighborhood (or unit neighborhood)
is considered.
If x(k) is the current solution, then N(x(k)) = [x(k) − 1,x(k) + 1].
Move: Move is greedy, so the entire neighborhood is searched to get the candidate solution (best
solution from the entire neighborhood).
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Q. Example. Consider the following optimization problem:
minimize :
x 5 − 10x4 + 35x3 − 50x2 + 24x
subject to
0 ≤ x ≤ 8.
Assume the starting solution is x(0) = 7.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Q. Example. Consider the following optimization problem:
minimize :
x 5 − 10x4 + 35x3 − 50x2 + 24x
subject to
0 ≤ x ≤ 8.
Assume the starting solution is x(0) = 7.
J Solution.
k x(k) N(x(k)) F(x(k)) F(x(k) − 1) F(x(k) + 1) 0
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Q. Example. Consider the following optimization problem:
minimize :
x 5 − 10x4 + 35x3 − 50x2 + 24x
subject to
0 ≤ x ≤ 8.
Assume the starting solution is x(0) = 7.
J Solution.
k x(k) N(x(k)) F(x(k)) F(x(k) − 1) F(x(k) + 1) 0 7
Current solution is given. We start the search from current solution.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Q. Example. Consider the following optimization problem:
minimize :
x 5 − 10x4 + 35x3 − 50x2 + 24x
subject to
0 ≤ x ≤ 8.
Assume the starting solution is x(0) = 7.
J Solution.
k x(k) N(x(k)) F(x(k)) F(x(k) − 1) F(x(k) + 1) 0 7 [6, 8]
Get the immediate neighborhood for the current solution (immediate since greedy).
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Q. Example. Consider the following optimization problem:
minimize :
x 5 − 10x4 + 35x3 − 50x2 + 24x
subject to
0 ≤ x ≤ 8.
Assume the starting solution is x(0) = 7.
J Solution.
k x(k) N(x(k)) F(x(k)) F(x(k) − 1) F(x(k) + 1) 0 7 [6, 8] 2520
Get the objective function value of the current solution.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Q. Example. Consider the following optimization problem:
minimize :
x 5 − 10x4 + 35x3 − 50x2 + 24x
subject to
0 ≤ x ≤ 8.
Assume the starting solution is x(0) = 7.
J Solution.
k x(k) N(x(k)) F(x(k)) F(x(k) − 1) F(x(k) + 1) 0 7 [6, 8] 2520 720 6720
Get the objective function values of all the neighbor solutions (all since greedy).
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Q. Example. Consider the following optimization problem:
minimize :
x 5 − 10x4 + 35x3 − 50x2 + 24x
subject to
0 ≤ x ≤ 8.
Assume the starting solution is x(0) = 7.
J Solution.
k x(k) N(x(k)) F(x(k)) F(x(k) − 1) F(x(k) + 1) 0 7 [6, 8] 2520 720 6720 1 6 [5, 7] 720 120 2520
Pick the best solution from the immediate neighborhood (best since greedy). Update and start the next iteration.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Q. Example. Consider the following optimization problem:
minimize :
x 5 − 10x4 + 35x3 − 50x2 + 24x
subject to
0 ≤ x ≤ 8.
Assume the starting solution is x(0) = 7.
J Solution.
k x(k) N(x(k)) F(x(k)) F(x(k) − 1) F(x(k) + 1) 0 7 [6, 8] 2520 720 6720 1 6 [5, 7] 720 120 2520 2 5 [4, 6] 120 0 720
Pick the best solution from the immediate neighborhood (best since greedy). Update and start the next iteration.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Q. Example. Consider the following optimization problem:
minimize :
x 5 − 10x4 + 35x3 − 50x2 + 24x
subject to
0 ≤ x ≤ 8.
Assume the starting solution is x(0) = 7.
J Solution.
k x(k) N(x(k)) F(x(k)) F(x(k) − 1) F(x(k) + 1) 0 7 [6, 8] 2520 720 6720 1 6 [5, 7] 720 120 2520 2 5 [4, 6] 120 0 720 3 4 [3, 5] 0 0 120
Pick the best solution from the immediate neighborhood (best since greedy). Update and start the next iteration.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Q. Example. Consider the following optimization problem:
minimize :
x 5 − 10x4 + 35x3 − 50x2 + 24x
subject to
0 ≤ x ≤ 8.
Assume the starting solution is x(0) = 7.
J Solution.
k x(k) N(x(k)) F(x(k)) F(x(k) − 1) F(x(k) + 1) 0 7 [6, 8] 2520 720 6720 1 6 [5, 7] 720 120 2520 2 5 [4, 6] 120 0 720 3 4 [3, 5] 0 0 120
The algorithm terminates here since there is no further improvement. Best solution from the above algorithm is x = 4.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Following are the definitions of neighborhood and move:
Neighborhood: Since the approach is random walk, extended neighborhood is considered.
If x(k) is the current solution, then N(x(k)) = [1, . . . ,x(k) − 1,x(k) + 1, . . . ,N]. where x(k) ∈{1, 2, . . . ,N}
Move: Move is random walk, so a random solution from neighborhood is searched. If the
neighborhood solution has a better objective
function value, then it will be taken as
candidate solution. Otherwise, another random
solution from the neighborhood is searched, and
the loop continues (first best solution
terminates the neighbor search loop).
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Q. Example. Consider the following optimization problem:
minimize :
x 5 − 10x4 + 35x3 − 50x2 + 24x
subject to
0 ≤ x ≤ 8.
Assume the starting solution is x(0) = 5.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Q. Example. Consider the following optimization problem:
minimize :
x 5 − 10x4 + 35x3 − 50x2 + 24x
subject to
0 ≤ x ≤ 8.
Assume the starting solution is x(0) = 5.
J Solution.
k x(k) N(x(k)) F(x(k)) x(k) ′ F(x(k)
′ )
Current solution is given. We start the search from current solution.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Q. Example. Consider the following optimization problem:
minimize :
x 5 − 10x4 + 35x3 − 50x2 + 24x
subject to
0 ≤ x ≤ 8.
Assume the starting solution is x(0) = 5.
J Solution.
k x(k) N(x(k)) F(x(k)) x(k) ′ F(x(k)
′ )
0 5 [0,1, 2, 3, 4, 6, 7, 8] 120 8 6720
Get the objective function value of the current solution. Get the a solution from the extended neighborhood (extended since random walk). Get the objective function values of the neighbor solution (one solution at a time since random walk). Update the current solution if there is an improvement. Otherwise look for another neighbor solution.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Q. Example. Consider the following optimization problem:
minimize :
x 5 − 10x4 + 35x3 − 50x2 + 24x
subject to
0 ≤ x ≤ 8.
Assume the starting solution is x(0) = 5.
J Solution.
k x(k) N(x(k)) F(x(k)) x(k) ′ F(x(k)
′ )
0 5 [0,1, 2, 3, 4, 6, 7, 8] 120 8 6720 1 5 [0,1, 2, 3, 4, 6, 7, 8] 120 3 0
Update the current solution if there is an improvement. Otherwise look for another neighbor solution.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Q. Example. Consider the following optimization problem:
minimize :
x 5 − 10x4 + 35x3 − 50x2 + 24x
subject to
0 ≤ x ≤ 8.
Assume the starting solution is x(0) = 5.
J Solution.
k x(k) N(x(k)) F(x(k)) x(k) ′ F(x(k)
′ )
0 5 [0,1, 2, 3, 4, 6, 7, 8] 120 8 6720 1 5 [0,1, 2, 3, 4, 6, 7, 8] 120 3 0 2 3 [0,1, 2, 4, 5, 6, 7, 8] 0 7 2520
Update the current solution if there is an improvement. Otherwise look for another neighbor solution.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Q. Example. Consider the following optimization problem:
minimize :
x 5 − 10x4 + 35x3 − 50x2 + 24x
subject to
0 ≤ x ≤ 8.
Assume the starting solution is x(0) = 5.
J Solution.
k x(k) N(x(k)) F(x(k)) x(k) ′ F(x(k)
′ )
0 5 [0,1, 2, 3, 4, 6, 7, 8] 120 8 6720 1 5 [0,1, 2, 3, 4, 6, 7, 8] 120 3 0 2 3 [0,1, 2, 4, 5, 6, 7, 8] 0 7 2520 3 3 [0,1, 2, 4, 5, 6, 7, 8] 0 8 6720
Update the current solution if there is an improvement. Otherwise look for another neighbor solution.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Q. Example. Consider the following optimization problem:
minimize :
x 5 − 10x4 + 35x3 − 50x2 + 24x
subject to
0 ≤ x ≤ 8.
Assume the starting solution is x(0) = 5.
J Solution.
k x(k) N(x(k)) F(x(k)) x(k) ′ F(x(k)
′ )
0 5 [0,1, 2, 3, 4, 6, 7, 8] 120 8 6720 1 5 [0,1, 2, 3, 4, 6, 7, 8] 120 3 0 2 3 [0,1, 2, 4, 5, 6, 7, 8] 0 7 2520 3 3 [0,1, 2, 4, 5, 6, 7, 8] 0 8 6720 4 3 [0,1, 2, 4, 5, 6, 7, 8] 0 7 2520
The algorithm terminates after observing that there are no further improvement after three successive iterations. The terminating criterion is based on the user preferences. Best solution from the above algorithm is x = 3.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Following are the definitions of neighborhood and move:
Neighborhood: Since the approach is greedy, immediate neighborhood (or unit neighborhood)
is considered.
If x(k) is the current solution, then N(x(k)) = ||x(k) −x|| ≤ �. where � > 0 is a design parameter.
Move: Move is greedy, so the entire neighborhood is searched to get the candidate solution (best
solution from the entire neighborhood).
This type of algorithm is NOT common, since searching entire neighborhood of infinite solutions is typically impractical. However, if such search method is available, then such algorithm can be designed.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Following are the definitions of neighborhood and move:
Neighborhood: Since the approach is random walk, extended neighborhood is considered.
If x(k) is the current solution, then N(x(k)) = {x : x(k) + (R− 0.5)(U −L)}. where R is a uniform random number between (0, 1), and U and L is upper and lower bound on the variable. Similarly, another neighborhood on
normal distribution can be defined as:
If x(k) is the current solution, then
N(x(k)) = {x : x(k) + (U−L) 6
N(0, 1)}. where N(0, 1) is a normal random variable with mean 0 and unit variance. The move can be expressed
as (depending on the choice of neighborhood):
x(k+1) = x(k) + (U −L)
6 N(0, 1)
x(k+1) = x(k) + (R− 0.5)(U −L)
Move: Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Following are the definitions of neighborhood and move:
Neighborhood: Move: Move is random walk, so a random solution
from neighborhood is searched. If the
neighborhood solution has a better objective
function value, then it will be taken as
candidate solution. Otherwise, another random
solution from the neighborhood is searched, and
the loop continues (first best solution
terminates the neighbor search loop).
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Following are the definitions of neighborhood and move:
Neighborhood: Move:
This type of algorithm is VERY common, for continuous variable optimization.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Q. Example. Consider the following optimization problem:
minimize :
x 5 − 10x4 + 35x3 − 50x2 + 24x
subject to
0 ≤ x ≤ 8.
Assume the starting solution is x(0) = 4.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Q. Example. Consider the following optimization problem:
minimize :
x 5 − 10x4 + 35x3 − 50x2 + 24x
subject to
0 ≤ x ≤ 8.
Assume the starting solution is x(0) = 4.
J Solution.
k x(k) F(x(k)) x(k) ′
F(x(k) ′ )
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Q. Example. Consider the following optimization problem:
minimize :
x 5 − 10x4 + 35x3 − 50x2 + 24x
subject to
0 ≤ x ≤ 8.
Assume the starting solution is x(0) = 4.
J Solution.
k x(k) F(x(k)) x(k) ′
F(x(k) ′ )
0 4 0 1.916384 -0.33155
Current solution is given. We start the search from current solution. Get the objective function value of the current solution. Get the a solution from the extended neighborhood (extended since random walk). Get the objective function values of the neighbor solution (one solution at a time since random walk). Update the current solution if there is an improvement. Otherwise look for another neighbor solution.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Q. Example. Consider the following optimization problem:
minimize :
x 5 − 10x4 + 35x3 − 50x2 + 24x
subject to
0 ≤ x ≤ 8.
Assume the starting solution is x(0) = 4.
J Solution.
k x(k) F(x(k)) x(k) ′
F(x(k) ′ )
0 4 0 1.916384 -0.33155 1 1.916384 -0.33155 1.160478 -0.81665
Update the current solution if there is an improvement. Otherwise look for another neighbor solution.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Q. Example. Consider the following optimization problem:
minimize :
x 5 − 10x4 + 35x3 − 50x2 + 24x
subject to
0 ≤ x ≤ 8.
Assume the starting solution is x(0) = 4.
J Solution.
k x(k) F(x(k)) x(k) ′
F(x(k) ′ )
0 4 0 1.916384 -0.33155 1 1.916384 -0.33155 1.160478 -0.81665 2 1.160478 -0.81665 1 0
Update the current solution if there is an improvement. Otherwise look for another neighbor solution.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Q. Example. Consider the following optimization problem:
minimize :
x 5 − 10x4 + 35x3 − 50x2 + 24x
subject to
0 ≤ x ≤ 8.
Assume the starting solution is x(0) = 4.
J Solution.
k x(k) F(x(k)) x(k) ′
F(x(k) ′ )
0 4 0 1.916384 -0.33155 1 1.916384 -0.33155 1.160478 -0.81665 2 1.160478 -0.81665 1 0 3 1.160478 -0.81665 3.666106 -3.6219
Update the current solution if there is an improvement. Otherwise look for another neighbor solution.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Q. Example. Consider the following optimization problem:
minimize :
x 5 − 10x4 + 35x3 − 50x2 + 24x
subject to
0 ≤ x ≤ 8.
Assume the starting solution is x(0) = 4.
J Solution.
k x(k) F(x(k)) x(k) ′
F(x(k) ′ )
0 4 0 1.916384 -0.33155 1 1.916384 -0.33155 1.160478 -0.81665 2 1.160478 -0.81665 1 0 3 1.160478 -0.81665 3.666106 -3.6219 4 3.666106 -3.6219 6.424358 1279.983
Update the current solution if there is an improvement. Otherwise look for another neighbor solution.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Q. Example. Consider the following optimization problem:
minimize :
x 5 − 10x4 + 35x3 − 50x2 + 24x
subject to
0 ≤ x ≤ 8.
Assume the starting solution is x(0) = 4.
J Solution.
k x(k) F(x(k)) x(k) ′
F(x(k) ′ )
0 4 0 1.916384 -0.33155 1 1.916384 -0.33155 1.160478 -0.81665 2 1.160478 -0.81665 1 0 3 1.160478 -0.81665 3.666106 -3.6219 4 3.666106 -3.6219 6.424358 1279.983 5 3.666106 -3.6219 3.807432 -3.00396
Update the current solution if there is an improvement. Otherwise look for another neighbor solution.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
Q. Example. Consider the following optimization problem:
minimize :
x 5 − 10x4 + 35x3 − 50x2 + 24x
subject to
0 ≤ x ≤ 8.
Assume the starting solution is x(0) = 4.
J Solution.
k x(k) F(x(k)) x(k) ′
F(x(k) ′ )
0 4 0 1.916384 -0.33155 1 1.916384 -0.33155 1.160478 -0.81665 2 1.160478 -0.81665 1 0 3 1.160478 -0.81665 3.666106 -3.6219 4 3.666106 -3.6219 6.424358 1279.983 5 3.666106 -3.6219 3.807432 -3.00396 6 3.666106 -3.6219 2.799156 0.970674
The algorithm terminates after observing that there are no further improvement after three successive iterations. The terminating criterion is based on the user preferences. Best solution from the above algorithm is x = 3.66.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Single Variable Heuristics
H Discrete Greedy
H Discrete Random Walk
H Continuous Greedy
H Continuous Random Walk
H Mixed & Hybrids
J Based on experience, and problem’s complexity, both random walk and greedy ideas can be mixed.
J Neighborhood definition is always one of the key element in a heuristic. The simple rule to remember is:
Neighborhood Size Solution Quality Search Speed Small Poor Fast Big Good Slow
J Furthermore, a continuous variable may be initially searched on coarse discrete intervals. Then a fine search for continuous value can be executed in smaller intervals.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 12 / 35
Multiple Variables
J Extension-1 The concepts from single variable heuristic methods can be easily extended to multiple variables. The idea is to consider one variable at a time. A set of iterations are repeated for one variable (stop as soon as you see an improvement, or stop after a fixed number of iterations).
J Extension-2 The above extension may not be practical in real scenarios. Thus, multiple variables are to be considered simultaneously. In addition to that, based on the variable types, different strategies are selected:
H Discrete variables: Immediate/Extended neighborhood with Greedy type or Random Walk type move.
H Continuous variables: We will see in when discussing NLP chapter.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 13 / 35
Discrete Multiple Variables
Solution Structure
The new and important element in designing a heuristic for multiple variables is the solution structure. Typical structures include a vector, or matrix.
The definition of neighborhood will be based on solution structure.
Following are the definitions of neighborhood and move:
Neighborhood:
Move:
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 14 / 35
Discrete Multiple Variables
Solution Structure
The new and important element in designing a heuristic for multiple variables is the solution structure. Typical structures include a vector, or matrix. The definition of neighborhood will be based on solution structure.
Following are the definitions of neighborhood and move:
Neighborhood:
Move:
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 14 / 35
Discrete Multiple Variables
Solution Structure
The new and important element in designing a heuristic for multiple variables is the solution structure. Typical structures include a vector, or matrix. The definition of neighborhood will be based on solution structure.
Following are the definitions of neighborhood and move:
Neighborhood:
Move:
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 14 / 35
Discrete Multiple Variables
Solution Structure
The new and important element in designing a heuristic for multiple variables is the solution structure. Typical structures include a vector, or matrix. The definition of neighborhood will be based on solution structure.
Following are the definitions of neighborhood and move:
Neighborhood: Since the approach is greedy, immediate neighborhood (or unit neighborhood) is considered.
If x(k) ∈ Zn is the current solution, then N(x(k)) = {x(k) ± Ii, ∀i = 1, . . . , n}. where Ii is a column of identity matrix I ∈ Rn×n
For example: if x = [2, 4, 3]T, then every column in the following matrix is a neighbor:
N(x(0)) =
1 3 2 2 2 2 4 4 3 5 4 4 3 3 3 3 2 4
Move: Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 14 / 35
Discrete Multiple Variables
Solution Structure
The new and important element in designing a heuristic for multiple variables is the solution structure. Typical structures include a vector, or matrix. The definition of neighborhood will be based on solution structure.
Following are the definitions of neighborhood and move:
Neighborhood: Since the approach is greedy, immediate neighborhood (or unit neighborhood) is considered.
If x(k) ∈ Zn is the current solution, then N(x(k)) = {x(k) ± Ii, ∀i = 1, . . . , n}. where Ii is a column of identity matrix I ∈ Rn×n For example: if x = [2, 4, 3]T, then every column in the following matrix is a neighbor:
N(x(0)) =
1 3 2 2 2 2 4 4 3 5 4 4 3 3 3 3 2 4
Move: Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 14 / 35
Discrete Multiple Variables
Solution Structure
The new and important element in designing a heuristic for multiple variables is the solution structure. Typical structures include a vector, or matrix. The definition of neighborhood will be based on solution structure.
Following are the definitions of neighborhood and move:
Neighborhood:
Move: Move is greedy, so the entire neighborhood is searched to get the candidate solution (best solution
from the entire neighborhood).
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 14 / 35
Discrete Multiple Variables: Greedy Heuristic
á Start from an initial solution.
á Set current solution as initial solution.
á Begin iterations
å Find the local best from the entire immediate neighborhood of current solution.
å Set candidate solution as local best solution.
å If candidate solution is same as current solution
å Then terminate the algorithm.
å Else
å Update current solution = local best solution
å Continue to next iteration.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 15 / 35
Discrete Multiple Variables: Example
Q. Example: Consider the following optimization problem:
minimize :
x21 + x 2 2 + x
2 3 − 6x1 − 14x2 − 2x3 + 59
subject to :
xi ∈ Z+ ∀i = 1, 2, 3.
Assume you want to build a heuristic that starts from solution [2, 6, 0]T .
Solution. The entire iterations can be summarized as:
k x(k) F(x(k)) L(k) F(L(k))
Thus the local optimal solution is [3, 7, 1]T .
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 16 / 35
Discrete Multiple Variables: Example
Q. Example: Consider the following optimization problem:
minimize :
x21 + x 2 2 + x
2 3 − 6x1 − 14x2 − 2x3 + 59
subject to :
xi ∈ Z+ ∀i = 1, 2, 3.
Assume you want to build a heuristic that starts from solution [2, 6, 0]T . Solution. Iteration-1: k = 0
The entire iterations can be summarized as:
k x(k) F(x(k)) L(k) F(L(k))
Thus the local optimal solution is [3, 7, 1]T .
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 16 / 35
Discrete Multiple Variables: Example
Q. Example: Consider the following optimization problem:
minimize :
x21 + x 2 2 + x
2 3 − 6x1 − 14x2 − 2x3 + 59
subject to :
xi ∈ Z+ ∀i = 1, 2, 3.
Assume you want to build a heuristic that starts from solution [2, 6, 0]T . Solution. Iteration-1: k = 0 Current solution = Initial Solution = x(0) = [2, 6, 0]T . Objective function value of the current solution = F(x(0)) = 3.
The entire iterations can be summarized as:
k x(k) F(x(k)) L(k) F(L(k))
Thus the local optimal solution is [3, 7, 1]T .
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 16 / 35
Discrete Multiple Variables: Example
Q. Example: Consider the following optimization problem:
minimize :
x21 + x 2 2 + x
2 3 − 6x1 − 14x2 − 2x3 + 59
subject to :
xi ∈ Z+ ∀i = 1, 2, 3.
Assume you want to build a heuristic that starts from solution [2, 6, 0]T . Solution. Iteration-1: k = 0 Current solution = Initial Solution = x(0) = [2, 6, 0]T . Objective function value of the current solution = F(x(0)) = 3. Immediate neighborhood of the current solution:
N(x(0)) =
1 3 2 2 26 6 5 7 6
0 0 0 0 1
The entire iterations can be summarized as: k x(k) F(x(k)) L(k) F(L(k))
Thus the local optimal solution is [3, 7, 1]T .
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 16 / 35
Discrete Multiple Variables: Example
Q. Example: Consider the following optimization problem:
minimize :
x21 + x 2 2 + x
2 3 − 6x1 − 14x2 − 2x3 + 59
subject to :
xi ∈ Z+ ∀i = 1, 2, 3.
Assume you want to build a heuristic that starts from solution [2, 6, 0]T . Solution. Iteration-1: k = 0 Current solution = Initial Solution = x(0) = [2, 6, 0]T . Objective function value of the current solution = F(x(0)) = 3. Immediate neighborhood of the current solution:
N(x(0)) =
1 3 2 2 26 6 5 7 6
0 0 0 0 1
Candidate solution that is local best from the neighborhood is Lk = [3, 6, 0]T with F(L(k)) = 2.
The entire iterations can be summarized as: k x(k) F(x(k)) L(k) F(L(k))
Thus the local optimal solution is [3, 7, 1]T .
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 16 / 35
Discrete Multiple Variables: Example
Q. Example: Consider the following optimization problem:
minimize :
x21 + x 2 2 + x
2 3 − 6x1 − 14x2 − 2x3 + 59
subject to :
xi ∈ Z+ ∀i = 1, 2, 3.
Assume you want to build a heuristic that starts from solution [2, 6, 0]T . Solution. Iteration-1: k = 0 Current solution = Initial Solution = x(0) = [2, 6, 0]T . Objective function value of the current solution = F(x(0)) = 3. Immediate neighborhood of the current solution:
N(x(0)) =
1 3 2 2 26 6 5 7 6
0 0 0 0 1
Candidate solution that is local best from the neighborhood is Lk = [3, 6, 0]T with F(L(k)) = 2. Since, the candidate solution is better than the current solution, we update: k = 1, x(1) = [3, 6, 0]T .
The entire iterations can be summarized as: k x(k) F(x(k)) L(k) F(L(k))
Thus the local optimal solution is [3, 7, 1]T .
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 16 / 35
Discrete Multiple Variables: Example
Q. Example: Consider the following optimization problem:
minimize :
x21 + x 2 2 + x
2 3 − 6x1 − 14x2 − 2x3 + 59
subject to :
xi ∈ Z+ ∀i = 1, 2, 3.
Assume you want to build a heuristic that starts from solution [2, 6, 0]T . Solution. The entire iterations can be summarized as:
k x(k) F(x(k)) L(k) F(L(k))
Thus the local optimal solution is [3, 7, 1]T .
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 16 / 35
Discrete Multiple Variables: Example
Q. Example: Consider the following optimization problem:
minimize :
x21 + x 2 2 + x
2 3 − 6x1 − 14x2 − 2x3 + 59
subject to :
xi ∈ Z+ ∀i = 1, 2, 3.
Assume you want to build a heuristic that starts from solution [2, 6, 0]T . Solution. The entire iterations can be summarized as:
k x(k) F(x(k)) L(k) F(L(k))
0 [2, 6, 0]T 3 [3, 6, 0]T 2
Thus the local optimal solution is [3, 7, 1]T .
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 16 / 35
Discrete Multiple Variables: Example
Q. Example: Consider the following optimization problem:
minimize :
x21 + x 2 2 + x
2 3 − 6x1 − 14x2 − 2x3 + 59
subject to :
xi ∈ Z+ ∀i = 1, 2, 3.
Assume you want to build a heuristic that starts from solution [2, 6, 0]T . Solution. The entire iterations can be summarized as:
k x(k) F(x(k)) L(k) F(L(k))
0 [2, 6, 0]T 3 [3, 6, 0]T 2
1 [3, 6, 0]T 2 [3, 7, 0]T 1
Thus the local optimal solution is [3, 7, 1]T .
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 16 / 35
Discrete Multiple Variables: Example
Q. Example: Consider the following optimization problem:
minimize :
x21 + x 2 2 + x
2 3 − 6x1 − 14x2 − 2x3 + 59
subject to :
xi ∈ Z+ ∀i = 1, 2, 3.
Assume you want to build a heuristic that starts from solution [2, 6, 0]T . Solution. The entire iterations can be summarized as:
k x(k) F(x(k)) L(k) F(L(k))
0 [2, 6, 0]T 3 [3, 6, 0]T 2
1 [3, 6, 0]T 2 [3, 7, 0]T 1
2 [3, 7, 0]T 1 [3, 7, 1]T 0
Thus the local optimal solution is [3, 7, 1]T .
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 16 / 35
Discrete Multiple Variables: Example
Q. Example: Consider the following optimization problem:
minimize :
x21 + x 2 2 + x
2 3 − 6x1 − 14x2 − 2x3 + 59
subject to :
xi ∈ Z+ ∀i = 1, 2, 3.
Assume you want to build a heuristic that starts from solution [2, 6, 0]T . Solution. The entire iterations can be summarized as:
k x(k) F(x(k)) L(k) F(L(k))
0 [2, 6, 0]T 3 [3, 6, 0]T 2
1 [3, 6, 0]T 2 [3, 7, 0]T 1
2 [3, 7, 0]T 1 [3, 7, 1]T 0
3 [3, 7, 1]T 0 [3, 7, 1]T 0
Thus the local optimal solution is [3, 7, 1]T .
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 16 / 35
Discrete Multiple Variables: Example
Q. Example: Consider the following optimization problem:
minimize :
x21 + x 2 2 + x
2 3 − 6x1 − 14x2 − 2x3 + 59
subject to :
xi ∈ Z+ ∀i = 1, 2, 3.
Assume you want to build a heuristic that starts from solution [2, 6, 0]T . Solution. The entire iterations can be summarized as:
k x(k) F(x(k)) L(k) F(L(k))
0 [2, 6, 0]T 3 [3, 6, 0]T 2
1 [3, 6, 0]T 2 [3, 7, 0]T 1
2 [3, 7, 0]T 1 [3, 7, 1]T 0
3 [3, 7, 1]T 0 [3, 7, 1]T 0
Thus the local optimal solution is [3, 7, 1]T .
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 16 / 35
Discrete Multiple Variables: Example
Q. Example: Consider the following optimization problem:
minimize :
x21 + x 2 2 + x
2 3 − 6x1 − 14x2 − 2x3 + 59
subject to :
xi ∈ Z+ ∀i = 1, 2, 3.
Assume you want to build a heuristic that starts from solution [2, 6, 0]T . Solution. The entire iterations can be summarized as:
k x(k) F(x(k)) L(k) F(L(k))
0 [2, 6, 0]T 3 [3, 6, 0]T 2
1 [3, 6, 0]T 2 [3, 7, 0]T 1
2 [3, 7, 0]T 1 [3, 7, 1]T 0
3 [3, 7, 1]T 0 [3, 7, 1]T 0
Thus the local optimal solution is [3, 7, 1]T .
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 16 / 35
Discrete Multiple Variables: Notations
. S: Underlying structure to depict solution.
. s: A solution represented in a vector or a matrix form.
. I: A method to generate a random initial solution based on the structure.
. G: A real valued objective function defined over the solution space s.
. g: Objective function value based on objective function G for a given solution s.
. G: A set of objective function values for a set of solutions.
. N : A neighborhood search method, which will give one or more neighbor solutions.
. N: A set of neighbor solutions.
. (k): A superscript denoting iteration number.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 17 / 35
Discrete Multiple Variables: Greedy Approach
START s(0) = I(S)
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 18 / 35
Discrete Multiple Variables: Greedy Approach
START s(0) = I(S)
Set k = 0, s∗ = s(k)
initialize
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 18 / 35
Discrete Multiple Variables: Greedy Approach
START s(0) = I(S)
Set k = 0, s∗ = s(k)
Get g(k) = G(s(k))
initialize evaluate
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 18 / 35
Discrete Multiple Variables: Greedy Approach
START s(0) = I(S)
Set k = 0, s∗ = s(k)
Get g(k) = G(s(k))
Get N(k) = N(s(k))
initialize evaluate
get neighbors
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 18 / 35
Discrete Multiple Variables: Greedy Approach
START s(0) = I(S)
Set k = 0, s∗ = s(k)
Get g(k) = G(s(k))
Get N(k) = N(s(k))
Get G(k) = G(N(k))
initialize evaluate
get neighbors
evaluate
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 18 / 35
Discrete Multiple Variables: Greedy Approach
START s(0) = I(S)
Set k = 0, s∗ = s(k)
Get g(k) = G(s(k))
Get N(k) = N(s(k))
Get G(k) = G(N(k))
Pick s(k+1) = best(Nk)
initialize evaluate
get neighbors
evaluate
greedy
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 18 / 35
Discrete Multiple Variables: Greedy Approach
START s(0) = I(S)
Set k = 0, s∗ = s(k)
Get g(k) = G(s(k))
Get N(k) = N(s(k))
Get G(k) = G(N(k))
Pick s(k+1) = best(Nk)
g(k+1) < g(k) ?
initialize evaluate
get neighbors
evaluate
greedy
check for minimization
for maximization use >
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 18 / 35
Discrete Multiple Variables: Greedy Approach
START s(0) = I(S)
Set k = 0, s∗ = s(k)
Get g(k) = G(s(k))
Get N(k) = N(s(k))
Get G(k) = G(N(k))
Pick s(k+1) = best(Nk)
g(k+1) < g(k) ?
STOP s∗ is local optimal
initialize evaluate
get neighbors
evaluate
greedy
check for minimization
for maximization use >
NO
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 18 / 35
Discrete Multiple Variables: Greedy Approach
START s(0) = I(S)
Set k = 0, s∗ = s(k)
Get g(k) = G(s(k))
Get N(k) = N(s(k))
Get G(k) = G(N(k))
Pick s(k+1) = best(Nk)
g(k+1) < g(k) ?
STOP s∗ is local optimal
Update s∗ = s(k+1), k = k + 1
initialize evaluate
get neighbors
evaluate
greedy
check for minimization
for maximization use >
NO
YES
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 18 / 35
Discrete Multiple Variables: Greedy Approach
START s(0) = I(S)
Set k = 0, s∗ = s(k)
Get g(k) = G(s(k))
Get N(k) = N(s(k))
Get G(k) = G(N(k))
Pick s(k+1) = best(Nk)
g(k+1) < g(k) ?
STOP s∗ is local optimal
Update s∗ = s(k+1), k = k + 1
initialize evaluate
get neighbors
evaluate
greedy
check for minimization
for maximization use >
NO
YES Iterate
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 18 / 35
Discrete Multiple Variables: Greedy Approach
+ In the greedy approach’s flow chart, all red boxes needs problem specific information.
+ In the greedy approach’s flow chart, all green boxes do NOT need any problem specific information.
+ Thus, any greedy algorithm can be developed, if we can extract following four things from the problem description:
S : I : G : N :
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 19 / 35
Discrete Multiple Variables: Greedy Approach
+ In the greedy approach’s flow chart, all red boxes needs problem specific information.
+ In the greedy approach’s flow chart, all green boxes do NOT need any problem specific information.
+ Thus, any greedy algorithm can be developed, if we can extract following four things from the problem description:
S : A solution structure in the form of vector, matrix, etc. I : A mechanism to generate an initial solution. G : An objective function to evaluate a solution. N : A mechanism to search for neighborhood solutions. In greedy
approach, we look for more than one neighborhood solution (possibly all solutions from the neighborhood).
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 19 / 35
Discrete Multiple Variables: Improving Greedy
+ Initial solution
Does the choice of initial solution (starting solution)
matter??
+ Greedy approach
Is there any issue with the Greedy approach??
+ Revisits to earlier solutions
Is the revisiting of solutions an issue??
+ Local optimal
Should we be happy with the local optimal??
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 20 / 35
Discrete Multiple Variables: Improving Greedy
+ Initial solution
Does the choice of initial solution (starting solution)
matter??
+ Greedy approach
Is there any issue with the Greedy approach??
+ Revisits to earlier solutions
Is the revisiting of solutions an issue??
+ Local optimal
Should we be happy with the local optimal??
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 20 / 35
Discrete Multiple Variables: Improving Greedy
+ Initial solution
Does the choice of initial solution (starting solution)
matter?? Yes, different initial solutions may lead to different local optimal solutions.
+ Greedy approach
Is there any issue with the Greedy approach??
+ Revisits to earlier solutions
Is the revisiting of solutions an issue??
+ Local optimal
Should we be happy with the local optimal??
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 20 / 35
Discrete Multiple Variables: Improving Greedy
+ Initial solution
Does the choice of initial solution (starting solution)
matter??
+ Greedy approach
Is there any issue with the Greedy approach??
+ Revisits to earlier solutions
Is the revisiting of solutions an issue??
+ Local optimal
Should we be happy with the local optimal??
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 20 / 35
Discrete Multiple Variables: Improving Greedy
+ Initial solution
Does the choice of initial solution (starting solution)
matter??
+ Greedy approach
Is there any issue with the Greedy approach?? Yes, Hill Climbing phenomenon.
+ Revisits to earlier solutions
Is the revisiting of solutions an issue??
+ Local optimal
Should we be happy with the local optimal??
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 20 / 35
Discrete Multiple Variables: Improving Greedy
+ Initial solution
Does the choice of initial solution (starting solution)
matter??
+ Greedy approach
Is there any issue with the Greedy approach??
+ Revisits to earlier solutions
Is the revisiting of solutions an issue??
+ Local optimal
Should we be happy with the local optimal??
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 20 / 35
Discrete Multiple Variables: Improving Greedy
+ Initial solution
Does the choice of initial solution (starting solution)
matter??
+ Greedy approach
Is there any issue with the Greedy approach??
+ Revisits to earlier solutions
Is the revisiting of solutions an issue?? Yes, computational time and storage issues.
+ Local optimal
Should we be happy with the local optimal??
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 20 / 35
Discrete Multiple Variables: Improving Greedy
+ Initial solution
Does the choice of initial solution (starting solution)
matter??
+ Greedy approach
Is there any issue with the Greedy approach??
+ Revisits to earlier solutions
Is the revisiting of solutions an issue??
+ Local optimal
Should we be happy with the local optimal??
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 20 / 35
Discrete Multiple Variables: Improving Greedy
+ Initial solution
Does the choice of initial solution (starting solution)
matter??
+ Greedy approach
Is there any issue with the Greedy approach??
+ Revisits to earlier solutions
Is the revisiting of solutions an issue??
+ Local optimal
Should we be happy with the local optimal?? Yes, it is a better solution. No, it may not the best.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 20 / 35
Discrete Multiple Variables: Improving Greedy
+ Initial solution
Does the choice of initial solution (starting solution)
matter??
+ Greedy approach
Is there any issue with the Greedy approach??
+ Revisits to earlier solutions
Is the revisiting of solutions an issue??
+ Local optimal
Should we be happy with the local optimal??
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 20 / 35
Discrete Multiple Variables: Improving Greedy
+ Initial solution Does the choice of initial solution (starting solution)
matter??
+ Greedy approach Is there any issue with the Greedy approach??
+ Revisits to earlier solutions Is the revisiting of solutions an issue??
+ Local optimal Should we be happy with the local optimal??
Some Research Directions: Initial solution- Maybe start with many random solutions, like: genetic algorithms.
Greedy approach- Maybe accept solutions that are not local best, like: simulated annealing, genetic algorithms, and all the meta-heuristics.
Revisits to earlier solutions- Maybe keep a list of solutions visited earlier, and avoid visiting them. List the revisits as taboo, like tabu search.
Local optimal- Working on the above three ideas may lead us to global optimal.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 20 / 35
1 Heuristic & Meta-Heuristic
2 Greedy and Random Heuristic Single Variables Multiple Variables
3 Simulated Annealing ILP Applications
4 Constraint Programming
5 Conclusion
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 21 / 35
Simulated Annealing: Introduction
Annealing:
Simulated Annealing
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 22 / 35
Simulated Annealing: Introduction
Annealing: to cool slowly a hot a metal (or glass) from liquid state into a solid state.
Simulated Annealing
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 22 / 35
Simulated Annealing: Introduction
Annealing: to cool slowly a hot a metal (or glass) from liquid state into a solid state.
Simulated Annealing Temp = 2500o F
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 22 / 35
Simulated Annealing: Introduction
Annealing: to cool slowly a hot a metal (or glass) from liquid state into a solid state.
Simulated Annealing Temp = 2500o F
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 22 / 35
Simulated Annealing: Introduction
Annealing: to cool slowly a hot a metal (or glass) from liquid state into a solid state.
Simulated Annealing Temp = 2500o F
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 22 / 35
Simulated Annealing: Introduction
Annealing: to cool slowly a hot a metal (or glass) from liquid state into a solid state.
Simulated Annealing Temp = 2500o F
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 22 / 35
Simulated Annealing: Introduction
Annealing: to cool slowly a hot a metal (or glass) from liquid state into a solid state.
Simulated Annealing Temp = 1500o F
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 22 / 35
Simulated Annealing: Introduction
Annealing: to cool slowly a hot a metal (or glass) from liquid state into a solid state.
Simulated Annealing Temp = 1500o F
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 22 / 35
Simulated Annealing: Introduction
Annealing: to cool slowly a hot a metal (or glass) from liquid state into a solid state.
Simulated Annealing Temp = 1500o F
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 22 / 35
Simulated Annealing: Introduction
Annealing: to cool slowly a hot a metal (or glass) from liquid state into a solid state.
Simulated Annealing Temp = 500o F
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 22 / 35
Simulated Annealing: Introduction
Annealing: to cool slowly a hot a metal (or glass) from liquid state into a solid state.
Simulated Annealing Temp = 500o F
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 22 / 35
Simulated Annealing: Introduction
Annealing: to cool slowly a hot a metal (or glass) from liquid state into a solid state.
Simulated Annealing Temp = 500o F
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 22 / 35
Simulated Annealing: Introduction
Annealing: to cool slowly a hot a metal (or glass) from liquid state into a solid state. The aim is to to remove internal stresses and toughen the material.
Simulated Annealing
Temp = 74o F
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 22 / 35
Simulated Annealing: Introduction
Annealing: to cool slowly a hot a metal (or glass) from liquid state into a solid state. The aim is to to remove internal stresses and toughen the material.
Simulated Annealing
Temp
Time
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 22 / 35
Simulated Annealing: Introduction
Annealing: to cool slowly a hot a metal (or glass) from liquid state into a solid state. The aim is to to remove internal stresses and toughen the material.
Simulated Annealing
Temp
Time
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 22 / 35
Simulated Annealing: Introduction
Annealing: to cool slowly a hot a metal (or glass) from liquid state into a solid state. The aim is to to remove internal stresses and toughen the material.
Simulated Annealing
Temp
Time
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 22 / 35
Simulated Annealing: Introduction
Annealing: to cool slowly a hot a metal (or glass) from liquid state into a solid state. The aim is to to remove internal stresses and toughen the material.
Simulated Annealing SA method models the physical process of annealing, and the aim is to minimize energy (typically derived from the objective function) of the system.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 22 / 35
Simulated Annealing: Introduction
Annealing: to cool slowly a hot a metal (or glass) from liquid state into a solid state. The aim is to to remove internal stresses and toughen the material.
Simulated Annealing SA method models the physical process of annealing, and the aim is to minimize energy (typically derived from the objective function) of the system.
Temp
Time
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 22 / 35
Simulated Annealing: Introduction
Annealing: to cool slowly a hot a metal (or glass) from liquid state into a solid state. The aim is to to remove internal stresses and toughen the material.
Simulated Annealing SA method models the physical process of annealing, and the aim is to minimize energy (typically derived from the objective function) of the system.
Temp
Time
Start from a high simulated
temperature (𝜏0).
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 22 / 35
Simulated Annealing: Introduction
Annealing: to cool slowly a hot a metal (or glass) from liquid state into a solid state. The aim is to to remove internal stresses and toughen the material.
Simulated Annealing SA method models the physical process of annealing, and the aim is to minimize energy (typically derived from the objective function) of the system.
Temp
Time
At 𝜏0 execute k iterations. In each iteration, look for a neighbor. a) If the neighbor is better (w.r.t objective function), take it as current solution. b) If the neighbor is NOT better, then ACCEPT with probability based on
𝜏0.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 22 / 35
Simulated Annealing: Introduction
Annealing: to cool slowly a hot a metal (or glass) from liquid state into a solid state. The aim is to to remove internal stresses and toughen the material.
Simulated Annealing SA method models the physical process of annealing, and the aim is to minimize energy (typically derived from the objective function) of the system.
Temp
Time
After k iterations, reduce the
temperature: 𝜏1
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 22 / 35
Simulated Annealing: Introduction
Annealing: to cool slowly a hot a metal (or glass) from liquid state into a solid state. The aim is to to remove internal stresses and toughen the material.
Simulated Annealing SA method models the physical process of annealing, and the aim is to minimize energy (typically derived from the objective function) of the system.
Temp
Time
Repeat the iterations until a termination criterion is met.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 22 / 35
Simulated Annealing: Introduction
Annealing: to cool slowly a hot a metal (or glass) from liquid state into a solid state. The aim is to to remove internal stresses and toughen the material.
Simulated Annealing SA method models the physical process of annealing, and the aim is to minimize energy (typically derived from the objective function) of the system.
Temp
Time
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 22 / 35
Simulated Annealing: Introduction
Annealing: to cool slowly a hot a metal (or glass) from liquid state into a solid state. The aim is to to remove internal stresses and toughen the material.
Simulated Annealing SA method models the physical process of annealing, and the aim is to minimize energy (typically derived from the objective function) of the system.
Temp
Time
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 22 / 35
Simulated Annealing: An Implementation
Motivation:
In the following slides we will see a simple structure of the simulated annealing algorithm. The aim is to show pictorially the crux of the annealing approach. Alternate versions of the approach can be implement by considering various assumptions.
Assumptions:
The objective of our simulated annealing approach is to find a good solution (Note: we cannot even guarantee local optimality). We assume that for a given temperature Tr, we investigate L solutions. At any temperature level, Tr, the probability of accepting a bad solution is at
most Pr = e −∆g Tr , where function g is related to objective function. A
good simulated annealing algorithm should have a carefully selected cooling curve C (cooling curve is nothing but a mechanism to reduce the temperature from one state to the next state.)
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 23 / 35
Simulated Annealing: An Implementation
START s(o) = I(S)
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 24 / 35
Simulated Annealing: An Implementation
START s(o) = I(S)
Set k, r = 0, s∗ = s(o), Tr = T
initialize
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 24 / 35
Simulated Annealing: An Implementation
START s(o) = I(S)
Set k, r = 0, s∗ = s(o), Tr = T
Get g(o) = G(s(o))
initialize evaluate
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 24 / 35
Simulated Annealing: An Implementation
START s(o) = I(S)
Set k, r = 0, s∗ = s(o), Tr = T
Get g(o) = G(s(o))
Get s(n) = N(s(o))
initialize evaluate
get neighbor
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 24 / 35
Simulated Annealing: An Implementation
START s(o) = I(S)
Set k, r = 0, s∗ = s(o), Tr = T
Get g(o) = G(s(o))
Get s(n) = N(s(o))
Get g(n) = G(s(n))
initialize evaluate
get neighbor
evaluate
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 24 / 35
Simulated Annealing: An Implementation
START s(o) = I(S)
Set k, r = 0, s∗ = s(o), Tr = T
Get g(o) = G(s(o))
Get s(n) = N(s(o))
Get g(n) = G(s(n))
Set ∆g = (g(n) − g(o))
initialize evaluate
get neighbor
evaluate
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 24 / 35
Simulated Annealing: An Implementation
START s(o) = I(S)
Set k, r = 0, s∗ = s(o), Tr = T
Get g(o) = G(s(o))
Get s(n) = N(s(o))
Get g(n) = G(s(n))
Set ∆g = (g(n) − g(o))
∆g < 0 ?
initialize evaluate
get neighbor
evaluate
check for
minimization
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 24 / 35
Simulated Annealing: An Implementation
START s(o) = I(S)
Set k, r = 0, s∗ = s(o), Tr = T
Get g(o) = G(s(o))
Get s(n) = N(s(o))
Get g(n) = G(s(n))
Set ∆g = (g(n) − g(o))
∆g < 0 ?
Update
s∗ = best{s∗, s(n)}, s(o) = s(n)
g(o) = g(n)
k = k + 1
initialize evaluate
get neighbor
evaluate
check for
minimization
YES
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 24 / 35
Simulated Annealing: An Implementation
START s(o) = I(S)
Set k, r = 0, s∗ = s(o), Tr = T
Get g(o) = G(s(o))
Get s(n) = N(s(o))
Get g(n) = G(s(n))
Set ∆g = (g(n) − g(o))
∆g < 0 ?
Update
s∗ = best{s∗, s(n)}, s(o) = s(n)
g(o) = g(n)
k = k + 1
TC?
initialize evaluate
get neighbor
evaluate
check for
minimization
YES
YES
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 24 / 35
Simulated Annealing: An Implementation
START s(o) = I(S)
Set k, r = 0, s∗ = s(o), Tr = T
Get g(o) = G(s(o))
Get s(n) = N(s(o))
Get g(n) = G(s(n))
Set ∆g = (g(n) − g(o))
∆g < 0 ?
Update
s∗ = best{s∗, s(n)}, s(o) = s(n)
g(o) = g(n)
k = k + 1
TC? k ≥L ?
initialize evaluate
get neighbor
evaluate
check for
minimization
YES
YES
NO
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 24 / 35
Simulated Annealing: An Implementation
START s(o) = I(S)
Set k, r = 0, s∗ = s(o), Tr = T
Get g(o) = G(s(o))
Get s(n) = N(s(o))
Get g(n) = G(s(n))
Set ∆g = (g(n) − g(o))
∆g < 0 ?
Update
s∗ = best{s∗, s(n)}, s(o) = s(n)
g(o) = g(n)
k = k + 1
TC? k ≥L ? Cool
Tr+1 = C(Tr) r = r + 1, k = 0
initialize evaluate
get neighbor
evaluate
check for
minimization
YES
YES
NO YES
iterate
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 24 / 35
Simulated Annealing: An Implementation
START s(o) = I(S)
Set k, r = 0, s∗ = s(o), Tr = T
Get g(o) = G(s(o))
Get s(n) = N(s(o))
Get g(n) = G(s(n))
Set ∆g = (g(n) − g(o))
∆g < 0 ?
Update
s∗ = best{s∗, s(n)}, s(o) = s(n)
g(o) = g(n)
k = k + 1
TC? k ≥L ? Cool
Tr+1 = C(Tr) r = r + 1, k = 0
initialize evaluate
get neighbor
evaluate
check for
minimization
YES
YES
NO YES
iterateNO
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 24 / 35
Simulated Annealing: An Implementation
START s(o) = I(S)
Set k, r = 0, s∗ = s(o), Tr = T
Get g(o) = G(s(o))
Get s(n) = N(s(o))
Get g(n) = G(s(n))
Set ∆g = (g(n) − g(o))
∆g < 0 ?
Update
s∗ = best{s∗, s(n)}, s(o) = s(n)
g(o) = g(n)
k = k + 1
TC? k ≥L ? Cool
Tr+1 = C(Tr) r = r + 1, k = 0
STOP, return s∗
initialize evaluate
get neighbor
evaluate
check for
minimization
YES
YES
NO YES
iterateNO
YES
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 24 / 35
Simulated Annealing: An Implementation
START s(o) = I(S)
Set k, r = 0, s∗ = s(o), Tr = T
Get g(o) = G(s(o))
Get s(n) = N(s(o))
Get g(n) = G(s(n))
Set ∆g = (g(n) − g(o))
∆g < 0 ?
Update
s∗ = best{s∗, s(n)}, s(o) = s(n)
g(o) = g(n)
k = k + 1
TC? k ≥L ? Cool
Tr+1 = C(Tr) r = r + 1, k = 0
STOP, return s∗
p ≤ e −∆g Tr ?
initialize evaluate
get neighbor
evaluate
check for
minimization
YES
YES
NO YES
iterateNO
YES
NO
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 24 / 35
Simulated Annealing: An Implementation
START s(o) = I(S)
Set k, r = 0, s∗ = s(o), Tr = T
Get g(o) = G(s(o))
Get s(n) = N(s(o))
Get g(n) = G(s(n))
Set ∆g = (g(n) − g(o))
∆g < 0 ?
Update
s∗ = best{s∗, s(n)}, s(o) = s(n)
g(o) = g(n)
k = k + 1
TC? k ≥L ? Cool
Tr+1 = C(Tr) r = r + 1, k = 0
STOP, return s∗
p ≤ e −∆g Tr ?
initialize evaluate
get neighbor
evaluate
check for
minimization
YES
YES
NO YES
iterateNO
YES
NO
YES
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 24 / 35
Simulated Annealing: An Implementation
START s(o) = I(S)
Set k, r = 0, s∗ = s(o), Tr = T
Get g(o) = G(s(o))
Get s(n) = N(s(o))
Get g(n) = G(s(n))
Set ∆g = (g(n) − g(o))
∆g < 0 ?
Update
s∗ = best{s∗, s(n)}, s(o) = s(n)
g(o) = g(n)
k = k + 1
TC? k ≥L ? Cool
Tr+1 = C(Tr) r = r + 1, k = 0
STOP, return s∗
p ≤ e −∆g Tr ?
initialize evaluate
get neighbor
evaluate
check for
minimization
YES
YES
NO YES
iterateNO
YES
NO
YES
NO
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 24 / 35
Simulated Annealing: Parameters and Tuning
J Notice that Simulated Annealing requires the same problem specific information (S,I,G,N)
J Simulated Annealing needs some parameters, which can be at first selected arbitrarily. However, they must be tuned by performing some experimentation.
J The parameters needed for Simulated Annealing:
H C, the cooling curve: + A simple cooling curve Tr+1 = α Tr, where 0 < α < 1.
+ A practical cooling curve Tr+1 = Tr
1+βTr , where β > 0.
+ A theoretical cooling curve or Tr+1 = T0 ln r
.
H T , the initial temperature: + Simple strategy is to set T very high at the beginning. Refer papers from the
reference list, which talks about different ideas to select initial temperature. + Practical strategy is to set T = 1, and normalize the ∆g function.
H L, the epoch length L represents the total number of success at one temperature, and this parameter must be tuned based on the experimentation.
J TC is terminating criteria:
H A limit can be placed on total number of neighbor searches (or total iterations). H A limit can be placed on the number of times the best objective function value
remained constant.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 25 / 35
Simulated Annealing: Convergence
The SA will almost converge to a global minimum solution if and only if the algorithm has following properties:
H Bounded: The best upper bound on the objective function value is M∗.
H Aperiodic: The algorithm can return to a solution but at irregular times.
H Success Probability: Infinite successful trials that escape local minimum:
∞∑ r=1
e −M∗ Tr = ∞.
H Irreducible: It is possible to move to any solution from any solution.
H Cooling Curve: The cooling temperature should be
lim r−→∞
Tr = 0.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 26 / 35
Simulated Annealing: Convergence
The SA will almost converge to a global minimum solution if and only if the algorithm has following properties:
H Bounded: The best upper bound on the objective function value is M∗.
H Aperiodic: The algorithm can return to a solution but at irregular times.
H Success Probability: Infinite successful trials that escape local minimum:
∞∑ r=1
e −M∗ Tr = ∞.
H Irreducible: It is possible to move to any solution from any solution.
H Cooling Curve: The cooling temperature should be
lim r−→∞
Tr = 0.
Note for Tr = M ln r
, SA with above properties converges if and only if M ≥ M∗.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 26 / 35
ILP: Handling Infeasibility
Consider the standard ILP:
minimize :
n∑ j=1
cjxj
subject to :
n∑ j=1
aijxj = bi ∀ i ∈ E
n∑ j=1
aijxj ≥ bi ∀ i ∈ G
n∑ j=1
aijxj ≤ bi ∀ i ∈ L
lj ≤ xj ≤ uj ∀ j = 1 . . .n
xj ∈ Z+ ∀ j = 1 . . .n
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 27 / 35
ILP: Handling Infeasibility
Consider the standard ILP:
minimize :
n∑ j=1
cjxj
subject to :
n∑ j=1
aijxj = bi ∀ i ∈ E
n∑ j=1
aijxj ≥ bi ∀ i ∈ G
n∑ j=1
aijxj ≤ bi ∀ i ∈ L
lj ≤ xj ≤ uj ∀ j = 1 . . .n
xj ∈ Z+ ∀ j = 1 . . .n
How to apply SA to the above problem?
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 27 / 35
ILP: Handling Infeasibility
Consider the standard ILP:
minimize :
n∑ j=1
cjxj
subject to :
n∑ j=1
aijxj = bi ∀ i ∈ E
n∑ j=1
aijxj ≥ bi ∀ i ∈ G
n∑ j=1
aijxj ≤ bi ∀ i ∈ L
lj ≤ xj ≤ uj ∀ j = 1 . . .n
xj ∈ Z+ ∀ j = 1 . . .n
How to apply SA to the above problem?
The problem can be re-written as:
minimize :
n∑ j=1
cjxj
+ ∑ i∈E
µi max
0, |
n∑ j=1
aijxj − bi|
+ ∑ i∈G
µi max
0,bi −
n∑ j=1
aijxj
+ ∑ i∈L
µi max
0,
n∑ j=1
aijxj − bi
subject to :
lj ≤ xj ≤ uj ∀ j = 1 . . .n
xj ∈ Z+ ∀ j = 1 . . .n
Incorporating the infeasiblity measure.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 27 / 35
ILP: Handling Infeasibility
Consider the standard ILP:
minimize :
n∑ j=1
cjxj
subject to :
n∑ j=1
aijxj = bi ∀ i ∈ E
n∑ j=1
aijxj ≥ bi ∀ i ∈ G
n∑ j=1
aijxj ≤ bi ∀ i ∈ L
lj ≤ xj ≤ uj ∀ j = 1 . . .n
xj ∈ Z+ ∀ j = 1 . . .n
How to apply SA to the above problem?
The problem can be re-written as:
minimize :
n∑ j=1
cjxj
+ ∑ i∈E
µi
n∑ j=1
aijxj − bi
2
+ ∑ i∈G
µi max
0,bi −
n∑ j=1
aijxj
2
+ ∑ i∈L
µi max
0,
n∑ j=1
aijxj − bi
2
subject to :
lj ≤ xj ≤ uj ∀ j = 1 . . .n
xj ∈ Z+ ∀ j = 1 . . .n
Incorporating differentiable infeasiblity measure.Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 27 / 35
ILP: Example-1
Question: Consider the following ILP:
minimize :
8∑ i=1
xi
subject to :
xi + xj ≥ 1 ∀(i,j) ∈ S xi ∈{0, 1} ∀i = 1, . . . , 8
where S = {(1,2),(1,6),(2,3),(2,4),(2,6),(3,5),(4,5),(4,7),(5,8),(6,7),(7,8)}. Write the model by incorporating infeasibility measure in the objective function.
Solution: The objective can be re-written as:
g(x) = 8∑ i=1
xi + ∑
(i,j)∈S µi,j max{0, 1 −xi −xj}
where µi,j is some scaling parameter.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 28 / 35
ILP: Example-1
Question: Consider the following ILP:
minimize :
8∑ i=1
xi
subject to :
xi + xj ≥ 1 ∀(i,j) ∈ S xi ∈{0, 1} ∀i = 1, . . . , 8
where S = {(1,2),(1,6),(2,3),(2,4),(2,6),(3,5),(4,5),(4,7),(5,8),(6,7),(7,8)}. Write the model by incorporating infeasibility measure in the objective function.
Solution: The objective can be re-written as:
g(x) = 8∑ i=1
xi + ∑
(i,j)∈S µi,j max{0, 1 −xi −xj}
where µi,j is some scaling parameter.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 28 / 35
1 Heuristic & Meta-Heuristic
2 Greedy and Random Heuristic Single Variables Multiple Variables
3 Simulated Annealing ILP Applications
4 Constraint Programming
5 Conclusion
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 29 / 35
Constraint Programming: Idea
. Do the constraints benefit in the solution search for ILPs?
. What will happen if there are too many constraints for ILPs?
. What will happen if there are no constraints for ILPs?
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 30 / 35
Constraint Programming
+ Consider the following discrete variables:
x ∈{1, 2, . . . , 8} y ∈{1, 2, . . . , 10} z ∈{1, 2, . . . , 10}
+ Assume, following are two of the constraints of the problem: “x 6= 7” and “y 6= 2” + Assume, following is an additional constraints of the problem: “x−y = 3z”
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 31 / 35
Constraint Programming
+ Consider the following discrete variables:
x ∈{1, 2, . . . , 8} y ∈{1, 2, . . . , 10} z ∈{1, 2, . . . , 10}
Q. How many combinations of the variable we must try for complete enumeration search?
+ Assume, following are two of the constraints of the problem: “x 6= 7” and “y 6= 2” + Assume, following is an additional constraints of the problem: “x−y = 3z”
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 31 / 35
Constraint Programming
+ Consider the following discrete variables:
x ∈{1, 2, . . . , 8} y ∈{1, 2, . . . , 10} z ∈{1, 2, . . . , 10}
Q. How many combinations of the variable we must try for complete enumeration search? 800
+ Assume, following are two of the constraints of the problem: “x 6= 7” and “y 6= 2” + Assume, following is an additional constraints of the problem: “x−y = 3z”
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 31 / 35
Constraint Programming
+ Consider the following discrete variables:
x ∈{1, 2, . . . , 8} y ∈{1, 2, . . . , 10} z ∈{1, 2, . . . , 10}
+ Assume, following are two of the constraints of the problem: “x 6= 7” and “y 6= 2”
+ Assume, following is an additional constraints of the problem: “x−y = 3z”
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 31 / 35
Constraint Programming
+ Consider the following discrete variables:
x ∈{1, 2, . . . , 8} y ∈{1, 2, . . . , 10} z ∈{1, 2, . . . , 10}
+ Assume, following are two of the constraints of the problem: “x 6= 7” and “y 6= 2” Q. What will be change in domain of x and y?
+ Assume, following is an additional constraints of the problem: “x−y = 3z”
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 31 / 35
Constraint Programming
+ Consider the following discrete variables:
x ∈{1, 2, . . . , 8} y ∈{1, 2, . . . , 10} z ∈{1, 2, . . . , 10}
x ∈{1, 2, 3, 4, 5, 6, 8} y ∈{1, 3, 4, 5, 6, 7, 8, 9, 10} z ∈{1, 2, . . . , 10}
+ Assume, following are two of the constraints of the problem: “x 6= 7” and “y 6= 2” Q. What will be change in domain of x and y?
+ Assume, following is an additional constraints of the problem: “x−y = 3z”
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 31 / 35
Constraint Programming
+ Consider the following discrete variables:
x ∈{1, 2, . . . , 8} y ∈{1, 2, . . . , 10} z ∈{1, 2, . . . , 10}
x ∈{1, 2, 3, 4, 5, 6, 8} y ∈{1, 3, 4, 5, 6, 7, 8, 9, 10} z ∈{1, 2, . . . , 10}
+ Assume, following are two of the constraints of the problem: “x 6= 7” and “y 6= 2” Q. How many combinations of the variable we must try for complete enumeration search?
+ Assume, following is an additional constraints of the problem: “x−y = 3z”
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 31 / 35
Constraint Programming
+ Consider the following discrete variables:
x ∈{1, 2, . . . , 8} y ∈{1, 2, . . . , 10} z ∈{1, 2, . . . , 10}
x ∈{1, 2, 3, 4, 5, 6, 8} y ∈{1, 3, 4, 5, 6, 7, 8, 9, 10} z ∈{1, 2, . . . , 10}
+ Assume, following are two of the constraints of the problem: “x 6= 7” and “y 6= 2” Q. How many combinations of the variable we must try for complete enumeration search?
630
+ Assume, following is an additional constraints of the problem: “x−y = 3z”
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 31 / 35
Constraint Programming
+ Consider the following discrete variables:
x ∈{1, 2, . . . , 8} y ∈{1, 2, . . . , 10} z ∈{1, 2, . . . , 10}
x ∈{1, 2, 3, 4, 5, 6, 8} y ∈{1, 3, 4, 5, 6, 7, 8, 9, 10} z ∈{1, 2, . . . , 10}
+ Assume, following are two of the constraints of the problem: “x 6= 7” and “y 6= 2” + Assume, following is an additional constraints of the problem: “x−y = 3z” Q. Bounds Consistency: What will be change in the bounds of x, y and z?
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 31 / 35
Constraint Programming
+ Consider the following discrete variables:
x ∈{1, 2, . . . , 8} y ∈{1, 2, . . . , 10} z ∈{1, 2, . . . , 10}
x ∈{1, 2, 3, 4, 5, 6, 8} y ∈{1, 3, 4, 5, 6, 7, 8, 9, 10} z ∈{1, 2, . . . , 10}
+ Assume, following are two of the constraints of the problem: “x 6= 7” and “y 6= 2” + Assume, following is an additional constraints of the problem: “x−y = 3z” Q. Bounds Consistency: What will be change in the bounds of x, y and z? + Rewrite the constraint: x = 3z + y. Consider the following cases:
What is the possible minimum value for x on LHS and RHS? What is the possible maximum value for x on LHS and RHS?
The bounds for x can be updated as 4 ≤ x ≤ 8.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 31 / 35
Constraint Programming
+ Consider the following discrete variables:
x ∈{1, 2, . . . , 8} y ∈{1, 2, . . . , 10} z ∈{1, 2, . . . , 10}
x ∈{1, 2, 3, 4, 5, 6, 8} y ∈{1, 3, 4, 5, 6, 7, 8, 9, 10} z ∈{1, 2, . . . , 10}
+ Assume, following are two of the constraints of the problem: “x 6= 7” and “y 6= 2” + Assume, following is an additional constraints of the problem: “x−y = 3z” Q. Bounds Consistency: What will be change in the bounds of x, y and z? + Rewrite the constraint: x = 3z + y. Consider the following cases:
What is the possible minimum value for x on LHS and RHS? 1 LHS, and 4 RHS What is the possible maximum value for x on LHS and RHS? 8 LHS, and 40 RHS
The bounds for x can be updated as 4 ≤ x ≤ 8.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 31 / 35
Constraint Programming
+ Consider the following discrete variables:
x ∈{1, 2, . . . , 8} y ∈{1, 2, . . . , 10} z ∈{1, 2, . . . , 10}
x ∈{4, 5, 6, 8} y ∈{1, 3, 4, 5, 6, 7, 8, 9, 10} z ∈{1, 2, . . . , 10}
+ Assume, following are two of the constraints of the problem: “x 6= 7” and “y 6= 2” + Assume, following is an additional constraints of the problem: “x−y = 3z” Q. Bounds Consistency: What will be change in the bounds of x, y and z? + Rewrite the constraint: x = 3z + y. Consider the following cases:
What is the possible minimum value for x on LHS and RHS? What is the possible maximum value for x on LHS and RHS? The bounds for x can be updated as 4 ≤ x ≤ 8.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 31 / 35
Constraint Programming
+ Consider the following discrete variables:
x ∈{1, 2, . . . , 8} y ∈{1, 2, . . . , 10} z ∈{1, 2, . . . , 10}
x ∈{4, 5, 6, 8} y ∈{1, 3, 4, 5, 6, 7, 8, 9, 10} z ∈{1, 2, . . . , 10}
+ Assume, following are two of the constraints of the problem: “x 6= 7” and “y 6= 2” + Assume, following is an additional constraints of the problem: “x−y = 3z” Q. Bounds Consistency: What will be change in the bounds of x, y and z? + Similar process can be executed for other variables.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 31 / 35
Constraint Programming
+ Consider the following discrete variables:
x ∈{1, 2, . . . , 8} y ∈{1, 2, . . . , 10} z ∈{1, 2, . . . , 10}
x ∈{4, 5, 6, 8} y ∈{1, 3, 4, 5} z ∈{1, 2}
+ Assume, following are two of the constraints of the problem: “x 6= 7” and “y 6= 2” + Assume, following is an additional constraints of the problem: “x−y = 3z”
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 31 / 35
Constraint Programming
+ Consider the following discrete variables:
x ∈{1, 2, . . . , 8} y ∈{1, 2, . . . , 10} z ∈{1, 2, . . . , 10}
x ∈{4, 5, 6, 8} y ∈{1, 3, 4, 5} z ∈{1, 2}
+ Assume, following are two of the constraints of the problem: “x 6= 7” and “y 6= 2” + Assume, following is an additional constraints of the problem: “x−y = 3z” Q. How many combinations of the variable we must try for complete enumeration search?
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 31 / 35
Constraint Programming
+ Consider the following discrete variables:
x ∈{1, 2, . . . , 8} y ∈{1, 2, . . . , 10} z ∈{1, 2, . . . , 10}
x ∈{4, 5, 6, 8} y ∈{1, 3, 4, 5} z ∈{1, 2}
+ Assume, following are two of the constraints of the problem: “x 6= 7” and “y 6= 2” + Assume, following is an additional constraints of the problem: “x−y = 3z” Q. How many combinations of the variable we must try for complete enumeration search?
32, which are much less than the initial combinations.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 31 / 35
Constraint Programming
+ Consider the following discrete variables:
x ∈{1, 2, . . . , 8} y ∈{1, 2, . . . , 10} z ∈{1, 2, . . . , 10}
x ∈{4, 5, 6, 8} y ∈{1, 3, 4, 5} z ∈{1, 2}
+ Assume, following are two of the constraints of the problem: “x 6= 7” and “y 6= 2” + Assume, following is an additional constraints of the problem: “x−y = 3z” + Following can be skipped for now. Can we improve further by element wise feasibility
check?
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 31 / 35
Constraint Programming
+ Consider the following discrete variables:
x ∈{1, 2, . . . , 8} y ∈{1, 2, . . . , 10} z ∈{1, 2, . . . , 10}
x ∈{4, 5, 6, 8} y ∈{1, 3, 4, 5} z ∈{1, 2}
+ Assume, following are two of the constraints of the problem: “x 6= 7” and “y 6= 2” + Assume, following is an additional constraints of the problem: “x−y = 3z” + Following can be skipped for now. Can we improve further by element wise feasibility
check? + Consistency Check: Consistency means that any infeasible partial assignment is explicitly
ruled out by a constraint. For each variable check if every element in the domain has a feasible solution.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 31 / 35
Constraint Programming
+ Consider the following discrete variables:
x ∈{1, 2, . . . , 8} y ∈{1, 2, . . . , 10} z ∈{1, 2, . . . , 10}
x ∈{4, 5, 6, 8} y ∈{1, 3, 4, 5} z ∈{1, 2}
+ Assume, following are two of the constraints of the problem: “x 6= 7” and “y 6= 2” + Assume, following is an additional constraints of the problem: “x−y = 3z” + Following can be skipped for now. Can we improve further by element wise feasibility
check? + Consistency Check: Consistency means that any infeasible partial assignment is explicitly
ruled out by a constraint. For each variable check if every element in the domain has a feasible solution.
+ For example, consider variable x, and rewrite the constraint: x = 3z + y. Consider all the following cases:
x = 4, has a solution. x = 5, has NO solution. x = 6, has a solution. x = 8, has a solution.
Thus, the domain of x can be updated.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 31 / 35
Constraint Programming
+ Consider the following discrete variables:
x ∈{1, 2, . . . , 8} y ∈{1, 2, . . . , 10} z ∈{1, 2, . . . , 10}
x ∈{4, 6, 8} y ∈{1, 3, 5} z ∈{1}
+ Assume, following are two of the constraints of the problem: “x 6= 7” and “y 6= 2” + Assume, following is an additional constraints of the problem: “x−y = 3z” + Following can be skipped for now. Can we improve further by element wise feasibility
check? + Consistency Check: Consistency means that any infeasible partial assignment is explicitly
ruled out by a constraint. For each variable check if every element in the domain has a feasible solution.
+ Similarly, other variables can be considered.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 31 / 35
Constraint Programming
+ Consider the following discrete variables:
x ∈{1, 2, . . . , 8} y ∈{1, 2, . . . , 10} z ∈{1, 2, . . . , 10}
x ∈{4, 6, 8} y ∈{1, 3, 5} z ∈{1}
+ Assume, following are two of the constraints of the problem: “x 6= 7” and “y 6= 2” + Assume, following is an additional constraints of the problem: “x−y = 3z” + Following can be skipped for now. Can we improve further by element wise feasibility
check? + Consistency Check: Consistency means that any infeasible partial assignment is explicitly
ruled out by a constraint. For each variable check if every element in the domain has a feasible solution.
Q. How many combinations of the variable we must try for complete enumeration search?
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 31 / 35
Constraint Programming
+ Consider the following discrete variables:
x ∈{1, 2, . . . , 8} y ∈{1, 2, . . . , 10} z ∈{1, 2, . . . , 10}
x ∈{4, 6, 8} y ∈{1, 3, 5} z ∈{1}
+ Assume, following are two of the constraints of the problem: “x 6= 7” and “y 6= 2” + Assume, following is an additional constraints of the problem: “x−y = 3z” + Following can be skipped for now. Can we improve further by element wise feasibility
check? + Consistency Check: Consistency means that any infeasible partial assignment is explicitly
ruled out by a constraint. For each variable check if every element in the domain has a feasible solution.
Q. How many combinations of the variable we must try for complete enumeration search? 9, which are much less than the initial combinations.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 31 / 35
Constraint Programming
+ Consider the following discrete variables:
x ∈{1, 2, . . . , 8} y ∈{1, 2, . . . , 10} z ∈{1, 2, . . . , 10}
x ∈{4, 6, 8} y ∈{1, 3, 5} z ∈{1}
+ Assume, following are two of the constraints of the problem: “x 6= 7” and “y 6= 2” + Assume, following is an additional constraints of the problem: “x−y = 3z” + The new domains can be used in branch and bound, or cutting plane methods.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 31 / 35
Constraint Programming
. Constraint Programming (CP) can be used in OR in different aspects.
. Examples: Bound consistency, Arc and Hyperarc consistency, Domain filtering.
. CP can help in reducing branches of B & B enumeration tree, and can provide valid cuts.
. See Hooker’s paper in the reference for more on CP.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 32 / 35
We Have Seen...
+ Reasons for using Heuristic Methods.
+ Difference between Heuristic and Exact methods.
+ Difference between Heuristic and Metaheuristic methods.
+ Concept of Immediate and Extended Neighborhood.
+ Concept of Greedy and Random-Walk.
+ Simulated Annealing Approach.
+ Domain Filtering via Constraint Programming
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 33 / 35
References
Chapter 10 from H. Taha, “Operations Research: An Intro- duction”, 9th Edition, 2011.
Chapter 13 from F. S. Hillier and G. J. Lieberman “Introduc- tion to Operations Research”, 9th edition, 2010.
B. Hajek, ”Cooling schedules for optimal annealing.” Math- ematics of operations research 13.2 (1988): 311-329.
D. Bertsimas, and J. Tsitsiklis. ”Simulated annealing.” Sta- tistical science 8.1 (1993): 10-15.
J. N. Hooker, ”Operations research methods in con- straint programming.” Foundations of Artificial Intelligence 2 (2006): 527-570.
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 34 / 35
Discussion
Syed N Mujahid (KFUPM) Heuristic Methods November 21, 2016 35 / 35
- Heuristic & Meta-Heuristic
- Greedy and Random Heuristic
- Single Variables
- Multiple Variables
- Simulated Annealing
- ILP Applications
- Constraint Programming
- Conclusion
- anm0:
[This sheet must be completed and attached to the last page of your homework]
ISE 421
Operations Research II
Term 161
Homework #3
Student Name ID# Signature
Homework Guidelines
To receive full credit, you should make sure you follow the following guidelines.
Homework Presentation:
• Every main problem should be answered on a different page.
• You should submit the solutions for the first two problems only.
• All pages of your homework should be in chronological order.
• Your name, and the homework number should be clearly indicated.
Modeling Questions:
• Clearly define all the variables in one group. Then clearly define all the parameters in another group. Then display the final model in the standard style (Objective, Constraints, Restriction on Domain).
• For GAMS software questions, you should give the GAMS model and the part of GAMS output that gives the solution (do not attach dozens of pages of GAMS output). Moreover, the solution of the model should be presented in the text. That is, write what are the optimal values of variables and objective in the paper. Make sure, you incorporate scaling of objective and/or constraints.
• For VBA code, give the VBA code file. In addition to that, the solution of the model should be presented in the text.
ISE-421 HW-3
Problem #1
Consider the following optimization problem:
min :
6∑ i=1
x2i − 4 6∑
i=1
xi − 24 (1)
s.t. :
x1 − 2x2 −x3 = 0 (2) x4 + x5 + x6 = 5 (3)
1 ≤ xi ≤ 9 ∀i = 1, . . . , 6 (4)
where the variables of the multivariable optimization problem are integers.
(a) Let [9, 4, 1, 1, 3, 1]T be the current solution. Ignore Constraints (2) & (3), and write all the possible feasible immediate neighbors of the current solution.
(b) Let [9, 4, 1, 1, 3, 1]T be the current solution. Ignore Constraints (2) & (3), and write 3 possible feasible extended neighbors of the current solution. Use Random Number Table (available in the course blackboard).
(c) Ignore Constraints (2) & (3), execute one full iteration of the greedy search with the immediate neighborhood. Use starting solution as [9, 4, 1, 1, 3, 1]T .
(d) Ignore Constraints (2) & (3), execute 3 full iterations of the random-walk search with the extended neighborhood. Use starting solution as [9, 4, 1, 1, 3, 1]T . Use Random Number Table (available in the course blackboard).
(e) Ignore Constraints (2) & (3). Let initial temperature be 1000, and starting solution be [9, 4, 1, 1, 3, 1]T . Execute 6 iterations of the simulated annealing. After 3 iterations, reduce the temperature to 200, and continue with the remaining iterations. Use Random Number Table (available in the course blackboard).
(f) Design an objective function to incorporate Constraints (2) & (3) in the greedy or random-walk heuristics.
(g) Consider constraint: x1 − 2x2 − x3 = 0. Use the constraint programming approach to update the bounds of x1, x2 and x3.
(h) Consider constraint: x4 + x5 + x6 = 5. Use the constraint programming approach to update the bounds of the related variables.
(i) Use the information from Part (f), Part (g) and Part (h), and re-write the formulation.
(j) Use the bound information from Part (i), and write all the possible immediate neighbors of the current solution ([9, 4, 1, 1, 3, 1]T ).
1
ISE-421 HW-3
Problem #2
Consider the simulated annealing algorithm. The objective type is to minimize the given objective function.
(a) Let T = 20 be the current value of the temperature. Let the objective function value of the current solution be 30. Starting from the current solution, determine the probability of accepting each of the following neighbor as the current solution for the next iteration.
Neighbor Objective function value A 29 B 34 C 31 D 24
(b) Using the random numbers from the Random Number Table (available in the course black- board), determine which of the neighbors from Part (a) would have been actually accepted as the current solution for the next iteration.
(c) If the value of temperature is cooled down to T = 2, then calculate the probabilities of Part (a). Assume the objective function value of the current solution is 30.
(d) Compare the probabilities for each neighbor from Part (a) and Part (c). What can you infer? Explain.
(e) Let the objective function value of the current solution be 10. Let the objective function value of a randomly generated neighbor be 400. What should be the minimum value of the temperature, for which the randomly generated neighbor will be accepted for sure as the current solution for the next iteration.
(f) Let the objective function value of the current solution be 10. Let the objective function value of a randomly generated neighbor be 400. What should be the minimum value of the temperature, for which the randomly generated neighbor will be accepted with 50% probability as the current solution for the next iteration.
(g) Let the current temperature be 10. If the cooling schedule is of the following type: Tnew = 0.9Told, then how many temperature updates are required for temperature to cool down to 0.2.
(h) Let the current temperature be 10. If the cooling schedule is of the following type: Tnew = 0.95Told, then how many temperature updates are required for temperature to cool down to 0.2.
(i) What can you infer from Part (g) and Part (h)? Explain.
(j) Let the minimum value of the objective function be −10, and the maximum value be 1000 units. What should be the minimum value of the initial temperature, so that the algorithm can easily explore all of the feasible space.
2
ISE-421 HW-3
Problem #3 Practice Only
Consider the cover problem (Example-3) from the Integer Programming Slide.
(a) Solve the cover problem using Cplex solver.
(b) Incorporate the constraint in the objective function. Solve the problem using Greedy Heuristic with Immediate Neighbors (use VBA code). Select reasonable values for the scaling parameters.
(c) Incorporate the constraint in the objective function. Solve the problem using Random-Walk Heuristic (use VBA code). Select reasonable values for the scaling parameters.
(d) Incorporate the constraint in the objective function. Solve the problem using Simulated Annea- ling (use VBA code). Select reasonable values for the scaling parameters. Try different values of the initial temperature, and other parameters to find a good solution.
(e) Use the solution obtained from Part (d) as the initial solution of the greedy heuristic. Execute the greedy search using the VBA code.
(f) Compare the objective function value of all the above parts.
3

Get help from top-rated tutors in any subject.
Efficiently complete your homework and academic assignments by getting help from the experts at homeworkarchive.com