extremal values of tolerances in combinatorial optimization
DESCRIPTION
EXTREMAL VALUES OF TOLERANCES IN COMBINATORIAL OPTIMIZATION. Boris Goldengorin Nizhny Novgorod Branch of Higher School of Economics, Russian Federation and University of Groningen, The Netherlands. Outline of this Talk. Elementary introduction Sensitivity analysis and tolerances - PowerPoint PPT PresentationTRANSCRIPT
EXTREMAL VALUES OF TOLERANCES IN COMBINATORIAL OPTIMIZATION
Boris Goldengorin
Nizhny Novgorod Branch of Higher School of Economics, Russian Federation
and University of Groningen, The Netherlands
Outline of this Talk
• Elementary introduction• Sensitivity analysis and tolerances• Non-trivial tolerances are invariants for the set of optimal
soltuions• Why extremal values of tolerances• Partial enumeration by means of tolerances• Computational Study• Concluding Remarks
Sensitivity analysis
• After an optimal solution to a combinatorial optimization problem has been determined, a natural next step is to apply sensitivity analysis, sometimes also referred to as post-optimality analysis or what-if analysis.
Elementary Introduction(upper tolerances for elements in an optimal solution)
Example. We are given an ordered set of numbers, say E = {3, 5, 4, 12, 7, 10}. Problem 1. Find a smallest single element in E. Solution 1. It is 3 and attained at the first place, (1). Related question 1. How much we are allowed to increase the optimal value 3
such that the element (1) will remain optimal keeping all other entries unchanged.
It is just the difference between an optimal solution in E\{3} (second optimal
solution) and first optimal value 3, i.e. u(1) = 4 – 3 =1. We call u(1) an upper tolerance of the given optimal solution (1) with its value 1.
Elementary Introduction (Problem 1 cont.) (lower tolerances for elements in an optimal solution)
Example. We are given an ordered set of numbers, say E = {3, 5, 4, 12, 7, 10}. Problem 1. Find a smallest single element in E. Solution 1. It is 3 and attained at the first place, (1). Related question 2. How much we are allowed to decrease the optimal value 3
such that the element (1) will remain optimal. It is just as much as we would like, i.e. infinity.
We call l(1) = ∞ a lower tolerance of the given optimal solution (1) with its
value = ∞.
Elementary Introduction(lower tolerances for elements outside of an optimal solution)
Example. We are given an ordered set of numbers, say E = {3, 5, 4, 12, 7, 10}. Problem 1. Find a smallest single element in E. Solution 1. It is 3 and attained at the first place, (1). Related question 3. How much we are allowed to decrease an element in E\
{3} such that the element (1) will remain optimal keeping all other entries unchanged. It is just the difference between such an element, say 5 and first optimal solution 3, i.e. l(2) = 5 – 3 =2.
We call l(2) a lower tolerance for element (2) w.r.t. the given optimal solution (1). Similarly, lower tolerances for all other elements in E\{3} arel(3) = 1, l(4) = 9, l(5) = 4, l(6) = 7.
Elementary Introduction(upper tolerances for elements in an optimal solution)
Example. We are given an ordered set of numbers, say E = {3, 5, 4, 12, 7, 10}. Problem 2. Find a smallest sum of two elements in E. Solution 1. It is 7 and attained at the optimal solution (1,3). Related question 1. How much we are allowed to increase the weight of the
element 3 such that the optimal solution (1,3) will remain optimal keeping all other entries unchanged.
To find another (second) optimal solution it is enough just to replace either the
element (1) or (3) by any other element with the smallest weight in E\{3,4}, i.e. c(2) = 5. Hence, u(1) = 5 – 3 = 2, and u(3) = 4 – 3 = 1 w.r.t. the opt. solution (1,3).
We call u(1), u(3) upper tolerances for the given optimal solution (1,3).
Elementary Introduction(lower tolerances for elements in an optimal solution)
Example. We are given an ordered set of numbers, say E = {3, 5, 4, 12, 7, 10}. Problem 2. Find the smallest sum of two elements in E. Solution 1. It is 7 and attained at the optimal solution (1,3). Related question 2. How much we are allowed to decrease the weights of
elements outside of the optimal solution (1,3) such that (1,3) will remain optimal keeping all other entries unchanged.
As easy to see any element outside (1,3) might be reduced up to the largest
value among the elements in the given opt. solution, i.e., c(3) = 4. Hence, the lower tolerances are l(2)=5 – 4 =1, l(4)=12 – 4 =8, l(5)=7 – 4 =3, l(6)=10 – 4 =6, and upper tolerances are u(2) = u(4) = u(5) = u(6) = ∞.
Elementary Introduction(extremal values for upper and lower tolerances)
Example. We are given an ordered set of numbers, say E = {3, 5, 4, 12, 7, 10}. Problem 2. Find the smallest sum of two elements in E. Solution 1. It is 7 and attained at the optimal solution (1,3). Related questions 3. - Is the smallest value of among all upper tolerances equal to the smallest
value among all lower tolerances. - Is the largest value of among all upper tolerances equal to the largest value
among all lower tolerances. As easy to see min{u(1), u(3)} = u(3) = 1, and min{l(2), l(4), l(5), l(6)} = l(2) =
1. Hence, the smallest upper and lower are equal ! Is such an equality valid for
any other combinatorial optimization problem?
Sensitivity analysis
• After an optimal solution to a combinatorial optimization problem has been determined, a natural next step is to apply sensitivity analysis, sometimes also referred to as post-optimality analysis or what-if analysis.
• Sensitivity analysis is also a well-established topic in linear programming and mixed integer programming. The purpose of sensitivity analysis is to determine how the optimality of the given optimal solution depends on the input data.
Why Sensitivity Analysis
• In many cases the data used are inexact or uncertain. In such cases sensitivity analysis is necessary to determine the credibility of the optimal solution and conclusions based on that solution.
Why Sensitivity Analysis
• In many cases the data used are inexact or uncertain. In such cases sensitivity analysis is necessary to determine the credibility of the optimal solution and conclusions based on that solution.
• Another reason for performing sensitivity analysis is that sometimes rather significant considerations have not been built into the model due to the difficulty of formulating them. Having solved the simplified model, the decision maker wants to know how well the optimal solution fits in with the other considerations.
Tolerances
• The most interesting topic of sensitivity analysis is the special case when the value of a single element in the optimal solution is subject to change.
Tolerances
• The most interesting topic of sensitivity analysis is the special case when the value of a single element in the optimal solution is subject to change.
• The goal of making such perturbations is to determine the tolerances being defined as the maximum amount of changes of a given individual cost (weight, distance, time etc.) preserving all other entries of the input file and optimality of the given optimal solution.
Tolerances vs. Algorithms
Today we are going to discuss how to apply tolerances of easily solvable relaxations in order to improve the efficiency of either exact or approximation algorithms for either Pollynomially Solvable or NP-hard problems in Combinatorial Optimization.
Upper Tolerances
• An upper tolerance is the largest increase in the cost of an element such that our current optimal solution remains optimal keeping all other elements unchanged.
Upper Tolerances
• Upper tolerance: the largest increase in the cost of an element such that our current optimal solution remains optimal keeping all other elements unchanged.
• Corresponding to the increment of a cheapest solution after removing a single element from the optimal solution.
Lower Tolerances
• A lower tolerance is the largest decrease in the cost of an element such that our current optimal solution remains optimal keeping all other elements unchanged.
Our class of combinatorial optimization problems
• Additive linear functions are objective functions;• A feasible solution will be infeasible after deletion (insertion) a
single element from (to) the feasible solution (non-embedded set of feasible solutions).
• Examples of problems defined on simple weighted graphs with non-negative weights:
- Polynomially solvable: Shortest Path , Minimum Spanning Tree (1-Tree), Assignment (Matching), Max-Flow-Min-Cut;
- NP-hard: Traveling Salesman, Maximum Weighted Independent Set, Linear Ordering Problem, Max-Cut, Capacitated Vehicle Routing.
Libura’s Theorem for the TSP, Discrete Applied Mathematics, 1991
Let f(S*) = min{f(S):SϵD} be an optimal value with an optimal solution S*.
For e ϵ S*, upper tolerance u(e) = min{f(S):SϵD-(e)} - f(S*),
and for e not in S*, u(e) = ∞;
For e not in S*, lower tolerance l(e) = min{f(S):SϵD+(e)} - f(S*),
and for e ϵ S*, l(e) = ∞;
Upper and Lower Basic Solutions
min{f(S):SϵD-(e)} = f[S*-(e)] =>
Upper Basic Solutions D*-(e) ={S*-(e) : e ϵS*}
min{f(S):SϵD+(e)} = f[S*+(e)] =>
Lower Basic Solutions D*+(e) ={S*+(e) : e not in S*}
(Non-)Trivial Tolerances
• Upper tolerance for an element outside of the given optimal solution is either zero or infinity.
• Lower tolerance for an element of the given optimal solution is either zero or infinity.
• The above mentioned infinity values of upper and lower tolerances are called trivial, and all other tolerances are called non-trivial.
Non-trivial tolerances are invariants for the set of optimal solutions
Non-trivial tolerances are independent from the chosen optimal solution
Application of Lower Tolerances
The first application of lower tolerances (called
alpha-values) is done by Helsgaun (EJOR 2000) in the Modified Lin-Kernighan heuristic for the Symmetric TSP (STSP).
Computing Tolerances for the Assignment, 1-Tree Problems
In case of Assignment Problem (AP) or 1-
Tree Problem (1-TP), additional problem should be solved (Libura, 1991) to compute a single non-trivial tolerance value.
Helsgaun’s observation for the STSP (EJOR 2000)
• An optimal tour normally contains between 70% and 80% of the edges of a minimum 1-tree
• 5 alpha-nearest (with smallest lower tolerances) edges are used within Lin-Kernighan heuristic
How to compute the five smallest alpha-values (lower tolerances)
• Helsgaun has designed an algorithm with quadratic time and linear space complexities.
• Note that Kruskal’s algorithm returns all upper and lower tolerances together with an optimal Minimum Spanning Tree.
• We have suggested to study properties of upper and lower tolerances with the purpose to reduce Helsgaun’s computational complexity.
Assignment Problem (1)
Find set of cycles such that each location is in one cycle and the sum of the lengths of the arcs in the cycles is as short as possible.
How many elements are in and outside of an optimal solution
• For the following problems the number of elements in an optimal (feasible) solution is just linear and the number of elements outside of an optimal solution is quadratic
• TSP, AP, Minimum Spanning Tree, 1-Tree, Shortest Path Problems
Smallest upper and smallest lower tolerances
• The smallest value of non-trivial upper tolerances us is equal to the smallest value of non-trivial lower tolerances ls , i.e. us = ls .
• If us > 0, then for any f: fs < f < fs+ us there is no feasible solution to the original COP
Upper and Lower Basic Solutions Sets for All Optimal Solutions
• G is the set of elements with the largest upper tolerance defined on the intersection of all optimal solutions.
• H is the set of elements with the largest lower tolerance defined on the complement to the intersection of all optimal solutions.
• A is the set of upper basic solutions for all optimal solutions.
• B is the set of all complements to lower basic solutions for all optimal solutions.
Connected Feasible Solutions
The set of feasible solutions is called connected if both conditions a) and b) hold:
a)A ∩ H is non-empty;b)B ∩ G is non-empty.
Sufficient Conditions of Equality for Largest Upper and Lower Tolerances (Goldengorin and Sierksma,
2003)
If the set of feasible solutions is connected, then the largest values of upper and lower
tolerances are equal (se e.g., Goldengorin, Jager and Molitor. Journal of Computer Science, 2006)
Necessary and Sufficient Conditions of Equality for Largest upper and largest lower tolerances
• Definition. An upper covering is the union of all upper basic solutions without the chosen optimal solution.
• Main Theorem. The largest values of upper and lower tolerances are equal iff an upper covering is equal to the complement of the ground set to the chosen optimal solution.
Enumeration Algorithms for the AP
How to design and improve the Hungarian Algorithm based on the non-trivial upper and lower tolerances
Hungarian Algorithm
1. Reduce rows and columns by its smllest entry (create at least one zero in each row and column, i.e. reduced matrix)
2. Cover all zeros in the reduced matrix by the minimum number k of lines (horizontal and vertical), and apply Konig-Egervary’s theorem
3. If k=n, then output an optimal AP solution, otherwise reduced all uncovered entries by its smallest uncovered entry and return to 2.
Number of machine
Time (hours)Row nonzero
minimumJob 1 Job 2 Job 3 Job 4
Machine 1 14 5 8 7 5
Machine 2 2 12 6 5 2
Machine 3 7 8 3 9 3
Machine 4 2 4 6 10 2
Number of machine
Time (hours)Row nonzero
minimumJob 1 Job 2 Job 3 Job 4
Machine 1 9 0 3 2 5
Machine 2 0 10 4 3 2
Machine 3 4 5 0 6 3
Machine 4 0 2 4 8 2
Cost matrix for Machineco with row minima
Cost matrix after row minima are subtracted
Number of machine
Time (hours)
Job 1 Job 2 Job 3 Job 4
Machine 1 9 0 3 0
Machine 2 0 10 4 1
Machine 3 4 5 0 4
Machine 4 0 2 4 6
Number of machine
Time (hours)
Job 1 Job 2 Job 3 Job 4
Machine 1 10 0 3 0
Machine 2 0 9 3 0
Machine 3 5 5 0 4
Machine 4 0 1 3 5
Cost matrix after column minimum is subtracted
Four lines required; optimal solution is available
Number of machine
Time (hours)
Job 1 Job 2 Job 3 Job 4
Machine 1 14 5 8 7 2
Machine 2 2 12 6 5 3 3
Machine 3 7 8 3 9 4
Machine 4 2 4 6 10 2
Tolerance Based Algorithm for the AP
)(* euS )(* elS
M1
M2
M3
M4
J1
J2
J3
J4
)}1,4(),3,3(),1,2(),2,1{(* S
Number of machine
Time (hours)
Job 1 Job 2 Job 3 Job 4
Machine 1 14 5 8 7 2
Machine 2 2 12 6 5 3 3
Machine 3 7 8 3 9 4
Machine 4 2 4 6 10 2
Tolerance Based Algorithm for the AP
)(* euS )(* elS
M1
M2
M3
M4
J1
J2
J3
J4
)}1,4(),3,3(),1,2(),2,1{(* S
332
good bad
M1
M2
M3
M4
J1
J2
J3
J4Optimal solution to
the AP
Asymmetric Traveling Salesman Problem (ATSP)
Suppose locations are given, and the distance between each pair too.
Asymmetric Traveling Salesman Problem (2)
Objective: find a shortest tour visiting all locations exactly once.
Asymmetric Traveling Salesman Problem (3)
• In the worst case exponentially many solutions have to be searched through.
• Problem is hard to solve to optimality• Frequently occurring in practice.
AP solutionsno. feasible solutions
Enumeration of AP solutions w.r.t. an optimal ATSP solution
I
II 1 2
1 2 3 4 5
16
1718
192021
22
6 7 89
10
11121314
15
},,,,max{ 15
14
13
12
11
1 uuuuuub
},max{ 22
21
2 uuub
1 2 3 4 5
1
2
2021
22
3
19
45
16
17
18
6
7
13
14
15
8 9
10
11
12
Bottleneck Tolerances
},max{ 22
21
2 uuub
},,,,max{ 15
14
13
12
11
1 uuuuuub
AP solutionsnumber of feasible solutions
},max{ 22
21
2 uuub
I
II
1 2 3 4 5
1
2
2021
22
3
19
45
16
17
18
67
13
1415
8 9
10
11
12
1 2
1 2 3 4 5
16
1718
19202122
6 7 89
10
1112131415
},,,,max{ 15
14
13
12
11
1 uuuuuub
},max{ 22
21
2 uuub
A Largest Jump
},,,,max{ 15
14
13
12
11
1 uuuuuub
},max{ 22
21
2 uuub
Branch and Bound
• A method to solve the NP-hard problems (such as ATSP) is Branch and Bound (BnB)
• BnB: solve easy relaxation of the problem. If not feasible for the original problem, partition the problem into subproblems (branching).
• Problem can be discarded if lower bound higher than current best solution.
• Subproblems listed in a search tree.
State-of-the-art BnB algorithms for the ATSP
• Miller & Pekny (1991)• Carpaneto et al. (1995)• Fischetti & Toth (1997)Best First Search algorithms: search through most
promising subproblem first.
Depth First Search (DFS)
• Search strategy in BnB.• Search through most recently generated
subproblem first.• Used for very difficult problems because:
1. Only linear memory space required;2. Good solution likely if terminated premature-
ly (Truncated BnB; see for example Zhang, 2000).
Branching rule
Delete the red arc (subproblem 1)
Arc a
(Carpaneto, 1980)
Branching rule (2)
Delete the red arc b and include a (subproblem 2)
Arc bArc a
BnB in action
If the ‘wrong’ arc is removed, the BnB algorithm searches through problems 1 to 5 without finding an optimal solution
1
2
3
4 5
6
7 8
0Delete arc a first? Or arc b?
Tolerance Based Branch-and-Bound Algorithms for the ATSP
1. Solve the AP. If an optimal AP is a hamiltonian cylce, then output an optimal ATSP solution, otherwise compute the bottleneck tolerance (upper and lower) , update the lower bounds, fathom the solutions tree and branch.
2. Chose an unsolved subproblem and return to 1.
Common Arcs iff Upper Tolerances > 0
Common Arcs iff Upper Tolerances > 0
• All arcs in an optimal solution with strictly positive upper tolerances are common arcs for all optimal AP solutions.
Common Arcs iff Upper Tolerances > 0
• All arcs in an optimal solution with strictly positive upper tolerances are common arcs for all optimal AP solutions.
• If upper tolerances for all arcs in an optimal AP solution are strictly positive, then the optimal solution is unique.
Common Arcs iff Upper Tolerances > 0
• All arcs in an optimal solution with strictly positive upper tolerances are common arcs for all optimal AP solutions.
• If upper tolerances for all arcs in an optimal AP solution are strictly positive, then the optimal solution is unique.
• If upper tolerances for all arcs in an optimal AP solution are equal to zero, then no common arcs for all optimal AP solutions.
Results for ATSPLIB instances
Instance
n s(0) t(0) s(4) t(4) s(8) t(8)
ft53 53 20111 2.31 336586 448.13 19200 6.70
ft70 70 25831 3.85 5861 28.30 4993 1.87
ftv33 34 7065 0.22 748 0.49 1362 0.16
ftv35 36 6945 0.22 3109 2.03 1965 0.27
ftv38 39 6195 0.22 1523 1.54 2091 0.38
ftv44 45 205 0.06 165 0.22 171 0.01
ftv47 48 29025 1.32 3302 6.15 7692 1.54
ftv55 56 92447 4.51 10100 29.56 23034 9.51
ftv64 65 43441 3.13 8319 34.73 15509 11.10
ftv70 71 253873 23.08 50024 205.11 17694 8.68
World TSP Record: VLSI instance XSC6880
• Date Gap Lower Bound Tour• 6.10.02 21542 - Best of 10 runs of LKH, each
with n trials. Total running time is 28242 seconds on an AMD Althlon 1900+ processor (1.6 GHz).
• 2.26.03 21537 - Found by Hung Dinh Nguyen using a parallel hybrid genetic algorithm. The running time was 16033 seconds on a Sun Ultra 80 with four 450 MHz processors (taking the best of five runs).
World TSP Record: VLSI instance XSC6880
• Date Gap Lower Bound Tour• 6.10.02 21542 - Best of 10 runs of LKH, each with n trials. Total
running time is 28242 seconds on an AMD Althlon 1900+ processor (1.6 GHz).
• 2.26.03 21537 - Found by Hung Dinh Nguyen using a parallel hybrid genetic algorithm. The running time was 16033 seconds on a Sun Ultra 80 with four 450 MHz processors (taking the best of five runs).
• 5.24.06 21535 - Found by Boris Goldengorin, Gerold Jager, Paul Molitor, Dirk Richter by a tolerance based variant of LKH. The running time was 5327 seconds on a AMD64 4000+ (2.4 GHz)
Summary of Tolerance Based Approach. 1
Summary of Tolerance Based Approach. 1
• Non-trivial tolerances are invariants for the set of optimal solutions to the Combinatorial Optimization Problems with an additive objective function and non-embedded set of feasible solutions.
Summary of Tolerance Based Approach. 1
• Non-trivial tolerances are invariants for the set of optimal solutions to the Combinatorial Optimization Problems with an additive objective function and non-embedded set of feasible solutions.
• Extremal values of upper and lower tolerances are indicators for (non-) stability of optimal solutions.
Summary of Tolerance Based Approach. 1
• Non-trivial tolerances are invariants for the set of optimal solutions to the Combinatorial Optimization Problems with an additive objective function and non-embedded set of feasible solutions.
• Extremal values of upper and lower tolerances are indicators for (non-) stability of optimal solutions.
• We presented a framework for constructing branching rules and lower bounds based on upper and lower (bottleneck) tolerances.
Summary of Tolerance Based Approach. 2
• We implemented this framework successfully for the Asymmetric Traveling Salesman and Asymmetric Capacitated Vehicle Routing Problems based on its Assignment Problem relaxation.
• More constraint instances have tighter bottleneck bounds and branching rules.
• Improved bounding and synergy effect.
Tolerance Based Approach Literature. 1
• B. Goldengorin, G. Jager, P. Molitor. Tolerances applied in com-binatorial optimization. Journal of Computer Science 2 (9), 716--734 2006.
Tolerance Based Approach Literature. 1
• B. Goldengorin, G. Jager, P. Molitor. Tolerances applied in com-binatorial optimization. Journal of Computer Science 2 (9), 716--734 2006.
• M. Turkensteen, D. Ghosh, B. Goldengorin, G. Sierksma. Tolerance-Based Branch and Bound Algorithms for the ATSP. European Journal of Operational Research 189 775--788 2008 .
.
Tolerance Based Approach Literature. 1
• B. Goldengorin, G. Jager, P. Molitor. Tolerances applied in com-binatorial optimization. Journal of Computer Science 2 (9), 716--734 2006.
• M. Turkensteen, D. Ghosh, B. Goldengorin, G. Sierksma. Tolerance-Based Branch and Bound Algorithms for the ATSP. European Journal of Operational Research 189 775--788 2008 .
• D. Ghosh, B. Goldengorin, G. Gutin, G. Jager. Tolerance-based greedy algorithms for the traveling salesman problem. Mathematical Programming and Game Theory for Decision Making. S.K. Neogy et al. (Editors), World Scientific, New Jersey, 2008, pp. 47-59.
Tolerance Based Approach Literature. 2
• Gerold Jager. The Theory of Tolerances with Applications to the Traveling Salesman Problem. Habilitationsschrift. University of Kiel, Germany, 2011.
•
Tolerance Based Approach Literature. 2
• Gerold Jager. The Theory of Tolerances with Applications to the Traveling Salesman Problem. Habilitationsschrift. University of Kiel, Germany, 2011.
• R. Germs, B. Goldengorin, M. Turkensteen. Lower tolerance-based Branch and Bound algorithms for the ATSP. Computers & Operations Research, In Press, Corrected Proof, Available online 8 April 2011, doi:10.1016/j.cor.2011.04.003
Thank you!
Questions?