Combinatorial Optimization: An Introduction · 2019-11-17 · Stated as such, linear programming is...

Post on 07-Jul-2020

3 views 0 download

Transcript of Combinatorial Optimization: An Introduction · 2019-11-17 · Stated as such, linear programming is...

Combinatorial Optimization: An IntroductionPAUL G. SPIRAKIS

Optimization Problems

Optimization ProblemsDefinition 1.1

An instance of an optimization problem is a pair (F, c), where F is any set, the domain of feasible points; c is the cost function, a mapping

𝑐: 𝐹 → 𝑅1

The problem is to find an f e F for which

𝑐 𝑓 ≤ 𝑐 𝑦 for all 𝑦 ∈ 𝐹

Such a point f is called a globally optimal solution to the given instance, or,

when no confusion can arise, simply an optimal solution.

Definition 1.2

An optimization problem is a set I of instances of an optimization problem.

Example (Traveling Salesman Problem)In an instance of the TSP we are given an integer 𝑛 > 0 and the distance between every pair of 𝑛 cities in the form of an 𝑛 𝑥 𝑛 matrix [𝑑𝑖𝑗], where 𝑑𝑖𝑗 ∈ ℤ+.

A tour is a closed path that visits every city exactly once. The problem is to find a tour with minimal total length. We can take

𝐹 = {all cyclic permutations 𝜋 on 𝑛 objects}

A cyclic permutation 𝜋 represents a tour if we interpret 𝜋 𝑗 to be the city visited after city 𝑗,𝑗 = 1,… , 𝑛. Then the cost c maps 𝜋 to σ𝑗=1

𝑛 𝑑𝑗𝜋(𝑗).

Example (Minimal Spanning Tree)As above, we are given an integer 𝑛 > 0 and an 𝑛 𝑥 𝑛 symmetric distance matrix [𝑑𝑖𝑗], where 𝑑𝑖𝑗 ∈ ℤ+. The problem is to find a spanning tree on 𝑛 vertices that has minimal total length of its edges. In our definition of an instance of an optimization problem, we choose

𝐹 = {all spanning trees (𝑉, 𝐸) with 𝑉 = {1, 2, … , 𝑛}}

𝑐: 𝑉, 𝐸 → σ 𝑖,𝑗 ∈𝐸 𝑑𝑖𝑗

(By a spanning tree we mean an undirected graph ( V, E) that is connected and acyclic.)

Example (Linear Programming)Let 𝑚, 𝑛 be positive integers, 𝑏 ∈ ℤ𝑚, 𝑐 ∈ ℤ𝑛, and A an 𝑚 𝑥 𝑛 matrix with elements 𝑎𝑖,𝑗 ∈ ℤ. An instance of LP is defined by

𝐹 = {𝑥: 𝑥 ∈ ℝ𝑛, 𝐴𝑥 = 𝑏, 𝑥 ≥ 0}

𝑐: 𝑥 → 𝑐′𝑥

Stated as such, linear programming is a continuous optimization problem, with, in fact, an uncountable number of feasible points 𝑥 ∈ 𝐹. To see how it can be considered combinatorial in nature, consider the simple instance defined by 𝑚 = 1, 𝑛 = 3 and 𝐴 = (1 1 1), 𝑏 = (2).

ExampleConsider an instance of the MST problem with n = 3 points. There are three spanning trees of these points, shown in the Figure below. They can also be thought of as points in 3-dimensional space if 𝑥𝑗 = 1 whenever edge 𝑒𝑗 is in the tree considered, and zero otherwise, 𝑗 = 1, 2, 3.

Example (cont.)These three spanning trees then coincide with the vertices 𝑣1, 𝑣2, and 𝑣3 of the feasible set defined by the constraints:

𝑥1 + 𝑥2 + 𝑥3 = 2

𝑥1 ≥ 0, 𝑥2≥ 0, 𝑥3 ≥ 0

𝑥1 ≤ 1, 𝑥2 ≤ 1, 𝑥3 ≤ 1

Thus this purely combinatorial problem can, in principle, be solved by LP.

NeighborhoodsDefinition 1.3

Given an optimization problem with instances (𝐹, 𝑐), a neighborhood is a mapping 𝑁: 𝐹 → 2𝐹

defined for each instance.

NeighborhoodsExample 1

In the TSP we may define a neighborhood called 2-change by:

𝑁2(𝑓) = {𝑔: 𝑔 ∈ 𝐹 and 𝑔 can be obtained from 𝑓 as follows: remove two edges from thetour; then replace them with two edges}

NeighborhoodsExample 2

In the MST, an important neighborhood is defined by:

𝑁(𝑓) = {𝑔 ∶ 𝑔 ∈ 𝐹 and 𝑔 can be obtained from 𝑓 as follows: add an edge 𝑒 to the tree 𝑓,

producing a cycle; then delete any edge on the cycle}

Example 3

In LP, we can define a neighborhood by:

𝑁𝑒(𝑥) = {𝑦 ∶ 𝐴𝑦 = 𝑏, 𝑦 ≥ 𝑂, and ԡ𝑦 − ԡ𝑥 ≤ 𝜖}

This is simply the set of all feasible points within Euclidean distance 𝜖 of 𝑥, for some 𝜖 > 0.

Local and Global OptimaDefinition 1.4

Given an instance (𝐹, 𝑐) of an optimization problem and a neighborhood 𝑁, a feasible solution 𝑓 ∈𝐹 is called locally optimal with respect to 𝑁 (or simply locally optimal whenever 𝑁 is understood by context) if:

𝑐 𝑓 ≤ 𝑐 𝑔 for all 𝑔 ∈ 𝑁(𝑓)

Local and Global OptimaDefinition 1.5

Given an optimization problem with feasible set 𝐹 and a neighborhood 𝑁, if whenever 𝑓 ∈ 𝐹 is locally optimal with respect to 𝑁 it is also globally optimal, we say the neighborhood 𝑁 is exact.

Convex Sets and FunctionsDefinition 1.6

Given two points 𝑥, 𝑦 ∈ ℝ𝑛, a convex combination of them is any point of the form

𝑧 = 𝜆𝑥 + 1 − 𝜆 𝑦, 𝜆 ∈ ℝ1 and 𝑂 ≤ 𝜆 ≤ 1

If 𝜆 ≠ 0,1 , we say 𝑧 is a strict convex combination of 𝑥 and 𝑦.

Definition 1.7

A set 𝑆 ⊆ ℝ𝑛 is convex if it contains all convex combinations of pairs of points 𝑥, 𝑦 ∈ 𝑆.

Convex Sets and Functions

Convex Sets and FunctionsLemma 1.1

The intersection of any number of convex sets 𝑆, is convex.

Definition 1.8

Let 𝑆 ⊆ ℝ𝑛 be a convex set. The function

𝑐: 𝑆 → ℝ1

is convex in 𝑆 if for any two points 𝑥, 𝑦 ∈ 𝑆

𝑐 𝜆𝑥 + 1 − 𝜆 𝑦 ≤ 𝜆𝑐 𝑥 + 1 − 𝜆 𝑐 𝑦 , 𝜆 ∈ ℝ1 and 0 ≤ 𝜆 ≤ 1

If 𝑆 = ℝ𝑛, we say simply that 𝑐 is convex.

Convex Sets and FunctionsLemma 1.2

Let 𝑐(𝑥) be a convex function on a convex set 𝑆. Then for any real number 𝑡, the set 𝑆𝑡 = {𝑥 ∶𝑐 𝑥 ≤ 𝑡, 𝑥 ∈ 𝑆} is convex.

Proof

Let 𝑥 and 𝑦 be two points in 𝑆𝑡. Then the convex combination 𝜆𝑥 + 1 − 𝜆 𝑦 is in 𝑆 and

𝑐 𝜆𝑥 + 1 − 𝜆 𝑦 ≤ 𝜆𝑐 𝑥 + 1 − 𝜆 𝑐 𝑦

≤ 𝜆𝑡 + 1 − 𝜆 𝑡

≤ 𝑡

which shows that the convex combination 𝜆𝑥 + 1 − 𝜆 𝑦 is also in 𝑆𝑡.

Convex Programming ProblemsDefinition 1.10

An instance of an optimization problem (𝐹, 𝑐) is a convex programming problem if 𝑐 is convex and 𝐹 ⊆ ℝ𝑛 is defined by

𝑔𝑖 𝑥 ≥ 0, 𝑖 = 1,… ,𝑚

where

𝑔𝑖: ℝ𝑛 → ℝ1

are concave functions.

Convex Programming ProblemsTheorem 1.1

Consider an instance of an optimization problem (𝐹, 𝑐), where 𝐹 ⊆ ℝ𝑛 is a convex set and 𝑐 is a convex function. Then the neighborhood defined by Euclidean distance

𝑁𝜖(𝑥) = {𝑦 ∶ 𝑦 ∈ 𝐹 and 𝑥 − 𝑦 ≤ 𝜖}

is exact for every 𝜖 > 0.

Convex Programming ProblemsProof

Let 𝑥 be a local optimum with respect to 𝑁𝜖 for any fixed 𝜖 > 0 and let 𝑦 ∈ 𝐹 be any other feasible point, not necessarily in 𝑁𝜖(𝑥). We can always choose a 𝜆 sufficiently close to 1 that the strict convex combination

𝑧 = 𝜆𝑥 + 1 − 𝜆 𝑦, 0 < 𝜆 < 1

lies within the neighborhood 𝑁𝜖(𝑥). Evaluating the cost function 𝑐 at this point, we get, by the convexity of 𝑐,

𝑐 𝑧 = 𝑐 𝜆𝑥 + 1 − 𝜆 𝑦 ≤ 𝜆𝑐(𝑥) + (1 − 𝜆)𝑐(𝑦)

Rearranging, we find that

𝑐 𝑦 ≥𝑐 𝑧 −𝜆 𝑐(𝑥)

1−𝜆

Convex Programming ProblemsProof (cont.)

But since 𝑧 ∈ 𝑁𝜖 𝑥

𝑐 𝑧 ≥ 𝑐(𝑥)

so

𝑐 𝑦 ≥𝑐 𝑥 −𝜆𝑐(𝑥)

1−𝜆= 𝑐(𝑥)

Note that we have made no extra assumptions about the function 𝑐; it need not be differentiable, for example.

Convex Programming ProblemsLemma 1.3

The feasible set 𝐹 in a convex programming problem is convex.

Theorem 1.2

In a convex programming problem, every point locally optimal with respect to the Euclidean distance neighborhood 𝑁𝜖 is also globally optimal.