Job Scheduling Lecture 19: March 19. Job Scheduling: Unrelated Multiple Machines There are n jobs,...

29
Job Scheduling Lecture 19: March 19
  • date post

    21-Dec-2015
  • Category

    Documents

  • view

    215
  • download

    0

Transcript of Job Scheduling Lecture 19: March 19. Job Scheduling: Unrelated Multiple Machines There are n jobs,...

Job Scheduling

Lecture 19: March 19

Job Scheduling: Unrelated Multiple Machines

There are n jobs, each job has:• a processing time p(i,j) (the time to finish this job j on machine i)

There are m machine available.

Task: to scheduling the jobs-To minimize the completion time of all jobs (the makespan)

NP-hard to approximate within 1.5 times of the optimal solution.

We’ll design a 2-approximation algorithm for this problem.

Why Unrelated?

For example, different processors have different specialties.

Computational jobs, display images, etc…

Job Scheduling: Unrelated Multiple Machines

There are n jobs, each job has:• a processing time p(i,j) (the time to finish this job j on machine i)

There are m machine available.

Task: to scheduling the jobs-To minimize the completion time of all jobs (the makespan)

Approach: Linear Programming.

How to formulate this problem into linear program?

Linear Programming Relaxation

whether job j is scheduled in machine i

for each job jEach job is scheduledin one machine.

for each machine iEach machine canfinish its jobs by time T

for each job j, machine i Relaxation

How good is this relaxation?

for each job j

for each machine i

for each job j, machine i

Example

One job of processingtime K for each machine

Optimal solution = K.

Optimal fraction solution = K/m.

The LP lower bound could be very bad.

How good is the relaxation?

for each job j

for each machine i

for each job j, machine i

Example

One job of processingtime K for each machine

Optimal solution = K.

Optimal fraction solution = K/m.

Problem of the linear program relaxation:

an optimal solution T could be even smaller than the processing time of a job!

How to tackle this problem?

Problem of the linear program relaxation:

an optimal solution T could be even smaller than the processing time of a job!

Ideally, we could write the following constraint:

but this is not a linear constraint…

Idea? To enforce this constraint by preprocessing!

Preprocessing

Fix T.

Consider the decision problem instead of an optimization problem

Call the resulting linear program LP(T).

Note that different T have different linear programs.

This is known as parametric pruning.

Decision Problem

for each job j

for each machine i

for each job j, machine i

Fix T Let S(T) be the set of jobs with p(i,j) < T.

Use binary search to find the minimum T* such that this LP is feasible.

We will use T* as the lower bound on the value of an optimal solution,

clearly T* <= OPT, since LP(OPT) is feasible.

for each job j

for each machine i

for each job j, machine i

Basic Solution

What can we say about a vertex solution of this LP?

Basic solution:unique solution of n linearly independent tight inequalities,where n is the number of variables.

for each job j

for each machine i

for each job j, machine i

A tight inequality of the last type corresponds to a variable of zero value.

There are at most n+m inequalities of the first two types,

and hence there are at most n+m nonzero variables.

Basic Solution

There are at most n+m nonzero variables.

Say a job is integral if it is assigned entirely to one machine;

otherwise a job is fractional.

Each fractional job is assigned to at least two machines.

Let p be the number of integral jobs,

and q be the number of fractional jobs.

p + q = n p + 2q <= n + m

p >= n – m q <= m

There are at most m fractional jobs.

Basic Solution

Integral Jobs

How to handle integral jobs?

Just follow the optimal fractional solution.

And so we can schedule all the integral jobs in time at most T* <= OPT,

as this schedule (on integral jobs) is just a subset of the fractional solution.

Fractional Jobs

Observation:

Suppose there are m machines and at most m jobs.

If we can assign all jobs to the m machines so that

each machine is assigned at most 1 job,

then the completion time (makespan) is at most T* <= OPT.

There are at most m fractional jobs.

If we could find such a “matching”, then we use this matching

to schedule all the fractional jobs in time at most T* <= OPT.

Goal: to design a 2-approximation algorithm for this problem

1) Do preprocessing (parametric pruning) and find a

smallest T* so that LP(T*) is feasible.

2) Find a vertex (basic) solution, say x, to LP(T*).

3) Assign all integral jobs to machines as in x.

4) Match the fractional jobs to the machines so that

each machine is assigned at most one job.

Approximation Algorithm

Proof (assuming a matching exists):

Schedule all integral jobs in time T*,

Schedule all fractional jobs in time T*,

Schedule all jobs in time 2T* <= 2OPT.

Bipartite Matching

Task: Match the fractional jobs to the machines

so that each machine is assigned at most one

job.

Create a vertex for each job j,

and create a vertex for each machine i,

add an edge between machine i and job j if 0 < x(i,j) < 1.

Now, the problem is to find a matching

so that every job is matched.

job

machine

Bipartite Matching

Assume the graph is connected.

There are at most n+m nonzero variables.

n + m vertices,

n + m edges,

at most one cycle.

Bipartite Matching

n + m vertices,

n + m edges,

at most one cycle.

Leaves must be machines, since each fractional job is adjacent to two machines.

Match a leaf machine with its adjacent job, then remove these vertices and repeat.

Bipartite Matching

n + m vertices,

n + m edges,

at most one cycle.

Eventually a cycle is left, and we can find a perfect matching.

Match a leaf machine with its adjacent job, then remove these vertices and repeat.

Bipartite Matching

If the graph is not connected,

we apply the same argument to each connected component.

Prove: (1) each component has at most n’+m’ edges.

(2) each component has a matching.

Bad Examples

m machines

m2 – m + 1 jobs:

1 job of processing time m on all machines

remaining jobs have processing time 1 on all

machines Optimal solution: the large job on one machine,

m small jobs on the remaining m-1 machines,

makespan = m

LP vertex solution: 1/m of the first job and

m-1 other jobs to each machine. Our rounding

procedure will produce a schedule of makespan 2m-1.

General Assignment

There are n jobs, each job has:• a processing time p(i,j) (the time to finish this job j on machine i)• a processing cost c(i,j) (the cost to finish this job j on machine i)

There are m machine available.

Task: to scheduling the jobs-To minimize the total cost of the assignment-Satisfying time constraint T(i) for each machine

Theorem. Let OPT be the optimal cost to satisfy all constraints.

There is a polynomial time algorithm which finds an assignment

with cost at most OPT and the constraint is violated at most twice.

Linear Programming Relaxation

for each job j

for each machine i

for each job j, machine i

Pruning: Delete every variable x(i,j) with p(i,j) > Ti

Iterative Relaxation

Iterative General Assignment Algorithm

(Basic solution)Compute a basic optimal solution of the LP.Delete every variable x(i,j) with x(i,j)=0

(Assigning a job)If there is a variable with x(i,j)=1,

assign job j to machine i, and set Ti = Ti – p(i,j).

(Relaxing a constraint)If there is a machine i with only one job,or there is a machine with two jobs j1 and j2and x(i,j1) + x(i,j2) >= 1,remove the time constraint for machine i.

repeat

Performance Guarantee

Lemma. Suppose the algorithm terminates.

Then the total cost is at most LP,

and each time constraint is violated at most twice.

• Deleting a job of value 0 does not change anything.

• Assigning a job of value 1 keeps the total cost and the constraints satisfied.

• Relaxing a machine i with only one job can add at most Ti to machine i.

• Relaxing a machine i with two jobs j1 and j2 with x(i,j1) + x(i,j2) >= 1.

In the worse case, both j1 and j2 are assigned to machine i (in future).

Then x(i,j1)p(i,j1) + x(i,j2)p(i,j2) + Ti >= p(i,j1) + p(i,j2),

and so the constraint is violated by at most Ti.

Counting Argument

Lemma. If there is no variable with value 0 or 1,

then the relaxation step applies.

job

machine

•Each job has degree at least 2 (otherwise there is a job with value 1). Each machine has degree at least 2 (otherwise the relaxation step applies). So there are at least n+m edges and thus n+m nonzero variables.

•There are n jobs and m machines, so there are n+m constraints, and so in a basic solution, there are at most n+m nonzero variables.

n jobs

m machines

Counting Argument

Lemma. If there is no variable with value 0 or 1,

then the relaxation step applies.

job

machine

•So there are exactly m+n edges, and so is a disjoint union of cycles.

•So each machine has degree exactly 2.

•Each job has total value 1.

•So there exists a machine with total value at least 1.

n jobs

m machines

The relaxation step applies!

0.70.3

0.3

0.70.20.8

•There are many more scheduling problems in the literature.

•Iterative relaxation method is very useful.

•Project outline

•Meeting signup

•5 more lectures to go.

Remarks