# minimum makespan scheduling

Views:

Category: Education

## Presentation Description

This is a minimum makespan scheduling for Vazirani book.

## Presentation Transcript

### Minimum Makespan Scheduling:

Minimum Makespan Scheduling Saeed Akhoondian Amiri Supervisor: Dr.Dara Moazzami

### Problem definition:

Problem definition Minimum makespan scheduling : Given processing times for n jobs, p 1 , p 2 ,... , p n , and an integer m, find an assignment of the jobs to m identical machines so that the completion time, also called the makespan, is minimized.

### Slide 3:

1 2 3 4 5 6 7 jobs processors

### Slide 4:

1 2 3 4 5 6 7 jobs processors 1 2 3 4 5 6 7 makespan

### Optimum solution bounds:

Optimum solution bounds OPT bounded on lower and upper bound: The average time for which a machine has to run, equals (  i p i )/m . The largest processing time. Let LB denote the combined lower bound, i.e. LB = max {(  i p i )/m, max i {p i }} . LB ≤OPT ≤2.LB

### Algorithm for Minimum makespan scheduling:

Algorithm for Minimum makespan scheduling Order the jobs arbitrarily. Schedule jobs on machines in this order, scheduling the next job on the machine that has been assigned the least amount of work so far.

### Slide 7:

1 2 3 4 5 6 7 jobs 1 2 3 4 5 6 7 Algorithm 2PACK (L) : schedule each job on the slowest machine whose load will not exceed 2L L 2L processors 1 2 3

### Theorem 10.3, Factor 2-ALG:

Theorem 10.3, Factor 2-ALG Algorithm 10.2 achieves an approximation guamntee of 2 for the minimum makespan problem.

### Factor-2 Proof:

Let M i be a processor that complete it’s job in the last schedule produced by the algorithm. Let j be the index of last job of M i . Factor-2 Proof M j M 1 M m start j P j ≤(1/m)∑P i

### Factor-2 Proof:

Factor-2 Proof We know P j ≤OPT. And we know (  i p i )/m ≤ OPT. So M j ≤ P j +(  i p i )/m ≤OPT + OPT = 2.OPT

### Tight Example:

Tight Example A tight example for this algorithm is provided by a sequence of m 2 jobs with unit processing time, followed by a single job of length m . The schedule obtained by the algorithm has a makespan of 2m, while OPT = m+1.

### From minimum makespan to bin packing:

From minimum makespan to bin packing The minimum makespan problem is closely related to the bin packing problem by the following observation. There exists a schedule with makespan t iff n objects of sizes p 1 , p 2 ,... , p n , can be packed into m bins of capacity t each.

### From minimum makespan to bin packing:

From minimum makespan to bin packing Denoting the sizes of the n objects, p 1 , p 2 ,... , p n , by  , let bins(  , t) represent the minimum number of bins of size t required to pack these n objects. the minimum makespan is given by min{t: bins(  ,t)  m}.

### DP algorithm for restricted bin packing:

DP algorithm for restricted bin packing We first present a dynamic programming algorithm for the restricted bin packing problem.

### Restricted bin packing:

Restricted bin packing Let k be the fixed number of object sizes, and assume that bins have capacity 1 . Fix an ordering on the object sizes. An instance of the bin packing problem can be described by a k-tuple, (i 1 , i 2 ,…, i k ), specifying the number of objects of each size. BINS(i 1 , i 2 ,…, i k ) denote the minimum number of bins needed to pack these objects.

### DP algorithm for restricted bin packing:

DP algorithm for restricted bin packing For a given instance, (n 1 , n 2 ,…, n k ),  n i = n, first compute Q, the set of all k-tuples (q 1 , q 2 , …, q k ) such that BINS(q 1 , q 2 , …, q k ) = 1 and 0  q i  n i , 1  i  k. Clearly, Q contains at most O(n k ) elements.

### DP algorithm for restricted bin packing:

DP algorithm for restricted bin packing Next, we compute all entries of the k-dimensional table BINS(i 1 , i 2 , . . . , i k ) for every (i 1 , i 2 , . . . , i k )  {0,..., n 1 } x {0,..., n 2 } x ... x {0,..., n k }. The table is initialized by setting BINS(q) = 1 for every q  Q. Then, we use the following recurrence to compute the remaining entries: BINS(i 1 , i 2 , . . . , i k ) = 1 + min q  Q BINS(i 1 -q 1 , i 2 -q 2 , . . . , i k -q k )

### DP algorithm for restricted bin packing:

DP algorithm for restricted bin packing Computing each entry takes O(n k ) time. Thus, the entire table can be computed in O(n 2k ) time, thereby determining BINS(n 1 , n 2 , …, n k ).

### Solving Bin packing with Error parameter :

Solving Bin packing with Error parameter   is an error parameter, t be in the interval [LB, 2 · LB]. We say that an object is small if its size is less than t . small objects are discarded for now. The rest of the objects are rounded down as follows: each p j in the interval [t ( 1+  ) i , t  (1+  ) i+1 ) is replaced by p j '= t  (1+  ) i , for i  0. The resulting p j 's can assume at most k = log 1+  1/  distinct values.

### Solving Bin packing with Error parameter :

Solving Bin packing with Error parameter  Determine an optimal packing for the rounded objects in bins of size t using the dynamic programming algorithm.

### ALG for Bin packing with Error parameter :

ALG for Bin packing with Error parameter  Rounding reduces the size of each object by a factor of at most 1+ . If we consider the original sizes of the objects, then the packing determined is valid for a bin size of t(1 +  ). Keeping this as the bin size, pack the small objects greedily in leftover spaces in the bins; Open new bins only if needed. Clearly, any time a new bin is opened, all previous bins must be full to the extent of at least t. Denote with  (  ,t,  ) the number of bins used by this algorithm.

### Lemma 10.5 and Proof:

Lemma 10.5 and Proof  (  ,t,  )  bins(  ,t). If the ALG does not open a new bean, then because we got bigger bins for our algorithm and we have best answer, then in equality holds. In the other case, the number all bins except last bin have size at least t , so any algorithm solves instance I has at least  (  ,t,  ) bins. Corollary 10.6 min{t:  (  ,t,  )  m}  OPT.

### Approximated bin packing to makespan:

Approximated bin packing to makespan The binary search is performed on the interval [LB, 2 · LB]. The length of the available interval is LB at the start of the search, and it reduces by a factor of 2 in each iteration. We continue the search until the available interval drops to a length of  · LB . This will require log 2 1/  iterations.

### Lemma 10.7 and Proof:

Lemma 10.7 and Proof Let T be the right endpoint of the interval we terminate with. Lemma 10.7 T  (1+  ) · OPT. Clearly min{t:  (  ,t,  ) m } must be in interval [T- .LB,T] Hence: T  min{t:  (  ,t,  ) m } + .LB OPT+  · OPT

### Theorem 10.8:

Theorem 10.8 Finally, the output of the core algorithm for t = T gives a schedule whose makespan is at most T · (1+  ). Theorem 10.8 The algorithm produces a valid schedule having makespan at most (1+  ) 2 · OPT  (1 + 3  ) · OPT. The running time of the entire algorithm is O(n 2k log 2 1/  ), where k = log 1+  1/  .