Você está na página 1de 19

International Journal of Foundations of Computer Science

c World Scientic Publishing Company


APPROXIMATION ALGORITHMS FOR FLEXIBLE JOB SHOP
PROBLEMS

KLAUS JANSEN

Universit at zu Kiel, kj@informatik.uni-kiel.de
Kiel, Germany
and
MONALDO MASTROLILLI
Istituto Dalle Molle di Studi sullIntelligenza Articiale, monaldo@idsia.ch
Manno, Switzerland
and
ROBERTO SOLIS-OBA

University of Western Ontario, solis@brown.csd.uwo.ca
London, Canada
Received (received date)
Revised (revised date)
Communicated by Editors name
ABSTRACT
The Flexible Job Shop problem is a generalization of the classical job shop scheduling
problem in which for every operation there is a group of machines that can process it. The
problem is to assign operations to machines and to order the operations on the machines
so that the operations can be processed in the smallest amount of time. This models
a wide variety of problems encountered in real manufacturing systems. We present a
linear time approximation scheme for the non-preemptive version of the problem when
the number m of machines and the maximum number of operations per job are xed.
We also study the preemptive version of the problem when m and are xed, and
present a linear time approximation scheme for the problem without migration and a
(2 +)-approximation algorithm for the problem with migration.
1. Introduction
The job shop scheduling problem is a classical problem in Operations Research
[17] in which it is desired to process a set J= {J
1
, . . . , J
n
} of n jobs on a group
M = {1, . . . , m} of m machines in the smallest amount of time. Every job J
j

A preliminary version of this paper appeared in the Proceedings of LATIN 2000.

This research was done while the author was at IDSIA Lugano, Switzerland.

This research was done while the author was at MPII Saarbr ucken, Germany.
1
consists of a sequence of operations O
1j
, O
2j
, . . . , O
j
which must be processed
in the given order. Every operation O
ij
has assigned a unique machine m
ij
M
which must process the operation without interruption during p
ij
units of time, and
a machine can process at most one operation at a time.
In this paper we study a generalization of the job shop scheduling problem called
the exible job shop problem [18, 22], which models a wide variety of problems
encountered in real manufacturing systems [4, 22]. In the exible job shop problem
an operation O
ij
can be processed by any machine from a given group M
ij
M
(the groups M
ij
are not necessarily disjoint). The processing time of operation O
ij
on machine u M
ij
is p
u
ij
. The goal is to choose for each operation O
ij
an eligible
machine and a starting time so that the maximum completion time C
max
over all
jobs is minimized. C
max
is called the makespan or the length of the schedule.
The exible job shop problem is more complex than the job shop problem be-
cause of the additional need to determine the assignment of operations to machines.
Following the three-eld || notation suggested by Vaessens [22] and based on the
one given by Graham et al. [8], we denote our problem as m1m|chain, op |C
max
.
In the rst eld m species that the number of machines is a constant, 1 species
that any operation requires at most one machine to be processed, and the second
m gives an upper bound on the number of machines that can process an operation.
The second eld states the precedence constraints and the maximum number of
operations per job, while the third eld species the objective function. The fol-
lowing special cases of the problem are already NP-hard (see [22] for a survey): 2 1
2|chain, n = 3|C
max
, 3 1 2|chain, n = 2|C
max
, 2 1 2|chain, op 2|C
max
.
The job shop scheduling problem has been extensively studied. The problem is
known to be strongly NP-hard even if each job has at most three operations and
there are only two machines [17]. Williamson et al. [23] proved that when the
number of machines, jobs, and operations per job are part of the input there does
not exist a polynomial time approximation algorithm with worst case bound smaller
than
5
4
unless P=NP. On the other hand the preemptive version of the job shop
scheduling problem is NP-complete in the strong sense even when m = 3 and = 3
[6].
The practical importance of NPhard problems necessitates tractable relax-
ations. By tractable we mean ecient solvability, and polynomial time is a robust
theoretical notion of eciency. A very fruitful approach has been to relax the no-
tion of optimality and settle for nearoptimal solutions. A nearoptimal solution
is one whose objective function value is within some small multiplicative factor of
the optimal value. Approximation algorithms are heuristics that in polynomial time
provide provably good guarantees on the quality of the solutions they return. This
approach was pioneered by the inuential paper of Johnson [14] in which he showed
the existence of good approximation algorithms for several NPhard problems. He
also remarked that the optimization problems that are all indistinguishable in the
theory of NPcompleteness behave very dierently when it comes to approximabil-
ity (assuming that P = NP). Remarkable work in the last three decades in both
the design of approximation algorithms and proving inapproximability results has
2
validated Johnsons remarks. The book on approximation algorithms edited by
Hochbaum [11] gives a good glimpse of the current knowledge on the subject.
Approximation algorithms for several problems in scheduling have been devel-
oped in the last three decades. In fact it is widely believed that the rst optimization
problem for which an approximation algorithm was formally designed and analyzed
is the makespan minimization of the identical parallel machines scheduling problem,
this algorithm was designed by Graham in 1966 [7]. This precedes the development
of the theory of NPcompleteness.
A Polynomial Time Approximation Scheme (PTAS for short) for an NP-hard
optimization problem in minimization (or maximization) form is a polynomial time
algorithm which, given any instance of the problem and a value > 0 (or 0 < < 1),
returns a solution whose value is within factor (1 + ) (or (1 )) of the optimal
solution value for that instance.
Jansen et al. [13] have designed a linear time approximation scheme for the
job shop scheduling problem when m and are xed. When m and are part of
the input the best known result [5] is an approximation algorithm with worst case
bound O([log(m) log(min{m, p
max
})/ log log(m)]
2
), where p
max
is the largest
processing time among all operations.
The exible job shop problem is equivalent to the problem of scheduling jobs with
chain precedence constraints on unrelated parallel machines. For this latter prob-
lem, Shmoys et al. [21] have designed a polynomial-time randomized algorithm that,
with high probability, nds a schedule of length at most O((log
2
n/ log log n)C

max
),
where C

max
is the optimal makespan.
In this work we study the preemptive and non-preemptive versions of the exible
job shop scheduling problem when the number of machines m and the number of
operations per job are xed. We generalize the techniques and results described
in [13] for the job shop scheduling problem and design a linear time approximation
scheme for the exible job shop problem. Our algorithm works also for the case
when each job J
j
has a delivery time q
j
. If in some schedule, job J
j
completes its
processing at time C
j
, then its delivery completion time is equal to C
j
+ q
j
. The
problem now is to nd a schedule that minimizes the maximum delivery completion
time L

max
. If in addition each job has a release time r
j
when it becomes available
for processing, our algorithm nds in linear time a solution of length no more than
(2 + ) times the length of an optimum schedule.
We also present a linear time approximation scheme for the preemptive version
of the exible job shop problem without migration. No migration means that each
operation must be processed by a unique machine. Hence, if an operation is pre-
empted, its processing can only be resumed on the same machine on which it was
being processed before the preemption. We also study the preemptive exible job
shop problem with migration and in which every operation has both a release time
and a delivery time. We describe a (2 + )-approximation algorithm for it. Both
algorithms produce solutions with only a constant number of preemptions.
The job shop problem with multi-purpose machines is a special instance of the
exible job shop problem in which the processing time of an operation is the same
3
regardless of the machine on which it is processed. We present a linear time ap-
proximation scheme for this problem that works for the case when each operation
has a release time and a delivery time.
At this point we should remark that even when our algorithms have linear run-
ning time, they are mainly of theoretical importance since the constants associated
with their running times are very large. Our main contribution is to prove that it is
possible to design approximation schemes, or algorithms with approximation ratio
2 + for the above problems. This knowledge, that polynomial time algorithms
with good approximation ratios exist, might help in the design of near optimum
algorithms with practical running times for these scheduling problems.
The rest of the paper is organized in the following way. In Section 2 we introduce
some notation and preliminary results. In Section 3 we describe a polynomial
time approximation scheme for the non-preemptive version of the exible job shop
scheduling problem with delivery times. In Section 4 we consider the preemptive
version of the problem with migration and, in Section 5, without migration. In
Section 6 we consider the multi-purpose machines job shop problem, and we give
our conclusions in Section 7.
2. Preliminaries
Consider an instance of the exible job shop problem with release and delivery
times. Let L

max
be the length of an optimum schedule. For every job J
j
, let
P
j
=

i=1
[ min
sM
ij
p
s
ij
] (1)
denote its minimum processing time. Let P =

JjJ
P
j
. Let r
j
be the release
time of job J
j
and q
j
be its delivery time. We dene t
j
= r
j
+ P
j
+ q
j
for all jobs
J
j
and t
max
= max
j
t
j
.
Lemma 1
max
_
P
m
, t
max
_
L

max
P + t
max
. (2)
Proof. We start by observing that L

max

P
m
, since
P
m
is the length of an ideal
schedule with no idle times, in which every operation is processed on its fastest
machine, and in which all release and delivery times are 0. Clearly, L

max
t
max
.
To show that L

max
P +t
max
we describe a simple algorithm that nds a schedule
of length L P + t
max
. Assign each operation O
ij
of every job J
j
to its fastest
machine (or to any of its fastest machines, if there are several of them). Then,
schedule the jobs one after another, taking them in non-decreasing order of release
times: the rst operation of the rst job is placed on its fastest machine and starts
as soon as possible; when that operation nishes, the second operation of the rst
job is immediately scheduled on its fastest machine, and so on. Let J
i
be the job
with the largest delivery completion time L in this schedule, i.e., L = s
i
+ P
i
+ q
i
,
where s
i
is the time when job J
i
starts its processing. Since jobs are scheduled,
as soon as possible, on their fastest machines, and in non-decreasing order of their
release times, then s
i
r
i
+P P
i
, and the claim follows since r
i
+q
i
t
max
. 2
4
Let us divide all processing, release, and delivery times by max
_
P
m
, t
max
_
, thus
by Lemma 1,
1 L

max
m + 1, and t
max
1. (3)
We observe that Lemma 1 holds also for the preemptive version of the problem
with or without migration.
3. Non-preemptive Problem
In this section we describe our linear time approximation scheme for the exible
job shop problem with delivery times. We assume that all release times are zero.
The algorithm works as follows. First we show how to transform an instance of the
exible job shop problem into a variant of the problem without delivery times and
in which we have a special non-bottleneck machine (described below). A solution
for this problem can be easily translated into a solution for our original problem.
To solve this new problem, we dene a set of time intervals and assign operations
to the intervals in such a way that operations from the same job that are assigned to
dierent intervals appear in the correct order, and the total length of the intervals
is no larger than the length of an optimum schedule. We perform this step by rst
xing the position of the operations from a constant number of long jobs, and then
using linear programming to determine the position of the remaining operations.
When solving the above linear program we might obtain a (fractional) solution
that splits some operations over several intervals. Furthermore, the solution of the
linear program requires more than linear time. We deal with these problems by
using the potential price directive decomposition method [9] to nd in linear time a
near optimal solution for the linear program. We shift around some of the fractional
assignments in this solution to reduce the number of operations split across intervals,
to a constant. We show that these operations are so small that by moving them to
the beginning of the schedule, we only slightly increase the length of the solution.
Next we use an algorithm by Sevastianov [20] to nd a feasible schedule for the
operations within each interval. Sevastianovs algorithm nds for each interval a
schedule of length equal to the length of the interval plus m
3
p
max
, where p
max
is
the largest processing time of any operation in the interval. In order to keep this
enlargement small we remove from each interval a subset V of jobs with the largest
operations before running Sevastianovs algorithm. Those operations are scheduled
at the beginning of the solution, and by choosing carefully the set of long jobs we
can show that the total length of the operations in V is very small compared to the
overall length of the schedule.
Getting Rid of the Delivery Times. We can use a technique by Hall and
Shmoys [10] to transform an instance of the exible job shop problem into another
instance with only a constant number of dierent delivery times. Any solution for
this latter problem can be easily transformed into a solution for the former one,
and this transformation increases the length of the solution by only /2. As we
show below, having to deal with only a constant number of dierent delivery times
greatly simplies the problem.
5
Let q
max
be the maximum delivery time and let > 0 be a constant value. The
idea is to round each delivery time down to the nearest multiple of

2
q
max
to get
at most 1 + 2/ distinct delivery times. Next, apply a (1 + /2)-approximation
algorithm for the exible job shop problem that can handle 1+2/ distinct delivery
times (this algorithm is described below). Finally, adding at most

2
q
max
to the
completion time of each job; this increases the length of the solution by at most

2
q
max
. The resulting schedule is feasible for the original instance, so this is a (1+)-
approximation algorithm for the original problem. In the remainder of this paper,
we shall restrict our attention to instances of the problem in which the delivery times
q
1
, ..., q
n
can take only 1 +
2

distinct values, which we denote by


1
> ... >

.
The delivery time of a job can be interpreted as an additional delivery operation
that must be processed on a non-bottleneck machine after the last operation of the
job. A non-bottleneck machine is a machine that can process simultaneously any
number of operations. Let D = {d
1
, ..., d

} be a set of delivery operations; each


operation d
i
has processing time
i
and it must be processed by a non-bottleneck
machine. Moreover, every feasible schedule for the jobs J can be transformed into
another feasible schedule, in which all delivery operations nish at the same time,
without increasing the schedule length: simply shift the delivery operations to the
end of the schedule. Because of the above interpretation, for every instance of the
exible job shop problem we can get an equivalent instance without delivery times:
just add to the set of machines a non-bottleneck machine, and add to every job a
delivery operation of length equal to the delivery time of the job. For the rest of this
section we assume that every job has a delivery operation, all delivery operations
end at the same time, and there is a non-bottleneck machine to process the delivery
operations. Let m + 1 be the non-bottleneck machine.
Relative Schedules Assume that the jobs are indexed so that P
1
P
2
...
P
n
. Let L J be the set formed by the rst k jobs, i.e., the k jobs with longest
minimum processing time, where k is a constant to be determined later. We call L
the set of long jobs. We note that if the number of jobs is smaller than k, then the
problem can be solved in constant time by trying all possible schedules for the jobs
and choosing the best one. Hence, we assume from now on that n is larger then k.
An operation from a long job is called a long operation, regardless of its processing
time. Let S = J \ L be the set of short jobs. We create a set J
D
of delivery jobs
such that every job J
j
J
D
consists of a single delivery operation d
j
.
Consider any feasible schedule for the jobs in J. This schedule assigns a machine
to every operation and it also denes a relative ordering for the starting and nishing
times of the operations. A relative schedule R for L J
D
is an assignment of
machines to long operations and a relative ordering of the starting and nishing
times of the long and delivery operations, such that there is a feasible schedule for
J that respects R. This means that for every relative schedule R there is a feasible
schedule for J that: (i) assigns the same machines as R to the long operations,
(ii) schedules the long operations in the same relative order as R, and (iii) ends all
delivery operations at the same time.
6
Note that since there is a constant number of long and delivery jobs, there is
only a constant number of dierent relative schedules.
Lemma 2 The number of relative schedules for L J
D
is at most
m
k
(2k)!(2k + 1)

. (4)
Proof. Fix the position of a long operation. The start and nishing times of
this operation divide the time into three intervals. Take a second long operation:
there are at most 3
2
choices for the starting and nishing intervals of the operation.
These two operations dene ve time intervals, and so for the next long operation
there are at most 5
2
possibilities for choosing its starting and ending intervals, and
so on. Moreover, for every long operation there are at most m machines that can
process it, therefore the number of relative schedules is at most
m
k
(1 3
2
5
2
(2k 1)
2
)(2k + 1)

m
k
(2k)!(2k + 1)

. (5)
2
If we build all relative schedules for L J
D
, one of them must be equal to the
relative schedule dened by some optimum solution. Since it is possible to build all
relative schedules in constant time, we can run our algorithm for each one of these
schedules, thus guaranteeing that it will be executed for a relative schedule R

such
that some optimum schedule for J respects R

.
Fix a relative schedule R

as described above. The ordering of the start-


ing and nishing times of the long operations divide the time into intervals that
we call snapshots. We can view a relative schedule as a sequence of snapshots
N(1), N(2), . . . , N(g), where N(1) is the unbounded snapshot whose right bound-
ary is the starting time of the rst operation according to R

, and N(g) is the


snapshot with right boundary dened by the nishing time of the delivery opera-
tions. Note that the number of snapshots g is
g 2k + 1 + (2 + 1 + )k (6)
for any k 1.
3.1. Scheduling the Small Jobs
Given a relative schedule R

as described above, to obtain a solution for the


exible job shop problem we need to schedule the small operations within the snap-
shots dened by R

. We do this in two steps. First we use a linear program LP(R

)
to assign small operations to snapshots, and second, we nd a feasible schedule for
the small operations within every snapshot.
To formulate the linear program we need rst to dene some variables. For each
snapshot N() we use a variable t

to denote its length. For each J


j
S we consider
all possible assignments of its operations to snapshots. This is done by dening for
each J
j
a set of variables x
j,(i
1
,...,i

),(s
1
,...,s

)
, so that when x
j,(i
1
,...,i

),(s
1
,...,s

)
= f,
0 f 1, this would mean that an f fraction of the v-th operation of job J
j
is
completely scheduled in the i
v
-th snapshot on machine s
v
, for all v = 1, . . . , . Note
7
that these variables only indicate the snapshots in which (fractions of) operations
of a given job are assigned, but they might not dene a valid schedule for them.
Let
j
be the snapshot where the delivery operation of job J
j
starts. Note that
because of the rounding of the delivery times, a delivery job might be associated
to several jobs. For every variable x
j,(i
1
,...,i

),(s
1
,...,s

)
we need 1 i
1
i
2

i

<
j
to ensure that the operations of J
j
are scheduled in the proper order. Let
A
j
= {(i, s) | i = (i
1
, . . . , i

) 1 i
1
. . . i

<
j
, s = (s
1
, . . . , s

) s
v
M
vj
and no long operation is scheduled by R

at snapshot i
v
on machine s
v
, for all
v = 1, . . . , }.
The load L
,h
on machine h in snapshot N() is the total processing time of the
operations from small jobs assigned to h during N(), i.e.,
L
,h
=

J
j
S

(i,s)A
j

v=1,i
v
=,s
v
=h
x
jis
p
sv
vj
, (7)
where i
v
and s
v
are the v-th components of tuples i and s respectively.
For every long operation O
ij
let
ij
and
ij
be the indices of the rst and
last snapshots where the operation is scheduled. Let p
ij
be the processing time
of long operation O
ij
according to the machine assignment dened by the relative
schedule R

. We are ready to describe the linear program LP(R

) that assigns
small operations to snapshots.
Minimize

g
=1
t

s.t. (1)

ij
=
ij
t

= p
ij
, for all J
j
L, i = 1, . . . , ,
(2)

g
=
j
t

=
j
, for all d
j
, j = 1, . . . ,
(3)

(i,s)A
j
x
jis
= 1 , for all J
j
S,
(4) L
,h
t

, for all = 1, . . . , g, h = 1, . . . , m,
(5) t

0 , for all = 1, . . . , g,
(6) x
jis
0 , for all J
j
S, (i, s) A
j
.
Lemma 3 An optimum solution of LP(R

) has value no larger than the length of


an optimum schedule S

that respects the relative schedule R

.
Proof. To prove the lemma we only need to show how to build a feasible solution
for LP(R

) that schedules the jobs in exactly the same way as S

. First we assign
to each variable t

value equal to the length of snapshot N() in schedule S

. Then,
we assign values to the variables x
jis
as follows.
1. For each operation O
uj
, snapshot N(), = 1, . . . , g, and machine h =
1, . . . , m, initialize f
uj
(, h) to be the fraction of operation O
uj
that is sched-
uled in snapshot N() and on machine h according to S

. Initialize the vari-


ables x
jis
to 0.
2. For each job J
j
and each (i, s) A
j
:
3. set x
jis
= f, where f := min{f
uj
(i
u
, s
u
) | u = 1, . . . , }, and
4. set f
uj
(i
u
, s
u
) := f
uj
(i
u
, s
u
) f for each u = 1, . . . , .
8
First, note that for any given feasible solution it is

g
=1

m
h=1
f
uj
(, h) = 1, for
each J
j
J and 1 u . Each time steps (2), (3) and (4) are completed, the
same fraction f of every operation of job J
j
is assigned to the same snapshot and ma-
chine as in S

; furthermore for at least one operation O


uj
the new value of f
uj
(i
u
, s
u
)
will be set to zero. Hence,

(i,s)A
j
x
jis
1. To show that at the end of the above
procedure min{f
uj
(i
u
, s
u
) | u = 1, . . . , } = 0 for all (i
1
, . . . , i

), (s
1
, . . . , s

) A
j
,
suppose the contrary, i.e. that there is a fraction

f > 0 of an operation of job J
j
that is not assigned to any snapshot and machine, i.e.,

(i,s)A
j
x
jis
< 1. But
then, the same fraction

f of every operation of job J
j
is not assigned by this pro-
cedure to any snapshot and to any machine, and therefore, there is at least a pair
(i
1
, . . . , i

), (s
1
, . . . , s

) A
j
for which min{f
uj
(i
u
, s
u
) | u = 1, . . . , } > 0, which
is a contradiction. 2
One can solve LP(R

) optimally in polynomial time and get only a constant


number of jobs with fractional assignments since a basic feasible solution of LP(R

)
has at most k + + n k + mg variables with positive value. This comes from
the fact that a basic feasible solution has at most as many variables with positive
value as the number of constraints in the linear program. For LP(R

), there are
k constraints of type (1), constraints of type (2), n k constraints of type (3),
and mg constraints of type (4). Moreover, by constraints (3) every small job has at
least one positive variable associated with it, and so there are at most mg +k+
jobs with fractional assignments in more than one snapshot.
A drawback of this approach is that solving the linear program might take a very
long time. Since we want to get an approximate solution to the exible job shop
problem, it is not necessary to nd an optimum solution for the linear program.
An approximate solution for LP(R

) would suce for our purposes. We can use an


algorithm by Grigoriadis and Khachiyan [9] to nd a (1 + )-approximate solution
for LP(R

) in linear time, for any value > 0. Furthermore, by using the rounding
technique described in [13] it is possible to nd an approximate solution for LP(R

)
in which only a small number of jobs receive fractional assignments in more than
one snapshot, thus proving the following lemma.
Lemma 4 A (1 + )-approximate solution for LP(R

) can be computed in linear


time for any value > 0 and in which the set of jobs F that have fractional assign-
ments in more than one snapshot has size |F| mg.
3.2. Generating a Feasible Schedule
To complete our solution, we need to compute a feasible schedule for the small
operations that have been assigned to each snapshot. We rst get rid of the opera-
tions that have been split across snapshots. Then, we use Sevastianovs algorithm
[20] to schedule the operations inside a snapshot. To ensure that the schedule pro-
duced by Sevastianovs algorithm almost ts in the snapshots we need to eliminate
the largest small operations rst. We give the details below.
We assume, without loss of generality, that our approximate solution for LP(R

)
has value no larger than m + 1, otherwise the solution obtained by scheduling the
9
jobs sequentially on the machines with the smallest processing time has a better
makespan in the worst case. To get a feasible schedule from the solution of the linear
program, let us rst deal with the jobs F that received fractional assignments. Note
that if we schedule the jobs in F at the beginning of our solution and append to this
schedule a feasible schedule for the rest of the jobs we will get a valid schedule for
the entire set of jobs. So, let us just assign each operation O
ij
of every job J
j
F
to its fastest machine (or to one of its fastest machines, if there are several of them).
Then, we schedule the jobs one after another: the rst operation of the rst job is
placed on its fastest machine; when that operation nishes the second operation of
the rst job is scheduled on its fastest machine, and so on.
For every operation of the remaining small jobs (not in F) consider its processing
time according to the machine selected for it by the solution of the linear program.
Let V be the set formed by all operations of small jobs with processing time larger
than =

8
3
mg
. Note that
|V |
m(m + 1)

=
8
3
m
2
(m + 1)g

. (8)
We remove from the snapshots all operations in V and place them at the beginning
of the schedule as we did for F. We need to get rid of these lengthy operations V
in order to be able to produce a near optimum schedule for the rest of the small
operations.
Let O() be the set of operations from small jobs that remain in snapshot N().
Let p
max
() be the maximum processing time among the operations in O(). Ob-
serve that the set of jobs in every snapshot N() denes an instance of the job shop
problem, since the linear program assigns a unique machine to every operation of
those jobs. Hence we can use Sevastianovs algorithm [20] to nd in O(n
2

2
m
2
)
time a feasible schedule for the operations O(); this schedule has length at most

= t

+
3
mp
max
(). We must further increase the length of every snap-
shot N() to

t

to accommodate the schedule produced by Sevastianovs algorithm.


Summing up all these enlargements, we get:
Lemma 5
g

=1

3
mp
max
()
3
mg =

8


8
L

max
. (9)
Note that the total length of the snapshots N(
ij
), . . . , N(
ij
) containing a long
operation O
ij
might be larger than p
ij
. This creates some idle times on machine
m
ij
. We start operations O
ij
for long jobs J
j
L at the beginning of the enlarged
snapshot N(
ij
). The resulting schedule is clearly feasible. Let S(J

) =

JjJ

P
j
be the total processing time of all jobs in some set J

J when the operations of


those jobs are assigned to their fastest machines.
Lemma 6 A feasible schedule for the jobs J of length at most (1+
3
8
)L

max
+S(F
V ) can be found in O(n
2
) time.
Proof. The small jobs F that received fractional assignments and the jobs in V are
scheduled sequentially at the beginning of the schedule on their fastest machines.
By Lemmas 4 (choosing = /4) and 5 the claim follows. 2
10
Now we show that we can choose the number k of long jobs so that S(F V )

8
L

max
.
Lemma 7 [12] Let {d
1
, d
2
, . . . , d
n
} be positive values and

n
j=1
d
j
m. Let q be
a nonnegative integer, > 0 , and n (q + 1)

. There exists an integer k

1
such that d
k

+1
+ . . . + d
k

+qk
m and k

(q + 1)

.
Let us choose =

8m
and q =
_
8
3
m(m+1)

+ 1
_
m(2 + 1 + ). By Lemma 4
and inequalities (6) and (8) , |F V | mg +
8
3
m
2
(m+1)

g qk. By Lemma 7 it
is possible to choose a value k (q +1)

so that the total processing time of the


jobs in F V is at most

8


8
L

max
. This value of k can clearly be computed in
constant time. We select the set L of long jobs as the set consisting of the k jobs
with largest processing times P
j
=

i=1
[min
sM
ij
p
s
ij
].
Lemma 8
S(F V )

8
L

max
. (10)
Theorem 1 For any xed m and , there is a linear-time approximation scheme for
the exible job shop scheduling problem that computes for any value > 0, a feasible
schedule with maximum delivery completion time of value at most (1 + )L

max
in
O(n) time.
Proof. By Lemmas 6 and 8, the above algorithm nds in O(n
2
) a schedule of
length at most (1 +
1
2
)L

max
. This algorithm can handle 1 +
2

distinct delivery
times. By the discussion at the beginning of Section 3.1 it is possible to modify
the algorithm so that it handles arbitrary delivery times and it yields a schedule
of length at most (1 + )C

max
. For every xed m and , all computations can be
carried out in O(n) time, with the exception of the algorithm of Sevastianov that
runs in O(n
2
) time. The latter can be sped up to get linear time by merging pairs
of small jobs together as described in [13]. 2
We note that our algorithm is also a (2 + )-approximation algorithm when
jobs have release and delivery times. Indeed by adding release times to jobs in the
schedule found by the algorithm, the maximum delivery completion time cannot
increase by more than r
max
, where r
max
is the maximum release time. Hence the
length of the schedule is at most (2+) times the optimal value, since r
max
L

max
.
4. Preemptive Problem without Migration
In the preemptive exible job shop problem without migration the processing
of an operation may be interrupted and resumed later on the same machine. We
describe our algorithm only for the case when all release times are zero. As in
the non-preemptive case we divide the set of jobs J into long jobs L and short
jobs S, with the set L having a constant number of jobs. One can associate a
relative schedule to each preemptive schedule of L by assigning a machine to each
long operation, and by determining a relative order for the starting and nishing
times of operations from L J
D
. The only dierence with the non-preemptive
case is that some infeasible relative schedules for the non-preemptive case are now
11
feasible. Namely, consider two long operations a and b assigned to the same machine
according to a relative schedule R. Now, R is feasible even if a starts before b but
ends after b.
By looking at every time in the schedule when an operation from LJ
D
starts or
ends we dene a set of time intervals, similar to those dened in the non-preemptive
case by the snapshots. For convenience we also call these time intervals snapshots.
Since L J
D
has a constant number of operations (and hence there is a constant
number of snapshots), we can build all relative schedules for L J
D
in constant
time. Thus by trying all relative schedules we guarantee that at least one of them,
say R

, schedules the long operations within the same snapshots as in an optimum


schedule.
An operation of a long job is scheduled in consecutive snapshots i, i+1, . . . , i+t,
but only a fraction (possibly equal to zero) of the operation need be scheduled in
any one of these snapshots. However, and this is crucial for the analysis, in every
snapshot there can be at most one operation from any given long job.
Now we dene a linear program as in the case of the non-preemptive exible job
shop. For each long operation O
ij
we dene variables y
ij
for every
ij

ij
,
where relative schedule R

places operation O
ij
within the snapshots
ij
to
ij
.
Variable y
ij
denotes the fraction of operation O
ij
that is scheduled in snapshot
. Let g be the number of snapshots, t

be the length of the -th snapshot, and


let O
h
denote the set of long operations processed by machine h according to the
relative schedule R

. Let L
,h
be dened as in Section 3.1 and let L

,h
denote the
total processing time of operations from long jobs that are executed by machine
h during snapshot , i.e., L

,h
=

O
ij
O
h
|
ij

ij
y
ij
p
h
ij
. The linear program to
assign operations to snapshots is the following.
Minimize

g
=1
t

s.t. (1)

ij
=
ij
y
ij
= 1 for all J
j
L, i = 1, . . . , ,
(2)

g
=
j
t

=
j
, for all d
j
, j = 1, . . . ,
(3)

(i,s)Aj
x
jis
= 1, for all J
j
S,
(4) L
,h
+ L

,h
t

, for all = 1, . . . , g, h = 1, . . . , m,
(5) t

0 , for all = 1, . . . , g,
(6) x
jis
0 , for all J
j
S, (i, s) A
j
,
(7) y
ij
0, for all 1 i , J
j
L,
ij

ij
.
Lemma 9 An optimum solution of this linear program has value no larger than the
length of an optimum schedule S

that respects the relative schedule R

.
Proof. The proof is similar to that of Lemma 3. 2
Note that in any solution of this linear program the schedule for the long jobs is
always feasible, since there is at most one operation of a given job in any snapshot.
We can nd an approximate solution for the linear program as described in the
previous section. In this approximate solution there are at most mg small jobs that
are preempted (Lemma 4). These jobs are placed at the beginning of the schedule,
as before.
Let =

8
3
mg
. As for the non-preemptive case, consider the set V of small
12
jobs containing at least one operation with processing time larger than according
to the machine assigned to it by the linear program. The cardinality of this set
is bounded by
m(m+1)

. Removing all jobs in V so the processing times of the


remaining operations of small jobs are not larger than . The jobs in V are placed
sequentially at the beginning of the schedule.
Next nd a feasible schedule for the jobs in every snapshot using Sevastianovs
algorithm, but since the length of the schedule produced by this algorithm depends
on the value of the largest processing time, it might be necessary to split operations
from long jobs into small pieces. So we split each long operation O
ij
into pieces
of size at most and then apply Sevastianovs algorithm. This yields a feasible
schedule for the snapshot because there is at most one operation of each long job
in it.
In this schedule the number of preemptions is a constant equal to the number
of times that long operations are preempted, and this number is at most

J
j
L

i=1

ij

=
ij
_
y
ij
p
h
ij

J
j
L

i=1

ij

=
ij
(
y
ij
p
h
ij

+ 1)

m(m + 1)

+ kg,
since in each snapshot there are at most k dierent operations from long jobs, and
the sum of the processing times y
ij
p
h
ij
cannot be more than m(m+1) (see Lemma
1). Hence, our solution has O(1) preemptions. Choosing the size of L as we did for
the non-preemptive case we ensure that the length of the schedule is at most 1 +
times the value of an optimum schedule. This algorithm runs in linear time.
Theorem 2 For any xed m and , there is a linear-time approximation scheme for
the preemptive exible job shop scheduling problem without migration that computes
for any xed > 0, a feasible schedule with maximum delivery completion time
(1 + ) L

max
.
Note that the algorithm described is also a (2 +)-approximation algorithm for
the exible job shop problem with release and delivery times.
5. Preemptive Problem with Migration
In the preemptive exible job shop problem with migration the processing of
an operation may be interrupted and resumed later on any eligible machine. We
consider the problem when each job J
j
has a release time r
j
and a delivery time q
j
.
In this section we describe a linear time (2 + ) -approximation algorithm for
the problem, when the number of machines and operations per job are xed.
Finding a Schedule When Release Times are Zero In the rst step we use
linear programming to assign operations to machines and to snapshots assuming
that all release times are zero. We wish to make this assignment in such a way that
the total length of the solution is minimized. We also assume that we have reduced
the number of dierent delivery times to a constant as in the non-preemptive
13
case. Let t denote the value of the objective function of the linear program (dened
below). Dene snapshots as follows: [0, t
1
], [t
1
, t
2
], ..., [t
1
, t

]
where
1
>
2
> ... >

are the delivery times. For each job J


j
we use a set of
decision variables x
jis
[0, 1] for tuples (i, s) A
j
.
The meaning of these variables is that x
jis
= 1 if and only if, for each 1 u ,
operation O
uj
is scheduled on machine s
u
, and in snapshot [0, t
1
] if i
u
= 1,or in
snapshot [t
i
u
1
, t
i
u
] if 1 < i
u
. Let the load L
,h
on machine h in time
interval be dened as the total processing time of all operations that are executed
by machine h during time interval . The linear program LP is as follows.
Minimize t
(1)

(i,s)A
j
x
jis
= 1 , for all J
j
J,
(2) L
1,h
t
1
, for all h = 1, . . . , m,
(3) L
,h

1

, for all h = 1, . . . , m, = 2, . . . , ,
(4) x
jis
0 , for all J
j
J, (i, s) A
j
,
Lemma 10 An optimum solution of LP has value no larger than the maximum
delivery time of an optimum schedule for J.
Proof. Consider an optimum schedule S

for J. We only need to show that for


any job J
j
J there is a feasible solution of the linear program that schedules all
operations of J
j
in the same positions as S

. The proof is similar to that of Lemma


3. 2
We guess the value s of an optimum schedule. Let LP(s, ) be the linear program
with objective function: Minimize , and constraints (1), (4), and (5) from the
above linear program plus the following two
(2)
L
1,h
s1
, for all 1 h m,
(3)
L
,h

, for all 1 h m, = 2, 3, ..., .


The linear program LP(s, ) has the structure required by the algorithm of
Grigoriadis and Khachiyan [9]. Using the algorithm in [9] and binary search on
the interval [1, 1 + m], we can nd in linear time a value s (1 +

8
)t

, where t

is the value of an optimum solution of LP, such that LP(s, 1 + ) is feasible for
= 1+

8+
. For this value s a solution of LP(s, 1+) has L
1,h
(s
1
)(1+) and
L
,h
(
1

)(1 + ). Since = 1 +

8+
the total enlargement of all snapshots
is at most (s
1
) +

=2
(
1

) = (s

) s (1 +

8
)t

= (1 +

4
)t

.
Lemma 11 A solution for LP(s, 1 + ), with s (1 +

8
)t

and = 1 +

8+
, of
value at most (1 +

4
)t

can be found in linear time.


Using the rounding procedure of [13] it is possible to modify any feasible solution
of LP(s, 1 + ) to get a new feasible solution with at most m variables x
jis
with
fractional values. Moreover we can do this rounding step in linear time.
Lemma 12 Any feasible solution of LP(s, 1 +) can be transformed in linear time
into another feasible solution in which the set of jobs F that receive fractional as-
signments has size |F| m.
Let P denote the set of jobs fromJ\F for which at least one operation, according
to the machine assignment computed in step 1, has processing time greater than
14
s
4
3
m(1+

8
)
. Let L= F P and S = J\L. We remove from the snapshots the jobs
in L and then use Sevastianovs algorithm to nd a feasible schedule
S
for S. The
enlargement in the length of the schedule caused by Sevastianovs algorithm is at
most
3
mp
max
, where p
max
is the maximum processing time of operations in S.
Since p
max

s
4
3
m(1+

8
)
, the length of the resulting schedule is (1 +

4
)t

3
mp
max
(1 +

4
)t

+
s
4(1+

8
)
(1 +

2
)t

(1 +

2
)L

max
.
Note that by considering release times dierent from zero this algorithm yields
a schedule for S that is at most (2+

2
) times the length of an optimal one, since the
maximum release time cannot be more than L

max
. The algorithm of Sevastianov
takes O(n
2
) time, but it can be sped up to get linear time by merging pairs of
jobs as described in [13].
The cardinality of set L is bounded by a constant since |P|< m(1+

4
)t

4
3
m(1+

8
)
s

4
3
m
2
(1+

8
)(1+

4
)

= O(

3
m
2

2
), and so |L|= |F||P| =O(

3
m
2

2
).
Scheduling the Jobs in L Now we ignore the delivery times and consider only
the release times. In the following we show how to compute a schedule
L
that
minimizes the makespan for the jobs from L.
As for the delivery times, the release time of a job can be interpreted as an
additional operation of the job that it has to process on a non-bottleneck machine
before any of its other operations. Because of this interpretation, we can add to
the set O
L
of operations from jobs L, a set R= {O
01
, O
02
, . . . , O
0
} of release
operations. The processing time of operation O
0j
is r
j
.
Let us dene a snapshot as a subset of operations from O
L
R such that two
dierent operations of the same job do not belong to the same snapshot. A relative
order of L is an ordered sequence of snapshots which denes for every operation
O
ij
O
L
R the rst
ij
and the last
ij
snapshots in which operation O
ij
can
start and nish, respectively. Let g be the number of snapshots. A relative order
is feasible if 1
ij

ij
g for every operation O
ij
, and for two operations O
ij
and O
i+1j
of the same job J
j
L, we have
ij
+ 1 =
i+1j
. Furthermore, every
release operation starts in the rst snapshot, i.e.,
0j
= 1 for every J
j
L.
We observe that g can be bounded by ( + 1)|L|, and therefore g = O(

4
m
2

2
).
Note that a relative order is dened without assigning operations of long jobs to
machines.
Since L has a constant number of jobs (and hence there is a constant number of
snapshots), we can consider all relative orders for L. Any operation is scheduled in
consecutive snapshots i, i + 1, . . . , i + t, but only a fraction (possibly equal to zero)
of that operation need be scheduled in any of these snapshots. In every snapshot
there can be at most one operation from any given job of L.
Now we dene a linear program. For each operation O
ij
we dene variables x
ijh
for every
ij

ij
and every h M
ij
. Variable x
ijh
denotes the fraction of
operation O
ij
that is scheduled in snapshot on machine h. Let t

be the length of
snapshot . The load on machine h in snapshot is equal to the total processing time
of the operations of jobs in L, i.e., L
,h
=

J
j
L

i=1,...,|
ij

ij
,hM
ij
x
ijh
p
h
ij
.
The linear program for a given relative order R is the following.
15
Minimize

g
=1
t

s.t. (1) t

0, 1 g
(2)

hM
ij

ij
=ij
x
ijh
= 1, 1 i , J
j
L,
(3) L
,h
t

, 1 g, 1 h m,
(4)

hM
ij
x
ijh
p
h
ij
t

, 1 g, J
j
L,
ij

ij
,
(5) x
ijh
0, 1 i , J
j
L,
ij

ij
,
(6)

0j
=1
t

= r
j
, J
j
L.
Consider any feasible schedule of L according to a relative order R, and let
x
ijh
p
h
ij
be the total amount of time that machine h works on operation O
ij
in
snapshot , and let t

denote the length of snapshot . It is clear that this assignment


of values to t

and x
ijh
constitutes a feasible solution of the above linear program.
We show now that for any feasible solution of the linear program, there is a feasible
solution with the same values of t

and x
ijh
. It is sucient to note that any
feasible solution of the linear program denes in each snapshot an instance of the
preemptive open shop scheduling problem. Hence, from results in [6, 16] the claim
follows. Furthermore, by using the same procedure described in [16] it is possible
to construct a schedule with no more than O(

4
m
4

2
) preemptions.
For each relative order R we solve the linear program above, and then select
the solution with the smallest length. All the algorithms used in this step require
polynomial time in the size of the input, therefore the overall time is O(1) since
the input size is O(1). Note that if we consider non-zero delivery times then the
maximum delivery completion time of the resulting schedule
L
is at most 2L

max
.
Combining the Schedules The output schedule is obtained by appending
S
after
L
. The length of the solution is at most (2 +

2
)L

max
since the maximum
completion time of schedule
L
is not greater than L

max
and the length of the
schedule
S
is at most (1 +

2
)L

max
.
Theorem 3 For any xed m and , there is a polynomial-time approximation al-
gorithm for the preemptive exible job shop scheduling problem with migration that
computes for any xed > 0, a feasible schedule with maximum delivery completion
time (2 + ) L

max
in O(n) time.
6. Multi-Purpose Machines Job Shop
The job shop scheduling problem with multi-purpose machines [1] is a special
case of the exible job shop problem, in which the processing time of each operation
p
ij
does not depend anymore on the machine on which it is processed. For the non-
preemptive version of this problem, our techniques can also handle the case in which
each job J
j
has a release time r
j
and a delivery time q
j
. The main dierence with
16
the previous algorithm is that we put each job J
j
from the set F V (this set
is dened as before) after their corresponding release operations and before their
delivery operations. More precisely, the algorithm works as follows.
We use the technique by Hall and Shmoys [10] to obtain a problem instance with
only a constant number of delivery and release times. We divide the set of jobs J
into long jobs L and short jobs S as before. Then, we compute all the relative
schedules of the long jobs; a relative schedule is dened as before, but we add also a
constant number of release operations. For each relative schedule we dene a linear
program as in the non-preemptive case. We solve and round the solution of the
linear program to get a solution with a constant number of fractional variables.
The solution of the linear program has a constant number of jobs with fractional
assignments (set F), and a constant number of jobs (set V ) with at least one
operation with large processing time. We now use a very simple rounding procedure
to obtain an integral (and possibly infeasible) solution for the linear program. If
job J
j
has more than one nonzero variable associated with it we set one of them to
be one and the others to be zero in an arbitrary manner.
The nal schedule is built as follows. Assign jobs from F V to machines
and snapshots according to the new integral solution. Build a left justied (that
means without leaving idle times) feasible schedule for the jobs from F V . Assign
the remaining small jobs according to the linear programming solution, and for
each snapshot N(), {1, . . . , g}, schedule operations assigned to N() after the
maximum nishing time in N() of operations of jobs from F V (if any), by using
Sevastianovs algorithm. Again, from Lemma 7 it is possible to choose the number
of long jobs such that the enlargement due to the jobs from F V is small enough,
and by using similar arguments as before, it is possible to show that the described
algorithm is a polynomial time approximation scheme which runs in linear time.
Theorem 4 For any xed m and , there is a polynomial-time approximation
scheme for the MPM job shop scheduling problem with release and delivery times
that computes for any xed real number , with > 0, a feasible schedule with
maximum delivery completion time of at most (1 + ) L

max
in O(n) time.
By using similar arguments as before, it is possible to provide a linear time
approximation scheme also when preemption without migration is allowed.
Theorem 5 For any xed m and , there is a polynomial-time approximation
scheme for the preemptive MPM job shop scheduling problem without migration with
release and delivery times that computes for any xed real number , with > 0, a
feasible schedule with maximum delivery completion time of at most (1 + ) L

max
in O(n) time.
7. Conclusions
In this paper we present linear time polynomial time approximation schemes
for the non-preemptive exible job shop scheduling problem when the number of
machines and maximum number of operations per job are constant. Our algorithm
can also handle delivery times for the jobs. We also consider the preemptive version
17
of the problem, and we design a polynomial time approximation scheme for the
case without migration and a (2 + )-approximation algorithm for the case with
migration, for any > 0. We consider a special case of the exible job shop
scheduling problem when the processing time of an operation does not depend on
the machine that processes it. For this case our algorithm can handle release times
for the operations as well.
Even when our algorithms have O(n) running time, where n is the number of
jobs, the hidden constant in the big-Oh notation is extremely large. This makes
our results important mainly in a theoretical sense, since the running time of the
algorithms is too high to be useful in practice. Hence, the main contribution of
this work is to prove the existence of polynomial time approximation schemes and
(2 + )-approximation algorithms for the above problems.
Proving the existence of polynomial time algorithms with good approximation
ratios for these complex scheduling problems also opens up the possibility of design-
ing practical running time approximation algorithms with similar approximation
ratios for them. For many problems, the discovery of inecient (approximation)
algorithms for them has motivated additional research activity on them that lead
to the discovery of ecient running time (approximation) algorithms [2, 3, 15, 19].
We leave it as an open question to determine whether a low running time poly-
nomial time approximation scheme for the exible job shop scheduling problem
exists.
Acknowledgements
This research was supported by Swiss National Science Foundation project
200021-104017/1, Power Aware Computing, and by the Swiss National Science
Foundation project 200021-100539/1, Approximation Algorithms for Machine schedul-
ing Through Theory and Experiments.
References
1. P. Brucker, B. Jurisch, and A. Kramer. Complexity of scheduling problems with
multi-purpose machines. Annals of Operations Research, 70:5773, 1997.
2. M. Charikar, S. Guha, E. Tardos, and D. Shmoys. A constant-factor approximation
algorithm for the k-median problem. In Proceedings of the 31th Annual ACM
symposium on Theory of Computing, 110, 1999.
3. L.R. Ford and D.R. Fulkerson. Maximal ow through a network. Canadian Journal
of Mathematics, 8:399404, 1956.
4. L. Gambardella, M. Mastrolilli, A. Rizzoli, and M. Zaalon. An optimization
methodology for intermodal terminal management. Journal of Intelligent Manufac-
turing, 12:521534, 2001.
5. L. Goldberg, M. Paterson, A. Srinivasan, and E. Sweedyk. Better approxima-
tion guarantees for job-shop scheduling. SIAM Journal on Discrete Mathematics,
14(1):6792, 2001.
6. T. Gonzalez and S. Sahni. Flowshop and jobshop schedules: complexity and ap-
proximation. Operations Research, 26:3652, 1978.
18
7. R. Graham. Bounds for certain multiprocessing anomalies. Bell System Technical
Journal (BSTJ), 45:15631581, 1966.
8. R. Graham, E. Lawler, J. Lenstra, and A. R. Kan. Optimization and approxima-
tion in deterministic sequencing and scheduling: A survey. In Annals of Discrete
Mathematics, volume 5, 287326. NorthHolland, 1979.
9. M. D. Grigoriadis and L. G. Khachiyan. Coordination complexity of parallel price-
directive decomposition. Mathematics of Operations Research, 21:321340, 1996.
10. L. Hall and D. Shmoys. Approximation algorithms for constrained scheduling prob-
lems. In Proceedings of the 30th IEEE Symposium on Foundations of Computer
Science, 134139, 1989.
11. D. Hochbaum, editor. Approximation Algorithms for NP-hard Problems. PWS
Publishing Company, Boston, 1995.
12. K. Jansen and L. Porkolab. Linear-time approximation schemes for scheduling
malleable parallel tasks. In Proceedings of the 10th Annual ACM-SIAM Symposium
on Discrete Algorithms, 490498, 1999.
13. K. Jansen, R. Solis-Oba, and M. Sviridenko. A linear time approximation scheme
for the job shop scheduling problem. In Proceedings of APPROX99, volume LNCS
1671, 177188, 1999.
14. D. Johnson. Approximate algorithms for combinatorial problems. Journal of
Computer and Systems Sciences, 9:256278, 1974.
15. M. Klein. A primal method for minimal cost ows with application to the assignment
and transportation problems. Management Science, 14: 205220, 1967.
16. E. Lawler and J. Labetoulle. On preemptive scheduling of unrelated parallel pro-
cessors by linear programming. Journal of the ACM, 25:612619, 1978.
17. E. Lawler, J. Lenstra, A. Rinnooy Kan, and D. Shmoys. Sequencing and scheduling:
Algorithms and complexity. Handbook in Operations Research and Management
Science, 4:445522, 1993.
18. M. Mastrolilli and L. M. Gambardella. Eective neighbourhood functions for the
exible job shop problem. Journal of Scheduling, 3:320, 2000.
19. S. Sahni. Approximate algorithms for the 0/1 knapsack problem. Journal of the
ACM, 22:115-124, 1975.
20. S. Sevastianov. Bounding algorithms for the routing problem with arbitrary paths
and alternative servers. Cybernetics (in Russian), 22:773780, 1986.
21. D. Shmoys, C. Stein, and J. Wein. Improved approximation algorithms for shop
scheduling problems. SIAM Journal on Computing, 23:617632, 1994.
22. R. Vaessens. Generalized Job Shop Scheduling: Complexity and Local Search. PhD
thesis, Eindhoven University of Technology, 1995.
23. D. Williamson, L. Hall, J. Hoogeveen, C. Hurkens, J. Lenstra, S. Sevastianov, and
D. Shmoys. Short shop schedules. Operations Research, 45:288294, 1997.
19

Você também pode gostar