Você está na página 1de 19

International Journal of Operations & Production Management

Measuring complexity as an aid to developing operational strategy


G. Frizelle E. Woodcock
Article information:
To cite this document:
G. Frizelle E. Woodcock, (1995),"Measuring complexity as an aid to developing operational strategy", International Journal of
Operations & Production Management, Vol. 15 Iss 5 pp. 26 - 39
Permanent link to this document:
http://dx.doi.org/10.1108/01443579510083640
Downloaded on: 06 December 2016, At: 08:18 (PT)
References: this document contains references to 7 other documents.
To copy this document: permissions@emeraldinsight.com
The fulltext of this document has been downloaded 1928 times since 2006*
Users who downloaded this article also downloaded:
(2002),"An information-theoretic methodology for measuring the operational complexity of supplier-customer
systems", International Journal of Operations & Production Management, Vol. 22 Iss 1 pp. 80-102 http://
Downloaded by FUDAN UNIVERSITY At 08:18 06 December 2016 (PT)

dx.doi.org/10.1108/01443570210412088
(2001),"Supply chain complexity and delivery performance: an international exploratory study", Supply Chain Management:
An International Journal, Vol. 6 Iss 3 pp. 106-118 http://dx.doi.org/10.1108/13598540110399110

Access to this document was granted through an Emerald subscription provided by emerald-srm:543096 []
For Authors
If you would like to write for this, or any other Emerald publication, then please use our Emerald for Authors service
information about how to choose which publication to write for and submission guidelines are available for all. Please
visit www.emeraldinsight.com/authors for more information.
About Emerald www.emeraldinsight.com
Emerald is a global publisher linking research and practice to the benefit of society. The company manages a portfolio of
more than 290 journals and over 2,350 books and book series volumes, as well as providing an extensive range of online
products and additional customer resources and services.
Emerald is both COUNTER 4 and TRANSFER compliant. The organization is a partner of the Committee on Publication
Ethics (COPE) and also works with Portico and the LOCKSS initiative for digital archive preservation.

*Related content and download information correct at time of download.


IJOPM
15,5 Measuring complexity
as an aid to developing
operational strategy
26
G. Frizelle
Received February 1994
Accepted May 1994
University of Cambridge, UK, and
E. Woodcock
CSC Manufacturing Practice, Preston, UK
Downloaded by FUDAN UNIVERSITY At 08:18 06 December 2016 (PT)

This article describes the work carried out in Cambridge University, CSC
Manufacturing Practice and British Aerospace to develop and use a measure
for manufacturing complexity.
One of the difficulties in developing a coherent operational strategy is in
knowing how the various elements of a manufacturing system interact and
assessing the relevant importance of each. The approach taken in this article is
to look at the manufacturing system in terms of how complex it is, and then
measure the contribution each operational source makes to its total complexity.
However, that requires a common measure to be developed, which can
justifiably claim to meet the requirement.
The article therefore starts by giving an outline of the derivation of the
mathematical model used to construct the measure. Then the model is
interpreted to see whether it behaves in a reasonable way and makes useful
predictions about the systems it is measuring. This leads to a discussion of how
the tool might be incorporated into a strategy development exercise. Finally,
three applications in operational environments are described along with the
resulting analysis. The analysis allowed the local management, in each case, to
identify the key areas of weakness, in both the short and medium term. As such
it gave one possible basis for developing an operational strategy for each of the
manufacturing units visited.

An outline of the derivation of the model


The key requirement was first to develop an appropriate measure. However,
that depended on settling on a suitable definition of what constituted
complexity. A possible starting-point was to take an area where that concept
was already well established to see if the ideas could be transferred to the
measurement of manufacturing operations. Algorithmic complexity is such an

The authors would like to thank the ACME directorate, who partially funded the research, and
International Journal of Operations British Aerospace for their support and for allowing the results of two of the trials in their
& Production Management, Vol. 15
No. 5, 1995, pp. 26-39. MCB factories to be published; and finally to Dowty for allowing a trial in one of their factories to be
University Press, 0144-3577 published.
area. Here complexity is the minimum number of steps required to achieve a Complexity as
desired degree of accuracy in the execution of an algorithm. an aid to
It was realized that the measurement of manufacturing variables follows the strategy
same logic. For example, statistical quality control requires repetitive sampling
where the sample size has been determined by some required level of accuracy.
Thus the measurement process itself is an iterative procedure which converges
and which is halted at some distance from its goal. 27
To proceed to a more formal definition, envisage a system S as consisting of N
mutually exclusive states s(i), each of which has a finite and measurable
probability associated with it. The system is observed at discrete time intervals
to see how it evolves. Using a standard result[1] the number of observations
needed to achieve a desired level of accuracy, for N 1 independent variables, is:
Downloaded by FUDAN UNIVERSITY At 08:18 06 December 2016 (PT)

N 1
nmin =
{ }
(1)
4 min
2
1

since N 1 of the states represent the independent variables, and each has to
achieve the required level of accuracy , with probability . On this basis the
complexity of the system is n nmin.
However, there is a problem, for if a function is to be used as a measure then
it must exhibit the mathematical properties of one. These are that for to be a
measure on a system S, it must meet the following three requirements[2]:
( S ) = 0 when N = 1 a system with a single state
f ( S ) > 0 when N > 1
M
S 1 , , S M = S m (2 )
m = 1

where the Sms represent a combined system consisting of M independent sub-


systems S1, , SM.
Unfortunately (1) cannot be used as a measure as it fails on the last of these.
The problem comes with the addition of a subsystem, since any state in the
existing system can be associated with up to all of those within the additional
subsystem. Thus nmin becomes, for M subsystems:

Nj 1
nmin M

4 min
2
{1 }
where Nj is the number of states in the jth subsystem. Therefore n ( nmin), the
complexity figure violates condition three in equation (2).
The approach, therefore, is to develop an expression which meets the third
requirement and can thus be described as a measure for the system. The
IJOPM starting-point is to look at the measurement process itself. The system will be
15,5 observed at regular intervals over time. It will be seen to occupy a sequence of
states and this sequence is assumed to be stationary and ergodic. The latter
property means that the probabilities of states occurring can be approximated
to by the ratio of the total observed occupancy time of that state, to the total
time over which the system is observed. If state i has been observed m(i) times
28 consecutively at time intervals t, then an estimate of the total occupancy time
is given by [m(i) l] t. If, for simplicity, this is written as n t then an
observation of the system can be represented by the n-tuple {n(i)t, , n(j)t},
where there are n observations of a system consisting of N states (n(i), n(j)
.{N}). This sequence can occur C ways, where C is given by the multinomial
coefficient (since the t cancels top and bottom):
Downloaded by FUDAN UNIVERSITY At 08:18 06 December 2016 (PT)

n!
C=
n(1)!n( N )!
Using Stirlings approximation this can be written:

(2n )1 / 2
C
n(i )
n( i )
(2n ) N /2
n (i )
1/2

N n

for large values of n. Taking logarithms to give the additive property required
for a measure and using the law of large numbers gives

log C
p(i )log2 p(i ) = ( S ) (3 )
n n
N

where (S) is the desired measure and represents the entropy per stage.
We choose base 2 for the logarithms for two reasons; first it ties in with
information theory. Second, it can be thought of as counting the sources of
uncertainty when applied to real world systems, in the sense of their being
either present or absent.
However, successor states do not have to be independent of predecessor
states. It can be shown that the various states s(i) can be linked to any number r
of preceding states[3], through the existence of conditioning and that sequences
of states can be categorized by the length of their memory r. Moreover, by a
suitable choice of the defining probability space, the sequences can all be shown
to have the Markov property, i.e. each (suitably defined) state only depends on
its predecessor. Such sequences will be referred to as trajectories.
The Markov property therefore means that only two categories need to be
considered; sequences with values of r = 1, i.e. having the Markov property
(trajectories), and sequences of values r = 0. These latter represent cases where
each observed outcome is independent of every other one, such as the results Complexity as
obtained from rolling a die repeatedly. an aid to
strategy
Applying the model to a manufacturing process
A more appropriate model for manufacturing allows the system to have a
countably infinite number of states, instead of restricting it to N as in (3). Such
systems have finite entropy where the state probabilities follow a geometric 29
probability law which also represents the maximum entropy the system can
take.
We consider the manufacturing process to consist of an input of items, a
process (for example a work centre) and an output. We let the states of the
process be the number of items present at any one time. If it is stationary then
Downloaded by FUDAN UNIVERSITY At 08:18 06 December 2016 (PT)

the input on average equals the output. If it is capable of processing at a rate of


but actually processes at a rate ( < ), then because of the ergodic property,
the probability of the process being occupied is /. Slotting these results into
the geometric distribution shows that the process will take its maximum
entropy when the probability of there being i items present is given by
i

p(i ) = 1 ( 4a )

with an entropy of

1
H (S ) = ( ) log2 1 + log2 ( 4 b)
( )

H(S ) represents a process S with countably infinite states. Equation (4a) is the
well known formula for a simple queue with Poisson arrivals at a rate and
exponentially distributed service times of 1. The significance of the two
equations is, first, that (4a) was derived without any assumptions about the
nature of the queue; in fact not even that there is a queue in the accepted sense.
Second, the simple queuing model forms an upper bound, in entropy terms, for
any queue found in practice.
Moreover equation (4b) increases as approaches , in other words as the
process nears its capacity. Thus the busiest systems (the bottlenecks) are also
the most complex. However most significantly, the term 1/() represents the
expected time in the process and 1/( ) 2 its variance, so that increasing
complexity means longer lead times and less reliable systems. This is one of the
key reasons why measuring complexity is valuable in assessing operational
strategy.
Equations (4a) and (4b) can be generalized to networks of queues simply by
adding the complexities from each of the processes. This is made possible
IJOPM because the entropy of two or more independent processes is always greater
15,5 than the entropy of the same two processes where dependency exists. An
analytical version of (4a) for networks, derived from queuing theory, can be
found in [4] for example.
In practice, queues are of finite length even if it does not always seem like
that. Therefore knowing (4b) is an upper bound we can write;
30
H(S ) = H(ST ) + H(S NT ) (5)
where the suffixes stand for tolerated and non-tolerated. The former are states
we tolerate and can be thought of as tolerance limits. Non-tolerated states
represent the process running out of tolerance.
As the above analysis has reintroduced the Markov property we can surmise
Downloaded by FUDAN UNIVERSITY At 08:18 06 December 2016 (PT)

that it applies to states with r = 1. We shall call these programmable states as


they could appear in a programme. We may readily extend (5) to include states
with r = 0, as they can be shown to be independent of the r = 1 states. These we
call non-programmable states. They are any Bernoulli type processes such as
breakdowns, rejects and reworks and by their very independence can be
measures for each work centre. Non-programmable states have the effect of
reducing the time available for processing useful work, i.e. they can thus create
bottlenecks.
We are interested in measurement. Therefore we wish to present (5) in a form
that facilitates it. First we can minimize the amount of observation required by
recognizing that we are only interested in the process when it strays outside of
its tolerance bands. We replace H(S T ) by a single term representing the
probability (P ) of the system being in control. Moreover we separate the
programmable states from the non-programmable states and split the former
into activities on the workstation and items waiting in the queue. Equation (5)
then takes the ungainly form




P log2 P + (1 P )log2 (1 P ) +


H ( S ) = (1 P ) pq ij log2 pij + pm ij log2 pij
q m
+ (6 )
M b N b M q N jq
j


p ij log2 p ij
b b

M m N j m

where p q are the probabilities of queues of varying length (>1), pm are the
probabilities of having a queue of one or zero; the make states of the operation
and pb are the probabilities of Bernoulli states, the non-programmable states.
The independence of the queues (and of the Bernoulli states) allows each Complexity as
resource to be measured independently, for M resources. Moreover Njq + Njm + an aid to
Njb = Nj , the number of states at resource j. The presence of the square bracket
with the (1 P ) term outside arises because the tolerated and non-tolerated strategy
states are not independent.
The formula (6) is called the dynamic complexity. The term dynamic refers
to the system rather than to the statistical processes, since they are assumed 31
stationary.
A useful development of the formula is to consider the value of the make
component over a long time. This involves putting P = 0, since there is no
control element; pbs to zero, as breakdowns are dynamic issues; and pqs to zero
as only the make states are involved. The final step is to let time tend to infinity.
Downloaded by FUDAN UNIVERSITY At 08:18 06 December 2016 (PT)

Infinity has usually been taken to be a year! Equation (6) then reduces to:

H = pij log2 pij (7 )


M Nj

Equation (7) is called the static complexity. It reflects the complexity of the
structure of the operation, and demonstrates that such complexity only has
meaning when considered in terms of the demand placed on it.
It must be emphasized that the outline given both in the last and in this
section is very sketchy. Any reader who is interested in the full development of
the ideas should see Frizelle[5].

Interpreting the model


Equation (6) can hardly be described as user-friendly. Therefore much time was
spent in ensuring the rigour of its derivation; the average operational manager
can hardly be expected to entertain it unless there are sound analytical reasons
for doing so. A similar amount of time has been spent in its interpretation. This
section will look at two aspects:
(1) How can the measure be interpreted, in operational terms?
(2) What does the measure say about operations?

Interpreting the measure in operational terms


A good starting-place is to look at equation (7). Think of each of the processes
merely as an operation on a resource. Then the resources are either working or
not working and there are two states, make and idle. It is a well-known fact
that entropy of finite systems takes its maximum value when all states are
equiprobable. In this simple case the probability p associated with each state is
1/2. Hence each resource returns a value of 1 and the static complexity is equal
to M, the number of resources. This was one of the motivations for choosing a
base of 2 in the first place.
In that sense the complexity measure can be thought of as a count of the
number of resources but assigning each one a weight of p log2 p. To arrive at
IJOPM a physical interpretation of this weight, again consider a resource with 1, 2, 3,
15,5 parts crossing it. As there is always an idle state, there will be 2, 3, 4, states
per resource. For simplicity, one again assumes equiprobable outcomes, so the
weighting factor for the effect of the operations reduces to log N, where there are
N states; or N 1 operations (one for each product).
The weight can be seen as a measure of the obstacle to flow represented by
32 the presence of the operations. Thus, where there are no operations, i.e. the
resource (process) is idle, there is no resistance to flow and the weight is zero.
Where there is one operation, then there is one product involved and the
resistance to flow is one; the presence of that product will impede flow. Where
there are two operations, then the resistance to flow will be 1.58. This represents
the blocking effect of the first product plus the incremental blocking effect of the
Downloaded by FUDAN UNIVERSITY At 08:18 06 December 2016 (PT)

second.
What are the grounds for claiming that? The first is intuitive. Adding each
new product will have a decreasing additional impact. After all if there were
1,000 products involved then adding another one would have little effect on the
overall impediment to flow. Following such intuitive reasoning, the marginal
impact of a second product will be to increase the impediment by 50 per cent or
1/2, so that the total impediment will be 1.5 (as opposed to the theoretical value
of 1.58).
However, there is also a sound theoretical reason and that comes from the
well-known fact that:
x dy
log e x = 1 y
which is the summation of the (infinitesimal) incremental change divided by its
total value up to that point. Hence the weighting is consistent with the notion of
resistance to flow caused by other operations present on the resource. One must
be careful not to say it is the resistance to flow, for entropy is a general concept
which has simply been applied in a particular context.
In information theory, the unit is the bit. However, an alternative unit is
suggested, which gives some sense of what is being measured. At its simplest
the measure counts the number of product processes. Since the weighting gives
the incremental effect of adding more products per process, the unit being
proposed is the equivalent product processes or epp.

What does it say about operations?


The major conclusion must be that complexity has the effect of impeding flow
by building ever bigger obstacles. This has the effect of extending lead times
and making the operation less predictable. Equation (6) says in addition that an
operation is either in or out of control. Control can be thought of as guiding the
system through the obstacles by keeping within predetermined limits. There
are two ways to be out of control with the appearance of either programmable
or non-programmable states.
The former were identified as states that could appear in a programme, but Complexity as
whose appearance is nonetheless unwelcome (non-tolerated). One example an aid to
might be the plant making a part that was not asked for. A more subtle example strategy
is a variation in queue length. What does control mean in the latter case? It has
to be a requirement to operate with minimal queues, i.e. queues of one or zero
with even queues of one rarely appearing. Non-programmable states, by their
nature, cannot be influenced by direct intervention and hence can never be 33
directly controlled.
Equation (6) has another useful property. Its recursive structure means that
each of the terms within the square bracket is capable of being broken out into
more detail while retaining the same mathematical structure. Provided that the
inner terms form a hierarchy, it can be shown that the measure converges under
Downloaded by FUDAN UNIVERSITY At 08:18 06 December 2016 (PT)

quite wide conditions with the addition of ever greater levels of detail. This
hierarchical structure allows for comparative measures to be taken, as
comparable levels of detail can be specified from the outset. Thus different
systems can be compared or the same system can be assessed at different points
in time. It is therefore meaningful to talk of one system being more or less
complex than another.
There are two other conclusions that can be drawn from equation (6) which
tend to underline its validity as a model for a manufacturing system. The first
is that, if the control term (P ) were applied twice, with everything else staying
the same, then the dynamic complexity would increase. This is equivalent to
adding a second layer to the control bureaucracy. In other words, more
bureaucracy can only be justified if its introduction eliminates more disorder
than its arrival automatically creates.
The second conclusion comes from the observation that adding more terms
within the square bracket, all other things being equal, can only increase the
overall value of the dynamic complexity. This means that control can only be
exercised downwards.

Complexity reduction as a strategic goal for the operation


The structure of (6) and (7) gives rise to the idea that complexity reduction can
be made a strategic goal of the operation. Equation (6) demonstrates that
control and complexity are two sides of the same coin. Simply increasing P will
reduce the overall value of H. However, it does not directly affect the term within
the square bracket. In that sense it masks the inherent complexity of the
process. Therefore enforcing better control is one option but it has the
disadvantage of turning a blind eye to complexity rather than getting rid of it.
Moreover, there is a limit to what can be done since non-programmable states
are also non-tolerated states (or should be since to say otherwise would be to
tolerate breakdowns, rejects, etc.). Put another way, P can never be made equal
to 1 where non-programmable states are present.
An alternative course of action is to attack the terms within the square
bracket. Programmable states are amenable to direct outside intervention;
better scheduling, queue control and the introduction of JIT are practical
IJOPM examples of tools that are available. On the other hand, non-programmable
15,5 states can only be attacked indirectly by pinpointing the causes. For example, a
high level of rejects and reworks might suggest the introduction of a total
quality programme.
However, equation (7), the static complexity equation, suggests a third route
to complexity reduction, in this case through simplification. Fewer processes, a
34 smaller range, fewer components will reduce the static complexity and impact
the dynamic both through a reduction in the number of terms in the inner
summation, and through a reduction in the number of processes as represented
by the outer summation. Simplification reduces the inherent complexity within
the structure and represents a step change. Changes to static complexity will,
very often involve capital expenditure.
Downloaded by FUDAN UNIVERSITY At 08:18 06 December 2016 (PT)

The fact that all of the elements are measured on a common basis means that
the measure gives a uniform way of comparing the impact that each has on the
whole. Thus courses of action can be prioritized based on how much complexity
each source introduces, starting with the question of whether complexity
reduction is achieved through better management of the existing operation or
through simplifying the structure. The sums of money are potentially large.
The introduction of a scheduling system or JIT, for example, is costly.
Structural simplification is likely to involve significant capital outlay so that
any decision needs to be taken in the context of an overall strategy.

Measurement
The measure is intended to provide manufacturing management with a
practical tool that will help them focus on key issues. It therefore deliberately
steers away from including too many theoretical models such as the analytical
form of the queuing model in equation (4a). In this way the technique is thought
to be more robust and therefore more widely applicable. However, the
requirements of the proofs for (6) and the structure of (6) itself, particularly its
recursive form, give a useful framework for undertaking a study.
At the start of the study, the area to be examined needs to be clearly
delineated. The sample size and sampling frequency are determined in advance.
The degree of detail will be dictated by the availability of data, and in particular
by the least detailed source. Each term within the square bracket in (6) must
have the same level of resolution.
In carrying out the observations, the assumption is that the system will
exhibit reasonably stationary properties over relatively short periods, such as
two weeks. This seems to have been true in the three operations studied to date.
The work involves observing queues and the states of the resources, e.g.
making, being set up, out of production or idle, at equal intervals over the
predetermined period. Much leg work is required and a study needs at least two
people to carry it out. From such data the probabilities can be estimated in
terms of inferred occupancy times, and the dynamic complexity is calculated.
The static complexity has been derived directly from the companys data-
base. It takes the bills of material, routeings, and work centres along with the
demand pattern over the year, to determine the loadings. The period of a year Complexity as
has been chosen to eliminate seasonality. This procedure has the added an aid to
advantage that data accuracy and completeness can be checked at the same strategy
time. The findings from that exercise alone can be salutary for management.
If either the static or the dynamic data set covers periods of really abnormal
working, then the data relating to them must be excluded from the analysis. In
such conditions the assumptions of stationarity are invalid. 35
Results
Three trials have been carried out to date. Each one was done at a distinct site
in order to cover a spectrum of manufacturing processes. Their goal has been to
prove the practicality of the technique, develop the supporting software and,
Downloaded by FUDAN UNIVERSITY At 08:18 06 December 2016 (PT)

above all, to ensure that the results were meaningful and useful to the
management of the factories[6].

Trial 1
The first was carried out in a conventional machining section of a production
line. It consisted of five processes comprising 19 machines or workstations with
a parts list of 352 items. The products were large, precision-engineered
components. The perceived problems were an inability to meet schedule
through lack of capacity and quality problems.
The static complexity was calculated to be 95 equivalent product processes.
This can be seen roughly as follows. With a five-stage process there will be
around three machines per stage. These can be thought of as providing the
equivalent of three parallel lines, and in fact there was a great deal of
interchangeability between machines. On that basis there would be 117 items
per line, whose log is 6.87. Multiplying by 15, the number of independent
processes gives a static complexity of 103 epp, fairly close to the observed figure
of 95. However, it is dangerous to assume that the figure can always be
approximated to in this way. A reasonably close result was obtained here
because of the structure of that particular operation. In other cases it does not
work, as will be seen in the next two examples. The correct procedure must
always be followed.
The dynamic complexity overall came out at 115 epp, so the dynamic element
adds very little to the static core. In fact the bar chart in Figure 1 shows how
the elements are made up, with the dynamic contributions shown merely as the
amount added to the static.
The obvious conclusion is to look at simplification. However, that was not
feasible as the majority of the static complexity came from the product range
itself, and reducing that was not an option. That left programmable states as
the next area to consider. Two issues emerged. The first was that there was a
high idle component in the figure, and second that there was a considerable
amount of unplanned subcontracting. This latter revealed itself by high queue
variation at the points where parts were sent out from, and received back into
manufacturing.
IJOPM Test 1

15,5 100
80
60
40
36 20
0
Complexity
Key:
Static
Figure 1.
Downloaded by FUDAN UNIVERSITY At 08:18 06 December 2016 (PT)

Bar chart showing the Dynamic programmable (incremental)


composition of the
complexity (test 1) Dynamic non-programmable (incremental)

The high idle component appeared to be caused by labour shortages. Attacking


this was likely to improve due date performance. Additional costs could be
saved by ending subcontracting. The example highlights the fact that high
complexity merely gives a starting-place for the analysis; it may not always be
the one to attack.

Trial 2
The second trial was carried out in an NC machine shop consisting of 35
processes, 59 machines or workstations processing 350 part numbers. The
operation makes small precision-machined components for high technology
applications.
Here the static complexity, shown in Figure 2, came out at 96.4. At first sight
this is surprising as it is little higher than the preceding case, even though there
are far more machines and processes involved. However, the number of part
numbers is almost identical to Trial 1. Moreover, in contrast with the previous
exercise, every part does not go through every process. It shows how
misleading rough calculations can be.
This time the dynamic complexity came out much higher than the static at
160 epp, with programmable states being the biggest contributor at 78.4 epp.
These results are shown in Figure 2 (note that the total dynamic complexity is
shown, not the incremental figure given in Figure 1).
The results indicated that operational problems predominated, and here was
the place to start. The high contribution from programmable states suggested
difficulties with volatile mix, batching and/or unstable flow. Further analysis of
the figures revealed the biggest issue to be queue stability, while the inspection
function showed up as the bottleneck.
It was proposed that the immediate solution was to introduce queue control,
to calm the turbulence in the flow and to relieve pressure at inspection. The
result would be more reliable delivery promises, and a reduction in work in
Test 2 Complexity as
200
an aid to
150 strategy
100

50
37
0
Complexity
Key:
Static
Downloaded by FUDAN UNIVERSITY At 08:18 06 December 2016 (PT)

Figure 2.
Dynamic total Bar chart showing the
composition of the
Dynamic programmable complexity (test 2)

progress inventories. In addition, a query was raised as to why so much


inspection was needed.
Longer term the high static complexity was seen as a problem. There were
opportunities for simplification, which did not exist in Trial 1, because many
machines were under-utilized.
Since completing the study the management has reported that schedule
adherence has jumped from 64 to 94 per cent[7].

Trial 3
This was carried out on a production line consisting of machining and
assembly. There were 74 machines or work centres, involving 57 processes and
126 active part numbers, the whole operation being controlled by a kanban
process. The factory made high technology aerospace defence systems and the
exercise covered the entire operation.
Figure 3 shows the results from the exercise. Static complexity was
calculated to be 75 epp. This is not surprising because many fewer part
numbers are involved, even though there are high numbers of machines.
However, they are more intensively used than in Trial 2, so the static does not
drop proportionately.
The dynamic complexity came out at 145 epp, which was a surprise as the
assumption was that the kanban system would make for even flow. Indeed the
manufacturing management prided themselves on their operational control.
Here again the queue element showed up as the largest contributor. The results
are shown in Figure 3.
A more detailed analysis of the figures revealed that the machining area was
indeed well controlled, but not so the assembly. Much of the problem arose
through poor synchronization between the two. Moreover, as with the previous
IJOPM trial, the bottleneck was at inspection. It was being caused by rework problems
15,5 that the plant was experiencing. While the latter conclusion was no surprise to
management, the discovery of the lack of synchronization was. There were
opportunities for inventory savings as a result.
As with Trial 2, there were longer-term opportunities for simplifying the
process. This shows up both in the static complexity and, to a degree, in the
38 queuing figure. As each process contributes to the overall, the more processes
the higher the figure, even if individual contributions are relatively small. There
is, after all, a well-known drawback with kanban processes that in long
routeings they tend to create rather than diminish work in progress.

Conclusions
Downloaded by FUDAN UNIVERSITY At 08:18 06 December 2016 (PT)

In both a theoretical and practical sense, entropy does seem to provide a


measure for the complexity of an operation. Moreover, the structure of the
measure gives a basis for analysis as well as suggesting that the major
operational decision facing manufacturing is to choose between better
management of the process and its simplification. Its strength, in a practical
sense, is its ability to provide a basis of comparison between quantities that
previously could not be compared. As a result it allows alternative courses of
action to be prioritized. The likely cost of any one of these means that a decision
on which to follow must form part of any strategy. Therefore complexity
reduction can form one plank in the development of an operational strategy.
It also appears to give an accurate picture of what happens on the shopfloor.
In all three tests, the analysis was done first and then presented to management.
Even where results came as a surprise to them, there was no disagreement with
the conclusions. As a further safeguard, Trial 3 was conducted in the absence of

Test 3
150

100

50

0
Complexity
Key:

Figure 3. Static
Bar chart showing the Dynamic total
composition of the
complexity (test 3) Dynamic queue
one of the researchers. He was then given the results to interpret blind. The Complexity as
conclusions given above were based on this analysis. an aid to
Future work will look in more detail at the opportunities provided by static strategy
complexity to help in the planning and introduction of new products. Two small
exercises have been carried out and gave encouraging results. Also the
extension of the measure for use in other types of system will be explored.
39
References
1. Parzen, E., Modern Probability Theory and its Applications, John Wiley & Sons, New York,
NY, 1960.
2. Taylor, A., General Theory of Functions and Integration, Blaisdell Publishing Company,
Waltham, MA, 1965.
3. Cox, D. and Miller, H., The Theory of Stochastic Processes, Methuen, London, 1965.
Downloaded by FUDAN UNIVERSITY At 08:18 06 December 2016 (PT)

4. Kelly, F., Reversibility and Stochastic Networks, Wiley & Sons, Chichester, 1979.
5. Frizelle, G., An entropic measure for complexity in Jackson networks, Working paper in
manufacturing, No. 95/2, Cambridge University, 1991.
6. Woodcock, E., Case experience complexity results, presentation to Manufacturing
Forum, Churchill College, Cambridge, Manufacturing Engineering Group, University of
Cambridge, 1993.
7. Bowman, I., Complexity could help you make decisions, Manufacturing Systems,
September 1994.
This article has been cited by:

1. Timothy L. Smunt, Sanjoy Ghose. 2016. An Entropy Measure of Flow Dominance for Predicting Operations Performance.
Production and Operations Management 25:10, 1638-1657. [CrossRef]
2. K. Efthymiou, D. Mourtzis, A. Pagoropoulos, N. Papakostas, George Chryssolouris. 2016. Manufacturing systems complexity
analysis methods review. International Journal of Computer Integrated Manufacturing 29:9, 1025-1044. [CrossRef]
3. Thawani Mpatama Sanjika, Carel Nicoolas Bezuidenhout. 2016. A primary influence vertex approach to identify driving
factors in complex integrated agri-industrial systems an example from sugarcane supply and processing systems. International
Journal of Production Research 54:15, 4506-4519. [CrossRef]
4. Esra Ekinci Department of Industrial Engineering, Dokuz Eylul University, Izmir, Turkey Adil Baykasoglu Department of
Industrial Engineering, Dokuz Eylul University, Izmir, Turkey . 2016. Modelling complexity in retail supply chains. Kybernetes
45:2, 297-322. [Abstract] [Full Text] [PDF]
5. Minakshi Kumari, Makarand S. Kulkarni. 2016. A Complexity Based Look-ahead Mechanism for Shop Floor Decision
Making. Procedia CIRP 41, 63-68. [CrossRef]
6. Vladimir Modrak, Slavomir Bednar. 2016. Entropy Based versus Combinatorial Product Configuration Complexity in Mass
Customized Manufacturing. Procedia CIRP 41, 183-188. [CrossRef]
7. Hanna Theuer, Sander Lass. 2016. Mastering Complexity with Autonomous Production Processes. Procedia CIRP 52, 41-45.
[CrossRef]
8. Vladimir Modrak, Slavomir BednarComplexity Mitigation in Collaborative Manufacturing Chains 411-419. [CrossRef]
Downloaded by FUDAN UNIVERSITY At 08:18 06 December 2016 (PT)

9. Csar Martnez-Olvera, Yasser Davizn-Castillo, Jaime Mora-Vargas. 2016. Entropy-based quantification of a products BOM
blocking effect. Production & Manufacturing Research 4:1, 175-189. [CrossRef]
10. Bugra Alkan, Daniel Vera, Mussawar Ahmad, Bilal Ahmad, Robert Harrison. 2016. Design Evaluation of Automated
Manufacturing Processes Based on Complexity of Control Logic. Procedia CIRP 50, 141-146. [CrossRef]
11. Paul T. Grogan, Olivier L. de Weck. 2015. The ISoS Modeling Framework for Infrastructure Systems Simulation. IEEE
Systems Journal 9:4, 1139-1150. [CrossRef]
12. Julie Drzymalski. 2015. A measure of supply chain complexity incorporating virtual arcs. Journal of Systems Science and Systems
Engineering 24:4, 486-499. [CrossRef]
13. Catia Barbosa, Americo AzevedoEvaluation of improvement actions impact on manufacturing operational performance
952-956. [CrossRef]
14. Eugene Levner, Alexander Ptuskin. 2015. An entropy-based approach to identifying vulnerable components in a supply chain.
International Journal of Production Research 53:22, 6888-6902. [CrossRef]
15. Kijung Park, Gl E. Okudan Kremer. 2015. Assessment of static complexity in design and manufacturing of a product family
and its impact on manufacturing performance. International Journal of Production Economics 169, 215-232. [CrossRef]
16. Stefan Bock, Filiz Isik. 2015. A new two-dimensional performance measure in purchase order sizing. International Journal
of Production Research 53:16, 4951-4962. [CrossRef]
17. Nima Hamta, M. Akbarpour Shirazi, Sara Behdad, S.M.T. Fatemi Ghomi. 2015. Modeling and measuring the structural
complexity in assembly supply chain networks. Journal of Intelligent Manufacturing . [CrossRef]
18. Jos L. Fernndez-Sols, Zofia K. Rybkowski, Chao Xiao, Xiaoshu L, Lee Seok Chae. 2015. General contractor's project
of projects a meta-project: understanding the new paradigm and its implications through the lens of entropy. Architectural
Engineering and Design Management 11:3, 213-242. [CrossRef]
19. Yunbo Lu, Lan Luo, Hongli Wang, Yun Le, Qian Shi. 2015. Measurement model of project complexity for large-scale
projects from task and organization perspective. International Journal of Project Management 33:3, 610-622. [CrossRef]
20. Florian Schttl, Udo Lindemann. 2015. Quantifying the Complexity of Socio-technical Systems A Generic, Interdisciplinary
Approach. Procedia Computer Science 44, 1-10. [CrossRef]
21. Paul T. Grogan, Olivier L. de Weck, Adam M. Ross, Donna H. Rhodes. 2015. Interactive Models as a System Design Tool:
Applications to System Project Management. Procedia Computer Science 44, 285-294. [CrossRef]
22. Qiguo Gong, Yuru Yang, Shouyang Wang. 2014. Information and decision-making delays in MRP, KANBAN, and CONWIP.
International Journal of Production Economics 156, 208-213. [CrossRef]
23. Gerry Frizelle Institute for Manufacturing, University of Cambridge, Cambridge, UK Ivian Casali Department of Production
Engineering, Universidade Federal do Expirito Santo, Espirito Santo, Brazil . 2014. Novel measures for emission reduction in
supply chains. International Journal of Productivity and Performance Management 63:4, 406-420. [Abstract] [Full Text] [PDF]
24. Konstantinos Efthymiou, Aris Pagoropoulos, Nikolaos Papakostas, Dimitris Mourtzis, George Chryssolouris. 2014.
Manufacturing systems complexity: An assessment of manufacturing performance indicators unpredictability. CIRP Journal
of Manufacturing Science and Technology 7:4, 324-334. [CrossRef]
25. H. ElMaraghy, T. AlGeddawy, S.N. Samy, V. Espinoza. 2014. A model for assessing the layout structural complexity of
manufacturing systems. Journal of Manufacturing Systems 33:1, 51-64. [CrossRef]
26. George Chryssolouris, Konstantinos Efthymiou, Nikolaos Papakostas, Dimitris Mourtzis, Aris Pagoropoulos. 2013. Flexibility
and complexity: is it a trade-off?. International Journal of Production Research 51:23-24, 6788-6802. [CrossRef]
27. Seyda Serdarasan. 2013. A review of supply chain complexity drivers. Computers & Industrial Engineering 66:3, 533-540.
[CrossRef]
28. Suja Sivadasan, Janet Smart, Luisa Huaccho Huatuco, Anisoara Calinescu. 2013. Reducing schedule instability by identifying
and omitting complexity-adding information flows at the suppliercustomer interface. International Journal of Production
Economics 145:1, 253-262. [CrossRef]
29. Sander de LeeuwVU University Amsterdam, Amsterdam, The Netherlands Ruud GrotenhuisVU University Amsterdam,
Amsterdam, The Netherlands Ad R. van GoorVU University Amsterdam, Amsterdam, The Netherlands. 2013. Assessing
complexity of supply chains: evidence from wholesalers. International Journal of Operations & Production Management 33:8,
960-980. [Abstract] [Full Text] [PDF]
30. Ali KamraniProduct Variety and Manufacturing Complexity 165-184. [CrossRef]
31. Y R Wu, L H Huatuco, G Frizelle, J Smart. 2013. A method for analysing operational complexity in supply chains. Journal
of the Operational Research Society 64:5, 654-667. [CrossRef]
32. J. Smart, A. Calinescu, L. Huaccho Huatuco. 2013. Extending the information-theoretic measures of the dynamic complexity
of manufacturing systems. International Journal of Production Research 51:2, 362-379. [CrossRef]
33. Florian Klug. 2013. The internal bullwhip effect in car manufacturing. International Journal of Production Research 51:1,
Downloaded by FUDAN UNIVERSITY At 08:18 06 December 2016 (PT)

303-322. [CrossRef]
34. Rob Dekkers and Hermann KhnleOlatunde Amoo DurowojuNorwich Business School, University of East Anglia, Norwich,
UK Hing Kai ChanNorwich Business School, University of East Anglia, Norwich, UK Xiaojun WangSchool of Economics
Finance and Management, University of Bristol, Bristol, UK. 2012. Entropy assessment of supply chain disruption. Journal
of Manufacturing Technology Management 23:8, 998-1014. [Abstract] [Full Text] [PDF]
35. Rob Dekkers and Hermann KhnleMarkus GerschbergerLOGISTIKUM, School of Economics, Upper Austria University
of Applied Sciences, Steyr, Austria Corinna EngelhardtNowitzkiLOGISTIKUM, School of Economics, Upper Austria
University of Applied Sciences, Steyr, Austria Sebastian KummerInstitute for Transport and Logistics Management, Vienna
University of Economics and Business, Vienna, Austria Franz StaberhoferLOGISTIKUM, School of Economics, Upper
Austria University of Applied Sciences, Steyr, Austria. 2012. A model to determine complexity in supply networks. Journal
of Manufacturing Technology Management 23:8, 1015-1037. [Abstract] [Full Text] [PDF]
36. Rok Vrabic, Peter Butala. 2012. Assessing operational complexity of manufacturing systems based on statistical complexity.
International Journal of Production Research 50:14, 3673-3685. [CrossRef]
37. Song Zhu, Yong XuComplexity measure of supply chain networks 2220-2224. [CrossRef]
38. S.N. SamyIndustrial and Manufacturing Systems Engineering, University of Windsor, Windsor, Canada H.A.
ElMaraghyIndustrial and Manufacturing Systems Engineering, University of Windsor, Windsor, Canada. 2012. Complexity
mapping of the product and assembly system. Assembly Automation 32:2, 135-151. [Abstract] [Full Text] [PDF]
39. Dominik T. Matt. 2012. Application of Axiomatic Design principles to control complexity dynamics in a mixed-model
assembly system: a case analysis. International Journal of Production Research 50:7, 1850-1861. [CrossRef]
40. C. Martnez-Olvera. 2012. An entropy-based approach for assessing a product's BOM blocking effect on a manufacturing
process flow. International Journal of Production Research 50:4, 1155-1170. [CrossRef]
41. K. Efthymiou, A. Pagoropoulos, N. Papakostas, D. Mourtzis, G. Chryssolouris. 2012. Manufacturing Systems Complexity
Review: Challenges and Outlook. Procedia CIRP 3, 644-649. [CrossRef]
42. S. Mattsson, P. Gullander, U. Harlin, G. Bckstrand, . Fasth, A. Davidsson. 2012. Testing Complexity Index a Method
for Measuring Perceived Production Complexity. Procedia CIRP 3, 394-399. [CrossRef]
43. C. Martnez-Olvera. 2012. WITHDRAWN: Quantifying the product's BOM blocking effect using an entropy-based
approach. International Journal of Production Economics . [CrossRef]
44. Waguih ElMaraghy, Hoda ElMaraghy, Tetsuo Tomiyama, Laszlo Monostori. 2012. Complexity in engineering design and
manufacturing. CIRP Annals - Manufacturing Technology 61:2, 793-814. [CrossRef]
45. Shiqing Yao, Zhibin Jiang, Na Li, Na Geng, Xiao Liu. 2011. A decentralised multi-objective scheduling methodology for
semiconductor manufacturing. International Journal of Production Research 49:24, 7227-7252. [CrossRef]
46. J S Baldwin, P M Allen, K Ridgway. 2011. An Evolutionary Complex Systems Decision-Support Tool for the Management
of Operations. IOP Conference Series: Materials Science and Engineering 26, 012022. [CrossRef]
47. Ping Han, Li-li Ma, Wan-jiang Wu, Chuang LiuResearch on influence factors and levels of automobile manufacturing system
complexity 186-190. [CrossRef]
48. Vlaidmir Modrak, Pavol Semanco. 2011. Complexity Assessment of Supply Chains Structure: A Comparative Study. Research
Journal of Applied Sciences 6:7, 410-415. [CrossRef]
49. Pradeep K. Jha, Rakhi Jha, Rajul Datt, Sujoy K. Guha. 2011. Entropy in good manufacturing system: Tool for quality
assurance. European Journal of Operational Research 211:3, 658-665. [CrossRef]
50. R. Vrabi, P. Butala. 2011. Computational mechanics approach to managing complexity in manufacturing systems. CIRP
Annals - Manufacturing Technology 60:1, 503-506. [CrossRef]
51. S Sivadasan, J Smart, L Huaccho Huatuco, A Calinescu. 2010. Operational complexity and suppliercustomer integration:
case study insights and complexity rebound. Journal of the Operational Research Society 61:12, 1709-1718. [CrossRef]
52. Rui Guo, Bu-Yun WangMeasuring Complexity of Modern Sea-Warfare System 539-542. [CrossRef]
53. Claudia M. Eckert, P. John Clarkson. 2010. Planning development processes for complex products. Research in Engineering
Design 21:3, 153-171. [CrossRef]
54. James S. BaldwinAdvanced Manufacturing Research Centre with Boeing, University of Sheffield, Rotherham, UK Peter M.
AllenComplex Systems Management Centre, Cranfield University, Bedford, UK Keith RidgwayAdvanced Manufacturing
Research Centre with Boeing, University of Sheffield, Rotherham, UK. 2010. An evolutionary complex systems decision
support tool for the management of operations. International Journal of Operations & Production Management 30:7, 700-720.
[Abstract] [Full Text] [PDF]
55. Filiz Isik. 2010. An entropy-based approach for measuring complexity in supply chains. International Journal of Production
Research 48:12, 3681-3696. [CrossRef]
56. Guojun Ji, Hongliu TangEnergy levels of product innovation based on quantum jump theory in supply chain clusters 1-6.
Downloaded by FUDAN UNIVERSITY At 08:18 06 December 2016 (PT)

[CrossRef]
57. S. Allesina, A. Azzi, D. Battini, A. Regattieri. 2010. Performance measurement in supply chains: new network analysis and
entropic indexes. International Journal of Production Research 48:8, 2297-2321. [CrossRef]
58. Biao Yang, Ying Yang. 2010. Postponement in supply chain risk management: a complexity perspective. International Journal
of Production Research 48:7, 1901-1912. [CrossRef]
59. Guojun JiEnergy Levels and Co-evolution of Product Innovation in Supply Chain Clusters 140-158. [CrossRef]
60. Emmanuel D. Adamides, Nikolaos Pomonis. 2009. The co-evolution of product, production and supply chain decisions, and
the emergence of manufacturing strategy. International Journal of Production Economics 121:2, 301-312. [CrossRef]
61. S. Cho, R. Alamoudi, S. Asfour. 2009. Interaction-based complexity measure of manufacturing systems using information
entropy. International Journal of Computer Integrated Manufacturing 22:10, 909-922. [CrossRef]
62. Luisa Huaccho Huatuco, Janet Efstathiou, Ani Calinescu, Suja Sivadasan, Stella Kariuki. 2009. Comparing the impact
of different rescheduling strategies on the entropic-related complexity of manufacturing systems. International Journal of
Production Research 47:15, 4305-4325. [CrossRef]
63. An-Yuan Chang. 2009. An attribute approach to the measurement of machine-group flexibility. European Journal of
Operational Research 194:3, 774-786. [CrossRef]
64. N. Papakostas, K. Efthymiou, D. Mourtzis, G. Chryssolouris. 2009. Modelling the complexity of manufacturing systems
using nonlinear dynamics approaches. CIRP Annals - Manufacturing Technology 58:1, 437-440. [CrossRef]
65. Ila ManujDepartment of Marketing and Logistics, University of North Texas, Denton, Texas, USA John T. MentzerThe
University of Tennessee, Knoxville, Tennessee, USA. 2008. Global supply chain risk management strategies. International
Journal of Physical Distribution & Logistics Management 38:3, 192-223. [Abstract] [Full Text] [PDF]
66. K. Windt, T. Philipp, F. Bse. 2008. Complexity cube for the characterization of complex production systems. International
Journal of Computer Integrated Manufacturing 21:2, 195-200. [CrossRef]
67. Csar Martnez-Olvera. 2008. Entropy as an assessment tool of supply chain information sharing. European Journal of
Operational Research 185:1, 405-417. [CrossRef]
68. S.J. Hu, X. Zhu, H. Wang, Y. Koren. 2008. Product variety and manufacturing complexity in assembly systems and supply
chains. CIRP Annals - Manufacturing Technology 57:1, 45-48. [CrossRef]
69. Ali K. Kamrani, Arun K. Adat. 2008. A simulation-based methodology for inventory analysis due to product proliferation.
International Journal of Computer Applications in Technology 32:2, 135. [CrossRef]
70. Yongyut MeepetchdeeImperial College London, Centre for Process Systems Engineering, London, UK Nilay ShahImperial
College London, Centre for Process Systems Engineering, London, UK. 2007. Logistical network design with robustness and
complexity considerations. International Journal of Physical Distribution & Logistics Management 37:3, 201-222. [Abstract]
[Full Text] [PDF]
71. Onur Kuzgunkaya, Hoda A. ElMaraghy. 2007. Assessing the structural complexity of manufacturing systems configurations.
International Journal of Flexible Manufacturing Systems 18:2, 145-171. [CrossRef]
72. Y. Wu, G. Frizelle, J. Efstathiou. 2007. A study on the cost of operational complexity in customersupplier systems.
International Journal of Production Economics 106:1, 217-229. [CrossRef]
73. Daniel J. France, Scott Levin. 2006. System Complexity As a Measure of Safe Capacity for the Emergency Department.
Academic Emergency Medicine 13:11, 1212-1219. [CrossRef]
74. Professor Richard WildingThorsten BleckerInstitute of Business Logistics and General Management, Hamburg University
of Technology, Hamburg, Germany Nizar AbdelkafiInstitute of Business Logistics and General Management, Hamburg
University of Technology, Hamburg, Germany. 2006. Complexity and variety in mass customization systems: analysis and
recommendations. Management Decision 44:7, 908-929. [Abstract] [Full Text] [PDF]
75. S. Sivadasan, J. Efstathiou, A. Calinescu, L. Huaccho Huatuco. 2006. Advances on measuring the operational complexity of
suppliercustomer systems. European Journal of Operational Research 171:1, 208-226. [CrossRef]
76. S B Yu, J Efstathiou. 2006. Complexity in rework cells: theory, analysis and comparison. Journal of the Operational Research
Society 57:5, 593-602. [CrossRef]
77. Andreas GrlerMannheim University, Mannheim, Germany Andr GrbnerMannheim University, Mannheim, Germany
Peter M. MillingMannheim University, Mannheim, Germany. 2006. Organisational adaptation processes to external
complexity. International Journal of Operations & Production Management 26:3, 254-281. [Abstract] [Full Text] [PDF]
78. K. Park, A. Kusiak *. 2005. Enterprise resource planning (ERP) operations support system for maintaining process integration.
International Journal of Production Research 43:19, 3959-3982. [CrossRef]
79. K.K.B. Hon. 2005. Performance and Evaluation of Manufacturing Systems. CIRP Annals - Manufacturing Technology 54:2,
139-154. [CrossRef]
Downloaded by FUDAN UNIVERSITY At 08:18 06 December 2016 (PT)

80. H.A. ElMaraghy, O. Kuzgunkaya, R.J. Urbanic. 2005. Manufacturing Systems Configuration Complexity. CIRP Annals -
Manufacturing Technology 54:1, 445-450. [CrossRef]
81. B.M. Arteta, R.E. Giachetti. 2004. A measure of agility as the complexity of the enterprise system. Robotics and Computer-
Integrated Manufacturing 20:6, 495-503. [CrossRef]
82. S. Kariuki, J. Efstathiou. 2003. Information Theory as a Measure of Schedule Complexity. Journal for Manufacturing Science
and Production 5:1-2. . [CrossRef]
83. Janet Efstathiou, Ani Calinescu, Guy Blackburn. 2002. A web-based expert system to assess the complexity of manufacturing
organizations. Robotics and Computer-Integrated Manufacturing 18:3-4, 305-311. [CrossRef]
84. S. SivadasanManufacturing Systems Research Group, Department of Engineering Science, University of Oxford, Oxford, UK
J. EfstathiouManufacturing Systems Research Group, Department of Engineering Science, University of Oxford, Oxford, UK
G. FrizelleInstitute for Manufacturing, Department of Engineering, Mill Lane, University of Cambridge, Cambridge, UK
R. ShiraziInstitute for Manufacturing, Department of Engineering, Mill Lane, University of Cambridge, Cambridge, UK
A. CalinescuManufacturing Systems Research Group, Department of Engineering Science, University of Oxford, Oxford,
UK. 2002. An informationtheoretic methodology for measuring the operational complexity of suppliercustomer systems.
International Journal of Operations & Production Management 22:1, 80-102. [Abstract] [Full Text] [PDF]
85. Robert MacIntoshUniversity of Glasgow, UK Donald MacLeanUniversity of Glasgow, UK. 2001. Conditioned emergence:
researching change and changing research. International Journal of Operations & Production Management 21:10, 1343-1357.
[Abstract] [Full Text] [PDF]
86. DR ALAN HARRISON. 1998. A Comparative Study of Lean Production Metrics in an Automotive Assembler. International
Journal of Logistics Research and Applications 1:1, 27-38. [CrossRef]
87. Alan HarrisonWarwick Business School, Coventry, UK. 1995. Themes for facilitating material flow in manufacturing systems.
International Journal of Physical Distribution & Logistics Management 25:10, 3-25. [Abstract] [Full Text] [PDF]
88. Luisa Huaccho Huatuco, Ani CalinescuHybrid Algorithms for Manufacturing Rescheduling 1488-1516. [CrossRef]
89. Luisa Huaccho Huatuco, Ani CalinescuHybrid Algorithms for Manufacturing Rescheduling 195-224. [CrossRef]
90. Sourav Banerjee, Raina Paul, Utpal BiswasCloud Computing: 304-324. [CrossRef]

Você também pode gostar