Escolar Documentos
Profissional Documentos
Cultura Documentos
Abstract An improved scoring tool for research and development (R&D) project evaluation and selection is presented that
ranks project alternatives based on the criteria of relevance, risk,
reasonableness, and return. The scoring algorithm explicitly incorporates tradeoffs among the evaluation criteria and calculates
a relative measure of project value by taking into account the fact
that value is a function of both merit and cost. Implementation of
the selection method in a federal research laboratory is discussed.
A comprehensive overview of the most recent R&D projectselection literature is included.
Index TermsR&D decision making, R&D management, R&D
project selection, ranking R&D projects, scoring.
I. INTRODUCTION
Authorized licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING. Downloaded on March 16,2010 at 07:02:13 EDT from IEEE Xplore. Restrictions apply.
159
TABLE I
R&D PROJECT SELECTION LITERATURE SUMMARY TABLE
Authorized licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING. Downloaded on March 16,2010 at 07:02:13 EDT from IEEE Xplore. Restrictions apply.
160
TABLE I (Continued.)
R&D PROJECT SELECTION LITERATURE SUMMARY TABLE
Authorized licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING. Downloaded on March 16,2010 at 07:02:13 EDT from IEEE Xplore. Restrictions apply.
161
TABLE I (Continued.)
R&D PROJECT SELECTION LITERATURE SUMMARY TABLE
Authorized licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING. Downloaded on March 16,2010 at 07:02:13 EDT from IEEE Xplore. Restrictions apply.
162
Authorized licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING. Downloaded on March 16,2010 at 07:02:13 EDT from IEEE Xplore. Restrictions apply.
163
Authorized licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING. Downloaded on March 16,2010 at 07:02:13 EDT from IEEE Xplore. Restrictions apply.
164
Fig. 1. Example questionnaire for use by peer review teams for assessing merit of proposed projects during R&D project-selection process.
1 6)
probability of success produces the same score (5
as a proposal with a midrange rating in each (3
3 6).
Contrast this with the results of multiplication of those same
question responses which produces the highest score when the
responses are both in the middle of the response scale (5
1 5 versus 3
3 9).
For criteria that are not tradable and that contribute independently to the quality of a proposed project, it is more
appropriate to multiply the question responses in the algorithm. Unlike adding tradable criteria, multiplying independent
criteria (or independent groupings of tradable criteria) emphasizes the importance of each one individually and minimizes
the potential selection of proposals that are extremely poor
performers in any one independent category. On a scale of
one to five, for example, a low score of one for an important
criterion would result in that criterion contributing nothing to
the proposals figure of merit.
To weight the criteria so that they reflect the desired
emphasis of the organization for a specific kind of R&D (basic
versus applied versus incremental), tradable question responses
in the same added grouping are each multiplied by a weighting
factor to reflect their relative level of importance. Multiplied
Authorized licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING. Downloaded on March 16,2010 at 07:02:13 EDT from IEEE Xplore. Restrictions apply.
165
Fig. 2. Example questionnaire for use by peer review teams for assessing merit of continuing projects during R&D project-evaluation process.
be constructed:
(1)
is the peer evaluator Likert-scale response for the
where
th question and
is the th weight in the th multiplicative or additive group. In this algorithm, scientific return is
treated as tradable with programmatic/business return, and all
return is considered tradable with probability of success. The
probability of success/return group is then multiplied by both
the relevance response and the reasonableness response. The
quantity one is subtracted to shift the range from zero to four,
and then the entire expression is divided by four to scale it
between zero and one.
The fact that this approach explicitly accounts for tradeoffs
represents an improvement over previous models because it
more readily selects projects that possess the desired set of
characteristics. A meaningful case to examine is the one
where six proposals receive identical question responses of five
fours and a single one, but where those marks are distributed
differently. Would the rankings in that case that result properly
Authorized licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING. Downloaded on March 16,2010 at 07:02:13 EDT from IEEE Xplore. Restrictions apply.
166
Authorized licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING. Downloaded on March 16,2010 at 07:02:13 EDT from IEEE Xplore. Restrictions apply.
167
TABLE II
RESULTS OF USING COMBINED ADDITIVE/MULTIPLICATIVE ALGORITHM TO DISCRIMINATE BETWEEN SIX
DIFFERENT R&D PROJECTS WITH DIFFERENTLY DISTRIBUTED BUT IDENTICAL QUESTION RESPONSE LEVELS
for this case, the final rankings are the same for both the figure
of merit and the value index calculations.
We conducted computer experiments to test whether the
combined figure of merit algorithm with tradeoffs produces
statistically different rankings than either the purely multiplicative or additive ones. Systematically varied proposal
question responses were calculated into a figure of merit
using a purely additive, a purely multiplicative, and this
combined algorithm. Spearman rank correlation coefficients
were obtained for the additive versus multiplicative algorithms
and for the additive versus combined and the multiplicative
versus combined algorithms. Similar steps were taken to obtain
value indices for the three pairs. While all three comparisons
resulted in coefficients that fell well within the range for
rejecting the null hypothesis, the coefficient for the additive
versus multiplicative comparison was 0.96, but the coefficients
for the additive versus combined and the multiplicative versus
combined algorithm were lower at 0.81.
IV. SOFTWARE DECISION SUPPORT TOOL
A software DSS was developed using Microsoft EXCEL.
The software is a custom application programmed with EXCEL macros that gives decision makers user-friendly access
to the model. The software acts as a database for the proposal
responses and calculates figures of merit and value index
results. The DSS ranks proposed R&D projects with respect
to both figure of merit and value index and indicates the
funding cutoff point for each, consistent with available amount
of funding. With the DSS, decision makers can carry out
what if scenario analyses by varying weights, algorithms,
and funding levels. The software can also be used to produce
output for presentation.
V. IMPLEMENTATION
This technique of project selection, based on both merit and
value, was developed for the internal R&D projects funding
Authorized licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING. Downloaded on March 16,2010 at 07:02:13 EDT from IEEE Xplore. Restrictions apply.
168
TABLE III
RELATIVE AND NORMALIZED WEIGHTS USED WITH (1) TO
OBTAIN RELATIVE RANKING OF BASIC SCIENCE R&D
PROJECTS AT A FEDERAL NATIONAL LABORATORY
AND
CONCLUSION
Authorized licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING. Downloaded on March 16,2010 at 07:02:13 EDT from IEEE Xplore. Restrictions apply.
169
[22] H. Gaynor, Selecting projects, Res. Technol. Manage., vol. 33, no. 4,
pp. 4345, 1990.
[23] K. Golabi, Selecting a group of dissimilar projects for funding, IEEE
Trans. Eng. Manag., vol. 34, pp. 138145, Aug. 1987.
[24] P. M. Goldstein and H. M. Singer, A note on economic models for
R&D project selection in the presence of project interactions, Manage.
Sci., vol. 32, no. 10, pp. 13561360, 1986.
[25] D. L. Hall and A. Nauda, A strategic methodology for IR&D project
selection, in Proc. 1988 IEEE Engineering Management Conf., 1988,
pp. 5966.
[26] S. W. Hess, Swinging on the branch of a tree: Project selection
applications, Interfaces, vol. 23, no. 6, pp. 512, 1993.
[27] J. C. Higgins and K. M. Watts, Some perspectives on the use of
management science techniques in R&D management, R&D Manage.,
vol. 16, no. 4, pp. 291296, 1986.
[28] M. G. Iyigun, A decision support system for R&D project selection
and resource allocation under uncertainty, Project Manage. J., vol. 24,
no. 4, pp. 513, 1993.
[29] X. Y. Jin, A. L. Porter, F A. Rossini, and E. D. Anderson, R&D
project selection and evaluation: A microcomputer-based approach,
R&D Manage., vol. 17, no. 4, pp. 277288, 1987.
[30] R. Khorramshahgol and Y. Gousty, Delphic goal programming (DGP):
A multi-objective cost/benefit approach to R&D portfolio analysis,
IEEE Trans. Eng. Manag., vol. 33, pp. 172175, Aug. 1986.
[31] D. F. Kocaoglu and M. G. Iyigun, Strategic R&D project selection and
resource allocation with a decision support system application, in Proc.
1994 IEEE Int. Engineering Management Conf., pp. 225232.
[32] R. N. Kostoff, A cost/benefit analysis of commercial fusion-fission
reactor development, J. Fusion Energy, vol. 3, no. 2, pp. 8193, 1983.
, Evaluation of proposed and existing accelerated research pro[33]
grams by the office of naval research, IEEE Trans. Eng. Manag., vol.
35, pp. 271279, Nov. 1988.
[34] F. Krawiec, Evaluating and selecting research projects by scoring,
Res. Manage., vol. 27, no. 2, pp. 2226, 1984.
[35] C. E. Kruytbosch, The role and effectiveness of peer review, in
The Evaluation of Scientific Research, D. Evered and S. Harnett, Eds.
Chichester, UK: Wiley, 1989, pp. 6985.
[36] Y. Kuwahara and T. Yasutsugu, An empirical view of a managerial
evaluation of overall R&D cost-effectiveness, in Proc. 1988 IEEE
Engineering Management Conf., 1988, pp. 6771.
[37] J. Lee, S. Lee, and Z. T. Bae, R&D project selection: Behavior and
practice in a newly industrializing country, IEEE Trans. Eng. Manag.,
vol. 33, pp. 141147, Aug. 1986.
[38] M. J. Liberatore and G. J. Titus, The practice of management science in
R&D project management, Manage. Sci., vol. 29, no. 8, pp. 962975,
1983.
[39] M. J. Liberatore, A decision support system linking research and
development project selection with business strategy, Project Manage.
J., vol. 19, no. 5, pp. 1421, 1988a.
, An expert system for R&D project selection, Math. Comput.
[40]
Modeling, vol. 11, pp. 260265, 1988b.
[41] M. J. Liberatore, An extension of the analytic hierarchy process for
industrial R&D project selection and resource allocation, IEEE Trans.
Eng. Manag., vol. 34, pp. 1218, Feb. 1989.
[42] G. Lockett, B. Hetherington, P. Yallup, M. Stratford, and B. Cox,
Modeling a research portfolio using AHP: A group decision process,
R&D Manage., vol. 16, no. 2, pp. 151160, 1986.
[43] G. Lockett and M. Stratford, Ranking of research projects: Experiments
with two methods, OMEGA, vol. 15, no. 5, pp. 395400, 1987.
[44] P. M. Maher and A. H. Rubenstein, Factors affecting adoption of a
quantitative method for R&D project selection, Manage. Sci., vol. 21,
no. 2, pp. 119129, 1974.
[45] T. F. Mandakovic and W. E. Souder, Experiments with microcomputers
to facilitate the use of project selection models, J. Eng. Technol.
Manage., vol. 7, no. 1, pp. 116, 1990.
[46] J. P. Martino, Technological Forecasting for Decision Making, 3rd ed.
New York, NY: McGraw-Hill, 1993, pp. 96105.
[47] A. Mehrez, Selecting R&D projects: A case study of the expected
utility approach, Technovation, vol. 8, pp. 299311, 1988.
[48] J. R. Moore, Jr. and N. R. Baker, An analytical approach to scoring model designApplication to research and development project
selection, IEEE Trans. Eng. Manag., vol. EM-16, pp. 9098, Aug.
1969a.
[49]
, Computational analysis of scoring models for R and D project
selection, Manage. Sci., vol. 16, no. 4, pp. B-212B-232, 1969b.
[50] D. P. Newton and A. W. Pearson, Application of option pricing theory
to R&D, R&D Manage., vol. 24, no. 1, pp. 8389, 1994.
[51] M. Oral, O. Kettani, and P. Lang, A methodology for collective
evaluation and selection of industrial R&D projects, Manage. Sci., vol.
Authorized licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING. Downloaded on March 16,2010 at 07:02:13 EDT from IEEE Xplore. Restrictions apply.
170
Anne DePiante Henriksen received the Ph.D. degree in chemical physics from the University of
Virginia, Charlottesville, and the M.B.A. degree in
operations/management science from the University
of New Mexico, Albuquerque.
She has worked at the Los Alamos National Laboratory in the Technology and Safety Assessment
Division, the Financial Operations Division, and the
Chemical and Laser Sciences Division. She was
a Visiting Assistant Professor with the Anderson
Graduate School of Management at the University
of New Mexico for one year, teaching courses in management of technology.
Currently, she is an Associate Professor of manufacturing and engineering in
the College of Integrated Science and Technology, James Madison University,
Harrisonburg, VA. Her research interests include technology assessment,
object-oriented modeling and simulation, enterprise integration, decision analysis, and environmental remediation technology analysis. Her current research
projects include hierarchical, atoms-to-enterprise modeling and simulation of
the integrated steel industry, and hierarchical modeling and simulation of
physiological systems.
Dr. Henriksen is a member of the International Association for Management
of Technology (IAMOT), the Institute for Operations Research and Management Science (INFORMS), the Society for Computer Simulation, the Decision
Sciences Institute, and the Production and Operations Management Society.
Authorized licensed use limited to: CENTRE FOR DEVELOPMENT OF ADVANCED COMPUTING. Downloaded on March 16,2010 at 07:02:13 EDT from IEEE Xplore. Restrictions apply.