Escolar Documentos
Profissional Documentos
Cultura Documentos
a Department of Accounting, Management Information Systems and Marketing. Clarkson University, Potsdam, NY 13699-5795, USA b Department of
Accounting and lnfi)rmation Systems, School of Business, Indiana University, 1309E. lOth Street, Bloomington, IN 47405, USA
Abstract
An auditor considers a tremendous amount of data when assessing the risk that the internal control (IC) structure of an entity will fail to
prevent or detect significant misstatements in financial statements. The myriad of relationships between 1C variables that must be identified,
selected, and analyzed often makes assessing control risk a difficult task. While some general procedures and guidelines are provided, audit
standards dictate no specifically set procedures and rules for making a preliminary control risk asscssmcnt (CRA). Rather, the proccdures and rules
are left mostly to auditor judgment. This paper considers the appropriateness of applying artificial intelligence (A1) techniques to support this audit
judgment task. It details the construction of a prototype expert network; a n integration of an expert system (ES) and a neural network (NN). The
rules contained in the ES model basic CRA heuristics, thus allowing for efficient use of well-known control variable relationships. The NN provides
a way to recognize patterns in the large number of control variable inter-relationships that even experienced auditors cannot express as a logical set
of specific rules. The NN was trained using actual case decisions of practicing auditors. © 1997 Elsevier Science B.V.
Keywords: Expert network; Neural network; Expert system; Control risk assessment; Auditing
variables in a problem increase and the speci- ficity and measurability of
efficacy relation- ships diminishes, difficulties are exacerbated. The
1. Introduction ure suggests that builders of systems to support complex decision-making
draw from and at- tempt to combine a multitude of paradigms (for
Many decision-making tasks do not lend them- selves to formulation
le, decision support, neural networks, case- based reasoning, and models
through the sole use of quanti- tative models, nor simple intuitive problemules or objects) (Silverman, 1995; Beulens and Van Nunen, 1988; Turban
solving (Rosenhead, 1992). Building and incorporating quali- tative Watkins,
and 1986; Yoon et al., 1993). This paper analyses the process of
quantitative reasoning and modeling into a decision-aiding system is a challenge
ng a prototype intelligence-based system for application to a com- plex
for practition- ers and researchers (Gupta, 1994; Silverman, 1995; I.iberatore
m within
and the field of auditing. The proto- type system is constructed as a
Stylianou, 1993, 1995). As the num- on-support tool for auditors analyzing the internal control (IC) struc-
0377-2217/97/$17.00 © 1997 Elsevier Science B.V. All rights reserved. PII S0377-221 7(97)001
25-2
J.T. Davis et al. / European Journal of Operational Research 103 (1997) 350-372 351
paradigms - expert systems (ESs) and neural networks (NNs). A review of
the literature suggests that AI tech- nologies such as ESs and NNs have
ture of a business entity in order to derive a prelimi- nary control risk
assessment (CRA). It is designed as an expert network t hat combines two AIfound only limited use as decision aids in audit practice. More- over,
developers of such systems have generally relied on singular means them,
of but where there is developing interest and a potential for bene-
knowledge representa- tion and reasoning. Intelligence-based systems be accrued
in from their use; and (2) develop- ment of an integrated ES
auditing have been primarily employed as traditional knowledge-basedNN prototype system, the CRA Expert Network, constructed for a
systems (O'Leary and Watkins, 1991; Gray et al., 1991; Eining et al.,sed 1994; environment using tools that lead to a PC-based product
Smith, 1996), traditional statistical systems (Bell et al., 1990; Einingpriate
et al.,for field experimentation.
1994; Scott and Wallace, 1994), and as stand-alone neural network systems The remainder of the paper is organized as fol- lows. Section
(Bell et al., 1990; Coakley and Brown, cribes
1993). the nature of the audit task, providing the background to the
This paper forms a part of a wider study into the increasing on
use of
of an expert network approach. Section 2 also describes the
AI technologies in audit practice. As for any system, effectiveness plesand behind the integration of ESs and NNs and introduces the
efficiency gains are not automatic or achievable without users accept- ype ingCRA Expert Network. Section 3 details the design and
the system as a decision aiding tool. Thus, system design, development and
uction of the prototype system, including a description of an ex-
implementation are key factors in promoting acceptance (Gillett ent et al.,
conducted with practicing auditors used to obtain judgment data
1995). Although design and development include, for example, traditionalining the NN. Section 4 contains brief concluding comments.
system aspects such as the design of the user interface, etc., it may be that
the characteristics of the underlying system technologies - here ESs and NNs
- also have a major impact on user acceptance (Davis, 1994; Einingrview et al.,of domain and system architecture
1994). More specifically, users such as auditors may be leery of technology
that appears to take the decision out of their hands. However, acceptance and When a public accounting firm audits a business entity, the
use may be enhanced as users gain a better understanding tial of exists
the that the auditors may not discover material misstatements in
characteristics, benefits and limitations of these intelligence-based systems,
ntity's finan- cial statements. The likelihood of not discovering a
and the role of the tech- nology as a decision aid as opposed to a decision
icant misstatement is called audit risk. During the audit process, the
maker. ess entity's internal con- trol (IC) structure is evaluated to determine
Thus, a motivation for the development of the prototype ature,
systemtiming and extent of audit tests to be per- formed that will be
is to enable future research de- signed to explore practitioners' acceptance
effective in reducing audit risk. The purpose of the IC structure is to
(or rejcc- tion) of such systems. However, the objective of the research
nt and/or detect erroneous, fraudulent, or missing ac- counting
presented in this paper is narrower in scope. More specifically, thisctions.
paper Consequently, a complete and proper assessment of this
focuses on: (1)demon- strating the appropriateness and applicabilityure ofisancritical to a successful audit.
expert network within audit practice; and (2) describ- ing the methodology
The attempt to build intelligence-based systems to support
employed to design and de- velop the prototype system.
judgment tasks, such as control risk assessment (CRA), is dependent
This paper, therefore, contributes to the literature on the
e acquisition and representation of knowledge and heuristics gained
practical application of AI-based systems to audit judgment tasks in at least
gh audit experience (Deng, 1994). Gen- eral audit theory and heuristics
two ways: (1) consid- eration of the application of hybrid intelligent com-
e captured and represented using logical constructs. But, obtaining
puting techniques to an area of auditing that is at present making almost no
352 J.T. Davis et al./ European Journal of Operational Research 103 (1997) 350-372
RA task, the prototype system is constructed as an expert network
knowledge from auditors for making small yet im- portant distinctions flecting the integration of an expert system (ES) and neural network (NN).
between situations in the form of specific rules may be nearly impossible.the ES incorporates general audit theory and well-known control
riefly,
lationships using a logical set of explicit rules. The NN in the CRA Expert
That is, assessing the IC structure potentially requires evalu- ating hundreds
likely is used to recognize and establish patterns among the large number
of variables with thousands of possi- ble inter-relationships. It is notetwork
done by a serial, step-by-step reasoning process, but rather by recognizing
inter-related variables inherent in the task. When exercised, the ES
patterns in a given situation and reacting appropriately based on experience
ovides the user interface and evaluates the complexity and basic structure
(Dreyfus and Dreyfus, 1986; Anderson, 1983). Evaluating these an many
entity's internal controls. If the ES determines that the IC structure is
complex relationships is a difficult task even for the most experiencedufficiently complex to war- rant further analysis, the data collected during
auditors - articulating them is essentially impossible. However, ter- neural
action with the ES is fed as input into the NN. The NN is stimulated
network systems can be used to automate judgment tasks that require y thethisinput data and provides a preliminary CRA. The NN evaluates
pattern recognition. pproximately two hundred variables with thousands of potential
ter-relationships used to make a preliminary CRA. The following section
Thus, the innate complexities of the audit judg- ment task suggest
xamines
that an appropriate approach to system design is to integrate sub-systems of the principles be- hind expert networks such as the C RA Expert
ork.
et- w
AI techniques each of which address distinct aspects of domain knowledge
and reasoning (Lymer and Richards, 1995). Given the characteristics of the
,sognitive models, focused on representations of knowledge,
est that human problem solvers or- ganize knowledge through the use
2.1. Expert networks: An integrated approach
opositions. A proposition, in the form of an if-then rule, is the atomic
ing block of rule-based expert systems. Domain knowledge - generally
An expert network is one form of integrated systems designed to a sin- gle or multiple subject matter experts - is codified into a
ed from
address limitations of using a of linearly executed discrete rules (Solso, 1988; Greeno, 1973;
single representation and reasoning approach (Lymer and Richards, 1995; on-Laird, 1983; Minsky, 1986). However, significant development
Frisch and Cohn, 1991). ESs - based on symbolic computation - are best ersatarise because experts often cannot articulate knowl- edge and
model- ing structured problem domains (or aspects thereof) that conform lex relationships as discrete rules. Only when asked do they produce a
well to logical constructs. Conversely, NNs - based on numerical ication for the judgments made. Even then, the justification is a
computation - can suc- cessfully model problems that do not conform well alization that more than likely has a fair num- ber of exceptions (Deng,
to explicit logical constructs. It is when these situa- tions cross, such as). in
the CRA application, that the combined capabilities of a blended solution Propositional logic is rarely a sufficient means to represent
should be considered (Caudill, 1991; Medsker and Bailey, 1992; Medsker, lex reasoning (Kunz et al., 1987; Jackson, 1990). Rather, ESs are a
1994). By combining the deductive reasoning approach of an ES with ularly the good approach for closed-system applications that have literal
inductive approach of a NN, difficult and somewhat unstruc- tured tasks recise inputs that lead to logical outputs (MacLennan, 1993). However,
may be performed. Integration takes ad- vantage of the strengths of each are relatively inflexible since performance degrades sharply when they
type of system, while mitigating the inherent weaknesses of each when usedpplied to problems outside their original scope (Jackson, 1990; Kunz et
alone. Let us briefly examine the subsys- tems of an expert network. 987; Edwards and Connell, 1989, p. 25). Moreover, changes in domain
ledge structure and content often re-
2.1.1. Components of expert networks
J.T. Davis et al./ European Journal of Operational Research 103 (1997) 350-372 3
53
quire substantial system modifications to continue or enhance system viability.
Conversely, NNs can analyze large numbers of inter-related variables to establish patterns and char- acteristics in situations
where precise, discrete rules are not known or discernible (MacLennan, 1993). Rather than depending on explicit knowledge
as expressed by a domain expert, NNs model the im- plicit relationships in exemplar domain data. The continuous nature of
the stimulus/response approach allows for efficient modeling of complex tasks. Sim- ply put, a neural network discovers its
own numeric, continuous rules as it learns through the examples it is provided. NN systems may be able to perform certain
types or parts of audit judgment tasks that are difficult and perhaps inappropriate for the capabili- ties of other types of
intelligent systems.
An artificial NN consists of processing elements linked via weighted uni-directional signal channels called connections to
form a distributed, parallel processing architecture (Rumelhart and McClelland, 1986; Hecht-Nielsen, 1990). Each
processing ele- ment can possess local memory and carry out local- ized information processing operations. The process- ing
element or neuron is the atomic building block of the NN (Fig. 1). NN paradigms differ in how the processing elements are
connected and the manner in which the weights are updated (Markham and Rags- dale, 1995; Rumelhart and McClelland,
1986).
ANN is a statistical modeling technique. How- ever, NNs can make less stringent assumptions con- ceming independence of
variables and the shapes of underlying data distributions than other statistical techniques, e.g., regression or multiple
discriminant analysis (Rumelhart and McClelland, 1986; Lipp- mann, 1987; Lacher et al., 1995). Rich discussions of the
statistical aspects of NNs and their relation to more traditional statistical models may be found in Ripley (1993, 1994),
Cheng and Titterington (1994) and Sarle (1994).
While NN systems are flexible in terms of fault tolerance to missing or noisy data, they do not have some basic
characteristics for flexible precise com- monsense reasoning, e.g., symbolic processing or interpretation of internally stored
knowledge (Sun, 1994, p. 247). Unlike ESs, NNs have no inherent explanatory function 'module'. This has hindered
acceptance in practice as it is not clear to a non-tech- nical user how the network derived a given conclu- sion. However,
research is being conducted to ex- tract comprehensible symbolic representations from trained NNs (e.g., Craven and
Shavlik, 1996).
2.1.2. Relationship of the CRA expert network to existing systems
A number of hybrid ES and NN systems have been described in the literature. Kandel and Langholz (1992), Gallant (1993),
Medsker (1995), and Sun
Xo= (Bias) 1 Neuron or Processing Element Inputs
(Outputs from other neurons or processing nodes)
XI
Summation ] Transfer Function ! Function
e.g. Sigmoid
Fig. I. Neural network processing element structure.
Neuron or XL 1 Processing Element
Output
354 J.T. Davis et al./ European Journal of Operational Research 103 (1997) 350-372
Medsker, 1994). The ES provides
and Bookman (1995) have compiled research con- cerning the integration ser interface
of and conducts preprocessing of data that is fed into the NN.
symbolic and numeric systems. These compilations include descriptions the NN
of performs its pattern-matching activities, output is passed back to
expert networks applied to application domains such as natural language, ESsignal
for display. Advantages of loosely coupling the components of the
processing, biology and medicine, management, and engineering (see type system are twofold: (1) system development is amenable to
particu-
larly Medsker, 1995, pp. 39-56). mercially available software; t and (2) maintenance is reduced due to the
icity of the data file interface ap- proach. The disadvantage of a great deal
The relationship between the ES and the NN in t he CRA Expert
dun- dancy that usually accompanies loosely coupled sys- tem has been
Network i s structured similarly to systems reported by Lin and Hendler y(1995),
avoided in the CRA Expert Network, perhaps due the nature of the
Macln- tyre et al. (1995), and Bataineh et al. (1996). Lin and Hendler (1995) use
em more than any other reason. Before proceeding to an in-depth
a NN to classify ballistic signals and the output is passed on to the ES for further
ssion of the structure of the CRA Expert Network, the following two
processing and interpretation. Using the same neural network software employed
ns describe the knowledge - and sources of that knowledge - reflected in
by the C RA Expert Net- work, Maclntyre et al. (1995) and Bataineh et al.
S and NN components, respec- tively.
(1996) present expert networks for application within the electric utilities
industry. The ESs in these appli- cations also provide processing inputs to the
Expert system development
NNs.
Thus, while expert networks (and other forms of integrated systems)
The rules for the ES were primarily derived from structured logic
are not an explicitly new tech- nique, we have found no evidence of the
uestions found in Grant Thom- ton's internal control documentation and
technique being applied to audit judgment tasks such as CRA. However, given
ation software - Information and Control Understanding System
the characteristics of the task, there is significant scope for consideration of their
applica- bility to this domain. Furthermore, the ability to construct suchcus)systems(Grant Thornton, 1992), which is used in audit and consulting practice.
us
using PC-based tools should facilitate experimentation with them in the field. not make any risk assessment itself and is not an AI decision tool.
does
The following sections detail the design and construction of the CRA ver, Expert it provides general relation- ships among IC variables to assist the
or in structuring the preliminary CRA process. The knowl- edge and logic
Network.
focus w ere formalized in a logical rule structure. The interface provided by
S allows the auditor to document the IC data of a given client. It should be
, however, that the user interface screens that are provided to an auditor
3. Control risk assessment (CRA) expert network
control risk. The IC poten- tial variables/questions, and the possible values that In addition to identifying their respective cue set, each auditor
each may be assigned, are presented in Appendix A. The appendix reflects the (1) whether s/he planned to rely on manual controls, programmed
ecorded:
complexities and relation- ships of the questions and variables that are poten-
ontrols, and/or segregation of duties - indicating whether more tests of
tially considered by an auditor (including the 64 auditors that participated
ontrolsinas opposed to more substan- tive tests would be used to reach an
the experiment) during the CRA task. Appendix A indicates the percentage of level of audit risk; (2) the existing controls selected as key controls,
cceptable
the auditors, by case, that deemed each variable/question as relevant. e., controls the auditor intended to rely on and test to determine if the control
operating as it should; and (3) any controls that s/he felt were missing.
IC environmental features, EDP environmental features, accounting
controls, and segregation of du- ties operate as inter-dependent parts of the IC
struc- ture. While these variables were included in this experiment andCollection of CRA output data. Following analysis of the case, each
2.1.2.
uditor recorded their preliminary CRA in Infocus. lnfocus p rovides a choice
addressed by the auditors in various fashions, the complex inter-dependencies
four risk categories: maximum; slightly below maximum; moderate; a nd,
between the variables makes it virtually impossible for an auditor tofexplain
the structure of their cue set, i.e. the relationships and strengths mited.ofThese risk categories are consistent with SAS No. 55 Audit Guide. I n
relationships be- tween items in the cue set, and the impact of this structure
dditiononto the categorical response, each auditor recorded their CRA using a
the CRA. These inter-dependencies serve to illustrate the complexityoint of estimate
the (0 to 100 scale) which provides an indication of how close to
domain and the task facing an auditor when conducting a CRA. Thishe level of border their judgment would be
category
based on the numeric scale. The point estimate corre- sponds to l.theThe network training (within sample) and testing (out-of-sample or
categorical judgment responses as fol- lows: out) data sets each contained 32 observations. Once trained, using the 32
n sample observations, the NN model serves as a proxy for the typical
ledge structure used by the experienced senior auditors in the training
Risk Category Point Estimate Interval
The 32 observations in the hold-out sample were used to test the resulting
model.
Limited (LTD) 0-25 Moderate (MOD) 26- 50 Slightly
below 51-75
. Training and validation of NN
maximum (SBM)
The CRA Expert Network employs a feedforward classification
Maximum (MAX) 76-100 that was trained using the back- propagation learning algorithm
melhart and Mc-
"E
0.9 0.7
0.5
'~ 0.3
0.1 o -0.1
o O -0.3
Network Accuracy - Training Data iii, iiiii il Observations
¢ Desired Output I
.._~.... Network Output I ! ,t Difference J
,¢ o O Network
Accuracy - Testing Data
1 0.8 ~ :~:~i~i~ii~
0.2 ;~!~:i~i~!~:.~?~:~i~ :N:~! ~: i~. :~4 i~: %:: ~!~ ~i:!-~i!~!i~::~ ~-: :~ ~ ......... !~:~:;
0
Obs e rvations
r_.4k_. Desired Output [ ~ Netw ork Output
,¢ Difference
Fig. 2. Network validation.
358 J.T. Davis et al./European Journal of Operational Research 103 (1997) 350-372
Controls~ | (Gce)
I 1 Output
Fig. 3. Expert network: Knowledge base structure with embedded neural network.
data set. Testing using the hold-out sample resulted in a Pearson correlation coefficient of 0.695, with a category accuracy rate of
78%. 5 The risk category prediction errors for the test sample included: four observations that were one category higher than the
assessment by the auditor: one observation each that were one category lower than the auditor, two cate- gories higher than the
auditor, and two categories
The correlation coefficient is a measure of relationship be- tween paired observations in two data sets - here, the relationship between the auditor's point estimate
and the model's point esti- mate for each observation. Category accuracy rates were derived by comparing the NN point estimate output - based on an auditor's
input cue set - to the auditor's chosen risk category. For comparison, error rates for NN models built for financial distress applications (Bell et al., 1990; Tam and
Kiang, 1992) were in the range of 10-23%. However, it is important to note that the models in those studies were much simpler than the NN developed in this
study. In those studies there were less than 10 input variables with two response levels for output. Predicting a four category response with 210 input nodes is a
significantly more difficult problem.
lower than the auditor. Fig. 2 presents the point estimate network accuracy in relation to both the training and testing samples.
A paired t-test (see Koopmans, 1987, p. 325) was also • run on the predicted and actual network output. No statistical difference
was found for the training sample (mean difference =-0.0003, one-tail p- value = 0.48) or the testing sample (mean difference =
0.0239, one-tail p-value = 0.24).
In addition, two classification networks were de- veloped - the first employed backpropagation and the second a radial basis
function - using only the CRA classification categories and not the auditors' point estimates. The hold-out sample classification
accuracy for both these models was approximately 49%. Conversely, as described above, the point esti- mates for the trained
network in the C
RA Expert Network w ere within the classification ranges chosen by the auditors 78% of the time. Clearly, for this
data, using a point estimate within CRA category provided superior model precision.
59
J.T. Davis et al./ European Journal of Operational Research 103 (1997) 350-372 3 (a)
! ! Ask Control Environment
Questions
Yes
Yes
Set CRT >= Slightly Below Maximum
PHASE1 No
Cootro,---) Inputs = 0 I I
yes I
Set CRT >= Limited
CRA = Current Threshold
I Yes
Computer Processing Overview (CPO)
(b)
No
Y~-- i l
Accounting Controls Questionnaire: Set MC Questions = 0 Ask PC Questions
0%) Matching
Other: Driver gets signature on annotated sales invoice. (19%, 0%, 0%) Other: Cost of goods entry is calculated
from sales invoices and calculations reviewed (4%, 0%, 0%) Other: Credit Check on invoice customer (0%, 0%,
0%) Other: Compares amounts written off to aged trial balance and customer detail (0%, 0%, 0%)
References
AICPA, 1989. American Institute of Certified Public Accountants. Audit guide: Consideration of the internal control structure in a financial statement audit. New York,
NY. prior
data matching (0%, 0%, 0%) reasonableness edit(O% 13%, 4%) range check edit (0%, 0%, 0%) check
digit (0%, 0%, 4%) to a previously validated document (37%, 40%, 30%) Matching to a previously validated file
(0%, 27%, 52%) Matching to an authorized list (26%, 47%, 22%) Monthly review of bank reconciliations (0%,
accountants. Management Accounting, 18-22. Brown, C., O'Leary, D., 1994. MIT Press, Cambridge, MA. Gillett, P., Bamber, E., Mock, T., Trot.man, K.,
Introduction to Artificial Intelli- 95. Audit judgment. In: Bell, T., Wright, A. (Eds.), Auditing Practice, Research, and
ucation: A Productive Collaboration. Ameri- can Institute of Certified Public
gence and Expert Systems. Published Monograph, 1994. Caudill, M., 1991.
countants in Cooperation
CRA expert networks. Byte, 108-116. Cheng, B., Titterington, D., 1994. Neural networks:
e Auditing Section of the American Accounting Associ- ation, New York. Grant
A review
on, 1992. Information and control understanding
from a statistical perspective. Statistical Science 9, 2-54. Coakley, J., Brown.,
C. 1993. Artificial neural networks applied to ratio analysis in the analytical review system version 2.08. Gray, G., McKee, T., Mock, T., 1991. The future
of expert systems and decision support systems on auditing. Advances in
process. International Journal of Intelligent Systems in Accounting, Finance and
nting 9, 249-273. Greeno, J.G. 1973. The structure of memory and the process of
Management, 19-39. Craven, M., Shavlik, J., 1996. Extracting tree-structured represen-
problems. In: Solso, R.L. (Ed.), Contemporary Issues in Cognitive Psychology:
tations of trained networks. In: Advances in Neural Informa- tion Processing Systems 8.
yola Symposium. Wiley, New York, NY. Gupta, U., 1994. How case-based
MIT Press, Cambridge, MA. Davis, E., 1994. Effects of decision aid type on auditors'
ng solves new problems.
going concern evaluation. Audit Judgment Symposium, Co-spon- sored by Grant Thornton
Interfaces 24 (6), 110-119. Hecht-Nielsen, R., 1990. Neurocomputing.
Addison-Wesley, -65. Liberatore, M., Stylianou, A., 1993. The development manager's advisory
Reading, MA. Hedberg, S., 1995. Where's AI hiding? AI Expert, 17-20. : A knowledgebased DSS tool for project assessment. Decision Sciences 24 (5),
Hornick, K., Stinchcombe, M., White, M., 1989. Multilayer feed- forwared networks are 6. Liberatore, M., Stytianou, A., 1995. Expert support systems for new product
universal approximators. Neural Net- works 2, 359-366. Jackson, P. 1990. Introductionpment
to decision making: A modeling framework and applications. Management
Expert Systems. Addison-Wes- e 41 (8), 1296-1316. Lin, C., Hendler, J., 1995. Examining a hybrid
ley, Workingham, MA. Johnson-Laird, P.N., 1983. Mental Models: tions/sym- bolic system for the analysis of ballistic signals. In Sun, R., Bookman,
Toward a Cognitive Science of Language, Inference, and Consciousness. Harvard s.), Computational Architectures Integrating Neuraland Symbolic Processes. Kluwer
University Press, Cambridge, MA. Kandel, A., Langholz, G., 1992. Hybrid Architectures Publishers, Norwell, MA, pp. 319-348. Lippmann, R., 1987. An introduction to
mic
for Intelli- ting with neural
gent Systems. CRC Press, Boca Raton, FL. Koopmans, L. 1987. nets. IEEE ASSP Magazine, 4-22. t.ymer, A., Richards, K., 1995. A
Introduction to Contemporary Statistical -based expert system for personal pension planning in the UK. Intelligent Systems
ounting, Finance and Management 4, 71-88. Maclntyre, J., Smith, P., Harris, T.,
Methods. Duxbury Press, Boston, MA. Kunz. J., Kehler, T., Williams,
ial experience: The use of hybrid systems in the power industry. In: Medsker, L.
M., 1987. Applications develop- ment using a hybrid A1 development system. In: Hawley,
Hybrid Intelligent Systems. Kluwer Academic Publishers, Norwell, MA, pp. 57-74.
R. (I-M.), Artificial Intelligence in Programming Environments. Ellis Horwood,
nnan, B., 1993. Continuous symbol systems - The logic of
Chichester. Lacher, R., Coats, P., Shanker, C., Franklin, L., 1995. A neural network for
classifying the financial health of a firm. Euro- pean Journal of Operational Research 85
372 J.T. Davis et al. / European Journal of Operational Research 103 (1997) 350-372
nagement 4, 191-204. Tam, K., Kiang, M., 1992. Managerial applications of neural
connectionism. In: D.S. Levine, M. Aparicio IV (Eds.), Neural Networks for Knowledgeks: The case of bank failure predictions. Management Science 38 (7), 926-947.
Representation and Inference. Lawrence Erlbaum, Hillsdale, NJ. Markham, I., Ragsdale, E., Watkins, P.R., 1986. Integrating expert systems and decision support systems.
C., 1995. Combining neural networks and statistical predictions to solve the classification 10 (2), 121-136. Yoon, Y., Guimaraes, T., Swales, G., 1993. Integrating
uarterly
l neural networks with rule-based expert systems. DSS Special Issue on Artificial
problem in discriminant analysis. Decision Sciences 26 (2), 229-242. Medsker, L., 1994.
Design and development of hybrid neural network and expert systems. In: Proceedings Networks.
of
the IEEE International Conference on Neural Networks III, Orlando, FL, pp. 1470-1474.
Medsker, L., Bailey, D., 1992. Models and guidelines for integrat- ing expert systems and
neural networks. In: Kandel, A., Langholz, G. (Eds.), Hybrid Architectures for Intelligent
Sys- tems. CRC Press, Boca Raton, FL, pp. 153-171. Medsker, L., 1995. Hybrid
Intelligent Systems. Kluwer Academic
Publishers, Norwell MA. Minsky, M., 1986. The Society of Mind. Simon
and Schuster,
New York, NY. NeuralWare: Technical Publications Group, 1992. Neural
comput- ing - A technology handbook for professional ll/plus and neuralWorks explorer.
Pittsburgh, PA. O'Leary, D., Watkins, P., 1991. Review of expert systems in auditing.
Expert Systems Review for Business and Account- ing, 3--22. Ripley, B., 1994. Neural
networks and related methods for classi- fication. Journal of the Royal Statistical Society B
56 (3), 409-456. Ripley, B., 1993. Statistical aspects of neural networks. In: Barn-
dorff-Nielsen, O., Jensen, J., Kendall, S. (Eds.), Networks and Chaos - Statistical and
Probabilistic Aspects. Chapman and Hall, London, pp. 40-123. Rosenhcad, J., 1992. Into
the swamp: The analysis of social issues. Journal of Operational Research Society 43 (4),
293- 305.
Rumelhart, D., McClelland, J., 1986. Parallel Distributed Process- ing: Volumes I and II.
The MIT Press, Cambridge, MA, pp. 110-146. Sarle, W., 1994. Neural networks and
statistical models. In: Proceedings of the Nineteenth Annual SAS Users Group
International Conference. SAS Institute, Cary NC, pp. 1538- 1550. Scott, D., Wallace, W.,
1994. A second look at an old tool:
Analytical procedures. The CPA Journal, 30-35. Silverman, B., 1995.
Knowledge-based systems and the decision
sciences. Interfaces 25 (6), 67-82. Solso, R., 1988. Cognitive
Psychology, 2nd edition. Allyn and
Bacon, Boston, MA. Sun, R., Bookman, L., 1995. Computational
Architectures Inte- grating Neural and Symbolic Processes. Kluwer Academic Publishers,
Norwell, MA. Sun, R., 1994. A two-level hybrid architecture for structuring knowledge for
commonsense reasoning. In: Sun, R., Book- man, L. (Eds.), Computational Architectures
Integrating Neu- ral and Symbolic Processes. Kluwer Academic Publishers, Norwell, MA,
pp. 247-282. Sutton, S., Young, R., McKenzie, P., 1995. An analysis of potential legal
liability incurred through audit expert systems. Intelligent Systems in Accounting, Finance