Você está na página 1de 17

EUROPEAN ​JOURNAL OF

OPERATIONAL RESEARCH ELSEVIER


​ ​European Journal of Operational Research 103 (1997) 350-372

Supporting a complex audit judgment task: An


expert network approach
Jefferson T. Davis a, Anne P. Massey b,,, Ronald E.R. Lovell II a

a Department of Accounting, Management Information Systems and Marketing. Clarkson University, Potsdam, NY 13699-5795, USA b Department of
Accounting and lnfi)rmation Systems, School of Business, Indiana University, 1309E. lOth Street, Bloomington, IN 47405, USA

Abstract

An auditor considers a tremendous amount of data when assessing the risk that the internal control (IC) structure of an entity will fail to
prevent or detect significant misstatements in financial statements. The myriad of relationships between 1C variables that must be identified,
selected, and analyzed often makes assessing control risk a difficult task. While some general procedures and guidelines are provided, audit
standards dictate no specifically set procedures and rules for making a preliminary control risk asscssmcnt (CRA). Rather, the proccdures and rules
are left mostly to auditor judgment. This paper considers the appropriateness of applying artificial intelligence (A1) techniques to support this audit
judgment task. It details the construction of a prototype ​expert network; a​ n integration of an expert system (ES) and a neural network (NN). The
rules contained in the ES model basic CRA heuristics, thus allowing for efficient use of well-known control variable relationships. The NN provides
a way to recognize patterns in the large number of control variable inter-relationships that even experienced auditors cannot express as a logical set
of specific rules. The NN was trained using actual case decisions of practicing auditors. © 1997 Elsevier Science B.V.

Keywords: ​Expert network; Neural network; Expert system; Control risk assessment; Auditing
variables in a problem increase and the speci- ficity and measurability of
efficacy relation- ships diminishes, difficulties are exacerbated. The
1. Introduction ure suggests that builders of systems to support complex decision-making
draw from and at- tempt to combine a multitude of paradigms (for
Many decision-making tasks do not lend them- selves to formulation
le, decision support, neural networks, case- based reasoning, and models
through the sole use of quanti- tative models, nor simple intuitive problemules or objects) (Silverman, 1995; Beulens and Van Nunen, 1988; Turban
solving (Rosenhead, 1992). Building and incorporating quali- tative Watkins,
and 1986; Yoon et al., 1993). This paper analyses the process of
quantitative reasoning and modeling into a decision-aiding system is a challenge
ng a prototype intelligence-based system for application to a com- plex
for practition- ers and researchers (Gupta, 1994; Silverman, 1995; I.iberatore
m within
and the field of auditing. The proto- type system is constructed as a
Stylianou, 1993, 1995). As the num- on-support tool for auditors analyzing the internal control (IC) struc-

"Corresponding author. E-mail: amassey@juliet.ucs.indiana. edu.

0377-2217/97/$17.00 © 1997 Elsevier Science B.V. All rights reserved. ​PII ​S0377-221 7(97)001
25-2
J.T. Davis et al. / European Journal of Operational Research 103 (1997) 350-372 ​351
paradigms ​- ​expert systems (ESs) and neural networks (NNs). A review of
the literature suggests that AI tech- nologies such as ESs and NNs have
ture of a business entity in order to derive a prelimi- nary control risk
assessment (CRA). It is designed as an ​expert network t​ hat combines two AIfound only limited use as decision aids in audit practice. More- over,
developers of such systems have generally relied on singular means them,
of but where there is developing interest and a potential for bene-
knowledge representa- tion and reasoning. Intelligence-based systems be accrued
in from their use; and (2) develop- ment of an integrated ES
auditing have been primarily employed as traditional knowledge-basedNN prototype system, the ​CRA Expert Network, ​constructed for a
systems (O'Leary and Watkins, 1991; Gray et al., 1991; Eining et al.,sed 1994; environment using tools that lead to a PC-based product
Smith, 1996), traditional statistical systems (Bell et al., 1990; Einingpriate
et al.,for field experimentation.
1994; Scott and Wallace, 1994), and as stand-alone neural network systems The remainder of the paper is organized as fol- lows. Section
(Bell et al., 1990; Coakley and Brown, cribes
1993). the nature of the audit task, providing the background to the
This paper forms a part of a wider study into the increasing on
use of
of an expert network approach. Section 2 also describes the
AI technologies in audit practice. As for any system, effectiveness plesand behind the integration of ESs and NNs and introduces the
efficiency gains are not automatic or achievable without users accept- ype ing​CRA Expert Network. ​Section 3 details the design and
the system as a decision aiding tool. Thus, system design, development and
uction of the prototype system, including a description of an ex-
implementation are key factors in promoting acceptance (Gillett ent et al.,
conducted with practicing auditors used to obtain judgment data
1995). Although design and development include, for example, traditionalining the NN. Section 4 contains brief concluding comments.
system aspects such as the design of the user interface, etc., it may be that
the characteristics of the underlying system technologies - here ESs and NNs
- also have a major impact on user acceptance (Davis, 1994; Einingrview et al.,of domain and system architecture
1994). More specifically, users such as auditors may be leery of technology
that appears to take the decision out of their hands. However, acceptance and When a public accounting firm audits a business entity, the
use may be enhanced as users gain a better understanding tial of exists
the that the auditors may not discover material misstatements in
characteristics, benefits and limitations of these intelligence-based systems,
ntity's finan- cial statements. The likelihood of not discovering a
and the role of the tech- nology as a decision aid as opposed to a decision
icant misstatement is called ​audit risk. ​During the audit process, the
maker. ess entity's internal con- trol (IC) structure is evaluated to determine
Thus, a motivation for the development of the prototype ature,
systemtiming and extent of audit tests to be per- formed that will be
is to enable future research de- signed to explore practitioners' acceptance
effective in reducing audit risk. The purpose of the IC structure is to
(or rejcc- tion) of such systems. However, the objective of the research
nt and/or detect erroneous, fraudulent, or missing ac- counting
presented in this paper is narrower in scope. More specifically, thisctions.
paper Consequently, a complete and proper assessment of this
focuses on: (1)demon- strating the appropriateness and applicabilityure ofisancritical to a successful audit.
expert network within audit practice; and (2) describ- ing the methodology
The attempt to build intelligence-based systems to support
employed to design and de- velop the prototype system.
judgment tasks, such as control risk assessment (CRA), is dependent
This paper, therefore, contributes to the literature on the
e acquisition and representation of knowledge and heuristics gained
practical application of AI-based systems to audit judgment tasks in at least
gh audit experience (Deng, 1994). Gen- eral audit theory and heuristics
two ways: (1) consid- eration of the application of hybrid intelligent com-
e captured and represented using logical constructs. But, obtaining
puting techniques to an area of auditing that is at present making almost no
352 ​J.T. Davis et al./ European Journal of Operational Research 103 (1997) 350-372
RA task, the prototype system is constructed as an ​expert network
knowledge from auditors for making small yet im- portant distinctions flecting the integration of an expert system (ES) and neural network (NN).
between situations in the form of specific rules may be nearly impossible.the ES incorporates general audit theory and well-known control
riefly,
lationships using a logical set of explicit rules. The NN in the ​CRA Expert
That is, assessing the IC structure potentially requires evalu- ating hundreds
likely ​is used to recognize and establish patterns among the large number
of variables with thousands of possi- ble inter-relationships. It is notetwork
done by a serial, step-by-step reasoning process, but rather by recognizing
inter-related variables inherent in the task. When exercised, the ES
patterns in a given situation and reacting appropriately based on experience
ovides the user interface and evaluates the complexity and basic structure
(Dreyfus and Dreyfus, 1986; Anderson, 1983). Evaluating these an many
entity's internal controls. If the ES determines that the IC structure is
complex relationships is a difficult task even for the most experiencedufficiently complex to war- rant further analysis, the data collected during
auditors - articulating them is essentially impossible. However, ter- neural
action with the ES is fed as input into the NN. The NN is stimulated
network systems can be used to automate judgment tasks that require y thethisinput data and provides a preliminary CRA. The NN evaluates
pattern recognition. pproximately two hundred variables with thousands of potential
ter-relationships used to make a preliminary CRA. The following section
Thus, the innate complexities of the audit judg- ment task suggest
xamines
that an appropriate approach to system design is to integrate sub-systems of the principles be- hind expert networks such as the C ​ RA Expert
​ ork.
et- w
AI techniques each of which address distinct aspects of domain knowledge
and reasoning (Lymer and Richards, 1995). Given the characteristics of the
,sognitive models, focused on representations of knowledge,
est that human problem solvers or- ganize knowledge through the use
2.1. Expert networks: An integrated approach
opositions. A proposition, in the form of an ​if-then ​rule, is the atomic
ing block of rule-based expert systems. Domain knowledge - generally
An expert network is one form of integrated systems designed to a sin- gle or multiple subject matter experts - is codified into a
ed from
address limitations of using a of linearly executed discrete rules (Solso, 1988; Greeno, 1973;
single representation and reasoning approach (Lymer and Richards, 1995; on-Laird, 1983; Minsky, 1986). However, significant development
Frisch and Cohn, 1991). ESs - based on symbolic computation - are best ersatarise because experts often cannot articulate knowl- edge and
model- ing structured problem domains (or aspects thereof) that conform lex relationships as discrete rules. Only when asked do they produce a
well to logical constructs. Conversely, NNs - based on numerical ication for the judgments made. Even then, the justification is a
computation - can suc- cessfully model problems that do not conform well alization that more than likely has a fair num- ber of exceptions (Deng,
to explicit logical constructs. It is when these situa- tions cross, such as). in
the CRA application, that the combined capabilities of a blended solution Propositional logic is rarely a sufficient means to represent
should be considered (Caudill, 1991; Medsker and Bailey, 1992; Medsker, lex reasoning (Kunz et al., 1987; Jackson, 1990). Rather, ESs are a
1994). By combining the deductive reasoning approach of an ES with ularly the good approach for closed-system applications that have literal
inductive approach of a NN, difficult and somewhat unstruc- tured tasks recise inputs that lead to logical outputs (MacLennan, 1993). However,
may be performed. Integration takes ad- vantage of the strengths of each are relatively inflexible since performance degrades sharply when they
type of system, while mitigating the inherent weaknesses of each when usedpplied to problems outside their original scope (Jackson, 1990; Kunz et
alone. Let us briefly examine the subsys- tems of an expert network. 987; Edwards and Connell, 1989, p. 25). Moreover, changes in domain
ledge structure and content often re-
2.1.1. Components of expert networks
J.T. Davis et al./ European Journal of Operational Research 103 (1997) 350-372 3
​ 53
quire substantial system modifications to continue or enhance system viability.
Conversely, NNs can analyze large numbers of inter-related variables to establish patterns and char- acteristics in situations
where precise, discrete rules are not known or discernible (MacLennan, 1993). Rather than depending on explicit knowledge
as expressed by a domain expert, NNs model the im- plicit relationships in exemplar domain data. The continuous nature of
the stimulus/response approach allows for efficient modeling of complex tasks. Sim- ply put, a neural network discovers its
own numeric, continuous rules as it learns through the examples it is provided. NN systems may be able to perform certain
types or parts of audit judgment tasks that are difficult and perhaps inappropriate for the capabili- ties of other types of
intelligent systems.
An artificial NN consists of processing elements linked via weighted uni-directional signal channels called connections to
form a distributed, parallel processing architecture (Rumelhart and McClelland, 1986; Hecht-Nielsen, 1990). Each
processing ele- ment can possess local memory and carry out local- ized information processing operations. The process- ing
element or neuron is the atomic building block of the NN (Fig. 1). NN paradigms differ in how the processing elements are
connected and the manner in which the weights are updated (Markham and Rags- dale, 1995; Rumelhart and McClelland,
1986).
ANN is a statistical modeling technique. How- ever, NNs can make less stringent assumptions con- ceming independence of
variables and the shapes of underlying data distributions than other statistical techniques, e.g., regression or multiple
discriminant analysis (Rumelhart and McClelland, 1986; Lipp- mann, 1987; Lacher et al., 1995). Rich discussions of the
statistical aspects of NNs and their relation to more traditional statistical models may be found in Ripley (1993, 1994),
Cheng and Titterington (1994) and Sarle (1994).
While NN systems are flexible in terms of fault tolerance to missing or noisy data, they do not have some basic
characteristics for flexible precise com- monsense reasoning, e.g., symbolic processing or interpretation of internally stored
knowledge (Sun, 1994, p. 247). Unlike ESs, NNs have no inherent explanatory function 'module'. This has hindered
acceptance in practice as it is not clear to a non-tech- nical user how the network derived a given conclu- sion. However,
research is being conducted to ex- tract comprehensible symbolic representations from trained NNs (e.g., Craven and
Shavlik, 1996).
2.1.2. Relationship of the CRA expert network to existing systems
A number of hybrid ES and NN systems have been described in the literature. Kandel and Langholz (1992), Gallant (1993),
Medsker (1995), and Sun
Xo= ​(Bias) 1​ ​Neuron or Processing Element Inputs
(Outputs from other neurons or processing nodes)
XI
Summation ] Transfer Function ! Function
e.g. Sigmoid
Fig. I. Neural network processing element structure.
Neuron or XL 1 Processing ​Element
Output
354 ​J.T. Davis et al./ European Journal of Operational Research 103 (1997) 350-372
Medsker, 1994). The ES provides
and Bookman (1995) have compiled research con- cerning the integration ser interface
of and conducts preprocessing of data that is fed into the NN.
symbolic and numeric systems. These compilations include descriptions the NN
of performs its pattern-matching activities, output is passed back to
expert networks applied to application domains such as natural language, ESsignal
for display. Advantages of loosely coupling the components of the
processing, biology and medicine, management, and engineering (see type system are twofold: (1) system development is amenable to
particu-
larly Medsker, 1995, pp. 39-56). mercially available software; t and (2) maintenance is reduced due to the
icity of the data file interface ap- proach. The disadvantage of a great deal
The relationship between the ES and the NN in t​ he CRA Expert
dun- dancy that usually accompanies loosely coupled sys- tem has been
Network i​ s structured similarly to systems reported by Lin and Hendler y(1995),
avoided in the CRA Expert Network, perhaps due the nature of the
Macln- tyre et al. (1995), and Bataineh et al. (1996). Lin and Hendler (1995) use
em more than any other reason. Before proceeding to an in-depth
a NN to classify ballistic signals and the output is passed on to the ES for further
ssion of the structure of the CRA Expert Network, the following two
processing and interpretation. Using the same neural network software employed
ns describe the knowledge - and sources of that knowledge - reflected in
by the C ​ RA Expert Net- work, ​Maclntyre et al. (1995) and Bataineh et al.
S and NN components, respec- tively.
(1996) present expert networks for application within the electric utilities
industry. The ESs in these appli- cations also provide processing inputs to the
Expert system development
NNs.
Thus, while expert networks (and other forms of integrated systems)
The rules for the ES were primarily derived from structured logic
are not an explicitly new tech- nique, we have found no evidence of the
uestions found in Grant Thom- ton's internal control documentation and
technique being applied to audit judgment tasks such as CRA. However, given
ation software - ​Information and Control Understanding System
the characteristics of the task, there is significant scope for consideration of their
applica- bility to this domain. Furthermore, the ability to construct suchcus)systems​(Grant Thornton, 1992), which is used in audit and consulting practice.
us ​
using PC-based tools should facilitate experimentation with them in the field. not make any risk assessment itself and is not an AI decision tool.
does
The following sections detail the design and construction of the ​CRA ver, Expert it provides general relation- ships among IC variables to assist the
or in structuring the preliminary CRA process. The knowl- edge and logic
Network.
focus w ​ ere formalized in a logical rule structure. The interface provided by
S allows the auditor to document the IC data of a given client. It should be
, however, that the user interface screens that are provided to an auditor
3. Control risk assessment (CRA) expert network

As introduced earlier, the prototype ​CRA Expert Network ​was


constructed by integrating an ES and NN. The ES and NN components The ES component of the CRA Expert Network was devel- oped
respectively ad- dress two types of knowledge and logic relevant to auditors: (1) Visual Basic 3.0. This tool provided the means to: (a) develop an
Microsoft
well-known audit relationships - ​ap- proximately 20% of the distinct t-oriented
internal GUI user interface; (b) encode the well-known audit relationships
control structure variables - encoded as l​ ogical rules i​ n the knowledge-base;
ule structure that logically drives data collection; and (c) directly link to a
and (2) more c​ omplex control t;ari- able relationships ​determinedosoft Access 2.0 database. The data collected as an auditor interfaces with
by the
'trained' NN. S component is stored in an Access 2.0 database file. If warranted, an
The ​CRA Expert Network u​ ses a loosely coupled modelI of textanfile of this collected data is fed to the NN as input. The NN,
integrated system in which the CRA process is decomposed into separate ES lated by these input variables, derives a CRA. The NN was created in
and NN components that communicate via data files (Medsker and Bailey, 1992; Professional I1/Plus (NeuralWorks, 1992).
alWorks
J.T. Daois et aL/ European Journal of Operational Research 103 (1997) 350-372 3​ 55
en client. Use of ​Infocus' l​ ogic and question was necessary because ​lnfocus ​was
during a given session are controlled by the logic structure encoded in the ESinte-
- thatgral part of the knowledge acquisition process con- ducted with audit
is, data is neither requested of a user or screens provided that are not relevant to-a whose knowledge was ultimately used to train the NN. In addition,
niors
lnfocus a​ lso provides a real world, accepted and tested basis for the majoritysenior
of the auditors from Grant Thornton (see Davis, 1996). These subjects averaged
ES logic. 5 years of audit experience and bad completed a CRA an average of 37 times.
Added to the knowledge-base - that is not from ​Infocus' ​logic and Each auditor was given one of three client cases. The first represented
questions - are CRA threshold rules and rules translating the preliminary CRA actual
out-small sized client of the firm. The other two were based on example cases
und in
put of the NN to a CRA category. The CRA thresh- old rules are used to control thethe ​SAS No. 55 Audit Guide (​ AICPA, 1989). All three cases involved
erchandising enti- ties. The preliminary CRA was restricted to the sales stream
current assessment of control risk while a session is in progress. For example,
depending on the earliest data collected by the ES component and the thin the revenue cycle of each entity. The financial statement assertion group
current
threshold level setting, the ES may determine that further processingnsisted is not of com- pleteness, existence/occurrence, and valuation - core assertions
atedThis
warranted, i.e., the data collection by the ES is halted and the NN is bypassed. to the IC system goal of preventing and detecting errors.
process is detailed in a following section. The three cases were chosen because the entities had computerized
counting systems and reflected situations in which the auditor could potentially
y on all three types of internal controls - ​manual controls, programmed
3.2. Neural network development
ntrols, a​ nd s​ egregation of
duties. 2 H​ owever, the cases represented different situations in terms of firm size,
1. Collection of CRA process data. U ​ sing ​lnfo- cus, ​each auditor began
complexity of control system, and strength of controls. The first case repre- sented
CRA by selecting variables and/or addressing specific questions from a set of
an actual small-sized client, with low control system complexity, yet a fairly strong
ial judgment variables and questions related to the r​ evenue cycle IC
IC structure. This case was also compatible with a purely substan- tive audit
onment,
approach. 3 The two cases from the audit guide represented larger entities than the EDP environment, ​and a​ ccounting controls. ​Values as- signed to
first case. The first, a closely held company, did not have as strong of an IC ed variables and the responses to chosen questions constituted each auditor's
et'
structure as the second, a larger public corporation that had a very strong IC struc-
ture. While both had computerized accounting sys- tems, the public corporation
had a more complex and well-controlled computer system.
Very broadly, while conducting a preliminary CRA, an auditor
raises and reviews questions regard- ing the ICs (e.g., manual, computerized) of an 2 Segregation of duties refers to the principle that an individual
nsible
entity that are designed to ensure that significant misstate- ments on financial for the conduct of a process cannot also be responsible for the controls
statements will be prevented or detected. However, the questions that an auditorated with that process. For example, if an individual is responsible for
ing
chooses to examine are left primarily to the auditor's judgment. Thus, the purpose checks from customers, they should not be also responsible for recording
payments
of working with the 64 auditors was to determine not only what their CRA was for in the accounting system.
a specific case, but also what set of questions each auditor was using during the 3 Essentially there are two audit approaches. In the substantive
CRA process. ach, the auditor elects to ignore the controls in place and focuses their analysis
ly on the numbers represented in the accounting statements. Conversely, in
ntrol approach the auditor focuses first on an analysis of the IC structure of
3.2.1. Data source for training and testing
ent in an effort to reduce the amount of substantive testing that will f​ ollow.
The data used to train the NN was from an experiment conducted with
356 ​J.T. Davis et al./European Journal of Operational Research 103 (1997) 350-372
omplexity also highlights the role that the NN will play in structuring these
elationships.
issues deemed as relevant to the specific case and, ultimately, assessment of
- ​

control risk. The IC poten- tial variables/questions, and the possible values that In addition to identifying their respective cue set, each auditor
each may be assigned, are presented in Appendix A. The appendix reflects the (1) whether s/he planned to rely on manual controls, programmed
ecorded:
complexities and relation- ships of the questions and variables that are poten-
ontrols, and/or segregation of duties - indicating whether more tests of
tially considered by an auditor (including the 64 auditors that participated
ontrolsinas opposed to more substan- tive tests would be used to reach an
the experiment) during the CRA task. Appendix A indicates the percentage of level of audit risk; (2) the existing controls selected as key controls,
cceptable
the auditors, by case, that deemed each variable/question as relevant. e., controls the auditor intended to rely on and test to determine if the control
operating as it should; and (3) any controls that s/he felt were missing.
IC environmental features, EDP environmental features, accounting
controls, and segregation of du- ties operate as inter-dependent parts of the IC
struc- ture. While these variables were included in this experiment andCollection of CRA output data. ​Following analysis of the case, each
2.1.2.
uditor recorded their preliminary CRA in ​Infocus. lnfocus p​ rovides a choice
addressed by the auditors in various fashions, the complex inter-dependencies
four risk categories: ​maximum; slightly below maximum; moderate; a​ nd,
between the variables makes it virtually impossible for an auditor tofexplain
the structure of their cue set, i.e. the relationships and strengths mited.of​These risk categories are consistent with ​SAS No. 55 Audit Guide. I​ n
relationships be- tween items in the cue set, and the impact of this structure
dditiononto the categorical response, each auditor recorded their CRA using a
the CRA. These inter-dependencies serve to illustrate the complexityoint of estimate
the (0 to 100 scale) which provides an indication of how close to
domain and the task facing an auditor when conducting a CRA. Thishe level of border their judgment would be
category
based on the numeric scale. The point estimate corre- sponds to l.theThe network training (within sample) and testing (out-of-sample or
categorical judgment responses as fol- lows: out) data sets each contained 32 observations. Once trained, using the 32
n sample observations, the NN model serves as a proxy for the typical
ledge structure used by the experienced senior auditors in the training
Risk Category Point Estimate Interval
The 32 observations in the hold-out sample were used to test the resulting
model.
Limited (LTD) 0-25 Moderate (MOD) 26- 50 Slightly
below 51-75
. Training and validation of NN
maximum (SBM)
The CRA Expert Network employs a feedforward classification
Maximum (MAX) 76-100 that was trained using the back- propagation learning algorithm
melhart and Mc-

Each auditor's selected cue set and corresponding derived CRA


point estimate 4 make up one observa- tion. Although there were only three
cases, the ob- servations were distinct as each auditor indicated a different cue
set for making their preliminary CRA. For example, it is possible for an 4 While it may seem tvdd to use a point estimate here, the motivation is rather
It allows for the use of a t-test to compare the NN's CRA (a value between 0 and 1) to that of
auditor to rely on programmed controls and choose a CRA of MOD. ditors, rather than relying solely on classification accuracy. A classification NN does not yield
Conversely, another auditor (examining the same case) could choose to rely y in relation to a t-test - just classification accuracy. Furthermore, this approach allows for the
on manual controls and make the same preliminary CRA. ponse to be mapped to the CRA risk categories that are inherent in thc experimental task.

The 64 observations were used to develop and test the NN


J.T. Davis et al./ European Journal of Operational Research 103 (1997) 350-372 3​ 57
Clelland, 1986; Tam and Kiang, 1992). A feedfor- ward NN using sigmoidal activation functions is mathematically capable
of any continuous function and thus is applicable to a large variety of knowl- edge intensive tasks by distributing knowledge
en- coded into link weights that is learned from data examples (Hornick et al., 1989). In the network the processing element
(refer to Fig. 1) takes inputs from other neurons and sums these inputs, X0... X,, us- ing a summation function, ​Li=
EN_oWI,jXj. The t​ ransfer function 'normalizes' the input summation by vectoring it into a predetermined range. The
​ 1/(1 + e-~'). The backpropagation
activation level of the processing element is deter- mined by an activation function: ​X i =
learning algorithm was used to find the functional relationship between the inputs (judgment cues/variables) and the target
outputs (the auditors preliminary CRAs). A variety of net- work configurations were tested during the design phase.
Networks with a larger number of middle nodes learned the training data sample quite well, but performed poorly on the test
data sample. Networks
with a smaller number of middle nodes were too general and performed poorly overall. The final NN architecture has 210
input nodes (control cues/varia- bles), 30 hidden (middle) layer nodes, and one out- put node for the preliminary CRA using
the auditors point estimates within their chosen CRA category. Delta rule summations were employed with sigmoid transfer
functions on the middle and output nodes, while the standard root mean square error (RMSE) function was used for "all
network layers. Training stopped at 1600 iterations using the early stopping technique to avoid overfitting of the training
sample - the RMSE of the sample began to increase instead of decrease at that point. In addition, the NN was analyzed after
training at 4800 iterations. While ac- curacy - both RMSE and category accuracy - on the training sample was better than at
1600 iterations, accuracy on the hold-out sample and overall accu- racy on the total sample declined.
The final trained network (at 1600 iterations) had a Pearson correlation coefficient of 0.869, and a CRA category accuracy
rate of 72% for the training

"E
0.9 ​0.7
0.5
'~ 0.3
0.1 ​o -0.1

o ​O ​-0.3
Network Accuracy - Training Data ​ iii, iiiii il ​ Observations

¢ Desired Output ​I
.._~.... Network Output I ​! ,t Difference J

,¢ ​o ​O Network
​ Accuracy - Testing Data
1 ​0.8 ~ ​:~:~i~i~ii~
0.2 ;~!~:i~i~!~:.~?~:~i~ :N:~! ~: i~. :~4 i~: %:: ~!~ ~i:!-~i!~!i~::~ ~-: :~ ~ ​......... !~:~:;
0
Obs e ​rvations
r_.4k_. Desired Output [ ~ Netw ork Output
,¢ Difference
Fig. 2. Network validation.
358 ​J.T. Davis et al./European Journal of Operational Research 103 (1997) 350-372

General ~. ​Environment Questions ~ [​ ​.... ​1 ​Processing Compu,er


​ ​Overview L
​ ​~ ~ (CPO) r////~, _1 ]Computer I​ General ​Questions Control

Controls~ | (Gce)

Phase 1 ​ ....​t ounting A ​ Controls ~. ​Phase 2


211 Vmmbles

1 2 3 ​.... ​210 Input Layer


Neural i Phase 3
1 2 .... 30 Middle Layer
, -A .... LJ

I ​1 Output
Fig. 3. Expert network: Knowledge base structure with embedded neural network.
data set. Testing using the hold-out sample resulted in a Pearson correlation coefficient of 0.695, with a category accuracy rate of
78%. 5 The risk category prediction errors for the test sample included: four observations that were one category higher than the
assessment by the auditor: one observation each that were one category lower than the auditor, two cate- gories higher than the
auditor, and two categories
The correlation coefficient is a measure of relationship be- tween paired observations in two data sets - here, the relationship between the auditor's point estimate
and the model's point esti- mate for each observation. Category accuracy rates were derived by comparing the NN point estimate output - based on an auditor's
input cue set - to the auditor's chosen risk category. For comparison, error rates for NN models built for financial distress applications (Bell et al., 1990; Tam and
Kiang, 1992) were in the range of 10-23%. However, it is important to note that the models in those studies were much simpler than the NN developed in this
study. In those studies there were less than 10 input variables with two response levels for output. Predicting a four category response with 210 input nodes is a
significantly more difficult problem.
lower than the auditor. Fig. 2 presents the point estimate network accuracy in relation to both the training and testing samples.
A paired t-test (see Koopmans, 1987, p. 325) was also • run on the predicted and actual network output. No statistical difference
was found for the training sample (mean difference =-0.0003, one-tail p- value = 0.48) or the testing sample (mean difference =
0.0239, one-tail p-value = 0.24).
In addition, two classification networks were de- veloped - the first employed backpropagation and the second a radial basis
function - using only the CRA classification categories and not the auditors' point estimates. The hold-out sample classification
accuracy for both these models was approximately 49%. Conversely, as described above, the point esti- mates for the trained
network in the C
​ RA Expert Network w ​ ere within the classification ranges chosen by the auditors 78% of the time. Clearly, for this
data, using a point estimate within CRA category provided superior model precision.

​ 59 ​
J.T. Davis et al./ European Journal of Operational Research 103 (1997) 350-372 3 (a)
! ​! Ask Control Environment
​ Questions
Yes
Yes
Set CRT >= Slightly Below Maximum
PHASE1 ​No

Set CRT = MAX ​ 1​ --No-- ​ I Set ​ Neural ​Manu,, ​ Network

Cootro,---) ​ Inputs = 0 ​I ​I

yes ​I
Set CRT >= Limited
CRA = Current Threshold

I Yes
Computer Processing ​Overview (CPO)

Set CPO & GCQ---~ ​Neural Network Inputs ​


)​= 0 (​ Fig.
​ 4. (a) Expert network processes flowchart; (b) Expert network flowchart.
360 ​J.T. Davis et al. / European Journal of Operational Research 103 (1997) 350-372

(b)
No

Set GCQ Neural ​Network Inputs = 0 ​ l


Rely on Manual ​Controls (MC)

I ​Yes I​
. ~ ~Accounting Controls ​_bJ Questionnaire: -I Set PC Questions = 0 Ask MC Questions

General Controls ​Questionnaire (GCQ)


​ ​No
. S ​-- Cu ent
PHASE2
General Environment ​Variable Values
PHASE1

-'--! ​Rely ​Manual on


​ t ​Controls ​(MC)

Y~-- i l
Accounting Controls ​Questionnaire: Set MC Questions = 0 Ask PC Questions

Accounting ​Controls Questionnaire: Ask MC & PC Questions


___T..____
PHASE3
Fig. 4 (continued).
3.3. CRA Expert Network Design
The ES and trained NN constitute the ​CRA Expert Network. T ​ he system employs a three phased ap- proach to determine a
preliminary CRA as follows:
​ nvironment phase that encompasses the general environment questions, computer pro-
• Phase I. E
cessing overview, and general control questions (computer-related);
• Phase 2. ​Accounting controls phase; and,
• Phase 3. ​The Neural Network phase.
Fig. 3 illustrates the relationships among these three system phases. Fig. 4 provides a flowchart
J.T. Davis et al. / European Journal of Operational Research 103 (1997) 350-372 3​ 61
cluding these
ticsitsas rules in the ES should improve an auditor's control variable
overview of the main logic contained within the CRA Expert Network and
three phases. on consistency as well as inter-auditor consistency.
Phases 1 and 2 address the logical data collection aspect of the The questions are designed to collect information concerning
ntrol environment and computer environment (if any) in which the
CRA Expert Network by incorporating and presenting to the user the judgment
variables/ questions found in Appendix A. As noted previously, the nting con- trols, addressed in Phase 2, must operate. This infor- mation
data
collected during these phases is stored in a database file, which is fedd by the ES, to make (if possible) a preliminary CRA without proceeding
to the
2 and 3. In addition, the information is used to set a ​control risk
ases (as
NN. The NN (during Phase 3) accepts and utilizes values for the following
old
detailed in Appendix A): (1) the Rev- enue Cycle Environment variables; (2) (CRT). ​ he purpose of the CRT is to prevent a final CRA that is
T
the Planned Reliance variables; (3) responses to the Computer Processing a predetermined level, given that certain conditions exist in the client
Overview (CPO) questions; (4) responses to the General Control Questions nment.
(GCO, and; (5) all the Accounting Controls variables. Values accepted by the Broadly, during Phase 1 (see the flowchart in Fig. 4 and the
NN are: a '1' for a Yes response to a question or the indication that an
ic questions found in Appendix A), the auditor is asked if s/he plans to
Accounting Control variable is a key control; a '-1' for a No response to a control environment. ​If not, i.e., the auditor believes that the
n the
question or that a specific Accounting Control vari- able is missing, and; a '0'
l environment is not reliable or testing the controls is inefficient, the ES
indicating that the ques- tion/variable is not applicable to the CRA. AsheFig.CRA3 to maximum (MAX) and the task is completed. This indicates
indicates, there exist 210 values passed to the NN - each constituting a separate
e auditor will conduct a purely sub- stantive audit approach. Conversely,
node in the input layer of the NN. response indicates that a control audit will be conducted and the auditor
However, it should be emphasized here that not n askedall if s/he plans to ​rely on segregation of duties. ​If so, reducing the
l risk 1threshold (CRT) to slightly below maximum (SBM) is justified; if
questions/variables are applicable to all cases. Data is collected in Phases
and 2 via user interface screens - the presentation of which is controlled ebyCRT
the is set to maximum (MAX). In either case, the auditor is asked if
logic structure encoded in the ES. This logic structure is illustrated in Fig. 4.​rely on manual controls (​ MC). If s/he does, the CRT is reduced to
lan to
d (LTD),
The ES does not continue along a data collection path that is not relevant. For and the system is triggered to obtain information concerning
uring
example, if the answer by the auditor to a particular CPO question is 'No', thenPhase 2. If no reliance is planned, the CRT remains unchanged and
the related GCQ questions are not applicable to that case. The ES C questions
then are omit- ted from Phase 2. The control variables related to
bypasses the related GCQ questions - the user never sees those screens mitted
- andquestions will be input to the NN as '0s' (i.e., not applicable).
assigns '0s' (which are passed to the NN), representing not appli- cable, to The next set of questions concerns whether ​any part of the
those GCQ questions (see Fig. 4(b)). The following sections present specific
nting system is automated. If t​ here is no planned reliance on MC and
details concerning each phase. of the accounting system is automated, the CRA is set to the current CRT
he task is complete. If there is automation, the Computer Processing
3.3.1. Phase 1: Environment phase iew (CPO) will be completed, thus gathering information about the
al
During the environment phase, information neces- sary for the computer
ES environment and applica- tion complexity. The responses to
questions
to apply broad internal control audit heuristics is collected. These heuristics determine thrce things: (1) the level of computer environment
2) the
should be well known to most auditors. However, the possibil- ity exists that level of computer applica-
one or more of these heuristics could be omitted or misapplied. Thus,
362 ​J.T. Davis et aL / European Journal of Operational Research 103 (1997) 350-372
completion.
tion complexity, and; (3) the relevant questions to be included in the General Finally, after completing the General Controls Questionnaire
to ​relythe auditor (usually consulting with the computer audit specialist) will
Controls Questionnaire (GCQ). At this point the auditor may decide not CQ),
on programmed controls. I​ f this is the case, no GCQ questions are asked and another chance to reject reliance on programmed controls. If the
given
all the GCQ inputs to the NN will be set to 0. In the event that manualditoranddecides that computer con- trois are too weak to support reliance on
programmed controls are not relied on, the system sets the CRA to the o-
current
grammed accounting controls, all the GCQ inputs to the NN will be set to
value of the CRT and exits. Conversely, if the auditor is relying solelyThe onsystem then moves on to Phase 2 for the manual accounting controls
manual controls, the system sets the program control NN input variables to 0,
orma- tion, if the auditor has previously indicated reliance on manual
and proceeds to Phase 2 where it asks the manual controls questions.counting
If the controls. Once again, in the event that manual and programmed
auditor elects to rely on program controls, the GCQ questions are presented
controls are not relied on, the system uses the current CRT to com- plete
estimate
the from the NN does provide information to the auditor as to how
CRA task. the NN response is to a risk category border. Whether this information
tely contributes to an auditor's Judgment and/or leads to better system
3.3.2. Phase 2: Accounting controls" phase ance are empirical questions.
The accounting controls phase includes the com- pletion of the
manual accounting control questions, the programmed control questions,rections
or for future development and research
both depend- ing on the decision path taken during Phase 1. If either the
manual controls or the programmed were considered not applicable in Phase 1, The prototype system offers the opportunity for
mentation
the ES sets the corresponding NN input values to 0 and does not present the with practicing auditors, and pro- vides a basis from which
user with any questions related to these controls. If either or both rds are
may be estab- lished for a task in which no specifically set proce-
and rules
considered relevant, the appropriate Accounting Controls Questionnaire is are currently delineated. Future de- velopments will center on
presented to the user. sing current limita- tions in the prototype CRA Expert Network.
The questions for Phase 2 relate to potential accounting controls The first limitation relates the knowledge acquired and
(see Appendix A). The auditor responds to each question by labelingented eachin the system. More specifically, the system is based on a limited
individ- ual potential control as a key control (coded as a '1' for the neural
r of cases and the fact that the auditor subjects were from the same
network); missing control (- l); or not applicable (0). Upon completion,
However,
the a sizable set of cases and additional auditors could be obtained
CRA Expert Network proceeds to Phase 3. audits already com- pleted and/or by providing auditors with additional
3.3.3. Phase 3: Neural network cenarios.
Assuming the auditor chooses to rely on manual accounting A related issue is whether the NN model using point estimates
controls, programmed accounting con- trois, or both, the responses collectedcategories provides a more precise decision-aid than a pure
from Phases 1 and 2 are fed into the NN (Fig. 4). Each question from the ESication is model. For this data, the extra information as to how close to a
mapped to one of the 210 NN input nodes. Given this input, the NN derives rya border the auditors' judgments were, seemed to give the point
prelimi- nary CRA - an evaluation among 0 and 1 - which is presented to te themodel an advan- tage over the pure classification models. Of course,
user by the CRA Expert Network. In addition to presenting the numerical a point estimate within category assumes more judgment precision on
evaluation, the ES transforms the response to the CRA risk cate- gory. rt Forof the auditors. There may be a trade-off between judgment precision
example, if the response was 0.52, the risk category would be slightly below
maximum (SBM), although it is close to the SBM/MOD category border. The
J.T. Davis et al. / European Journal of Operational Research 103 (1997) 350-372 3​ 63
owever, the same control may apply to more than one accounting process.
generalizability between the point estimate within the categories approach he system does not take into consideration these com- pensating controls -
and the purely classification approach. This empirical question should ntrolsbethat are in operation later in the accounting process stream that may
tested in future research particularly with regard to acceptabil- ity by the overall effectiveness of the IC system. Further design efforts
engthen
auditors. ll involve including this assignment of controls to processes and
Second, most auditors are not computer audit specialists. When objectives. Efforts in this vein may take the form of additional
counting
the computer processing overview (CPO) indicates a complex system, les in athe knowledge-base, more neural network variables (and, perhaps
computer audit specialist should be consulted. The computer audit specialisten a different NN architecture), and more than three levels or values that a
assists the auditor in deciding whether the computer controls are adequate rticular
to control variable could be given.
support reliance on accounting programmed controls. Thus, efforts are Currently, research efforts with regard to the CRA Expert
etwork involve two main areas: (1) estab- lishment of experiments with
underway to capture and represent in the system this specialized knowledge.
ditors
Third, the system is currently narrow in scope. In addressing the in the field, and (2) design and development efforts focused on
risk assessment task, only the sales stream within the revenue cycle is hancements to the user interface and the construc- tion of an explanation
odule.
included. How- ever, the sales stream is general to nearly all busi- nesses While the NN pro- vides the capability to model complex
ationships
and important in most all audit situations. In order to increase the usefulness
of the system, a set of integrated systems each designed to deal withand derive a conclusion, it does not provide any justification for
particular parts of all the accounting cycles should be developed. nclusion. A major concem for the auditor is being able to document
udg- ments in order to defend his/her judgment, if neces- sary. For
Additionally, the system does not include assign- ment of
le, if the auditor agrees or disagrees with the suggested CRA, they
accounting control variables to particular accounting processes or control
be able to docu- ment their reasoning. This dilemma is similar to the
objectives. The NN only considers whether a particular control is a key
ma faced by the medical profession for using intelligent systems as
control and should be relied on, is a missing control, or is not applicable.
on aids. However, inter- pretation of how a NN produced a particular
conclu- sion is not a trivial task, given that the very purpose of using a NNtsisinto the decision process; decision-aiding support: consistency of
to model complex relationships that are not well understood. Much research sions; and, increased productivity (Borthick and West, 1987).
in this area is left to be done. In conclusion, the ​CRA Expert Network t​ akes advantage of both
eductive approach of an ES and the inductive approach of a NN to
de a unique decision aid designed to support and facilitate the process
nducting a CRA. The system is tailored to auditors requirements in
4. Conclusion
of data collection and analysis and allows an auditor to take a more
dered and structured approach. The system offers audit firms the
The creation of ​the CRA Expert Network p​ rovides a good tunity to improve upon the accuracy of the CRA by capturing the
example of the potential applicability of integrate d AI technologies to audit
tise of multiple auditors and multiple client experiences. Finally, the
practice and demonstrates the appropriateness of the integrated techniquee toof the system en- courages efficiency in the analysis process as it
the specific audit judgment task of assessing control risk. Potentially, ws a logic-driven data collection path, request- ing the user to only
intelligent systems such as this can provide several benefits to audit firms,
ss questions that are logi- cally required to make a CRA. The
including: preservation, replication, and distri- bution of expertise; newation results to date indicate that an expert network shows promise
364 ​J.T. Davis et al. / European Journal of Operational Research 103 (1997) 350-372

for addressing the large complex judgment processes inherent in ppendix


control A. List of judgment variables/ questions 6
risk assessments. While intelli- gent systems are domain and task specific,
this ap- proach is conceivably transferable to other large complex
1. General environment questions (Phase 1)
judgment tasks.

Revenue cycle environment variables. ​Volume of transactions:


High, Medium, Low (59%, 100%, 100%);
Acknowledgements
CPO question: Is a third party service organi- zation
to process all transactions involving electronic data
The authors wish to thank Dr. George W. Krull, Jr., Nationalssing? (n/a, n/a, n/a)
Director of Professional Development at Grant Thornton, whom was the
s the service auditor reported on the processing of transactions? (n/a,
driving force behind the experiment presented as part of this research. The
n/a) Does the report only cover policies and procedures in place?
experiment took place in conjunction with the firm's national professional
a, n/a, n/a) Does the report test the policies and procedures? (n/a, n/a,
education program. Dr. Kmll also provided valuable comments with
) Do we have a copy of the report? (n/a, n/a, n/a)
regard to the experiment. In addition, we appreciate the direction provided
by Stephen Yates, Partner and National Director of Advanced Audit
Techniques, who pro- vided the cases, instruction and opportunity to 6 Each of the following represent an input node to the Neural Network. There are
utilize Infocus for the study. input nodes, including three each for volume and dollar value (high, medium, low). These
uts can take on values of 1 (Yes or Key Accounting Control), "- I' (No or Missing Accounting
Dollar value of transactions: High, Medium, Low (63%, 93%,
ntrol), and "0' (Not Applicable to the CRA). A 'Yes' response to a particular CPO question
100%); Whether part of the cycle is automated (100%, 93%, 91%); l trigger a corresponding GCQ analysis to determine the status of a GCQ objective. The
Whether management override is a concern (81%, 60%, 61%); Whether centages in the parentheses, following each variable/ques- tion, represent the percentage of
auditors in the experiment that had included and addressed this item in their "cue set'. The
client has an internal audit staff that works to prevent or detect material
er respectively reflects values for the three experimental cases - small client, nonpublic client,
misstatements in the financial statements (89%, 33%, 87%).
public client. An n/a indi- cates that the item was not applicable to the case.
CPO question: Are the client's computers in a dedicated
Planned reliance on type of controls. cal area or facility? (n/a, 80%, ​87%)
Whether reliance is planned on manual controls (required) Whether
objective: Computer system physical secu- rity is adequate.
reliance is planned on programmed con- trois (required) Whether
ntry to the computer and supervisor terminal areas restricted to those
reliance is planned on segregation of duties (required)
ff required for com- puter operations? (n/a, 80% 87%) Are lists of
horized personnel maintained and kept up-to-data? (n/a, 80%, 78%)
A.2. Computer processing overview (CPO) and gen- eral log of all visitors maintained and reviewed? (n/a, 80%, 78%)
controls questionnaire (GCQ) company wide (Phase I)
J.T. Davis et al. / European Journal of Operational Research 103 (1997) 350-372 3​ 65
d for production runs? (n/a, n/a, 96%) Are application
Do procedures exist to control access by non-DP personnel (e.g. grammers prohibited from set- ting up and operating the computer,
engineers, janitors)? (n/a, 80%, 87%) Are DP management and n during program testing? (n/a, n/a 96%), Does management
rove development at key stages: feasibility systems proposals,
security personnel noti- fied when DP staff /eave the client's employ-
ment? (n/a, 80%, 87%) Are employees who work outside normal sys- tems proposals, detailed system design including systems
line
opera- tion hours properly authorized and adequately su- pervised? cifications, parallel running or system acceptance testing? (n/a, n/a,
(n/a, 80%, 87%) Is the computer room physical security adequate %)to
restrict unauthorized access to program and data files? (n/a, 80% 87%)
mputer Operations? Is access to operators' consoles restricted to
m- puter operators and other authorized staff?. (n/a, 93%, 96%) Are
CPO question: Is there a separate EDP depart ment? (n/a, 93%,
mputer operators restricted from performing programming functions
96%)
unning unauthorized jobs? (n/a, 93%, 96%) Are computer
GCQ objective: Separation of EDP duties is ade- quate. rators excluded from access to cash and accounting source
uments? (n/a, 93%, 96%) Is access to production source libraries
Systems Programming? Are systems programmers prohibited from
com- puter operations' personnel restricted? (n/a, n/a, 96%) Are
operat- ing the computer system when production files or application
s or records of computer system activity (jobs and program runs,
programs are resident? (n/a, n/a, 91%) Are systems programmers
uns, abnormal termina- tions of jobs and programs, system console
prohibited from chang- ing production files and programs? (n/a, n/a,
ra- tor commands) maintained and reviewed? (n/a, 93%, 96%) Are
96%) Are responsibilities for various processors or soft- ware products
tem log exceptions investigated and results of the investigation
periodically rotated among mem- bers of the systems programming
umented? (n/a, 93%, 96%) Are system utilities (ZAP, Super-ZAP,
staff?. (n/a, n/a, 70%) Are systems programmers' activities logged?
) in use and appropriately controlled? (n/a, 93%, 83%)
(n/a, n/a, 87%) Are systems programmers' activity logs reviewed by
management? (n/a, n/a, 96%) Are system utilities (ZAP, Super-ZAP)
ut/Output Scheduling? Is the preissuance of files for night and
in use and appropriately controlled? (n/a, n/a, 87%) Is there a written
ekend shifts in accordance with approved job schedules? (n/a, 93%,
computer development strategy? (n/a, n/a, 52%) Are there adequate
%) Is authorization required for unscheduled job re- quests? (n/a,
reporting procedures to moni- tor the progress of computer
%, 87%) Are any scheduling overrides automatically logged and
development? (n/a, n/a, 39%)
ject to review? (n/a, 93%, 91%) Are all terminals closed down at
Application Programming? Is development work carried out by using end of opera- tions? (n/a, 93%, 96%) Is there a secure location for
separate source libraries, separate object libraries, and sep- arate sitive output (e.g. check printing)'? (n/a, 93%, 91%)
development machines? (n/a, n/a, 74%) Are application programmers
rary maintenance? Is formal authorization required to transfer pro-
restricted from ac- cessing program libraries or data files which are
366 J.T. Davis et aL / European Journal of Operational Research 103 (1997) 350-372
duties/shifts, holiday arrangements, and termina- tion of employment?
grams to the production library? (n/a, n/a, 78%) Is the transfer of 93%, 91%) Are source documents handled within data pro-
(n/a,
files to the production library carried out by computer operationscessing only by data preparation and data control staff?. (n/a, 93%,
staff?. (n/a, n/a, 78%) Are all additions to the utility library 87%)
authorized? (n/a, n/a, 83%) Do header labels hold reference and
control data (i.e. file name, generation numbers, volume ID, time/dateIs a lack of segregation compensated for by in- creased
management supervision? Is there independent management review
stamps on creation, expiration dates, control totals)? (n/a, n/a, 78%)
Does program library software, if any, monitor and record program and ap- proval of various data processing functions? (n/a, n/a, n/a) Are
changes? (n/a, n/a, 91%) there controls over certain privileged func- tions, such as requiring
that all program library updates are assigned to designated personnel?
Are the activities of the EDP department (includ- ing management) (n/a, n/a, n/a)
appropriately segregated? Is there an organizational plan that defines
and allocates responsibilities and identifies lines of communication?
CPO q​ uestion: Can separate u~rs access the sys- tem
(n/a, 93%, 70%) Are the responsibilities of the EDP department,
oncurrently? (19%, 93%, ​91%)
accounting department and other DP users clearly defined? (n/a,
93%,96%) Are there appropriate procedures for: rotation of
%, 91%) Is there encryption of all sensitive data files? (15%, 93%,
GCQ ​objective: Access control over programs and data files%)
is Does the access control software identify each terminal via the
adequate. gical address? (11%, 93%, 87%) Does the access control software
Is access control software used to restrict access to the production
ntify each terminal via the physical address? (11%, 93%, 87%) For
programs? Are master lists of authorized personnel main- tained, ta base applications, is access to the data possible only through the
indicating restrictions on access? (n/a, n/a, 91%) BMS? (15%, 93%, 91%)
Are access and security features within the operat- ing system utilized
to restrict access to and amendment of program files? (n/a, n/a, 91%) e reports generated by such software reviewed by management? Is
Is terminal activity automatically monitored and rejected access e record of jobs actual/y run, and console logs, reviewed by
attempts reported? (n/a, n/a, 91%) Does the access control softwarepervisory personnel? (15%, 93%, 87%) Is the computer usage
identify each terminal via the logical address? (n/a, n/a, 91%) Doesmmary produced and reviewed? (15%, 93%. 87%)
the access control software identify each terminal via the physical
address? (n/a, n/a, 91%)
A.3. Computer processing overview (CPO) and gen- eral
ols questionnaire (GCQ) accounting cycle specific (Phase 1)
Is access control software used to restrict access to the data files? Are
master lists of authorized personnel main- tained, indicating
restrictions on access? (19%, 93%, 91%) Are access and security Note: The next two questions determine whether the
features within the operat- ing system utilized to restrict access to and
onment for the particular accounting cycle is/is not low risk. The
amendment of program files? (19%, 93%, 91%) Is terminal activityten questions determine
automatically monitored and rejected access attempts reported? (19%,
J.T. Davis et al. / European Journal of Operational Research 103 (1997) 350-372 ​367
other than an applications programmer, systems programmer or
whether the application complexity for this account- ing cycle is simplecomputer
or operator? (n/a, n/a, 96%)
advanced.
Could or does the client make modifications to the source code? Is
there a computer program that reports all changes to each program and
CPO question: Are transactions for this cycle processed by a third
program library? (n/a, n/a, 91%)
party service organization? ​(n/a, n/a, n/a)
Are logs of all modifications to the source code maintained?
Has the service auditor reported on the processing of transactions?a,Does
n/a, 91%) Is access to and use of the application software
the report only cover policies and procedures in place? Does the report
ricted? (n/a, n/a, 87%)
test the policies and procedures? Do we have a copy of the report?
source modifications authorized, tested, and documented? Does
CPO question: Is this application PC-based? ​(n/a, n/a, n/a) ing follow the predefined objectives of the test? (n/a, n/a, 74%) Does
Is this application running on a standalone PC? ing use test data files instead of produc- tion files for its tests? (n/a,
91%) Is a chronological record of all program amend- ments
CPO question: Does the client have the source code for the (n/a, n/a, 91%) Is the system documentation updated to
ntained?
computer programs used in this accounting cycle? (n/a, n/a, ​100%) ect amendments? (n/a, n/a, 87%) Are modifications made by
d-party vendors to existing systems adequately recorded and con-
GCQ objective: Access control over source code is adequate. led to ensure that all procedures are appropri- ately updated? (n/a,
Is access to source code controlled? ls the copying or renaming of 91%) Is the vendor software tested in the same way as in-house
sensitive programs or utilities prevented/detected? (n/a, n/a, 91%) elopments?
Are (n/a, n/a, 83%) Are emergency fixes to production
utilities, which have been made widely avail- able (e.g. for reporting),
grams fully reported and independently reviewed by manage- ment?
restricted to readonly access? (n/a, n/a, 96%) Does the operating system
a, n/a, 96%) Are the tests of the modified source code properly
provide operation log- ging which records: all use of a compiler and ervised
the by management? (n/a, n/a, 91%) Do the tests of the modified
targeted files, access and amendments to the pro- gram libraries, rce code use complete computer systems to test the interaction of
attempts to copy program files to/from the production library? (n/a, ferent
n/a, programs? (n/a, n/a, 91%)
91%) If source code is stored off-line on tape or car- tridge, is the
location physically secure? (n/a, n/a, 96%) If source code is storedquestion: Do multiple applications share the same data bases
off-line on tape or car- tridge, is the individual responsible an individual
(files)? (n/a, n/a, n/a) ministrator approved by some other author- ity? (n/a, n/a, n/a) Is the
ribution of the data dictionary restricted? (n/a, n/a, n/a)
GCQ objective: Data base application control is adequate.
Is there an integrated dictionary system? Is the data base here a data base administrator? Is the data base administrator
administrator responsible for the development of the data dictionary?ponsible for con- trolling, developing and maintaining the data dic-
(n/a, n/a, n/a) Are the data base changes requested by the data base nary? (n/a, n/a, n/a)
368 ​J.T. Davis et al./ European Journal of Operational Research 103 (1997) 350-372
elements of the distributed data system? (n/a, n/a, n/a) Are software
Does the data base administrator monitor data base usage? (n/a, n/a, trols
n/a)used to prevent update inter- ference on central databases in
Are procedures established which allow access to the data base only ributed sys- tems? (n/a, n/a, n/a)
through the DBMS software and prevent unauthorized access while the
data base is not under DBMS software control? (n/a, n/a, n/a) DoesQuestion:
the Are telecommunications or net- works used in this
data base administrator approved and log all changes to the DBMSnting cycle? (n/a, 100%, ​100%)
library? (n/a, n/a, n/a) Is access to the computer operation area by the
data base administrator restricted or supervised? (n/a, n/a, n/a) Are GCQ Objective: Control over ​telecommunications and/or
n/a,​is adequate.
utility programs under control of the data base administrator? (n/a,rks
hysical access to the terminals controlled? Is the physical access of
Does the data base administrator maintain and review a log of programs
run? (n/a, n/a, n/a) rocomputers or termi- nals maintained by either equipment locks or
- trol over location? (n/a, 100%, 100%)
CPO question: Are ​there real-time updates ​to files ​when transactions
he use of terminals controlled by passwords? Are terminals
are entered? ​(n/a, n/a, n/a) omatically locked out after failed sign-on? (n/a, 100%, 96%) Are
minals automatically locked out if the ter- minal is inactive for a
GCQ objective: Control over ​real-time updates ​to ​files is adequate.
cified period of time? (n/a, 100%, 96%) Are sign-on and sign-off
Are control totals from the transaction history file reconciled to the totals
cedures specified and then verified by the computer system? (n/a,
from the updated files? ​(n/a, n/a, n/a) ​If memo updating (A method %, in 96%) Is access to all data files that are in the process of being
which the system issues memo transactions to temporarily update a copy ated (i.e. 'live' files) controlled by pass- word? (n/a, 100%, 100%)
of the file and then actual update is per- formed in batches overnight)users
is refrain from using common passwords (first name, spouse's
used, is the up- dated copy of the file matched against the master file used
me, birth date, etc.)? (n/a, 100%, 100%) Do users refrain from
during actual on-line processing? (n /a n/a, n/a) playing passwords exter- nally (e.g. post-it notes) on the terminals?
, 100%, 9​ 6%) ​Is the use of batch files to log onto the system
CPO question: Is ​the data processing ​function tbr ​this accounting hibited'? (n/a, 100%, 96%)
cycle decentralized (Distributed ​Proce~ing)? (n/a, n/a, n/a)
passwords protected and changed on a regular basis? Is the display
GCQ objective: Control over ​distributed process- ing applications passwords
​is prevented during log on? (n/a, 100%, 100%) Upon
adequate. mination or resignation, are employees immediately denied any
mputer access? (n/a, 100%, 100%) Are passwords stored using a
Is a software log maintained for all transactions including errors and
retransmissions? (n/a, n/a, n/a) Are data dictionary tools used to way encryption algorithm? (n/a, 100%, 96%)
document and monitor the responsibilities for various data items or
the system user rights transaction or applica- tion specific?
elements among the distributed network'? (n/a, n/a, n/a) Arc transaction
logs maintained and reviewed for
J.T. Davis et al. / European Journal of Operational Research 103 (1997) 350-372 3
​ 69
administrator or supervisor pass- word been changed from default
Are system user rights established, monitored and changed by a system password set- tings? (n/a, 100%, 91%) Is the confidentiality of the
or network administrator? (n/a, 100%, 91%) Is the ability to change system administrator or supervisor Password maintained? (n/a, 100%,
system user rights pro- tected by a system administrator or supervisor 91%) Is the system administrator or supervisor password encrypted?
password? (n/a, 100%, 91%) Is the system administrator or supervisor n/a, 100%, 91%) Is the system accessible by modem? Is data encrypted
password changed frequently? (n/a, 100%, 91%) Has the system during transmission and on the files? (n/a, n/a, n/a) Are the phone
numbers changed periodically? (n/a, n/a, n/a) Are the phone numberssing transactions, validate completeness, ensure one- time-only
listed on the modem or terminal? (n/a, n/a, n/a) Are there procedures
ipttoof a transaction, match func- tional acknowledgment reports,
prevent line tapping, un- listed numbers and auto connection of dial-up
ew exception reports to ensure corrections? (n/a, n/a, n/a)
lines? (n/a, n/a, n/a) Is password and transaction code documentation
secured against unauthorized access? (n/a, n/a, n/a) question: Have ​there been any hardware or software malfunctions
resulted in ​a loss of data ​in this accounting ​cycle? (n/a, n/a, n/a)
CPO ​question: Does ​the software in this ​account- ing cycle generate
objective: Backup ​control is adequate.
transactions or pass ​informa- tion ​to other cycles? ​(19%, 93%, 96%) Was data restored by use of backup? Were there restart
GCQ ​objective: Control over computer generated transactions is lities which enabled pro- cessing to continue from point of
adequate. rruption? (n/a, n/a, n/a) Were there procedures which identified the
Are the methods used in the program to generate the data and related
- cessing stage reached at the time of malfunction? (n/a, n/a, n/a)
control record appropriate? (19%, 93%, 96%) Is there an adequaterecheck
there controls to ensure that program li- braries (most recent
over the accuracy of the data generated (e.g. reasonableness check,sions) and data files (most recent backup or copy) were restored
manual review of data generated, etc.)? (19%, 93%, 96%) Are thesequent to the failure or malfunction? (n/a, n/a, n/a) Was data
results of the check for accuracy reviewed and approved? (19%, 93%,ored by manual re-entry? Were controls (batch totals, control totals,
96%) Are controls such as computer sequence checks, computer ) used to ensure that the restoration of data was complete and
matching or batch totals used to ensure the completeness of input urate?
and/or (n/a, n/a, n/a) Was data lost and not restored'? Was the nature
updates? (19%, 93%, 96%) he data insignificant? (n/a, n/a, n/a) Was the affect upon the financial
Are exception reports investigated and reconciled? (19%, 93%,
ements im- material? (n/a, n/a, n/a)
96%)
ccounting control variables (Phase 2)
CPO question: Is there a ​significant loss of visible audit trail in this
accounting cycle? (n/a, n/a, ​n/a) GCQ ​Objective: Control over a ancing receivable subledger (1 9%, 20%, 43%) Balancing run to run
'paperless' audit trail is adequate. ntrol totals (11%, 13%, 57%) Balancing the G/L with the subledgers
%, 47%, 17%)
For paperless or EDI transactions, are control totals such as batch totals,
sequence numbers and 'line item counts' maintained which: track
370 ​J.T. Davis et al. / European Journal of Operational Research 103 (1997) 350-372
Canceling original documents (11%, 27%, 13%) Checking by computer for duplicate entries (11%, 27%, 30%)
Checking by computer the numerical sequence of a file (7%, 20%, 39%) Checking for a third party signature (7%,
13%, 9%)) Checking manually the numerical sequence of a document, journal or report (0%, 20%, 17%) Checking
one-to-one (26%, 7%, 9%) Comparing batch totals (4%, 7%, 43%)) Comparison of budgeted amounts to actual
amounts (0%, 7%, 9%) Comparison of cash receipts listing to deposit slips (0%, 27%, 17%) Computer generation
of transactions (0%, 0%, o​ %) ​Dual control over cash receipts (11%, 20%, 4%) Electronic authorization (0%, 7%,
0%) Independent review of edit reports for data file changes (0%, 20%, 13%) Interactive dependency edit (0%, 0%,
4%) Interactive document reconciliation (0%, 0%, 4%) edit controls (0%, 7%, 0%) existence edit (0%, 0%, 0%)
feedback edit (0%, 0%, 0%) format edit (0%, 0%, 0%) key verification (0%, 7%, 0%)
mathematical accuracy check (7%,
Non-interactive range check edit (0%, 0%, 0%) Non-interactive reasonableness edit (0%, 0%, 0%) Non-interactive
prior data matching (0%, 0%, 0%) Non-interactive check digit (0%, 0%, 0%) Performance of analytical procedures
and investi- gation of unusual items (0%, 13%, 17%) Periodic reconciliation of books to physical (7%, 13%, 4%)
Periodic revision of budgeted amounts based on updated information (0%, 0%, 4%) Physical access controls (7%,
13%, 4%) Physical safeguards (7%, 7%, 4%) Reconciliation of manual totals to run totals (0%, 7%, 17%)
Reconciliation of master file balance to control account (0%, 34%, 9%) Reconciliation of master file balance to run
totals (7%, 7%, 35%) Reperformance (0%, 7%, 0%) Restrictive endorsement of checks received (0%, 7%, 4%)
Reviewing adjustment transactions (0%, 7%, 4%) Reviewing internal signatures (7%, 20%, 9%) Interactive
Reviewing permanent data file exception reports Interactive
(0%, 0%, 0%) Interactive

Reviewing reference file data (0%, 0%, 0%) Interactive
Software access controls (0%, 33%, 9%) Interactive
Use of a Iockbox (7%, 7%, 0%) Interactive
Use of prerecorded input (OCR, MICR, OMR) 20%, 9%)
(0%, 0%, 0%) Interactive
Verifying mathematical accuracy (7%, 7%, 4%) Interactive
Other: Review, compare invoice adjusted for out Interactive of stock items with the packing slip (11%, 14%,
Interactive

0%) Matching

Other: Driver gets signature on annotated sales invoice. (19%, 0%, 0%) Other: Cost of goods entry is calculated
from sales invoices and calculations reviewed (4%, 0%, 0%) Other: Credit Check on invoice customer (0%, 0%,
0%) Other: Compares amounts written off to aged trial balance and customer detail (0%, 0%, 0%)
References
AICPA, 1989. American Institute of Certified Public Accountants. Audit guide: Consideration of the internal control structure in a financial statement audit. New York,
NY. ​prior
data matching (0%, 0%, 0%) reasonableness edit(O% 13%, 4%) range check edit (0%, 0%, 0%) check
digit (0%, 0%, 4%) to a previously validated document (37%, 40%, 30%) Matching to a previously validated file
(0%, 27%, 52%) Matching to an authorized list (26%, 47%, 22%) Monthly review of bank reconciliations (0%,

13%, ​4%) Monthly


​ review of receivable (7%, 13%, 13%) Non-interactive format edit (0%, 0%, 0%)

Non-interactive mathematical accuracy (0%, 0%, ​o%) Non-interactive


​ edit controls (0%, 0%, 0%) Non-interactive
existence edit (0%,0%, 0%) Non-interactive dependency edit (0%, 0%, 0%)
J.T. Davis et al. / European Journal of Operational Research 103 (1997) 350-372 ​371
d University of Southem Califor- nia, pp. 21-22. Davis, J. 1996. Experience and auditors'
ection of relevant information for preliminary control risk assessment. Auditing: Journal
Anderson, J., 1983. A spreading activation theory of memory. Journal of Verbal Learning
and Verbal Behavior, 261-295. Bataineh, S., AI-Anbuky, A., Al-Aqtash, S., 1996. An Practice and Theory, 16-37. Dcng, P., 1994. Automating knowledge acquisition and
expert system for unit commitment and power demand prediction using fuzzy logic and ine- ment for decision support: A connectionist inductive inference model. Decision
ences
neural networks. Expert Systems, 29-40. Bell, T., Ribar, S., Verchio, J., 1990. Neural nets 24 (2), 371-393. Dreyfus, H., Dreyfus, S., 1986. Mind Over Machine: The Power
vs. logistic regression: A comparison of each model's ability to predict commercial llurnan
bank Intuition and Expertise in the Era of the Computer. Free Press, New York, NY.
failures. KPMG Peat Marwick Working paper, May 1990. Beulens, A., Van Nunen, ning,
J., M., Jones, D. and Loebbecke, J., 1994. An experimental examination of the impact
1988. The use of expert system technology in DSS. Decision Support Systems 4, 421-431. aids on the assessment and evaluation of management fraud. Audit Judgment
decision
Borthick, A.F., West, O., 1987. Expert systems - A new tool for mpo- sium, Co-sponsored by Grant Thornton and University of Southern California,
bruary 21-22. Frisch, A., Cohn, A., 1991. Thoughts and after-thoughts on the
the professional. Accounting Horizons 1, 9-16. Brown, C., Coakley, J.,
Phillips, M.E., 1995. Neural networks enter the world of management accounting. 1988 Workshop on Principles of Hybrid Reasoning. A1 Maga zinc: Special
Management Ac- counting, 51-57. Brown, C., Phillips, M., 1990. Expert systems for ue, January 1991, 77-.83. Gallant, S. 1993. Neural Network Learning and Expert
management stems.

accountants. Management Accounting, 18-22. Brown, C., O'Leary, D., 1994. MIT Press, Cambridge, MA. Gillett, P., Bamber, E., Mock, T., Trot.man, K.,
Introduction to Artificial Intelli- 95. Audit judgment. In: Bell, T., Wright, A. (Eds.), Auditing Practice, Research, and
ucation: A Productive Collaboration. Ameri- can Institute of Certified Public
gence and Expert Systems. Published Monograph, 1994. Caudill, M., 1991.
countants in Cooperation
CRA expert networks. Byte, 108-116. Cheng, B., Titterington, D., 1994. Neural networks:
e Auditing Section of the American Accounting Associ- ation, New York. Grant
A review
on, 1992. Information and control understanding
from a statistical perspective. Statistical Science 9, 2-54. Coakley, J., Brown.,
C. 1993. Artificial neural networks applied to ratio analysis in the analytical review system version 2.08. Gray, G., McKee, T., Mock, T., 1991. The future
of expert systems and decision support systems on auditing. Advances in
process. International Journal of Intelligent Systems in Accounting, Finance and
nting 9, 249-273. Greeno, J.G. 1973. The structure of memory and the process of
Management, 19-39. Craven, M., Shavlik, J., 1996. Extracting tree-structured represen-
problems. In: Solso, R.L. (Ed.), Contemporary Issues in Cognitive Psychology:
tations of trained networks. In: Advances in Neural Informa- tion Processing Systems 8.
yola Symposium. Wiley, New York, NY. Gupta, U., 1994. How case-based
MIT Press, Cambridge, MA. Davis, E., 1994. Effects of decision aid type on auditors'
ng solves new problems.
going concern evaluation. Audit Judgment Symposium, Co-spon- sored by Grant Thornton
Interfaces 24 (6), 110-119. Hecht-Nielsen, R., 1990. Neurocomputing.
Addison-Wesley, -65. Liberatore, M., Stylianou, A., 1993. The development manager's advisory
Reading, MA. Hedberg, S., 1995. Where's AI hiding? AI Expert, 17-20. : A knowledgebased DSS tool for project assessment. Decision Sciences 24 (5),
Hornick, K., Stinchcombe, M., White, M., 1989. Multilayer feed- forwared networks are 6. Liberatore, M., Stytianou, A., 1995. Expert support systems for new product
universal approximators. Neural Net- works 2, 359-366. Jackson, P. 1990. Introductionpment
to decision making: A modeling framework and applications. Management
Expert Systems. Addison-Wes- e 41 (8), 1296-1316. Lin, C., Hendler, J., 1995. Examining a hybrid
ley, Workingham, MA. Johnson-Laird, P.N., 1983. Mental Models: tions/sym- bolic system for the analysis of ballistic signals. In Sun, R., Bookman,
Toward a Cognitive Science of Language, Inference, and Consciousness. Harvard s.), Computational Architectures Integrating Neuraland Symbolic Processes. Kluwer
University Press, Cambridge, MA. Kandel, A., Langholz, G., 1992. Hybrid Architectures Publishers, Norwell, MA, pp. 319-348. Lippmann, R., 1987. An introduction to
mic
for Intelli- ting with neural

gent Systems. CRC Press, Boca Raton, FL. Koopmans, L. 1987. nets. IEEE ASSP Magazine, 4-22. t.ymer, A., Richards, K., 1995. A
Introduction to Contemporary Statistical -based expert system for personal pension planning in the UK. Intelligent Systems
ounting, Finance and Management 4, 71-88. Maclntyre, J., Smith, P., Harris, T.,
Methods. Duxbury Press, Boston, MA. Kunz. J., Kehler, T., Williams,
ial experience: The use of hybrid systems in the power industry. In: Medsker, L.
M., 1987. Applications develop- ment using a hybrid A1 development system. In: Hawley,
Hybrid Intelligent Systems. Kluwer Academic Publishers, Norwell, MA, pp. 57-74.
R. (I-M.), Artificial Intelligence in Programming Environments. Ellis Horwood,
nnan, B., 1993. Continuous symbol systems - The logic of
Chichester. Lacher, R., Coats, P., Shanker, C., Franklin, L., 1995. A neural network for
classifying the financial health of a firm. Euro- pean Journal of Operational Research 85
372 ​J.T. Davis et al. / European Journal of Operational Research 103 (1997) 350-372
nagement 4, 191-204. Tam, K., Kiang, M., 1992. Managerial applications of neural
connectionism. In: D.S. Levine, M. Aparicio IV (Eds.), Neural Networks for Knowledgeks: The case of bank failure predictions. Management Science 38 (7), 926-947.
Representation and Inference. Lawrence Erlbaum, Hillsdale, NJ. Markham, I., Ragsdale, E., Watkins, P.R., 1986. Integrating expert systems and decision support systems.
C., 1995. Combining neural networks and statistical predictions to solve the classification 10 (2), 121-136. Yoon, Y., Guimaraes, T., Swales, G., 1993. Integrating
uarterly
l neural networks with rule-based expert systems. DSS Special Issue on Artificial
problem in discriminant analysis. Decision Sciences 26 (2), 229-242. Medsker, L., 1994.
Design and development of hybrid neural network and expert systems. In: Proceedings Networks.
of
the IEEE International Conference on Neural Networks III, Orlando, FL, pp. 1470-1474.
Medsker, L., Bailey, D., 1992. Models and guidelines for integrat- ing expert systems and
neural networks. In: Kandel, A., Langholz, G. (Eds.), Hybrid Architectures for Intelligent
Sys- tems. CRC Press, Boca Raton, FL, pp. 153-171. Medsker, L., 1995. Hybrid
Intelligent Systems. Kluwer Academic
Publishers, Norwell MA. Minsky, M., 1986. The Society of Mind. Simon
and Schuster,
New York, NY. NeuralWare: Technical Publications Group, 1992. Neural
comput- ing - A technology handbook for professional ll/plus and neuralWorks explorer.
Pittsburgh, PA. O'Leary, D., Watkins, P., 1991. Review of expert systems in auditing.
Expert Systems Review for Business and Account- ing, 3--22. Ripley, B., 1994. Neural
networks and related methods for classi- fication. Journal of the Royal Statistical Society B
56 (3), 409-456. Ripley, B., 1993. Statistical aspects of neural networks. In: Barn-
dorff-Nielsen, O., Jensen, J., Kendall, S. (Eds.), Networks and Chaos - Statistical and
Probabilistic Aspects. Chapman and Hall, London, pp. 40-123. Rosenhcad, J., 1992. Into
the swamp: The analysis of social issues. Journal of Operational Research Society 43 (4),
293- 305.
Rumelhart, D., McClelland, J., 1986. Parallel Distributed Process- ing: Volumes I and II.
The MIT Press, Cambridge, MA, pp. 110-146. Sarle, W., 1994. Neural networks and
statistical models. In: Proceedings of the Nineteenth Annual SAS Users Group
International Conference. SAS Institute, Cary NC, pp. 1538- 1550. Scott, D., Wallace, W.,
1994. A second look at an old tool:
Analytical procedures. The CPA Journal, 30-35. Silverman, B., 1995.
Knowledge-based systems and the decision
sciences. Interfaces 25 (6), 67-82. Solso, R., 1988. Cognitive
Psychology, 2nd edition. Allyn and
Bacon, Boston, MA. Sun, R., Bookman, L., 1995. Computational
Architectures Inte- grating Neural and Symbolic Processes. Kluwer Academic Publishers,
Norwell, MA. Sun, R., 1994. A two-level hybrid architecture for structuring knowledge for
commonsense reasoning. In: Sun, R., Book- man, L. (Eds.), Computational Architectures
Integrating Neu- ral and Symbolic Processes. Kluwer Academic Publishers, Norwell, MA,
pp. 247-282. Sutton, S., Young, R., McKenzie, P., 1995. An analysis of potential legal
liability incurred through audit expert systems. Intelligent Systems in Accounting, Finance

Você também pode gostar