Você está na página 1de 7

22

A Simple Expert System

B. I. Blum

The Johns Hopkins University/Applied Physics Laboratory

Abstract Figure I contains a generic diagram for an expert system.


Expert systems are one field of artificial intelligence (AI) The knowledge is stored in the Knowledge Base, and the data that
that has received considerable recent attention. These systems describe the current problem are maintained in the Global Data-
base. The Knowledge Base is static; it can be added to as new
generally are written in Lisp or Prolog -- languages that are inter-
knowledge is gained. The Global Database is dynamic; it is
pretive, have flexible data accessing mechanisms, and use powerful
deleted once the problem has been solved. The Inference Engine
string manipulation tools. This raises two questions: what are
navigates through (or searches) the Knowledge Base to infer facts
expert systems, and can they be implemented using MUMPS?
that are consistent with the state of the Global Database; these
Some answers are presented in this paper.
new facts then are added to the Global Database.
Introduction to Expert Systems The structure of the Knowledge Base, the form of the global
Expert systems represent a field of artificial intelligence (AI) data, and the mechanism of the Inference Engine are all closely
that has met with some operational success and considerable com- interconnected. Many expert systems are designed for only a
mercial interest. From a functional viewpoint, the underlying con- specific class of problem; others -- called domain independent or a
cept of an expert system is that the knowledge of a relatively nar- shell -- may be used in many application areas. In each case, the
row domain, i.e., the expertise, is structured so that a less experi- Inference Engine applies heuristics (i.e., rules of thumb) to guide
enced person may access this knowledge in solving problems. its processing. The heuristics normally arc defined as algorithms,
That is, the expert's knowledge is recorded, and the system draws but they do not act as algorithms. The fact that a heuristic is used
upon this to provide expert advice. implies that an exact solution is not always available. Backtrack-
ing, nondeterministic judgments, and approximations may be used.
Not every system that offers expert advice is an expert sys- Herb Simon, the economist, computer scientist, and Nobel
tem. In many situations the expert knowledge can be formalized as laureate, uses the word "satisfice" to mean that a satisfactory
an algorithm. In this case, the results are defined as a function of rather than optimum solution is produced.
the inputs; once the inputs have been specified, the outputs can be
produced with no further interaction. Examples are computations Another way of looking at an expert system is to consider
such as finding a maximum or calculating a standard deviation. how it processes data. In a traditional compiled system, the com-
Most of the early computer applications were concerned with the putational algorithm fixes all control paths and data structures
formalization of this kind of computational knowledge. prior to execution. All possible paths of use have been predefined.
With an expert system, on the other hand, the knowledge and con-
With the advent of integrated circuits and large mass memory trol flow are separate from each other. There is no standard con-
devices, computers increasingly were used to store information. trol flow (although the heuristics may suggest one). The control is
Information retrieval and database management systems now rou- determined dynamically by the contents of the Global Database.
tinely store and access information in databases of gigabit size. In
most cases, the databases are not viewed as storing knowledge. Expefts U~r$
There are exceptions. For e~ample, bibliographic systems may
contain knowledge in the form of abstracts. The information
retrieval systems can display this knowledge, but it cannot use it. Natural I.,ll~juage User Inteda¢~ [
The abstract is simply a body of text with some additional terms to
guide retrieval; extraction of the knowledge contained in it
I,
I-I
requires a human reader.
Expert systems, therefore, must do more than simply com-
pute algorithms or retrieve information. They require that (a) the Engine
knowledge is organized so that inferences can be constructed from
it, and (b) there is a mechanism to infer expert guidance for a
given situation. Although there is no clear dividing line between
the algorithmic computations, the retrieval of information, and the
/ \
functions of an expert system, the unique character of an expert
system is that it can navigate through the knowledge base to make B&~ Oatt~a.t~
inferences in many situations. Compare this with the algorithm
that can compute only after all the inputs have been entered or the
information retrieval system that only can extract information Figure 1 An Expert System
already stored.
23

That is, searching of the K n o w l e d g e Base is established by what is A Production Rule System
currently k n o w n in the Global Database and what is to be found
out, i.e., the goals. As previously noted, the structure of the knowledge
representation a n d the organization of the Inference Engine are
In addition to the basic control flow mechanism, an expert closely linked. O n e o f the most gcncral organizations for
system requires: knowledge -- and the easiest to explain -- is the production rule.
, Tools to build and maintain the K n o w l e d g e Base. A production rule is a logical statement containing a predicate
composed of antecedents and a set o f consequents. If the predi-
o A m e a n s for explaining the rationale for the advice cate is true, then each c o n s e q u e n t is set true. For example, in
presented.
• A n interface to facilitate user interaction. If P(A ~ A ~ . . . . 4 , 0
thcnC~C~ ... C,
Because most expert systems maintain the knowledge base as sym-
the A ~ are a s s u m e d to be facts in the Global Database, and if P
bolic (as opposed to n u m e r i c or coded) text, there also is a general
evaluates to true, then the C~ are a d d e d as facts to the Global
goal of providing natural language interfaces. At present, how-
Database.
ever, natural language processing is a separate research issue, and
most systems do little language processing beyond that associated This is better understood by way of example. Table l con-
with the interpretation of the K n o w l e d g e Base. tains a small a n d very simple production rule K n o w l e d g e Base

RULE 1 RULE i0
On condition [has hair] On condition [is carnivore]
Then [is mammal] [has tawny color]
[has black stripes]
RULE 2 Then [is tiger]
On condition [gives milk] GOAL
Then [is mammal]
R U L E ii
RULE 3 On condition [is ungulate]
On condition [has feathers] [has long legs]
Then [is bird] [has long neck]
[has tawny color]
RULE 4 [has dark spots]
On condition [flies] Then [is giraffe]
[lays eggs] GOAL
Then [is bird]

RULE 5 RULE 12
On condition [is mammal] On condition [is ungulate]
[eats meat] [has white color]
Then [is carnivore] [has black stripes]
Then [is zebra]
RULE 6 GOAL
On condition [is mammal]
[has pointed teeth] RULE 13
[has claws] On condition [is bird]
[has foward pointing eyes] [does not fly]
[is carnivore] [has long legs]
Then
[has long neck]
[is black and white]
RULE 7
[is mammal] Then [is ostrich]
On condition GOAL
[has hoofs]
Then [is ungulate]
RULE 14
On condition [is bird]
RULE 8
[is mammal] [does not fly]
On condition
[chews cud] [swims]
[is ungulate] [is black and white]
Then [is penguin]
[is even-toed] Then
GOAL
RULE 9
[is carnivore] RULE 15
On condition [is bird]
[has tawny color] On condition
[has dark spots] [is good flier]
[is cheetah] Then [is albatross]
Then GOAL
GOAL

Table 1 A Simple Production Rule Knowledge Base


24

designed to identify an animal. All of the rules in this case imply initial facts: [has feathers]
the conjunction of the antecedents as the predicate. Thus. for [is good flier]
example, R U L E 8 is
Rule 9 [is carnivore] Rule 5 [is mamma]] Rule 1 [has hair]
If (this) is mammal A N D fail
(it) chews cud Rule 2 [gives milk]
then (it) is ungulate A N D ALSO fail fail fail
(it) is even-toed Rule 10 [is carnivore] Rule 5 [is mammal] Rule 1 [has hair]
fall
Both antecedents must bc true; if they are, the two consequcnts Rule 2 [gives milk]
will be added to the Global Database. fail fail fail
Rule 8 adds to our knowlcdgc, but it does not answer the Rule 1 I [is ungulate Rule 5 [is mammal] Rule 1 [has hair]
question we are interested in, namely: What is this animal? Rules fail
9 through 15 tell us the name of a specific animal. Therefore, they Rule 2 [gives milk]
are labeled as goals. Notice that we have taken a very general fail fail fail
structure -- the production rule - - and built a Knowledge Base Rule 12 [is ungulate] Rule 5 [is mammal] Rule 1 [has hair]
designed to answer certain kinds o f questions. This Knowledge fail
Base can identify an animal given some of its characteristics; it Rule 2 [gives milk]
cannot, however, tell about the characteristics of an animal given fail fail fail
its name. That is, one would expect it to be able to answer: Rule 13 [is bird I Rule 3 [has feathers]
true, [is bird] added as a fact
What is a carnivore that has a tawny color and black [does not fly]
stripes? fail
Rule 14 [is bird] true
But the rules in this example do not provide information to [does not fly]
answer: fail
Rule 15 [is bird] true
What color is a tiger? [is good flier] true
Goal: [is albatross]
because "tiger" does not appear as an anteccdcnt in any of the
rules.
There are two basic ways in which an Inference Engine Figure 2 Search Flow for a Sample Consultation
searches through this type o f Knowledge Base. The first is called
goal directed searching. Assume that we enter the following facts
into the Global Database: ever an unresolved antecedent was found, it used backward chain-
ing to try to validate the antecedents. There is another way to
manage the searching with production rules. It is called data
[has feathers] directed search. What happens with this flow is that all rules (not
[is good flier]. just goal rules) are tested to see if all the antecedents are in the
Global Database. If they are, then the consequents are added to
In a goal directed search, the Inference Engine examines each pro- the database, and the process begins again. This continues until
duction rule defined as a goal, and tests its antecedents. If an the consequent o f a goal is found to be true.
antecedent is not known to be true, then the antecedent is made a If the Inference Engine used a data directed search (or for-
subgoal, and the engine attempts to show that it is true. If it is ward chaining), then after entry of
true, then the Global Database is updated, and processing resumes
after the point it became a subgoal. If the backward chaining fails, [has feathers]
then the current rule is abandoned and the next rule is examined. [is good flier]
In the example, Rule 9 (the first of the goals) is examined the flow might be as follows. For Rules 1 and 2 the antecedents
and the first antecedent, [is carnivore], becomes a subgoal (see are not satisfied, and so no action is taken. For Rule 3 the
Figure 2). This will be true if the antecedents of Rules 5 or 6 are antecedent is true, so the consequent [is bird], is added to the Glo-
true. First we try Rule 5 and make its first antecedent, [is mam- bal Database. Because the Global Database now has been
mal], a subgoah This will be true if the antecedents of Rules 1 or changed, rules that previously failed may now succeed. Thus test-
2 are true. Trying Rule 1, we find that the fact [has hair] is not in ing begins again with Rule 1 a n d continues until Rule 15 where
the Global Database and there are no other rules that can deter- both antecedents succeed. The consequent is that this is an alba-
mine the truth o f [has hair] as a subgoal. Next we try Rule 2 and tross. Because this is a goal, the processing is complete.
it also fails. This means that the first subgoal for Rule 5 fails and In this example, we have shown how knowledge can be
therefore Rule 5 fails. We try Rule 6 which also fails and, there- structured in small chunks, i.e., as production rules, and how two
fore our initial goal, Rule 9, fails. We next try Rule 10. different Inference Engines can search through this Knowledge
Notice that the Inference Engine as we described it has no Base to arrive at a goal. Both Inference Engines are domain
memory. It will repeat the same processing for [is carnivore] in independent. That is, there is nothing about them that depends
Rule 10, will fail at [is ungulate] in Rules 11 and 12, and in Rule upon this being an animal knowledge base. They both are able to
13 will backward chain from [is bird] to Rule 3. Now the add new facts, e.g., [is bird], to the Global Database.
antecedent, [has feathers], is in the Global Database, thus the rule With this example, we can see how expert systems differ
succeeds and [is bird] is added to the Global Database. Rules 13 from other processing paradigms. One could, for example, write a
and 14 fail at [does not fly], but Rule 15 has both its antecedents program in which all the rules were in the form o f I F - T H E N -
in the Global Database and therefore succeeds. ELSE statements. It then could be compiled, and - - for this sim-
In this first example, a goal directed search was used. The ple example - - would operate essentially the same way with
Inference Engine started with the rules identified as goals. When- improved efficiency. The compiled program, however, would
25

combine the knowledge and the control flow. As new changes to using the knowledge base presented in Table 1. T h e next section
the knowledge are identified, the program would have to be identifies some of the deficiencies o f this approach and discusses
rewritten. In the production rule system, on the other hand, the the direction o f the next prototype.
knowledge is maintained as i n d e p e n d e n t rules that can be added
to or modified without changing the Inference Engine. T h e prototype expert s~,stem to be presented here used both
goal directed a n d data directed searching. First, the user would
T h u s , the coding as an algorithm improves performance at enter what was k n o w n about the animal. T h e n the system would
the expense o f flexibility. For static, well understood domains, execute a goal directed search to see if a goal had been reached.
formalization as algorithms is to be recommended. Next we con- If not, it would do a data directed search. If any facts were added
sider how the expert system differs from a database application. to the Global Database, then the goal directed search would be
After all, the K n o w l e d g e Base used in this example really is only a repeated. If it failed, then the most likely goal would be identified
database. T h e key difference is that databases can retrieve only and s h o w n to the user. T h e system would then try to verify the
information that has been stored. Most DBMSs have tools to select unproven antecedents by a combination of user inputs and search-
subsets of data or compute the average value for a numeric field. ing.
A DBMS could find albatross as the answer to a question with the
query terms "bird" and "good flier." But it does not have the ability Figure 3 illustrates this basic flow. T h e user enters four facts
to recognize that [is bird] is implied by [has feathers]. To do ibis about the animal. F r o m the data directed search, the Inference
requires a predefined structuring of the knowledge with an Engine determines, by Rule 1, that because thc animal has hair it
is a mammal. In another iteration, by Rule 8, it adds
appropriate inference mechanism.
[is ungulate]
O f course, this simple example does nothing useful. Neither
does it deal with any of the difficult issues. In the following sec- [is even-toedl
tion we present several examples taken from a simple expert sys- to the Global Database. Even with these facts, we are not certain
tem which operates on this database. We use these examples to about the goal, therefore the system lists out the most likely goal,
illustrate what some of the problems are in going from a simple Rule 11. In this prototype the rule is described with text ("the
conceptual design to a practical, operational system. animal is ..."), but the description could have been c o m p o s e d from
the rule itself. T w o facts are missing to establish that Rule I1
A Simple Production Rule System holds in this case. T h e user is asked if each fact is true. Because
they are, the conclusion is drawn a n d each substantiating inference
As part of a larger project, several prototype rule based
is listed.
expert systems are being implemented. Each is being treated as a
prototype, i.e., a partially implemented application intended to Some expert systems will print out explanations only on
gain a better understanding of the problems involved. The goal of request. T h e y typically have a H O W c o m m a n d (i.e., how did the
the prototype is to gain experience; the prototypes are intended to system reach this conclusion?) a n d a WHY c o m m a n d (i.e., why
bc discarded. We illustrate how an expert system works by way of does the system need to know this fact?). In this example, the how
one of these prototypes. The code was developed over a two day and w h y are always listed. With a larger K n o w l e d g e Base, this
period using T E D I U M * to generate an operational system that may result in a cluttered dialog that confuses more t h a n it clarifies.
runs in MUMPS. This prototype was integrated with a larger pro- T h e example in Figure 4 s h o w s how the s y s t e m manages
totype intended to explore ideas in '- ~nuw~cugc
~ ' ~ ~ representation. backward chaining. Three facts are given, Rule I l (that this is a
Both prototypes have been abandoned. In what follows, we illus- giraffe) is considered the most logical, and the fact [is ungulate] is
trate the inference m e c h a n i s m of the expert system prototype accepted as a subgoal. T h e system asks,
* TEDIUM in ~ registered trademark of Tedious Enterprises, Inc.
IS T H I S T R U E ( Y / N )

E N T E R FACT : has dark spots


E N T E R FACT : has long neck
E N T E R F A C T : chews cud
E N T E R F A C T : has hair
ENTER FACT :
T H A N K YOU, P R O C E S S I N G BEGINS.
W I T H TIIE GIVEN FACTS, TIIE MOST L I K E L Y G O A L IS D E F I N E D BY RULE ii
The a n i m a l is a g i r a f f e b e c a u s e it is an u n g u l a t e , has a long neck and legs, and
h a s a tawny color w i t h dark spots.
T H I S IS IMPLIED BY TIIE F O L L O W I N G FACTS
[is ungulate]
[has long neck]
[has dark spots]
T H E F O L L O W I N G FACTS M U S T BE TRUE TO REACH T H I S G O A L
has long legs
IS Tills TRUE (Y/N)
has tawny color
IS TIIIS TRUE (Y/N)
The animal has hair and t h e r e f o r e is a m a m m a l .
The a n i m a l is a m a m m a l w h i c h chews its cud, t h e r e f o r e it is an e v e n - t o e d u n g u l a t
e.
The a n i m a l is a g i r a f f e b e c a u s e it is an u n g u l a t e , has a long neck and legs, and
has a tawny color with dark spots.

Figure 3 Sample Expert Flow


26

Yes would add it to the Global Database, No indicates that there system scarchcs for a rule to prove [does not fly], i.e., one with
[does not fly] as a consequent. Because none exists, it is a s s u m e d
is no knowledge o f its truth. (This is different, of course, from
that [docs not fly] is not true and Rule 14, it is a penguin, also
saying that it is not true, i.e., false.) T h e system asks if it should
fails. (Note that a s s u m i n g that a fact is not true does not imply that
try to find the truth or this subgoal. When the user responds Yes,
it is false. It only m e a n s that there is no knowledge o f its truth.)
it first attempts Rule 8. This requires that both
T h e next possible rule is Rute 15, it is an albatross. The
antceedcnt [is good flier] is asked, and after the user states that
[is mammal] this is true thc conclusion is drawn that this is an albatross
[chews cudl
Limitations of this Simple Expert System
be true. When the user indicates that they are true, [is ungulate] is The four examples of the prcvious section illustrate some of
added to the Global Database. Rulc l 1 is forested and found to be the problcms with this type of vcry simple system. In the first
true. example, the fact was established that the animal was even-toed.
This fact indecd was truc for a giraffe, but it was nevcr used. T h u s
T h e sample in Figure 5 illustrates how easy it is for the sys-
the answer had a "left ovcr" fact. In the Figure 5 example, the first
tem to respond in an u n r e a s o n a b l e way. Here two facts are
two possible goals were not good choices. Tigers do not have long
entered and we expect the path to lead to a conclusion of giraffe.
necks; the ostrich [is black a n d white] and conscqucntly does not
But the system in this case responds to individual facts first. [has
have tawny color. T h e fact about color could be inferred if the
tawny color] suggests a tiger. This is a poor choice because we
K n o w l e d g e Base has knowledge of color as well as animals.
know tigers do not have long necks. T o prove that it is a tiger, the
Finally, in the last example, we said that the albatross swims cven
system asks if it is a carnivore. When we reject this fact, it exam-
though (a) that fact was not in the K n o w l e d g e Base and (b) it
ines other possible animals and fixes on [has long neck]. N o w it
probably is not accurate. In this case the "left over" fact m a y have
considers if it is an ostrich, and continues with the subgoal [is
contradicted the advice.
bird]. Clearly, a better algorithm for ranking the probable goals
would eliminate this particular example. Yet it does point out h o w In addition to the problem of dealing with u n u s e d facts,
expert systems m a y fix on key facts inappropriately. there is the question of what to do if all the antecedcnts have not
bccn established for a goal. In this simple example, if the animal is
N o w consider the example in Figure 6. Here a partial fact
a carnivore with clark spots, then only Rule 9 would hold, i.e., it is
"has lea" is entered and a dictionary lookup asks if the user
a chcetah. Yet if we know that it is a carnivore with t a w n y color,
wanted "has feathers." This is a routine error test in a data process-
then there arc two possible choices. Here is an example of uncer-
ing environment. Many AI systems apply m o r e sophisticated test-
ing by using a D W I M (do what l mean) approach. In any case, tainty. Wc nccd more information (such as the location of the
note that an expert system must maintain a consistent internal animal, the distinction of animal populations, other characteristics
vocabulary. T w o other facts next are added. T h e inference is of the animal, etc.) before we can suggest the most probable solu-
made that this is a bird, a n d the most logical choice is presented: tion. If that information is not available, then the fact that it is one
Rule 14, this is a penguin. T h e user is asked if it is true that this o f two animals m a y be useful in and o f itself.
animal does not fly. T h e No indicates no knowledge, and so the

ENTER FACT : has long legs


E N T E R FACT : has long n e c k
E N T E R FACT : has t a w n y c o l o r
E N T E R FACT :
T H A N K YOU, P R O C E S S I N G B E G I N S .
WITH THE G I V E N FACTS, THE M O S T L I K E L Y G O A L IS D E F I N E D BY R U L E ii
The a n i m a l is a g i r a f f e b e c a u s e it is an u n g u l a t e , has a long neck and legs, and
has a tawny c o l o r w i t h dark spots.
THIS IS I M P L I E D BY T H E F O L L O W I N G F A C T S
[has long legs]
[has long neck]
[has t a w n y color]
THE F O L L O W I N G F A C T S M U S T BE T R U E TO REACH T H I S G O A L
is u n g u l a t e
IS T H I S T R U E (Y/N) N
THE Q U E S T I O N IS is u n g u l a t e T R U E ?
A T T E M P T TO F I N D THE T R U T H OF THIS F A C T (Y/N) Y
WE S H A L L C O N S I D E R THE S U B G O A L RULE 8
The a n i m a l is a m a m m a l w h i c h chews its cud, t h e r e f o r e it is an e v e n - t o e d u n g u l a t
e.
T E S T THIS S U B G O A L (Y/N) Y
c h e w s cud
IS THIS T R U E (Y/N)
R E E V A L U A T I N G G O A L S B A S E D U P O N NEW FACTS
W I T H THE G I V E N FACTS, T H E M O S T L I K E L Y G O A L IS D E F I N E D BY R U L E ii
The animal is a g i r a f f e b e c a u s e it is an u n g u l a t e , has a long neck and legs, and
has a t a w n y c o l o r w i t h dark spots.

Figure 4 Backward Chaining Example


27

E N T E R FACT : has long neck


E N T E R FACT : has t a w n y c o l o r
E N T E R FACT :
T H A N K YOUr P R O C E S S I N G B E G I N S °
W I T H T H E G I V E N F A C T S , THE M O S T L I K E L Y G O A L IS D E F I N E D BY RULE i0
We k n o w that this is a t i g e r b e c a u s e it is a c a r n i v o r e with a t a w n y c o l o r and bl
ack s t r i p e s .
T H I S IS I M P L I E D BY THE F O L L O W I N G FACTS
[has t a w n y color]
T H E F O L L O W I N G F A C T S M U S T BE T R H E TO REACH THIS G O A L
is c a r n i v o r e
IS T H I S T R U E (Y/N) N
T H E Q U E S T I O N IS is c a r n i v o r e T R U E ?
A T T E M P T TO F I N D T H E T R U T H OF T H I S FACT (Y/N) N
W I T H T H E G I V E N FACTS, T H E M O S T L I K E L Y G O A L IS D E F I N E D BY RULE 13
The a n i m a l is an o s t r i c h b e c a u s e it is a bird w h i c h d o e s not fly, it has long le
gs and neck, and it is b l a c k and white.
THIS IS I M P L I E D BY THE F O L L O W I N G FACTS
[has long neck]
T H E F O L L O W I N G F A C T S M U S T BE T R U E T O REACH THIS G O A L
is bird
IS T H I S T R U E (Y/N) N

Figure 5 A Sample of Unreasonable Behavior

E N T E R FACT : has lea


THIS IS NOT A K N O W N FACT, (S)CAN THE FACTS OR ( R ) E N T E R
has f e a t h e r s
(A)CCEPT OR (N)EXT
E N T E R FACT : s w i m s
E N T E R F A C T : is b l a c k and w h i t e
E N T E R FACT :
T H A N K YOU, P R O C E S S I N G B E G I N S .
W I T H THE G I V E N FACTS, THE M O S T L I K E L Y G O A L IS D E F I N E D BY RULE 14
The a n i m a l is a p e n g u i n b e c a u s e it is a bird that d o e s not fly but s w i m s and is
b l a c k and w h i t e .
THIS IS I M P L I E D BY THE F O L L O W I N G FACTS
[is bird]
[swims]
[is b l a c k a n d white]
T H E F O L L O W I N G F A C T S M U S T BE T R U E T O REACH T H I S G O A L
does not fly
IS T H I S T R U E (Y/N) N
T H E R E IS NO W A Y T O I N F E R THIS FROM O T H E R FACTS, W I L L C O N S I D E R O T H E R G O A L S
W I T H T H E G I V E N FACTS, T H E M O S T LIKELY G O A L IS D E F I N E D BY RULE 15
The a n i m a l is an a l b a t r o s s b e c a u s e it is a bird w h i c h is a good flier.
T H I S IS I M P L I E D BY T H E F O L L O W I N G FACTS
[is bird]
THE F O L L O W I N G F A C T S M U S T BE T R U E T O REACH T H I S G O A L
is g o o d f l i e r
IS T H I S T R U E (Y/N) Y
The a n i m a l has f e a t h e r s and t h e r e f o r e is a bird.
The a n i m a l is an a l b a t r o s s b e c a u s e it is a bird w h i c h is a good flier.

Figure 6 A Sample with Unresolved Facts


28

T h e r e also is another kind of unccrtainty: that which is confident that we can proOuce an acceptable EMYCIN-Iike shell
implicit in the rule. For example, there might be a rule that if it in a matter of weeks. Based upon some preliminary analysis, it
flies, then it is most likely a bird. Facts in this case would not just appears that m u c h of the work in developing E M Y C I N involves
be true or unknown; they would be known with some certainty user interaction and data entry/validation. These are functions
factor. T h e n the logic for selecting the most likely goal would not ideally suited to T E D I U M (and MUMPS), and therefore we are
be based on only finding the antecedents; it would require mani- confident that they can be implemented quickly. We believe that
pulations of both facts and their certainty factors to establish the the result will be a prototype upon which we can build the expert
most likely goals. If the certainties were established with some systems that we need: one that supports database m a n a g e m e n t ,
precision, then one could apply Bayesian statistics: in general, easy interfaces with external systems, and the ability to insert pro-
however, expert systems use subjective certainty factors based on cedural code. Although Lisp clearly is superior for the problem of
a 5 or I0 point scale. implementing an inference engine, we feel that T E D I U M (and
Finally, note that this simple example did not consider the MUMPS) have some advantages for some of the other functions.
"obvious" facts that, if it is a bird, it is not a mammal and if This prototype will test that assertion with relatively large sets of
rules.

[does not fly] is true, then Conclusions


[is good flier] is false.
This paper is intended to introduce the reader to some of the
concepts associated with expert systems. A very simple expert
By using production rules written in this form, some of the system and knowledge base was used as an illustration. Every-
knowledge is fragmented and obvious associations are lost. O n e thing was implemented in M U M P S , and most response times were
could o v e r c o m e these linguistic problems by identifying variables satisfactory. Because the implementation tools were designed for
such as fly with the valid values o f "does not" and "is good". T h e n databases and the knowledge base was stored as a database, why
the a n t e c e d e n t s in the rules might look like: was the resultant "expert system" not just another database appli-
cation? That is a reasonable question.
If "fly" equals "does not".
If one wished to i m p l e m e n t the knowledge recorded in Table
In this particular case, the result is a w k w a r d but understandable. 1 as a relational database, one might restructure it by identifying
A neater, but more difficult, solution would be to use natural attributes and values as s h o w n in Table 2. Using standard data-
language processing to translate: base tools, one t h e n could ask the questions:

T h e animal does not fly What bird flies?


It doesn't fly What bird has long legs, a long neck, is black and white,
It's a nonflier a n d does not fly?
into the above expression.
T h e results would be the same as if the expert s y s t e m were used.
Finally, we should point out that the user interface in this One also could ask,
prototype was designed to display the problem solving activity
from the perspective of the inference engine. It told the user what What animals have long necks?
it was evaluating and h o w it was searching. Although the input
function did not allow the user to enter invalid information, the and get a list the same as would be reported by the expert system.
interface was still unnatural. To restructure a system so that it
may maintain a more reasonable dialog, some expert system shells However, if one asked,
implement an "ask first" property that forces the system to ask
questions at appropriate points d u r i n g the consultation. What animal has hair, chews cud, and is white with black
stripes?
T h e obvious question is, h o w does one overcome these diffi-
culties? In our situation we intend to implement our next proto- then the database approach could not satisfy the query. O f course,
type expert s y s t e m shell based u p o n the model established by the database could be e x p a n d e d with the additional attributes has
E M Y C I N . T h i s is the Stanford developed "Essential M Y C I N " hair (yes or no) and c h e w s c u d (yes or no). O n e could write a pro-
d o m a i n - i n d e p e n d e n t framework for constructing rule-based con- gram to add entries with (yes, yes) for every ungulate, (yes, no) for
sultants. It is M Y C I N stripped of its d o m a i n knowledge. T h e every other m a m m a l , and (no, no) for the remaining animals. But
Texas I n s t r u m e n t s Personal Consultant (PC) is a commercial it should be obvious that this becomes a very clumsy w a y to solve
implementation o f this model. a problem. Nevertheless, for small, well u n d e r s t o o d problem
domains that can be described fully with a database, the database
Clearly, a full explanation o f E M Y C I N cannot be presented
paradigm is generally more effective than the expert system.
here. It is rule based, but the a n t e c e d e n t s contain expressions
(e.g., animal color = "tawny") a n d they m a y be associated using
boolean algebra. T h e shell supports certainty factors (CF) plus the As~sl Gap C~rdor Pllte~x Fly Swim
grouping of rules in contexts (e.g., patient, infection, therapy).
eb~tah clLrmvor¢
There are m a n y tools for string manipulation so that internal
representations m a y be t r a n s f o r m e d into user friendly text. For t~cr ca~'nJvorc mw.y black
example, the Lisp S-expression u.ng'uiatc mway dark

~bra ungutatc while black ~ ¢ s


(SAME C N T X T SITE B L O O D )
ollrich bird black and white nOl

can be translated automatically into pclaguin bird black a n d white not yes

albatro~.~ bird soo<~


T h e site of the culture is blood.
D e v e l o p m e n t of this next prototype is underway. We have
learned e n o u g h from our mistakes with the first prototypes to feel Table 2 Animal Knowledge Base as a Data Base (Goals Only)

S E E PTI. 14

Você também pode gostar