toward intelligent decision support systems: an artificially intelligent … › 2a38 ›...

16
Future Directions Toward Intelligent Decision Support Systems: An Artificially Intelligent Statistician By: William F_ Remus Jeffrey E. Kottemann Department of Decision Sciences University of Hawaii 2404 Maile Way Honolulu, Hawaii Abstract There are three important considerations in DSS de- velopment. (1) Decision making Involves both pri- mary andsecondary processes, where secondary processes concern selecting appropriate decision making tools, approaches, and information. (2) making decisions, humans are subject to numerous cognitive limitations.(3) In order for end users develop DSSs, sophisticated, problem-oriented DSS generators must replace technically demand- ing DSS tools. These threeconsiderations canbe effectively addressed by including expertsystem components in DSSs. An expert DSS for statistical analysis Is proposed and used as anillustration. Decision making scenarios are used to indicate the potential of such a system. In particular, it appears that anexpert DSS can provide support for both pri- mary and secondary decision making and help ame- liorate human cognitive limitations. Keywords:Expert systems, artificial intelligence, decision support systems, cognitive biases ACM Categories: H.4.2,1.2.1 Introduction Designing decision support systems (DSS) re- quires more than just knowledge about hard- wareandsoftware design. To design the DSS well, the designer mustunderstand decision makers and their limitations. The designer mustalso knowhowdecision makers deter- minewhat methods to use to reach a deci- sion. Only with all of the above cana DSS be constructed which is Useful in making a parti- cular decision. In this article, we describe a hypothetical in- telligent DSS for statistical decision making. But first we address the motivations for such a DSSo in the following section, we reviewthe literature on human decision making and point out the limitations that decisionmakers have in dealing with data andactually making decisions. The literature reported in that sec- tion deals with not only decision making but deciding howto decide. Theformer are pri- marydecisions; the latter have beentermed secondary decisions [81], metadecisions [50], and predecisions [55]. Theresearch suggests that both primary and secondarydecisions may be biased [70]. Anintelligent DSS for sta- tistical analysis must both help choose appro- priate statistical tools (a secondary decision) andalso provideoutput that will not mislead the user when making a primary decision. The use of artificial intelligence techniques may make active expert support for both kinds of decislonspossible. Managerial Decision Making Biases Making a gooddecision starts with having or gathering the right information upon which to base a decision. In most casesthe necessary information is obtainedthrough the manager’s visual and auditory senses. Theprocessing of information through these senses is fairly well understood and is briefly describedin Sidebar 1. The problem with the processing that occurs in the sensoryinput channels is that processes candistort information. Consider the impact of the processing de- scribed in Sidebar 1. First, humans do not MIS Quarter/y/December 1986 403

Upload: others

Post on 25-Jun-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Toward Intelligent Decision Support Systems: An Artificially Intelligent … › 2a38 › f77bce3d038a07b5d... · 2016-01-29 · Toward Intelligent Decision Support Systems: An Artificially

Future Directions

Toward IntelligentDecision SupportSystems: An ArtificiallyIntelligent Statistician

By: William F_ RemusJeffrey E. KottemannDepartment of Decision SciencesUniversity of Hawaii2404 Maile WayHonolulu, Hawaii

AbstractThere are three important considerations in DSS de-velopment. (1) Decision making Involves both pri-mary and secondary processes, where secondaryprocesses concern selecting appropriate decisionmaking tools, approaches, and information. (2) making decisions, humans are subject to numerouscognitive limitations. (3) In order for end users develop DSSs, sophisticated, problem-orientedDSS generators must replace technically demand-ing DSS tools. These three considerations can beeffectively addressed by including expert systemcomponents in DSSs. An expert DSS for statisticalanalysis Is proposed and used as an illustration.Decision making scenarios are used to indicate thepotential of such a system. In particular, it appearsthat an expert DSS can provide support for both pri-mary and secondary decision making and help ame-liorate human cognitive limitations.

Keywords:Expert systems, artificial intelligence,decision support systems, cognitivebiases

ACM Categories: H.4.2, 1.2.1

IntroductionDesigning decision support systems (DSS) re-quires more than just knowledge about hard-ware and software design. To design the DSSwell, the designer must understand decisionmakers and their limitations. The designermust also know how decision makers deter-mine what methods to use to reach a deci-sion. Only with all of the above can a DSS beconstructed which is Useful in making a parti-cular decision.

In this article, we describe a hypothetical in-telligent DSS for statistical decision making.But first we address the motivations for sucha DSSo in the following section, we review theliterature on human decision making andpoint out the limitations that decision makershave in dealing with data and actually makingdecisions. The literature reported in that sec-tion deals with not only decision making butdeciding how to decide. The former are pri-mary decisions; the latter have been termedsecondary decisions [81], metadecisions [50],and predecisions [55]. The research suggeststhat both primary and secondary decisionsmay be biased [70]. An intelligent DSS for sta-tistical analysis must both help choose appro-priate statistical tools (a secondary decision)and also provide output that will not misleadthe user when making a primary decision. Theuse of artificial intelligence techniques maymake active expert support for both kinds ofdecislons possible.

Managerial DecisionMaking BiasesMaking a good decision starts with having orgathering the right information upon which tobase a decision. In most cases the necessaryinformation is obtained through the manager’svisual and auditory senses. The processing ofinformation through these senses is fairlywell understood and is briefly described inSidebar 1. The problem with the processingthat occurs in the sensory input channels isthat processes can distort information.

Consider the impact of the processing de-scribed in Sidebar 1. First, humans do not

MIS Quarter/y/December 1986 403

Page 2: Toward Intelligent Decision Support Systems: An Artificially Intelligent … › 2a38 › f77bce3d038a07b5d... · 2016-01-29 · Toward Intelligent Decision Support Systems: An Artificially

Future Directions

Sldebar 1. Processing Visual Information

Visual processing can be viewed as the fol-lowing sequence of actions [37]:(1)The light from an object strikes the

retina.(2) The light is changed Into neural coding.(3)The Image of the object and the obJect’s

location are separated and processedIndependently.

(4)The Image Is reduced to geometric fea-tures (e.g., arcs, lines, corners, edges)and Fourier representations by simplefeature extractors.

(5)The small geometric features are recon-structured into a global rendering of theobject.

(6)The global rendering of the object is re-ferred to associative memory for Identi-fication.

(7)The Identity of the object is found usinga "fuzzy key" to access memory.

(8)The object’s Identity allows associativememory to look-up the relationships theobject has to other objects.

At this point, the information is ready to bepassed on for higher levels of processing.

deal directly with holistic images; they dealwith images as coded by the feature extrac-tors (that is, images composed on linkedlines, arcs, corners, edges, and Fourier signa-tures). The feature extractors fit lines to anyimage. Therefore, it is not surprising thatmanagers can see patterns (i.e., linked linesegments in some plausible order) in randomplots of data points (the brain reconstructs pattern from the pieces found by the featureextractors). Table 1 summarizes other humaninput errors that bias interpretations of data.

From Table 1, one gets the impression thathumans are prone to numerou~ errors whichthey could easily avoid. Yet even training oftenfails to eliminate these errors [24]. The reasonthat these errors persist is that most of the er-rors result from the neurophyslological limita-tions of the human brain. And training can’tdo much to overcome such limitations.

The brain’s wiring affects not only the inputprocessing but also the decisions made. Notethe many human decision making biasesshown in Table 2. Again, one gets the impres-sion that these obvious biases should beeasy to correct through proper training. Unfor-tunately, these errors persist because theyare also a function of the brain’s organiza-tion. To illustrate this relationship considerthe neural underpinning of the task of grasp-ing an object as described in Sidebar 2.

The way in which the future is forecast paral-lels that in Sidebar 2. Some point is anchoredon and then adjusted to try to reach a goal.Human decision makers underadjust just asthe conservative fractionation processesunderadjusto The motor system can correctusing dynamic feedback -- but decisionmakers don’t have dynamic feedback on theirdecisions. Feedback is so potent that it mayhave more effect on performance than otherbiases [38].

Another source of decision making behaviorresults more from the balance between cogni-tive hardware (i.e., the right and left hemi-sphere of the brain) than from its limitations.This theory (called cognitive style) suggests

Sldeber 2. Grasping An Object

Consider the way in which the brain pro-cesses a command to grasp an object [37]:(1) The task Is fractlonated and passed

several parallel processors. The frac-tlonatlon is conservative (that Is thesum of the smaller tasks Is less thanthe overall task) because overshootingthe goal can Injure the hand but under-shooting will not.

(2)Each parallel processor repeats theconservative fractlonatlon and passesthat task on to sub-processors and on-ward to muscle groups.

(3)The Impact of the conservative frac-tlonatlon Is movement toward the goal(but not totally reaching the goal).

(4)Dynamic feedback allows the fractlona-tlon of the remaining distance until thehand Is correctly positioned.

404 MIS Quarterly~December 1986

Page 3: Toward Intelligent Decision Support Systems: An Artificially Intelligent … › 2a38 › f77bce3d038a07b5d... · 2016-01-29 · Toward Intelligent Decision Support Systems: An Artificially

Future Directions

Table 1. Biases Associated With Present!ng Data to a Decision Maker

Irrelevant InformationIrrelevant information can influence decision makers and may reduce the quality of theirdecisions [18, 65].

Data Presentation Biases(a) The type of Information -- Data acquired through human Interaction has more impact

on the decision maker than just the data itself [10].

(b) The format of the data display -- The display format of the data crucially affects thedecision made. Generally speaking, summarized data presentations (e.g., statistics,tables, graphs) are preferred to raw data [14]. The choice of whether to use graphicalsummaries or tabular summaries depends on the level of environmental stability [58].

(c) Logical data displays -- When a set of alternatives is presented which seems to cap-ture all the possibilities, then the decision maker may not detect other alternativesthat may exist [25].

(d) Order effects -- The order in which data are presented can affect the data retention[66, 52]. In particular, the first piece of data may assume undue Importance (primaryeffect) or the last piece of data may become overvalued (recency effect).

(e) Context -- When decision makers assess the variability in a series of data points,their assessment will be affected by the absolute (i.e., mean) value of the data pointsand the sequence in which they are presented [42].

Selective PerceptionDecision makers selectively perceive and remember only certain portions of the data [20].

(a) People filter data In ways that reflect their experience. When data from a problem ispresented to decision makers, they will particularly notice the data that relates toareas in which they have expertise [16, 20]. This leads to differing and often limitedproblem formulations.

(b) People’s expectations can bias perceptions. When decision makers are reviewingdata which is contrary to their expectations, they may remember incongruent piecesof data inaccurately [13, 45].

(c) People seek Information consistent with their own views. If decision makers bring to aproblem some prejudices about the problem, they will seek data confirming their pre-judices [5, 12, 78].

(d) People downplay data which contradicts their views. If the decision makers have anexpectation about a problem or if they think they have arrived at the solution, they willdownplay or ignore conflicting evidence [2, 64]. Their expectations will often persisteven in light of continued conflicting evidence [78].

Frequency(a) Ease of recall -- The ease with which certain data points can be recalled will affect

not only the use of this data but the decision maker’s perception of the likelihood ofsimilar events occurring [40, 44, 72, 73, 74].

(b) Base rate error -- Often the decision maker determines the likelihood of two eventsby comparing the number of times each of the two events has occurred. However, thebase rate is the crucial measure and is often ignored [10, 34, 35]. This is a particularproblem when the decision makers have concrete experience with one problem butonly statistical information abot~t another. They will generally be biased toward think-ing the concrete problem to be more troublesome. Decision makers particularlyoverestimate the frequency of catastrophic events [44].

MIS Quarterly~December 1986 405

Page 4: Toward Intelligent Decision Support Systems: An Artificially Intelligent … › 2a38 › f77bce3d038a07b5d... · 2016-01-29 · Toward Intelligent Decision Support Systems: An Artificially

Future Directions

Table 1 Continued(c) Frequency to Imply strength of relationship -- The more pairs of co-varying data

points that decision makers have, the stronger they think the relationship betweenvariables is [22, 33, 77].

(d) Illusory correlation -- Decision makers may erroneously believe certain variables tobe correlated. From plots of data they can often "recognize" patterns, even when thepoints displayed here have no true correlation [67, 73, 77].

Table 2. Biases In Information Processing

HeuristicsAs Ackoff [1] pointed out, the decision maker’s major problem is often not the lack of informa-tion upon which to base a decision but too much information. Generally the decision makerreduces the problem to a problem Involving only 3 or 4 crucial factors. The decision Is madeusing a heuristic based on those factors. Unfortunately, these heuristics often have built-inbiases [74]. The following are some of the problems with heuristics.

(a) Structuring problems based on experiences. Decision makers may try to find the bestfit between the new problems they have and old problems they have had. When amatch is found, the decision maker then alters the old decision slightly to reflect thenew circumstances [34, 35, 73, 74].

(b) Rules of thumb. If the decision maker has prior experience in solving a problem, hemay again use the same rule of thumb since it proved satisfactory last time [39].

(c) Anchoring and adjustment. Prediction is often made by selecting an "anchoring" (e.g.,on a mean) value for a set of data and then adjusting the value for the circumstancesIn the present case. Generally, the adjustments are insufficient [66, 74].

(d) Inconsistency In the use of the heuristic. Bowman [11] theorized that the majoreconomic consequences of decisions did not result from a manager’s use of poorheuristics but from the manager’s inconsistent use of the heuristics. Numerousstudies have shown that heuristics based on regression outperform the decisionmaker himself [29, 51, 59] and the performance difference, as theorized, has beenshown to result from Inconsistency [57].

Misunderstanding of the Statistical Properties of Data(a) Mistaking random variation as a persisting change. Data available to a decision

maker often have statistical properties; that Is, they have a mean value around whichthe data points are randomly distributed. When the next observation is a high or lowvalue data point, the decision maker may believe an upward or downward trend is oc-curring. In fact, It is just a random variation and not a persisting change. Often deci-sion makers infer causes for these random variations [41, 67, 72, 73, 77].

(b) Inferring from small samples.The characteristics of small samples are often believedto be representative of the population from which they are drawn. Thus, in data In-ference too much weight is given to small sample results [72, 74].

(c) Gambler’s fallacy. In a probabllistlc process where each event is Independent of theones preceding it (i.e., It is random), decision makers may erroneously make in-ferences about future events based on the occurrence of past events [34, 41].

(d) Ignoring uncertainty. When a decision maker Is faced with several sources of uncer-tainty simultaneously, he may simplify the problem by ignoring or discounting someof the uncertainty. The decision maker may then choose the most likely alternative forhis decision [28].

406 MIS Quarterly/December 1986

Page 5: Toward Intelligent Decision Support Systems: An Artificially Intelligent … › 2a38 › f77bce3d038a07b5d... · 2016-01-29 · Toward Intelligent Decision Support Systems: An Artificially

Future Directions

Table 2 ContinuedLimited Search StrategiesDecision makers have to decide when to stop gathering data and begin the analysis. Theyoften use truncated search strategies which prematurely exclude potentially relevant informa-tion [61]. The use of truncated search strategies increases as task complexity increases [54].

Conservatism In Decision MakingWhen new information arrives, the decision maker will tend to revise his probability estimatesin the direction prescribed by Bayes’ theorem, but usually the revision is too small [52]. Thisconservatism increases with message informativeness [66] and is subject to the primary effectHT].Inability to Extrapolate Growth ProcessesWhen exponential growth is occurring, the decision maker may underestimate the outcomesof the growth process [71, 75]. This underestimation Is not improved by presenting more datapoints [76].Note: Major reviews of cognitive blas research are given in [6, 21, 56, 60, 65].

¯ that decision makers can be classified Interms of their right hemisphere orientation(intuitive, sensory, feeling) versus their lefthemisphere orientation (analytic and system-atic). The Ideal manager could function in ei-ther mode as required by the situation. Inmany cases the Ideal would be a balance be-tween the hemispheres. Although the cogni-tive style concept has been expanded to in-corporate dimensions other than hemisphericlateralization, the concept still suggests thatnon-optimal decision making can be traced toneurophysiological factors other than brainwiring limitations.

The neural underpinning of cognitive style isnot clear. It may be the balance between leftand right is hard-wired or it may be under con-scious control. But In any case, the theorysuggests the relative balance will affect deci-sions actually made. The inconsistencies incognitive style research, however, confoundapplication of the theory [27, 32].

At this point, it is important to also questionthe paradigms, methods, and results of cogni-tive bias research

1. Are biases an artifact of the way in whichinformation is presented to decision mak-ers? Several recent studies indicate thatcognitive bias can be ameliorated by pre-senting Information in a form that is con-gruent with the type of cognitive processingrequired by a task and with the manner in

which humans seem to store and processinformation [15, 31, 48, 56].

2. Are the normative models chosen to serveas the. baseline In judging decisions non-optimal truly appropriate? Wright and Mur-phy [83] have shown that, while decisionmakers may not judge correlations wellwhen compared to judgements dictated bynormative models such as Pearson’s r, theymay judge corPelationsln noisy data (i.e.,with outliers) similarly to measurementsusing robust estimates of correlation (i.e.,weighted local linear least squares).

3. Are biases indeed rational in the context oforganizational cultures? James March andcolleagues [23, 46] have eloquently arguedthat biased, seemingly irrational decisionmaker behavior is sensible, even rational,given the nature of organizational life.

While the above three questions may have afundamental Impact on behavioral decisionmaking research, they do not particularly con-found the exploration of intelligent DSS -- inthis case, of an artificially Intelligent statisti-cian (AIS). First, it is primary goal of an AISto present information in a way congruent withhuman cognitive processes. And further, thereis a fundamental difference between con-straining the information made available to adecision maker, as Is done In a laboratorystudy, and making various information alter-natives available as an AIS would do. Second,the number of statistical tools available to

MIS Quarterly~December 1986 407

Page 6: Toward Intelligent Decision Support Systems: An Artificially Intelligent … › 2a38 › f77bce3d038a07b5d... · 2016-01-29 · Toward Intelligent Decision Support Systems: An Artificially

Future Directions

Sldebar 3. What Is an Expert System?

An expert system Incorporates the knowledge of expert human decision makers and at-tempts to Induce or deduce new knowledge using the system’s a priori knowledge, Inferredknowledge and real world data. One key In expert system research, then, Is to devise a schemefor representing knowledge that Is appropriate for ellclting and encoding human knowledgeand that Is also expedient for use in automated Induction and deduction.

The product/on rule Is one popular approach among the many proposed knowledge represen-tation schemes. In Its simplest form, a production rule is "IF condition(s) THEN conclu-slon(s).".A group of production rules are related In that the conclusion(s)of one rule sharesvariables with the condition(s) of another rule -- for example: (1) IF runny nose THEN virus, (2) IF cold virus THEN bed rest. Given that rules so overlap, the set of rules, termed knowledge base (KB), can be treated as a tree. Taken as a whole, a knowledge base Is a con-tlngency model. Given the KB, an Inference engine traverses the logical rule tree asking a userfor values of the conditions or conclusions when needed to choose a path through the tree.

An expert system may try to prove a hypothesis and move backwards In the tree -- e.g., "1 willtry to prove you need bed rest." It may also try to surmise something devoid of an Initial hypo-thesis -- e.g., "Do you have a runny nose?" It has proven effective forexpert systems to Inter-weave the goal-dr/yen (as In the former example) and data-dr/yen (as In the latter example) ap-proaches (also referred to as backward and forward chaining). These "dUal-driven" expertsystems better emulate expert decision makers who have Initial hunches, ask directed ques-tions, and try to develop new hunches. In essence, an inference eng/ne/s a contingencymodel analyzer. If the Inference engine keeps track of where it has gone In the rule tree, thenIt can backtrack to expla/n its tra/n of thought.

If the rules In the KB are logically treated as data, then it is possible to devise an inferenceengine that can analyze an arbitrary set of rules. Such an expert system generator Is, In prin-ciple, no different than database-oriented application generators. An application generatorallows users to define data models and then supports database manipulation based uponthe definitions. An expert system generator allows users to define IF... THEN ... rules andthen supports contingency analysis based upon the rule definitions.

Although experimental expert systems are typically Implemented In LISP or PROLOG (whichare well suited for prototype development) an expert system can be implemented In. anygeneral purpose programming language. Indeed, for efficiency reasons, commercial systemsare often Implemented in languages such as Fortran or C.

Business applications include expert systems to support production plant managers, loan of-ricers, budget and portfolio managers, Information resource managers, auditors and collec-tion agents, as well as many day-to-day tasks such as scheduling meetings. The Proceedingsof the International Joint Conference on Artlfical Intelligence Is perhaps the best source forInformation on recent AI research and applications (albeit there Is much to wade through Inthese two volume sets, which are published In odd numbered years). Winston [82] gives readable, if a bit outdated, treatment of artificial Intelligence.

408 MIS Quarterly~December 1986

Page 7: Toward Intelligent Decision Support Systems: An Artificially Intelligent … › 2a38 › f77bce3d038a07b5d... · 2016-01-29 · Toward Intelligent Decision Support Systems: An Artificially

Future Directions

decision makers is large and many of thesetools are not well understood. An AIS willsimply help a decision maker choose appropri-ate tools while also helping to avoid the mostcommon data manipulation and Interpreta-tion errors. Third, the AIS we outline does notintend to address the use of information in anorganizational, cultural sense. In short, devel-opment of an AIS can draw upon the stableaspects of the cognitive bias literature andstatistical theory. It embodies the basic philo-sophy of DSS by offering support and consul-tation in statistical analysis.

Moving to an Intelligent DSSfor Statistical AnalysisThere are numerous good statistical packagesavailable (e.g., SAS, SPSS, TSP). In the currentstate-of-the-art, the decision maker (1) pro-vides the data, (2)chooses an appropriate sta-tistical technique and gives the analysis com-mands, and (3) interprets the results. Steps (2)and (3) are repeated until the decision makeris satisfied that he understandSthe da~a inso-far as is possible using available tools. In theanalysis the human statistician makes judge-ments on what tests to use and/or how to pro-perly conduct those tests. Rules for thesejudgements are not difficult to formalize (seeAndrews et aL, [3]).

Techniques in artificial intelligence, particu-larly expert systems, are readily applicablehere. (Sidebar 3 is a brief overview of expertsystems.) This section illustrates the incor-potation of expert systems techniques in astatistical DSS using decision making sce-narios that contrast unintelligent and intelli-gent support. Following the cases, the nextsection gives a general design for a proposedAIS DSS, sketches a hypothetical session us-ing AIS, and outlines stumbling-blocks tocommercial implementation.

An AIS will provide direct support for each ofthe three steps given above. Steps 1 and 2 in-volve secondary decisions. That is, decisionsof which data and analysis tools are approprioate. Step 3 relates to the primary decision inwhich the results of the analysis tools are in-terpreted and used to determine a real-world

action. Following are four scenarios in whichonly passive support is offered by the DSS.After that, the four scenarios are revisited toillustrate the potentials of an AIS DSS. Final-ly, a fifth scenario is used to discuss treat-ments of Illostructured problems.

Case 1 -- In assessing and predicting salesperformance, the DSS presents the decisionmaker with a graph of time series data. How-ever, the decision maker may anchor on pastdata and adjust to predict the future, may seepatterns in data which are due to randomness,and may see more variability in the data thanreally exists.

Case 2 -- The decision maker wants to com-pare the performance of salespeople in Chi-cago and Boston. He has the DSS create histo-grams of these data from both locations; heconcludes that the salespeople in Chicagoare performing better.

Case 3 -- The manager wants to determinewhat is causing the sales problems in Boston.To do this, he has the DSS extract data onmany predictive factors from the databaseand display it. After mentally sorting throughthese factors, he concludes that it "all boilsdown to the sales commission structure inuse in Boston."

Case 4 -- The decision maker asks the DSSto predict sales using any of 10environmentalfactors. The resulting regression has 3 inde-pendent variables; one of the independent vari-ables is the log of the number of personalcomputers sold nationwide.

In each of the four cases, the DSS has donecorrectly what was requested, but the decisionmaker could be misled by the results pre-sented. In case 1, the presentation form istechnically correct but the decision makercould misinterpret the resulting display. Incase 2, the DSS histograms were correct butno support was given for comparing the twofactors. In case 3, the DSS gives technicallycorrect support in displaying the factors butdoes not extend help to modeling the salesproblem. This leads the manager to use a sim-plistic heuristic. In case 4, the technically cor-rect stepwise response regression has beenperformed but no support is provided to thedecision maker to understand the logic of the

MIS Quarter/y/December 1986 409

Page 8: Toward Intelligent Decision Support Systems: An Artificially Intelligent … › 2a38 › f77bce3d038a07b5d... · 2016-01-29 · Toward Intelligent Decision Support Systems: An Artificially

Future Directions

process. In all four cases, the DSS can be ex-panded to provide more Intelligent support.The additional requirements are as follows:

Case 1 -- In a graphics display of time seriesdata, a good design will make the default dis-play start at zero on the Y axis. This helpsavoid the distortion in the "perceived data vari-ability" (see Table 1) resulting if the Y axlswere shortened to zoom-in on the data. Thetrend line projection tools should be automati-cally invoked to get a better prediction thanthe conservative "anchor and adjust" heuris-tic. Automatic testing for statistical signifi-cance of the trend and any seasonality shouldalso be displayed. This might reduce the deci-sion makers tendency to see patterns In ran-dom data. ~

Implementing the display defaults can beeasily done with an AIS or even a "dumb" sys-tem like SPSS. (Interestingly, SPSS defaultsto the bias creating zoom display.) Invoking trend line analysis presents more of a prob-lem since not every graphic display of data isappropriate for trend analysis (even if season-ality is built in). Normally, decision makers in-voke trend line If the data "appears right" forprojecting a trend. In the AIS, a heuristic canbe implemented to invoke the trend line. Withtrend lines, or any pattern in times seriesdata, the AIS must automatically calculate,present, and interpret statistical tests of thesignificance of the fit. This can help decisionmakers to avoid overevaluating any patternswhich are not statistlcally significant.

Case 2 -- In this case the decision makerdoes not have an AIS to automatically teststatistically the differences between the twocities. Thus, he makes intuitive decision. Ifthe decision maker had a natural language in-terface to an AIS, many of the tasks would be-come simplified. Consider the query; "Com-pare the sales commission earned in Bostonand Chicago." The data would be retrievedand displayed, but additionally, statisticalcomparisons would be automatically madethat avoid the decision maker’s biases. Usingan AIS, the proper test would be chosen, vari-ances pooled, outliers handled, etc. Thesechoices are not all determinate. The choicebetween tests on the mean and median maybe subjective as is the choice of when to pool

or not to pool. Handling outllers Is also Judge-mental. Such heuristics can be built Into anAIS. As a result, the manager will receive astatement such as "There is greater than 95%chance that the median commissions differbetween Boston and Chicago" (and relateddata) without having called for the test (andwithout having knowledge of the heuristics).Since a natural language interface will allowmore general inquiries than non-natural lan-guage AIS’s or "dumb" packages like SPSS,SAS, and IFPS/MLR, a natural language AIScan be more aggressive in choosing how toanswer the general inquiry.

Case 3 -- In this situation the decision makeris dealing with multivariate data (in a sequen-tial univarlate way). Thus he resorts to a sim-plistic heuristic. What is needed is a multivari-ate model to support this decision. When ahistorical database is available, regressioncan be used to develop a heuristic policy fromthis problem. Heuristics of this sort consis-tently outperform the decision makers them-selves [29, 51, 59]. At minimum these heuris-tics will reduce erratic decision making.(Note, however, that this may not always bedesirable [30].)

The keys to successful use of the regressionboot-strapping model in an AIS are (1) embed-ding a user transparent regression, (2) keepingthe details of its operation out of the user’sway, (3) automatically determining when itsuse might be appropriate, and (4) promotingits use. Of particular note here is the need tocompetently perform the regression withoutinvolving the user in the details. This calls foran expert system to automatically take careof outliers, perform transformations, test forheteroscedasclty, and perform other relatedtasks. Many of the choices would be basedon expert heuristics embedded in the AIS.Also, the system for case 3 must have the fea-tures described next in case 4.

Case 4 -- Here the AIS may develop a fore-casting model based on decision maker in-put, axiomatic regression procedures, and ex-pert heuristics for regression. Also, however,the decision maker must be able to under-stand the logic for the AIS’s choice of thethree variable model as the "best" model. Theubiquitous feature of expert systems, back

410 MIS Quarter/y/December 1986

Page 9: Toward Intelligent Decision Support Systems: An Artificially Intelligent … › 2a38 › f77bce3d038a07b5d... · 2016-01-29 · Toward Intelligent Decision Support Systems: An Artificially

Future Directions

tracking, is used to show the logic of modeldevelopment. In case 4, the decision makerwould want to know how the three variableswere chosen and why the log transformationwas necessary. In case 3, the logic modelingmultivariate heuristics using regressionwould also be provided.

With an AIS-like system, it is also possible fora decision maker to interrupt the logic se-quence and "what if" a step in the procedure.If applied to case 4, the decision maker couldget an expert’s consultation about how toproceed with the analysis by a step-by-steplogic and calculations summary as the model’is developed. The decision maker could alsoinspect the logic (and results) of several dif-ferent models (i.e., regression and discrimi-nant analysis); this would be equivalent toconsulting several experts. Capabilities suchas these are currently available in many ex-pert systems [63].

In some cases, (e.g., case 1), the AIS mustcompensate for the biases of the human deci-sion maker in performing primary decisiontasks. In other cases (e.g., cases 2 and 3), thesystem must support secondary decisiontasks and augment the analytical skills of themanager. Given the sophistication of thesemodels, the system (as in case 4) must alsobacktrack and "forward track" to show thelogic of its analysis and also allow managerialoverride. In short, the AID must provide Con-sultation for both the primary and secondaryaspects of decision making. While an AIS hasno answer to a truly "unstructured" problem,in many problems a structure can be devel-oped given adequate data.

Case 5 -- A manager states that "There issomething wrong at our Chicago plant." If theDSS provides expert consultation (beyondthat described in the prior section), it mightbe able to help identify the problem andcauses. For example, the system might re-spond by asking it it was a financial, sales,manufacturing, etc. problem. If the managerreplied that it was a financial problem, thesystem might ask what the manager thoughtthe cause might be. If the causal relationshipis testable, the system will determine and per-form the appropriate test. The capabilitiesneeded are (1) a natural language interface,

(2) the ability to develop testable hypotheses,(3) the ability to test the hypotheses usingreal data, (4) access to meta-data, i.e., infor-mation on, and the relationships among, dataitems and processes, and (5) the ability backtrack on the system’s logic.

In MYCIN, an expert system for diagnosing in-fectious diseases, the doctor is coached inwhat tests would be appropriate to make aclear diagnosis [63]. Similarily, an expertbusiness system could coach and perform~testing of managerial hypotheses about prob-lems [49]. For example, analysis of two simi-lar plants (one with and one without the prob-lem) can suggest a cause of the problem. Thisis an important capability since selectingrelevant variables can be more importantthan procedural sophistication in makinggood decisions [38]. To do this well, the data-base must be augmented with expert infor-mation about the processes happening ineach plant and data beyond that normally re-quired for data processing. This requires in-formation which conceptualizes organiza-tional tasks and task structure in addition totransactional data. Often the former is morecrucial in decision making and better handledwith expert systems [38]..

In Blum’s RX system [8], testable hypothesesare derived in a different way. In this systemfor performing medical research, the systemasks the users what they wish to predict (forexample, the causes of hea, rt attacks). A sub-program of Blum’s RX system finds the high-est correlations between the criterion variable(heart attacks) and other potential predictionvariables; the Spearman non-parametric cor-relation coefficient is used for this. The re-suits of this step are filtered through a pathanalysis program to suggest a causal model.

The Design and Operationof an AISDesign and operation of an AIS for managerialdecision making given in this section drawsheavily upon the RX system for medical re-search [8]. RX supports medical research inmuch the same way as we are suggesting theAIS support managerial decision making and

MIS Quarter/y/December 1986 411

Page 10: Toward Intelligent Decision Support Systems: An Artificially Intelligent … › 2a38 › f77bce3d038a07b5d... · 2016-01-29 · Toward Intelligent Decision Support Systems: An Artificially

Future Directions

problem solving. RX uses statistical andmedical knowledge in an attempt to discovercausal relationships, such as "cholesterolcauses heart attacts."

The basic functions that the AIS must per-form are:

1. Parse the query,2. Determine data that is appropriate to

answering the query,3. Determine correspondingly appropriate

statistical technique(s),4. Determine how to handle issues such

as outllers, multicolllnearity, and trans-formations,

5. Produce output to the decision makerwhich minimizes possible biasing ef-fects.

Note that steps (2) through (4) support secon-dary decisions; only (5) provides direct pri-mary declsion support.

The basic structure of an AIS is given in Figure1. Note that its structure mirrors the DSSframework proposed by Sprague [69], with theaddition of the inference engine and knowl-edge base (KB) components. The inference en-gine functions as the executive module of theAIS and is shown as the hub of the system.The inference engine receives a request (inparsed form) from the user-interface compo-nent. It references the knowledge base anddatabase/data dictionary to choose an appro-priate statistical technique which is then in-

Figure 1. Simple Block Diagram

voked. The actual references to the variouscomponents would be interleaved as evi-denced by the following example.

1. The decision maker says "Comparethe sales commissions in Boston andChicago."

2. The natural language interface parsesthe request to:(a) get Chicago data on sales commis-sions.(b) get Boston data on sales commis-sions.(c) statistically test two samples.

3. The Inference engine references theKB to fired a list of tests which com-pare two independent samples and aset of rules to choose between them.[The rules will require analyses of botharrays of data to determine whetherthe arrays are nominal, cardinal, or or-dinal.]

4. The two data arrays must be called upand passed to statistical subroutinesfor analysis.

5. The subroutines return to the in-ference engine lists of statisticalarguments [to determine whether thedata is nominal, ordinal, or cardinal].

6. The AIS infe.rence engine uses thosestatistical arguments and the step 3rules to make a decision as to whetherthe data is nominal, ordinal, or car-dinal. Note that the inclusion of this

of an Artiflcally Intelligent Statistican

Natural Language Interface I

~.

~ Kn°wledge Base °f Extra’StatisticalAIS Inference EngineRules; Organization-Specific Rules

and Executive

Database and Dictionary I Statistical AnalysisModel Base and Dictionary

412 MIS Quarterly~December 1986

Page 11: Toward Intelligent Decision Support Systems: An Artificially Intelligent … › 2a38 › f77bce3d038a07b5d... · 2016-01-29 · Toward Intelligent Decision Support Systems: An Artificially

Future Directions

metadata information in the data dic-tionary will simplify steps 4 through 6.

7. Knowing the data type, the inferenceengine now finds a short list of testsand adjustment procedures.

8. Given the data type, the next AIS rulecauses the data arrays to be sent to anappropr!ate module in the model baseto handle any outliers (e.g., for car-dinal data this heuristic subroutinemight be based on Winsorizing). Thesubroutine returns the adjusted dataarrays.

9. Given the data type and handling ofoutliers, the AIS rule calls for the ad-justed data to be passed to an ap-propriate statistical subroutine [e.g.,t-test]. There are two sets of t.testsdepending on whether the data ispaired or not. If not paired, then thereis a choice to be made between t-testswith pooled and non-pooled variances.The latter may be established by aF-test on variances within thesubroutine. The subroutine returns aselect list of statistical measures forthe comparison.

10. The AIS inference engine uses rules tounderstand and interpret the results ofthe test based on the select list ofstatistical measures. [This may re-quire reference to t tables which canbe stored in the data, knowledge, ormodel base].

11. The AIS must pass arguments to thenatural language interface for en-coding in English~ The select list ofstatistical measures may then also bestored in the database or knowledgebase for future reference.

The AIS as illustrated here can be implement-ed using existing software components --DBMS and data dictionaries, statistical soft-ware such as SPSS, and generalized inferenceengines (expert system generators) such EMYCIN. "Patching" such diverse compo-nents together, however, proves clumsy andresults in execution inefficiencies. For exam-Die, in the system initially developed by Blum[8], the data retrieved from the DBMS had tobe reformatted before being passed to SPSS;

the output from SPSS was reformatted into aform suitable for pattern matching with rulesin the knowledge base. The overall systemwas several batch oriented tasks rather thana real time DSS. A more complete list of diffi-culties in implementing an integrated, indus-trial-strength AIS includes the following:

1. A robust parser must be developed. Thisrequires the definition of a formal syn-tax for business problem solving. Natur-al language parsers for database querytransform a natural query into a formalsyntax such as SEQUEL. A formal syn-tax for managerial problem solvingmust be orders of magnitude more to.bust than such query formalisms.

2. The data storage representation mustbe equivalent in the various AIS compo-nents such that information can bepassed between the components with-out the need for storage format conver-sion.

3. The AIS Inference engine must be ableto access large amounts of historicaldata and metadata. This requires a timeoriented database with powerful, exten-sible access functions. The funct~onatdata model [62] holds promise here. Fur-ther, the functional data model attemptsto address (2) by treating data and meta-data equivalently [4].

4. Analogous to (3), an AIS that attemptsto support a broad range of managerialdecision making requires the construc-tion and maintenace of a substantiveknowledge base. Issues here includedevelopment of encyclopedic KB’s [43]which must allow for change over time,storage of possibly conflicting expertknowledge, and concurrent access bymany users [36]. In short, there is a needfor knowledge base management sys-tems and knowledge base administra-tors.

5. The computational complexity of sucha large-scale system is tremendous andrequires efficient implementation andhardware.

Despite the monumental effort implied bythese difficulties, a number of organizationshave committed to undertake such large scale

MIS Quarter/y/December 1986 413

Page 12: Toward Intelligent Decision Support Systems: An Artificially Intelligent … › 2a38 › f77bce3d038a07b5d... · 2016-01-29 · Toward Intelligent Decision Support Systems: An Artificially

Future Directions

projects -- see, for example, Lenat eL al., [43]and Fox [26]. Also, research in model manage-ment promises to provide frameworks forsuch systems [7, 9, 17].

SummaryPerhaps the most popular framework for DSSis that outlined by Sprague [69]; it is an archi-tectural framework. The framework embodiedby the AIS illustrated In this paper extendsSprague’s architectural view while also ad-ding a complimentary decision process view.The architectural extension is conceptuallysimple yet fundamental and involves incorpo-rating an "executable" knowledge-based DSScomponent [9]. The addition of this componentleads to a vertical view of the decision makingprocess. Decision making Implies the exis-tence of secondary processes which involve"deciding how to decide." The knowledge-based component, then, is used to supportboth primary and secondary decision making.

In an AIS, extra-statistical knowledge is usedto guide the selection of decision makingtools -- in this case, the secondary task ofchoosing among data and statistical analysistechniques. Knowledge of human cognitiveproperties is used to display information in aform effective for primary decision making.This basic, leveled approach is applicable toa wide range of decision making situations.See, for example, the discussion of a DSS forselecting general decision making strategies[80].The distinction between DSS generators andDSS tools affords a useful summary of sys-tems like AIS. A DSS generator is a user-friendly system with which end users candevelop specific DSSs. The use of DSS toolsfor data analysis, such as statistical analy-sis packages and database management sys-tems, requires substantive technical training[79]. The inclusion of an expert component,typified by AIS, integrates assorted DSS toolsinto a fairly robust DSS generator. Further,since the AIS contains primary and secondarydecision making knowledge, it should be abletO generate specific DSS models from littlemore than users’ problem statements.

Despite the promise of large-scale, knowledge-based DSS, a number of major challenges re-main. These include development of robust,integrated user interface, data management,model management, and knowledge manage-ment subsystems. Moreover, the fundamen-tal challenge in developing any knowledge-based system remains the effective identifi-cation and encoding of appropriate knowledge.

References[1] Ackoff, R.L."Management Misinformation

Systems," Management Science, Volume14, Number 4, December 1967, pp. B147-B156.

[2] Anderson, N.H. and Jacobson, A. "Effectof Stimulus Inconsistency and Discount-ing Instructions in Personality Impres-sion Formation," Journal of Personafityand Social Psychology, Volume 2, Number4, April 1965, pp. 531-539.

[3] Andrews, F.M., Klem, L., Davidson, T.N.,O’Malley, P.M. and Rodgers, W.L. A Guide,to Selecting Statistical Techniques forAnalyzing Social Science Data, Ann Arbor,Michigan, 1976.

[4] Atkinson, M.P. and Kulkarni, K.G."Experi-menting with the Functional Data Model,"in Databases ~ Role and Structure, P.M.Stocker, P.M.D. Gray and M.P. Atkinson(eds.). Cambridge University Press, NewYork, New York, 1984, pp. 311-338.

[5] Batson, C.D. "Rational Processing or Ra-tionalization?: The Effect of Disconfirm-ing Information on Stated Religious Be-lief," Journal of Personafity and SocialPsychology, Volume 32, Number 1, Janu-ary 1975, pp. 176-184.

[6] Benbassat, I. and Taylor, R. "BehavioralAspects of Information Processing for theDesign of Management Information Sys-tems," IEEE Transactions on Systems,Man, and Cybernetics, Volume SMC-12,Number 4, July/August 1982, pp. 439-450.

[7] Blanning, R.W. "Conversing with Manage-ment Information Systems in Natural Lan-guage," Communications of the ACM, Vol-ume 27, Number 3, March 1984, pp. 201-207.

414 MIS Quarterly~December 1986

Page 13: Toward Intelligent Decision Support Systems: An Artificially Intelligent … › 2a38 › f77bce3d038a07b5d... · 2016-01-29 · Toward Intelligent Decision Support Systems: An Artificially

Future Directions

[8] Blum, R.L. Discovery and Representationof Causal Relationships from a LargeTime-Oriented Clinical Database: The RXProject, Springer-Verlag Lecture Notes inMedical Informatics, Berlin, Germany,1982.

[9] Bonczek, R.H., Holsapple, C.W. andWhin-ston, A.B. "The Evolving Roles of Modelsin Decision Support Systems," DecisionSciences, Volume 11, Number 2, April1980, pp. 337-356.

[10] Borgida, E. and Nisbett, R.E. "The Differ-ential Impact of Abstract vs. Concrete In-formation on Decisions," Journal of Ap-plied Social Psychology, Volume 7, Num-ber 3, 1977, pp. 258-271.

[11] Bowman, E.Ho "Consistency and Optimal-ity in Management Decision Making,"Management Science, Volume 9, Number2, January 1963, pp. 310-321.

[12] Brunet, J.S., Goodnow, J.J. and Austin,G.A. A Study of Thinking, John Wiley &Sons, New York, New York, 1956.

[13] Bruner, J.S. and Postman, L.J. "On thePerception of Incongruity," Journal of Per-sonality, Volume 18, Number 2, December1949, pp. 206-223.

[14] Chervany, N.L. and Dickson, G.W. "An Ex-perimental Evaluation of Information OverLload in a Production Environment," Man-agement Science, Volume 20, Number 10,June 1974, pp. 1335-1344.

[15] Christensen-Szalanski, J.J. and Beach,L.R. "Experience and the Base-Rate Fal-lacy," Organizational Behavior and Hu-man Performance, Volume 29, Number 2,April 1982, pp. 270-278.

[16] Dearborn, D.C. and Simon, H.A. "Selec-tive Perception: A Note on the Departmen-tal Identification of Executives," Socio-metry, Volume 21, Number 2, 1958, pp.140-144.

[17] Dolk, D. and Konsynski, B.R. "KnowledgeRepresentation for Model ManagementSystems," IEEE Transactions on SoftwareEngineering, Volume 10, Number 6, No-vember 1984, pp. 619-628.

[18] Ebert, R.J. "Environmental Structure andProgrammed Decision Effectiveness,"Management Science, Volume 19, Num-ber 4, December 1972, pp. 435-445.

[19] Edgell, S.E. and Hennessey, J.E. "Irrele-vant Information and Utilization of Event

Base Rates in Nonmetric Multiple CueProbability Learning," Organizational Be-havior and Human Performance, Volume26, Number 1, August 1980, pp. 1-6.

[20] Egeth, H. "Selective Attention," Psycho-logical Bulletin, Volume 67, Number 1,January 1967, pp. 41-57.

[21] Einhorn, H. and Hogarth, R. "BehavioralDecision Theory: Processes of Judgmentand Choice," Annual Review of Psycholo-gy, Volume 32, 1981, pp. 53-88.

[22] Estes, W.K. "The Cognitive Side of Prob-ability Learning," Psychological Review,Volume 83, Number 1, January 1976, pp.37-64.

[23] Feldman, M.S. and March, J.G. "Informa-tion in Organizations as Signal and Sym-bol," Administrative Science Quarterly,Volume 26, Number 2, June 1981, pp. 171-186.

[24] Fischhoff, B. "Debiasing," in JudgmentUnder Uncertainty: Heuristics and Biases.D. Kahneman, P. Slovic and A. Tversky(eds.), Cambridge University Press, NewYork, New York, 1981.

[25] Fischhoff, B., Slovic, P. and Lichtensteln,S. "Fault Trees: Sensitivity of EstimatedFailure Probabilities to Problem Presen-tation," Journal of Experimental Psycholo-gy: Human Perception and Performance,Volume 4, Number 2, 1978, pp. 330-344.

[26] Fox, M.S. "The Intelligent ManagementSystem: An Overview," Working Paper, In-telligent Systems Laboratory, The Robot-ics Institute, Carnegie-Mellon University,August 11, 1981.

[27] Ganster, D.C. "Individual Differences andTask Design: A Laboratory Experiment,"Organizational Behavior and Human Per-formance, Volume 26, Number 1, August1980, pp. 131-148.

[28] Gettys, C.F., Kelly, III, C.W. and Petter-son, C.R. "The Best Guess Hypothesis inMultistage Inference," Organizational Be-havior and Human Performance, Volume10, Number 3, 1973, pp. 364-373.

[29] Goldberg, L.R. "Man Versus Model ofMan: A Rationale, Plus Some Evidencefor a Method of Improving on Clinical In-ferences," Psychological Bulletin, Vol-ume 73, Number 6, June 1970, pp. 422-432.

[30] Hogarth, R.M. and Makrldakls, S. "TheValue of Decision Making in a Complex

MIS Quarterly/December 1986 415

Page 14: Toward Intelligent Decision Support Systems: An Artificially Intelligent … › 2a38 › f77bce3d038a07b5d... · 2016-01-29 · Toward Intelligent Decision Support Systems: An Artificially

Future Directions

Environment: An Experimental Approach,"Management Science, Volume 27, Num-ber 1, January 1981, pp. 93-107.

[31] Howell, W.C. and Kerkar, S.P. "A Test ofTask Influences in Uncertainty Measure-ment," Organizational Behavior and Hu-man Performance, Volume 30, Number 3,December 1982, pp. 365-390.

[32] Huber, G.P. "Cognitive Style as a Basisfor MIS and DSS Designs: Much AdoAbout Nothing?" Management Science,Volume 29, Number 5, May 1983, pp. 567-579.

[33] Jenkins, H.M. and Ward, W.C. "Judgmentof Contingency Between Responses andOutcomes," Psychological Monographs:Genera/and Applied, Volume 79, Number1, 1965, Whole No. 594, pp. 1-17.

[34] Kahneman, D. and Tversky, A. "SubjectiveProbability: A Judgment of Representa-tiveness," Cognitive Psychology, Volume3, Number 3, July 1972, pp. 430-454.

[35] Kahneman, D. and Tversky, A. "On thePsychology of Prediction," PsychologicalReview, Volume 80, Number 4, July 1973,pp. 237-251.

[36] Kehler, T.P., Friedland, P., Pople, H., Ro-boh, R. and Rosenberg, S. "IndustrialStrength Knowledge Bases: Issues andExperiences," Proceedings on the EighthInternational Joint Conference on Artifi-cial Intelligence, Karlsruhe, West Ger-may, August 8-12, 1983, pp. 108-109.

[37] Kent, G. The Brains of Men and Machines.Bytes Books, New York, New York, 1982.

[38] Klelnmuntz, D. "Cognitive Heuristics andFeedback in a Dynamic Decision Environ-ment," Working Paper 83184-4-23, Univer-sity of Texas at Austin, Austin, Texas,1984.

[39] Knafl, K. and Burkett, G. "ProfessionalSocialization in a Surgical Speciality: Ac-quiring Medical Judgment," Social Sci-ence and Medicine, Volume 9, Number 7,July 1975, pp. 397-404.

[40] Kunreuther, H. "Limited Knowledge and.Insurance Protection," Public Policy, Vol-ume 24, Number 2, Spring 1976, pp. 227-261.

[41] Langer, E.J. "The Psychology of Chance,"Journal for the Theory of Social Behavior,Volume 7, Number 2, October 1977, pp.185-207.

[42] Lathrop, R.G. "Perceived Variability,"Journal of Experimental Psychology, Vol-ume 73, Number 4, April 1967, pp. 498-502.

[43] Lenat, D.B., Borning, A., McDonald, D.,Tay-Ior, C. and Weyer, S. "Knowsphere Build-ing Expert Systems with EncyclopedicKnowledge," Proceedings of the EighthInternational Conference on Artificial In-telligence, Karlsruhe, West Germany, Au-gust 8-12, 1983, pp. 167-169.

[44] Lichtenstein, S.C., Slovic, P., Fischhoff,B., Layman, M. and Combs, B. "JudgedFrequency of Lethal Events," Journal ofExperimental Psychology: Human Learn-ing and Memory, Volume 4, Number 6, No-vember 1978, pp. 551-578.

[45] Makridakis, S. and Hibon, M. "Accuracyof Forecasting: An Empirical Investiga-tion," Journal of the Royal StatisticalSociety A, Volume 142, Part II, 1979, pp.97-145.

[46] March, J.G. and Shapiro, Zo "BehavioralDecision Theory and Organizational Deci-sion Theory," In Decision Making: An Inter-disciplinary Inquiry, G.R. Ungson and D.N.Braunstein (eds.), Kent Boston, Massachu-setts, 1982, pp. 92-115.

[47] Mason, R.O. and Moskowitz, H. "Conser-vatism in Information Processing: Impli-cations for Management Information Sys-tems," Decision Sciences, Volume 3,Number 4, October 1972, pp. 35-55.

[48] Mclntyre, S.H. and Ryans, A.B. "Task Ef-fects on Decision Quality in TravelingSalesperson Problems," OrganizationalBehavior and Human Performance, Vol-ume 32, Number 3, December 1983, pp.344-369.

[49] Michaelsen, R., and Michie, D. "ExpertSystems in Business," DATAMATION,Volume 29, Number 11, November 1983,pp. 240-246.

[50] Mintzberg, H., Raisinghani, D. and Theoret,A. "The Structure of ’Unstructured’ Deci-sion Processes," Administrative ScienceQuarterly, Volume 21, Number 2, June1976, pp. 246-275.

[51] Moskowitz, M. and Miller, J. "Informationand Decision Systems for Production Plan-ning," Management Science, Volume 22,Number 3, November 1975, pp. 359-370.

[52] Moskowitz, M., Schaefer, R.E. and Borch-erdlng, K. "Irrationality of Managerial

416 MIS Quarterly/December 1986

Page 15: Toward Intelligent Decision Support Systems: An Artificially Intelligent … › 2a38 › f77bce3d038a07b5d... · 2016-01-29 · Toward Intelligent Decision Support Systems: An Artificially

Future Directions

Judgments: Implications for InformationSystems," Omega, Volume 4, Number 2,June 1976, pp. 125-140.

[53] Norman, D.A. "Stages and Levels inHuman-Machine Interaction," Interna-tional Journal of Man-Machine Studies,Volume 21, Number 4, October 1984, pp.365-375.

[54] Payne, JoW. "Task Complexity and Con-tingent Processing in Decision Making:An Information Search and Protocol Anal-ysis," Organization Behavior and HumanPerformance, Volume 16, Number 2, Au-gust 1976, pp. 366-387.

[55] Pitz, G.F. "Decision Making and Cogni-tion," In Decision Making and Change inHuman Affairs, H. Jungermann and G. DeZeeuw (eds.), D. Reidel Publishing, Dord-recht, Holland, 1975, pp. 403-424.

[56] Pitz, G.F. and Sachs, N.J. "Judgment andDecision: Theory and Application," An-nual Review of Psychology, Volume 35,1984, pp. 139.163.

[57] Remus, W.E. "Bias and Variance in Bow-man’s Managerial Coefficient Theory,"Omega, Volume 5, Number 3, September1977, pp. 349-351.

[58] Remus. W.E. "An Empirical Investigationof the Impact of Graphical and TabularData Presentations on Decision Making,"Management Science, Volume 30, Num-ber 5, May 1984, pp. 533-542.

[59] Remus, W.E., Carter, P. and Jenicke, L."Regression Models of Decision Rules inUnstable Environments," Journal of Busi-ness Research, Volume 7, Number 2,1979, pp. 187-196.

[60] Sage, A.P. "Behavioral and Organization-al Considerations in the Design of Infor-mation Systems and Processes for Plan-ning and Decision Support," IEEE Trans-actions on Systems, Man, and Cybeme-tics, Volume SME-11, Number 9, Septem-ber 1981, pp. 640-678.

[61] Shaklee, H. and Fischhoff, B. "Strategiesof Information Search in Causal Analy-sis," Decision Research Report 79-1, Uni-versity of Oregon, Eugene, Oregon, 1979.

[62] Shipman, D.W. "The Functional DataModel and the Data Language DAPLEX,"ACM Transactions on Database Systems,Volume 6, Number 1, March 1981, pp. 140-173.

[63] Shortcliffe, E. "Medical Consultation Sys-tems: Designing for Doctors," in Design-ing For Human-Computer Interaction, M.Sime and M. Coombs (edso), AcademicPress, New York, New York, 1983, ppo 209-238.

[64] Slovic, P. "Value as a Determiner of Sub-jective Probability," IEEE Transactionson Human Factors, HFE-7, Number 1,1966, pp. 22-28.

[65] Slovic, P. Fischhoff, B. and Lichtenstein,S. "Behavioral Decision Theory," AnnualReview of Psychology, Volume 28, 1977,pp. 363-396.

[66] Slovic, P. and Lichensteln, S. "Comparisonof Bayesian and Regression Approachesto the Study of Information Processing inJudgment," Organization Behavior andHuman Performance, Volume 6, Number6, November 1971, pp. 649-744.

[67] Smedslund, Jo "The Concept of Correla-tion in Adults," Scandinavian Journal ofPsychology, Volume 4, Number 3, 1963,pp. 165-173.

[68] Sprague, R.H., and Carlson, E.D. BuildingEffective Decision Support Systems,Prentice-Hall, Englewood Cliffs, New Jer-sey, 1982.

[69] Sprague, R.H. "A Framework for the De-velopment of Decision Support Systems,"MIS Quarterly, Volume 4, Number 4, De-cember 1980, pp. 1-26.

[70] Stabell, C.B. "Integrative Complexity ofInformation Environment Perception andInformation Use," Organizational Behaviorand Human Performance, Volume 22,Number 1, August 1978, pp. 116-142.

[71] Timmers, H° and Wagenaar, W.A. "InverseStatistics and Misperception of Exponen-tial Growth," Perception and Psychophys-ics, Volume 21, Number 6, June 1977, pp.558-562.

[72] Tversky, A. and Kahneman, D. "The Beliefin the ’Law of Small Numbers’," Psycho-logical Bulletin, Volume 76, Number 2,August 1971, pp. 105-110.

[73] Tversky, A. "Availability: A Heuristic forJudging Frequency and Probability," Cog-nitive Psychology, Volume 5, Number 2,September 1973, pp. 207-232.

[74] Tversky, A. "Judgment Under Uncertainty:Heuristics and Biases," Science, Volume185, September 27, 1974, I~P. 1124-1131.

MIS Quarterly/December 1986 417

Page 16: Toward Intelligent Decision Support Systems: An Artificially Intelligent … › 2a38 › f77bce3d038a07b5d... · 2016-01-29 · Toward Intelligent Decision Support Systems: An Artificially

Future Directions

[75] Wagenaar, W.A. and Sagaria, S.D. "Mis-perception of Exponential Growth," Per-ception and Psychophysics, Volume 18,Number 6, December 1975, pp. 416-422.

[76] Wagenaar, W.A. and Timmers, H. "Extra-polation of Exponential Time Series isNot Enhanced by Having More DataPoints," Perception and Psychophysics,Volume 24, Number 2, August 1978, pp.182-184.

[77] Ward, W.C. and Jenkins, HoM. "The Dis-play of Information and the Judgment ofContingency," Canadian Journal of Psy-chology, Volume 19, Number 3, Septem-ber 1965, pp. 231-241.

[78] Wason, P.C. and Johnson-Laird, P.N. Psy-chology of Reasoning: Structure and Con-tent, Harvard University Press, Boston,Massachusetts, 1972.

[79] Mann, R.I. and Watson, H.J. "A Contin-gency Model for User Involvement in DSSDevelopment," MIS Quarterly, Volume 8,Number 1, March 1984, pp. 27-38.

[80] Wedley, W.C. and Field, R.H.G. "A Predeci-sion Support System," Academy of Man-agement Review Volume 9, Number 4, Oc-tober 1984, pp. 696-703.

[81] White," D.J. "The Nature of DecisionTheory," in Theories of Decision in Prac-tice, D.J. White and K.C. Bowen (eds.),Hodder and Soughton, London, England,1975, pp. 3-16.

[82] Winston, P.H. Artificial Intelligence, Addi-son-Wesley, Reading Massachusetts,1977.

[83] Wright, J.C. and Murphy, G.L. "The Utilityof Theories in Intuitive Statistics: The Ro-bustness of Theory-based Judgments,"Journal of Experimental Psychology:General, Volume 13, Number 2, June 1984,pp. 301-322.

About the AuthorsWilliam E. Remus is Professor of Decision Sci-ences at the University of Hawaii. During thelast decade he has published over two dozenarticles in journals such as Management Sci-ence, the International Journal of Manage-ment Science (OMEGA), and Journal of Busi-ness Research. He has been funded by theNational Science Foundation for research intobehavioral decision making and was a Ful-bright scholar at the National University ofMalaysia in 1980. His current areas of researchinclude man-machine interfaces and applica-tions of artlfical intelligence.

Jeffrey E. Kottemann is Assistant Professorof Decision Sciences at the University ofHawaii. He received his Ph.D. in ManagementInformation Systems and Quantitative Meth-ods from the University of Arizona in 1984. Hehas published articles in the Journal of MISand in Information Systems. His current re-search Interests include information systemdevelopment environments, construction ofsoftware to integrate data, document, andknowledge base management, and empiricalresearch into the impacts of DSS on decisionmaking behavior.

418 MIS Quarterly~December 1986