automatic formulation of the auditor’s opinion ... · pricewaterhousecoopers, 2004, 2005, 2006)....

17
Automatic Formulation of the Auditor’s Opinion with AREX: With an Application to Egypt Mohamed A. Wahdan Menoufia University, Egypt - Maastricht School of Management, The Netherlands ([email protected]) Pieter Spronck IKAT, Faculty of Liberal Arts and Sciences, Universiteit Maastricht, The Netherlands Hamdi F. Ali Maastricht School of Management, The Netherlands Eddy Vaassen AIM, Faculty of Economics, Universiteit Maastricht, The Netherlands H. Jaap van den Herik IKAT, Faculty of Liberal Arts and Sciences, Universiteit Maastricht, The Netherlands Abstract This paper reports on research of knowledge-based systems (KBS) in auditing. It focuses on constructing, implementing, and validating a KBS that aims at helping auditors formulate their opinions on financial statements. To formulate their opinions, auditors use a “personal-judgement” approach, which is heavily depended on their experience and expertise. The approach has four drawbacks: (1) it is ineffective (2) it may lead to different decisions, (3) it suffers from personal bias, and (4) it may even generate misleading judgements. Therefore, the challenging questions are: (1) To what extent is it possible to automate the formulation of the auditor’s opinion? and (2) To what extent is a KBS effective, efficient, and acceptable as a tool to formulate the auditor’s opinion? This paper focuses on constructing, implementing, and validating a KBS, called “Auditor’s Report EXpert” (AREX), to formulate the auditor’s opinion on financial statements. The knowledge used by AREX is acquired from literature, and from practicing and academic auditors through questionnaires and in-depth interviews. The results of the validation and evaluation process (using test cases and actual auditing cases) indicate that AREX performs the auditor’s opinion task adequately. We conclude that AREX is successful and promising. 1. Introduction This paper focuses on constructing, implementing, and validating a knowledge-based system (KBS) that aims at helping auditor to formulate an opinion on financial statements to be expressed in the auditor’s report. Corporations are required by law to produce annual financial statements, which are accompanied by the auditor’s report. The report shows an independent auditor’s opinion on the fairness of the financial statements. To formulate their opinions, auditors use a “personal-judgement” approach, which is heavily depended on their experience and expertise. This approach may be (1) ineffective and may lead to (2) different decisions, (3) personal bias, and (4) even misleading judgements (cf. O'Leary, 2003). These four drawbacks of a human auditor are a source of doubt and hesitation for corporations and users of financial statements. The paper investigates (1) to what extent is it possible to automate the formulation of an auditor’s report with a KBS called the “Auditor’s Report EXpert” (AREX) and (2) to what extent is a KBS effective, efficient, and acceptable as a tool to formulate the auditor’s opinion? Such a KBS would be useful in supporting auditors in their tasks, especially in environments where there is a lack of experience or expertise with formulating the auditor’s opinion. Designing the system is a difficult task, because of the high complexity of the audit environment, and because of the personal-judgement approach used by auditors. A KBS that is able to formulate the auditor’s opinion and does so adequately will reduce the inconsistencies of the personal judgements (cf. Brown and Murphy, 1990; Flory, 1991; McDuffie et al., 1993; O’Leary, 2003). Hence, a KBS for the auditor’s opinion task may be considered as a considerable help to the International Federation of Accountants (IFAC) members. It may expedite and harmonize the auditors’ opinions, thus making those opinions more reliable. Additionally, a KBS for such a task could also be used as an internal training tool at auditing firms to build up the experience of junior auditors (Young, 1994; Changchit, 2003). 1

Upload: others

Post on 16-Apr-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Automatic Formulation of the Auditor’s Opinion with AREX: With an Application to Egypt

Mohamed A. Wahdan Menoufia University, Egypt - Maastricht School of Management, The Netherlands ([email protected])

Pieter Spronck

IKAT, Faculty of Liberal Arts and Sciences, Universiteit Maastricht, The Netherlands

Hamdi F. Ali Maastricht School of Management, The Netherlands

Eddy Vaassen

AIM, Faculty of Economics, Universiteit Maastricht, The Netherlands

H. Jaap van den Herik IKAT, Faculty of Liberal Arts and Sciences, Universiteit Maastricht, The Netherlands

Abstract This paper reports on research of knowledge-based systems (KBS) in auditing. It focuses on constructing, implementing, and validating a KBS that aims at helping auditors formulate their opinions on financial statements. To formulate their opinions, auditors use a “personal-judgement” approach, which is heavily depended on their experience and expertise. The approach has four drawbacks: (1) it is ineffective (2) it may lead to different decisions, (3) it suffers from personal bias, and (4) it may even generate misleading judgements. Therefore, the challenging questions are: (1) To what extent is it possible to automate the formulation of the auditor’s opinion? and (2) To what extent is a KBS effective, efficient, and acceptable as a tool to formulate the auditor’s opinion? This paper focuses on constructing, implementing, and validating a KBS, called “Auditor’s Report EXpert” (AREX), to formulate the auditor’s opinion on financial statements. The knowledge used by AREX is acquired from literature, and from practicing and academic auditors through questionnaires and in-depth interviews. The results of the validation and evaluation process (using test cases and actual auditing cases) indicate that AREX performs the auditor’s opinion task adequately. We conclude that AREX is successful and promising. 1. Introduction This paper focuses on constructing, implementing, and validating a knowledge-based system (KBS) that aims at helping auditor to formulate an opinion on financial statements to be expressed in the auditor’s report. Corporations are required by law to produce annual financial statements, which are accompanied by the auditor’s report. The report shows an independent auditor’s opinion on the fairness of the financial statements. To formulate their opinions, auditors use a “personal-judgement” approach, which is heavily depended on their experience and expertise. This approach may be (1) ineffective and may lead to (2) different decisions, (3) personal bias, and (4) even misleading judgements (cf. O'Leary, 2003). These four drawbacks of a human auditor are a source of doubt and hesitation for corporations and users of financial statements. The paper investigates (1) to what extent is it possible to automate the formulation of an auditor’s report with a KBS called the “Auditor’s Report EXpert” (AREX) and (2) to what extent is a KBS effective, efficient, and acceptable as a tool to formulate the auditor’s opinion? Such a KBS would be useful in supporting auditors in their tasks, especially in environments where there is a lack of experience or expertise with formulating the auditor’s opinion. Designing the system is a difficult task, because of the high complexity of the audit environment, and because of the personal-judgement approach used by auditors. A KBS that is able to formulate the auditor’s opinion and does so adequately will reduce the inconsistencies of the personal judgements (cf. Brown and Murphy, 1990; Flory, 1991; McDuffie et al., 1993; O’Leary, 2003). Hence, a KBS for the auditor’s opinion task may be considered as a considerable help to the International Federation of Accountants (IFAC) members. It may expedite and harmonize the auditors’ opinions, thus making those opinions more reliable. Additionally, a KBS for such a task could also be used as an internal training tool at auditing firms to build up the experience of junior auditors (Young, 1994; Changchit, 2003).

1

Wahdan, Spronck, Ali, Vaassen and van den Herik

AREX is targeted in particular at the accounting practice in Egypt, which, as a developing country, lacks sufficient auditors experienced in formulating the auditors’ opinions. Use of a KBS will increase the likelihood that the Egyptian auditors’ opinions on financial statements comply with the International Standards on Auditing (ISA) (IFAC, 2006). AREX has encoded all the knowledge associated with the auditor’s opinion on financial statements. It supplies auditors with relevant information in three phases of the auditing process, namely (1) basic information gathering, (2) audit procedures selection, and (3) formulation of appropriate opinions. AREX helps assess the control risk, the preliminary materiality, and the planned detection risk (Wahdan et al., 2005b). To implement AREX, knowledge was acquired from published academic materials, periodicals, and ISA. Knowledge was also elicited from practicing and academic auditors through questionnaires and in-depth interviews, using the Knowledge Acquisition and Design Systems (KADS) methodology (cf. Schreiber et al., 1993; Post, Wielinga, and Schreiber, 1997). AREX is implemented using the Knowledge Representation Objects Language (KROL) (Shaalan et al., 1998). After implementation, the knowledge base was validated by experienced auditors. The auditors were selected depending on at least one of the following three factors: (i) the number of years of experience (at least 10 years), (ii) the level of education, and/or (iii) work in the international auditing firms. A pilot study was carried out to test the clarity and validity of the questions in all questionnaire lists (available from the first author). Finally, AREX performance is validated and evaluated by test cases and actual auditing cases. The outline of this paper is as follows. Section 2 discusses background information and related work. Section 3 describes the conceptual model of AREX. Section 4 deals with the acquisition of the AREX knowledge. Section 5 presents the AREX implementation. Section 6 presents the validation and evaluation methodology of AREX. Section 7 provides our main conclusions and points at future work. 2. Background Information This section presents background information on the auditor’s report (2.1), the complexity of the audit environment (2.2), the ineffectiveness of the personal-judgement approach (2.3), the question whether formulating the auditor’s opinion is a suitable task for a KBS (2.4), and KBSs in auditing (2.5). 2.1 The Auditor’s Report The auditor’s report is the final stage of the audit process. It is used to communicate the auditor’s findings to the auditee. A director of a company is mainly interested in presenting the results of the company’s operations as satisfying as possible. This interest may conflict with the objective of preparing accounts to present a fair view. The auditor’s opinion may lend credibility to the financial statements by validating the techniques and procedures used to report the company’s results. To achieve this credibility, it is generally accepted that an independent auditor’s opinion should confirm that the financial statements fairly present the financial position, the results of operations, and the cash flows of the company (Guy et al., 2003; Arens et al., 2005). The auditor is responsible for checking the compliance with the accounting principles and attesting that the financial statements are fairly presented (Whittington and Pany, 2003; PCAOB, 2004; Hayes et al., 2005). Only experienced auditors will lend their credibility to financial statements. 2.2 The complexity of the Audit Environment An auditor should comply with a set of auditing standards that might be different from one country to another, which complicates the audit environment, in particular when auditing multinational firms (Needles and Pomeranz, 1985). Although most countries follow international standards on auditing (ISA), each country publishes its own standards that are in compliance with ISA. We may expect some differences in auditing standards among IFAC Members (cf. Ernst & Young, 2005; PriceWaterhouseCoopers, 2004, 2005, 2006). Moreover, legislators frequently change the predefined auditees’ situations and the auditing standards, which make the audit environment even more detailed and complex. Therefore, the interference of legislators adds another factor that may complicate the audit environment. As an example, we mention that the US Congress passed the Sarbanes-Oxley Act in 2002. Section 404 of that Act requires the auditor to report on the effectiveness of the internal controls over financial reporting, when auditing the financial statements (PCAOB, 2004). The auditor’s opinion on financial statements is influenced by three factors: (1) the auditor’s professional characteristics (e.g., independence, scope of responsibility, competitiveness, and expertise); (2) the characteristics of the audit environment (e.g., ability to collect evidence, effectiveness of the internal controls, and the auditor’s compliance with applicable auditing standards and other regulations); and (3) the auditee’s characteristics (e.g., going-concern ability, disclosure of the accounting principles, the auditee’s compliance with accounting principles, and fairness of the

2

Automatic Formulation of the Auditor’s Opinion with AREX

representation of the financial statements). Each of these factors requires an analysis and assessment before auditors are able to formulate their professional opinions. 2.3 The ineffectiveness of the personal-judgement approach Auditors depend on their personal judgements during the course of the audit. This may lead to different auditors reaching different decisions, depending, among others, on their experience and expertise (Libby, 1995; O’Leary, 2003). Given the same audit situation, one auditor’s opinion may differ from another’s. This may happen even to the extent that one gives an unqualified opinion and the other gives a qualified opinion subject to some established uncertainties (i.e., in the case of going-concern uncertainties) (cf. Libby, 1995; O’Leary, 2003). In summary, the approach may lead to difficulties in estimating audit risks. Consequently, it may lead to inconsistencies and difficulties in determining to what extent the audit procedures for obtaining evidence are sufficient. For example, auditors may differ in their opinions (1) on compliance with the accounting principles, (2) on the materiality level (Arens et al., 2005), and (3) on the consistency in applying the accounting principles (Wheeler, 1990; Pany and Ray, 2001). In this paper, we will address all three issues. 2.4 Is the task suitable? To determine whether formulating the auditor’s opinion is a suitable task for a KBS, we need to ascertain that the task is (a) not trivial, (b) useful, and (c) appropriate to be developed with KBS technology (cf. Abdolmohammadi and Kazaz, 1995; Karan et al., 1995). In answer to (a), we can say that the task is even more detailed and complex, as discussed in 2.2. In answer to (b), we can say that the automatic formulation of the auditor’s opinion is quite useful (as discussed in Section 1) and leads to many advantages, in particular in Egypt. As the task is less structured and rather complex, higher-level expertise and judgement become more crucial in order to perform the task well. So, a higher level of staff is required to perform the task. Mostly, the higher level of staff works under time pressure and as a consequence, the need for a KBS increases to perform this task. In answer to (c), we can say that the task of formulating the auditor’s opinion is well defined by the ISA. Therefore, the knowledge and expertise required to perform the task are available. Therefore, we can build a KBS to formulate the auditor's opinion on financial statements. 2.5 KBSs in Auditing Previous KBSs in auditing did not deal with the audit process as a whole. Instead, they dealt with limited decisions within an audit (Abdolmohammadi and Kazaz, 1995; Wahdan, 2006). The main restriction of these systems was that the knowledge bases reflected only the expertise of a single practitioner. Therefore, the ability to generalise the systems’ conclusions was restricted (Changchit, Holsapple, and Viator, 2001). Furthermore, most KBSs developed in auditing did not reflect any actual decision making in audit firms. These systems performed well on test cases, but their performance declined on actual audit cases (Murphy and Yetmar, 1996; Hornik and Ruf, 1997; Collier et al., 1999; Lenard et al., 2001; Lenard, 2003; McDuffie and Smith, 2005). Moreover, previous studies ignored the role of users in developing a knowledge base and building an explanation facility (Akoka and Comyn-Wattiau, 1996; Akoka and Comyn-Wattiau, 1997; Mak et al., 1997; Bayraktar, 1998). Indeed, most KBSs developed in auditing did not provide an explanation facility, which shows why a question was asked and how a particular conclusion was reached (Comyn-Wattiau and Akoka, 1997; Lenard, Madey, and Alam, 1998; Changchit et al., 2001). So far, a KBS for formulating the auditor’s opinion on financial statements received little attention in the literature. Much attention was given to the acquisition (from the literature) of knowledge required for this task. To the best of our knowledge, previous research has failed to deal adequately with the irregularities, inconsistencies, and complexities of the task of formulating the auditor’s opinion (Wahdan et al., 2005b). Up to now, no single KBS has been developed which executes this task in practice, as we discovered during the survey among many local and international audit firms in Egypt and The Netherlands (the preliminary survey was carried out in 2003 and is available from the first author). 3. A Conceptual Model of AREX The conceptual model of AREX focuses on the final stage of the auditing process, which consists of four tasks, namely (1) accumulating final audit evidence, (2) reviewing subsequent events that have happened after the year-end, (3) evaluating the auditor’s findings, and (4) issuing the auditor’s report (Arens et al., 2005). Before these tasks can be started, the model should (1) test the completeness of the prior auditing stages and (2) collect the results of these stages. To achieve this, the overall conceptual model of AREX consists of eight models, as illustrated in Figure 1 (Wahdan et al., 2005a).

3

Wahdan, Spronck, Ali, Vaassen and van den Herik

The output of the model of examining controls (1) represents the input of the preliminary materiality model (2) and the model of assessing planned detection risk (3). The output of these two models (2,3) represents the input of the auditing-standards model (4). The output of the auditing-standards model (4), the accounting-principles model (5), the model of fairness of representation (6), and the going-concern model (7) together form the input of the auditor’s opinion model (8), which formulates the auditor’s opinion on financial statements.

1.

Examining controls

3.

Planned detection

risk

4. Auditing standards

5. Accounting principles

2. Preliminary materiality 6. Fairness of representation

8.

Auditor’s opinion

7. Going concern

Figure 1: The conceptual model of AREX. The eight models of AREX are briefly discussed below. (1) The model of examining controls provides an assessment of the control risk, which contributes to

selecting the audit scope. (2) The preliminary materiality model provides the preliminary judgement about materiality, which

contributes to determining the amount of planned evidence. (3) The model of assessing planned detection risk provides an assessment of planned detection risk

and an audit scope of substantive tests. (4) The auditing-standards model checks whether the auditor collects appropriate audit evidence and

whether the audit complies with applicable auditing standards. (5) The accounting-principles model tests whether financial statements are prepared in accordance

with the applied accounting policies. (6) The model of fairness of representation tests whether financial statements are fairly presented. (7) The going-concern model evaluates whether the company has the ability to remain in business

and whether management plans are effective to resolve the going-concern uncertainties. (8) The auditor’s opinion model generates the proper auditor’s report on the financial statements after

collecting the outputs from all above models. In summary, the overall conceptual model provides the required knowledge to formulate the auditor’s opinion. The knowledge is acquired from the literature and from highly experienced auditors in Egypt. 4. Knowledge Acquisition After the AREX knowledge was collected through an extensive search of the literature, which included textbooks, periodicals, publications, firms’ manuals, and the ISA concerning the formulation of the auditor’s opinion on financial statements, the specialised knowledge from practice was elicited from a set of interviews with a sample of 32 auditors in audit firms in Egypt. To profit most from the interviews, questionnaires were sent to the auditors beforehand. The questionnaires were divided into eight parts, each covering one model. In the last part that covers the auditor’s opinion model, the questionnaires contained fifteen auditing situations that needed to be handled by the auditors as test cases. The whole process of knowledge acquisition was structured according to the KADS methodology (cf. Wielinga et al., 1992; Schreiber et al., 1993; Post et al., 1997), using the models specified in the previous section. The acquired knowledge was validated by letting auditors review the results of the knowledge acquisition process. Disagreements between auditors were first given to a small sample of the auditors for resolution. If they could not reach consensus, the chief expert made the final decision (Wahdan et al., 2005b).

4

Automatic Formulation of the Auditor’s Opinion with AREX

KROL was used to represent the AREX knowledge base (Shaalan et al., 1998). KROL encompasses multi-paradigm knowledge representations, such as first-order predicate logic, objects, and rules. The combination of object and rule processing provides a firm basis for handling complex problems. To represent the AREX knowledge, we used objects, concepts, properties, prompts, values, and value sources. 5. AREX Implementation The main challenge for any modelling approach of a KBS is to know how to model expertise. KADS is a method for the structured development of a KBS, which aims at providing a software engineering support for the knowledge-engineering process (cf. Wielinga et al., 1992; Schreiber et al., 1993). The KADS expertise model distinguishes three types of knowledge, namely (1) domain knowledge, (2) inference knowledge, and (3) task knowledge (Wahdan et al., 2005a). The AREX implementation of these three types of knowledge is discussed in subsections 5.1, 5.2, and 5.3, respectively. Subsection 5.4 discusses the AREX user interface, and subsection 5.5 presents the explanation facility. 5.1 Domain knowledge Domain knowledge consists of knowledge of a specific system. In this case, it is the knowledge required for creating an auditor’s report. Domain knowledge is represented in the forms of rules, facts, objects, hierarchies, properties, and relations. The AREX domain knowledge is stored in a concept hierarchy consisting of objects with their relations. Table 1 lists the number of concepts, properties, questions, and rules within each model and within a whole system. Figure 2 depicts the AREX concept hierarchy. Figure 3 illustrates the implementation of the concept of the auditor’s opinion from such a hierarchy.

7. Going-concern

uncertainties

6. Fair representation

5. Compliance with accounting

principles 4. Compliance with auditing

standards 3. Detection risk 1. Control

risk

Auditor competence

Auditor independence

Investigation of control procedures

Management integrity

Disclosure and audit evidence

Inherent risk

Audit risk

Walkthrough of significant accounts

Financial difficulties

User reliance on FS

Scope restriction

Account for going_ concern problem

Management plans

Testing going-concern Preliminary materiality

Scope restriction

Non-compliance with accounting principles / unfair representation

Substantive procedures

AB

AB

Disclosure of Accounting principles Balance sheet representation

Subsequent events

Departure from Accounting principles

Other FS representation

Comply with law and act

Consistency

Accounting judgments

Tests of controls (operation)

Significant misstatements

Understanding internal controls (design)

2. Materiality

Materiality base

8.Auditor opinion

Figure 2: AREX concept hierarchy. FS: Financial Statements

5

Wahdan, Spronck, Ali, Vaassen and van den Herik

Table 1: AREX models, concepts, properties, questions, and rules.

Models of AREX Concepts Properties Questions Rules 1. Examining controls 8 73 64 415 2. Preliminary materiality 3 21 17 35 3. Detection Risk 6 27 14 39 4. Auditing Standards 4 20 17 66 5. Accounting Standards 5 13 12 42 6. Fairness of Representation 5 23 18 23 7. Going-concern 4 26 20 41 8. Auditor’s opinion 3 29 22 37 Total (AREX) 38 232 185 698

auditor_report :: { concept_description(‘Determine the auditor opinion type’) & attributes([ b_departure_necessary([]), d_disclosure_Financial_statements([]), c_significant_uncertainties([]), advice([]) ]) & type(b_departure_necessary/1, nominal) & prompt(b_departure_necessary/1, ‘Does the auditor agree that accounting principles departure is necessary?’, []) & legal(b_departure_necessary/1, [ yes, no ]) & necessary(b_departure_necessary/1) & type(d_disclosure_Financial_statements/1, nominal) & prompt(d_disclosure_Financial_statements/1, ‘Do financial statements disclose about going concern uncertainties?’, []) & legal(d_disclosure_Financial_statements/1, [ yes, no ]) & necessary(d_disclosure_Financial_statements/1) & type(c_significant_uncertainties/1, nominal) & prompt(c_significant_uncertainties/1, ‘Are there significant uncertainties?’, []) & legal(c_significant_uncertainties/1, [ yes, no ]) & necessary(c_significant_uncertainties/1) & type(advice/1, nominal) & source_of_value(advice/1, [derived(auditor_opinion_financial_statements)]) & legal(advice/1, [ ‘adverse opinion_going concern uncertainties’, ‘adverse opinion_non_compliance with accounting principles’, ‘adverse opinion_unfair representation’, ‘disclaimer of opinion_non_compliance with auditing standards’, ‘disclaimer of opinion_significant uncertainties’, ‘qualified opinion_except for non_compliance with auditing standards and collecting evidence’, ‘qualified opinion_except for non_compliance with accounting principles’, ‘qualified opinion_except for going concern problem’, ‘qualified opinion_except for unfair representation’, ‘unqualified opinion with explanation_information about necessary departure’, ‘unqualified opinion with explanation_information about going concern uncertainties’, ‘unqualified opinion with explanation for misstatements that are less than 5%’, ‘unqualified opinion with explanation for misstatements more than 10 % that are adjusted by client’, ‘unqualified opinion_clean opinion’ ]) & target(advice/1, ‘The auditor opinion on financial statements should be: ‘) & super(domain_class) }.

Figure 3: An implementation of the concept (auditor’s report) using KROL. 5.2 Inference Knowledge Inference knowledge is knowledge used in the reasoning process. In AREX, inference knowledge is stored in the form of rules. AREX generates the auditor’s opinion by applying user-supplied facts to the encoded rules.

6

Automatic Formulation of the Auditor’s Opinion with AREX

5.3 Task Knowledge Task knowledge is knowledge related to the goal of the task and the activities that contribute to achieving the goal (such as decomposition and control). In AREX, eight tasks are distinguished, which correspond to the eight different models displayed in Figure 1. The eight tasks are defined by their input, output, goals, control, and features. When AREX is used, the eight tasks are used to structure the information that the user must supply.

5.4 User Interface Users can supply AREX with information in two different ways. The first way (sequential questions screen) is to let AREX query the user on information needed, i.e., AREX asks the user questions one-by-one, which the user must answer before he/she can continue. The second way is to let AREX provide the user with the required information in the form of a sheet, which covers one relation in a model. The user may choose in which order to assign values to properties, and may obtain information on how the system works, why properties are needed, and how intermediate conclusions are derived. The sheet screen is illustrated in Figure 4 (Wahdan et al., 2005c).

5.5 Explanation facility The most widely used types of explanation are WHY the system asks this particular question and HOW the system has reached this particular conclusion (CLAES, 1995; Dhaliwal and Tung, 2000). Figure 4 illustrates the WHY explanation as used in AREX. It is used during the interaction with the user who is, for instance, requested to enter a value of an attribute, then the user may ask the system “why do you ask me this question”. In AREX, the user then clicks on the WHY icon, as shown in Figure 4, to obtain the answer of this question. The HOW explanation is important when the system reaches a conclusion, and the user would like to have a justification for that conclusion. The HOW explanation consists of names of inference steps used in the current session, names of relations used in each inference step, and the input and output of each relation used. The HOW explanation of AREX is displayed on the screen containing the value of the output and the input attributes of the selected relation, as shown in the sheet screen of the Figure 4 (print icon), and in Table 2. In addition, the help screen of AREX provides more explanation to the user regarding how AREX works and justifies the conclusion.

Table 2: The value of the input and the output attributes.

‘A model of examining controls (Investigation of control procedures)’

Concept Property Value

control_procedures a_effective_supervision_controls_internal_auditors yes

control_procedures b_separation_custody_assets_accounting yes

control_procedures c_separation_authorisation_custody_assets yes

control_procedures d_separation_operational_responsibility_record_keeping yes

control_procedures e_separation_IT_users_duties yes

control_procedures f_duties_rotated yes

control_procedures g_pre_sequential_numbering_documents yes

control_procedures h_timely_preparation_documents yes

control_procedures i_proper_records_keeping yes

control_procedures j_controls_data_recording yes Investigation of control procedures: ‘supports the low level of control risk ‘

7

Wahdan, Spronck, Ali, Vaassen and van den Herik

Figure 4: Sheet screen and WHY and HOW explanations.

6. Validation and evaluation methodology of AREX The validation and evaluation methodology of AREX includes testing the effectiveness, efficiency, and acceptance of AREX in performing the task of formulating the auditor’s opinion. The preliminary validation phase (6.1) is using the test cases to check whether the AREX prototype is working correctly. The AREX AREX prototype is validated to ensure that it asks the right questions in the expected order and that there are no unnecessary or unexpected questions. The field-tests validation phase (6.2) is to test the reliability of AREX. AREX is presented to a sample of auditors to use it on actual auditing cases. A comparison between AREX’s opinions and the auditors’ opinions is made.

8

Automatic Formulation of the Auditor’s Opinion with AREX

The auditors’ feedback is included as far as possible to improve AREX. The two phases are aimed at measuring AREX’s accuracy. The accuracy of AREX’s advice compared to that of the auditors is a major criterion for evaluating AREX (cf. Ram & Ram, 1996; Borenstein, 1998). The auditors’ evaluation phase (6.3) is to test the practicability of AREX. The auditors are asked to evaluate the effectiveness, efficiency, and acceptance of AREX in practice. These phases ensure to a certain extent the reliability of AREX’s advice. The challenging question still is: How is the validation and evaluation process precisely executed? Figure 5 illustrates the methodology of the validation and evaluation process of AREX.

AREX

Face validation

Subsystems validation

Auditors validation

Preliminary validation (Test cases)

(6.1)

Field-tests validation

(Actual cases) (6.2)

Auditors’ evaluation (Effectiveness, efficiency,

acceptance, AREX models, and AREX as a whole) (6.3)

Figure 5. Validation and evaluation process of AREX.

6.1 Preliminary validation A preliminary validation of AREX was carried out in Egypt; it was performed in two stages. In the first stage, a questionnaire No. 1 (available from the first author) was submitted to 32 auditors. It consisted of fifteen auditing cases that needed to be handled by the auditors as test cases. The auditors were asked to choose one type from five types of the auditor’s opinions and to formulate their opinions for each of the cases. The test cases were handled by AREX too. The opinions generated from the AREX prototype were compared to the auditors’ opinions. The outcome of the comparison indicated that in two cases AREX arrived at opinions different from those arrived at by the majority of the auditors. The differences concerned an assessment of the importance of the work of another (additional) auditor. It should be noted that the Egyptian auditors do not apply the particular standard that takes into account another (additional) auditor’s work. In Egypt, two auditors review the company’s accounts and issue one report, which is signed by both of them. In the remaining thirteen cases, it was noticed that in some cases some auditors made different decisions (the percentage of disagreements was 23). We discussed the reasons for the different opinions with the auditors within the interviews. After discussion, we arrived at the conclusion that the AREX prototype performed the task of formulating the auditor’s opinion more accurate and consistent than the auditors; and as a consequence the auditors revised their opinions in accordance with AREX’s opinions. Therefore, we may conclude that the reports generated by AREX are of the same quality as those formulated by experienced auditors (Wahdan et al., 2005b). In the second stage, three auditors applied the AREX prototype to three of their own hypothetical cases. The results indicated that AREX performed the task of formulating the auditor’s opinion in a manner identical to their own formulation. They considered AREX to be a useful tool in formulating the auditor’s opinion on financial statements in practice. They expressed that AREX may provide a solid fundament for the audit process, by its definition of concepts, properties, and relations. AREX may help train new auditors by providing them with structured knowledge for formulating the auditor’s opinion. AREX’s performance was considered acceptable and its task knowledge relevant. 6.2 Field-tests validation The field-tests validation of AREX using real auditing clients was carried out as described below. Participants in the validation process A judgemental sample of 26 highly experienced auditors (8 of them participated in the knowledge-elicitation phase and preliminary validation phase) from various audit firms in Egypt participated in the field-tests validation of AREX. The auditors were selected depending on at least one of the following three factors: (i) having at least ten years of experience, (ii) their level of education (at least bachelor degree with high level of computer’s skills and English language), and/or (iii) their work in the international audit firms in Egypt. Six of the auditors were accounting and auditing university professors from the largest six universities in Egypt. The remaining twenty auditors were practitioners;

9

Wahdan, Spronck, Ali, Vaassen and van den Herik

twelve of them partners or vice partners in the audit firms, and eight of them managers or senior managers. Two practitioners had doctoral degrees, ten were CPAs or had master degrees, and eight had bachelor degrees. The participants represented fourteen audit firms (three “Big 4” firms (KPMG, Price Waterhouse Coopers, and Ernst & Young), three local firms with a foreign partner, and eight local audit firms). Validation of AREX We used a face validation, i.e., submitting AREX to a judgemental sample of experienced auditors in order to elicit their comments on how AREX performs the task of formulating the auditor’s opinion in terms of accuracy. This procedure was carried out during the validation process of AREX and of its models. The validation of AREX was done by having each auditor select one or more auditing cases from his files and compare his results with the AREX results. This was done for 48 cases of different auditing clients. The field-tests validation was carried out in two stages as follows. The first stage was to check the correctness of the implementation. The auditors used AREX in operating six actual auditing cases. During these six trial cases, syntactic errors were discovered in the AREX code. Hence, conclusions between AREX’s opinions and the auditors’ opinions differed. We discussed these differences with the auditors during the interviews. The auditors suggested that adding some properties to several of the models and concepts would help. The errors were corrected and the auditors’ feedback was included as far as possible to improve AREX. Consequently, some rules in the knowledge base were changed, added, or deleted. Moreover, the knowledge represented was refined and the user interface was enhanced to satisfy the auditors’ requirements. In the second stage, the auditors used AREX in operating the remaining 42 auditing cases. In 41 of these cases, AREX’s opinions complied with the auditors’ opinions, as shown in Table 3. There was one disagreement between AREX’s opinion and the auditor’s opinion. The auditor’s opinion was a qualified opinion except for some existing uncertainties in the case of existing multiple uncertainties. AREX’s opinion was a disclaimer of opinion as is required by the ISA 570. We discussed the case with the auditor and other auditors, but they had different opinions, depending on the materiality of multiple uncertainties. In another case, AREX’s opinion was at the start different from the auditor’s opinion. In this case, AREX recommended that the auditor’s opinion on the client’s financial statements should be an adverse opinion, while the auditor’s opinion was a qualified opinion. We discussed the auditing case in detail with the auditor who admitted that the client should have had an adverse opinion. However, the auditor formulated a qualified opinion in order to retain the client. Therefore, we did not consider this case as a disagreement. From the validation results, we may conclude that AREX performs the task of formulating the auditor’s opinion as can be expected from a skilful auditor. The accuracy of AREX’s opinions is 98% (41 / 42) in the 42 cases, as shown in Table 3.

Table 3. The results of comparison by using actual cases

Number of auditors

Number of cases audited by each

auditor

Total of cases audited

Number of agreements

Number of disagreements

3 3 9 8 1 10 2 20 20 0 13 1 13 13 0 26 42 41 1 % 100 % 98% 2%

10

Automatic Formulation of the Auditor’s Opinion with AREX

6.3 Auditors’ evaluation We surveyed the auditors with their permission to gather their attitudes toward AREX. After the auditors had used AREX in processing the actual auditing cases, they were asked to answer questions formulated in the questionnaire No.2 (available from the first author) using five-point Likert scales (strongly agree = 5 to strongly disagree = 1; and very good = 5 to very poor = 1). A pilot study was carried out to test the clarity and validity of the questions in the questionnaire No.2. The data include the auditors’ evaluation of the effectiveness, the efficiency, the acceptance, and AREX and its models. First: Effectiveness In our research, effectiveness deals with the impact of AREX on the decision quality and the increased accuracy (Baldwin-Morgan, 1993). Accuracy of decision making is examined as a measure of the system’s effectiveness (Changchit et al., 2001). KBSs may encourage consistent performance of auditing tasks (Baldwin-Morgan, 1993). The effectiveness of a KBS is measured, among others, (1) by the presence of an explanation facility, which is the system’s ability to provide guidelines to explain questions and conclusions, and by (2) its potential usefulness, which is the system’s ability to satisfy an auditor’s requirements (cf. Pan et al., 2005). If AREX increases the frequency of correct decisions, it may improve the performance of the task of formulating the auditor’s opinion (cf. Boritz & Wensley, 1996; Ram & Ram, 1996). In order to examine the effectiveness of AREX, we compare the auditors’ opinions with using AREX and without using it (as discussed in Section 4). Fifteen attributes in Table 4 are evaluated to measure the effectiveness of AREX as (1) an auditing tool and (2) as a training tool. (1) AREX as an auditing tool The evaluation of AREX as an auditing tool contains (1a) the soundness of its logic and (1b) the potential usefulness. (1a) Soundness of logic The auditors strongly agree that AREX’s logic is sound (Question 3, weighted mean = 4.54) and reflects professional competence (Question 4, 4.54). Since the AREX knowledge was elicited from 32 auditors, we were curious on the results from the evaluation of AREX’s knowledge and expertise. As hoped for, the auditors strongly agree that AREX’s knowledge and expertise is very good (Question 15, 4.50) and AREX provides relevant information. (1b) Potential usefulness Most auditors are in a clear agreement with the usefulness of AREX in formulating the auditor’s opinion in practice (Question 1, 4.42). In addition, a clear agreement exists for AREX with respect to the task of formulating the auditor’s opinion: it performs in the same manner, as the auditors would do (Question 5, 4.35). The respondents strongly agree that AREX helps auditors to formulate their opinions according to International Standards on Auditing (ISA) (Question 6, 4.50), provides guidelines for the auditors so as to support the required procedures to formulate opinions on financial statements (Question 7, 4.54), and provides the auditors with the appropriate auditor’s opinion type (Question 9, 4.38). They evaluated AREX’s competence to perform the task as very good (Question 11, 4.54). In addition, the auditors have evaluated AREX’s reliability as good. The reliability includes accuracy (Question 12, 4.27), completeness (Question 13, 4.19), and relevancy to the task of formulating the auditor’s opinion (Question 14, 4.42). (2) AREX as a training tool The evaluation of AREX as a training tool contains (2a) educational and training capabilities and (2b) explanation facility. (2a) Educational and training capabilities A KBS should have the ability to advise inexperienced auditors and to help auditors learn more about their decisions-making strategies (cf. Baldwin-Morgan & Stone, 1995). In addition, a KBS may enable auditors to access to the best expertise in the audit firms. Auditors could gain new insights into the audit decision process and could increase their abilities to handle complex tasks, depending on the preservation and distribution of expertise (Pei & Steinbart, 1994). The auditors strongly agree that AREX is useful as a training device for new auditors (Question 2, 4.88). In addition, the auditors agree that AREX helps auditors to understand in a better way how they formulate their opinion on financial statements (Question 8, 4.35).

11

Wahdan, Spronck, Ali, Vaassen and van den Herik

Table 4: Auditors’ evaluation of the effectiveness of AREX.

Questions Strongly agree = 5…. Strongly disagree = 1

Meann=26

Minimum Std. dev.

Std. Err

1. AREX is useful in practice 2. AREX is useful as a training device for new auditors 3. AREX’ s logic is sound 4. AREX’s logic reflects professional competence 5. AREX approached the task of formulating the

auditor’s opinion in the same manner I would 6. AREX helps auditors formulate their opinions on

financial statements according to ISA 7. AREX provides guidelines for auditors as to the

required procedures to formulate their opinions 8. AREX helps auditors to understand in a better way

how they formulate their opinion on financial statements

9. AREX provides the auditors with the appropriate

auditor’s opinion type

4.42

4.88

4.54

4.54

4.35

4.50

4.54

4.35

4.38

3.00

4.00

3.00

3.00

3.00

4.00

4.00

3.00

3.00

0.58

0.33

0.58

0.58

0.69

0.51

0.51

0.75

0.63

0.11

0.06

0.11

0.11

0.14

0.10

0.10

0.15

0.12

Very good = 5…. Very poor = 1 10. The explanation facility 11. AREX’s competence 12. AREX’s accuracy 13. AREX’s completeness 14. AREX’s relevancy 15. AREX’s knowledge and expertise

4.50

4.54

4.27

4.19

4.42

4.50

3.00

3.00

3.00

3.00

3.00

3.00

0.58

0.65

0.53

0.63

0.70

0.58

0.11

0.13

0.10

0.12

0.14

0.11

(2b) Explanation facility AREX’s reasoning capabilities are considered significant factors in the evaluation process (CLAES, 1995; Ram & Ram, 1996). The auditors strongly agree that AREX’s explanation facility is very good (Question 10, 4.50). AREX has WHY and HOW explanations. Second: Efficiency Efficiency may be measured by the time required to perform a task or by the number and organisational levels involved in the task (cf. Baldwin-Morgan and Stone, 1995; Changchit et al., 2001). Using KBSs in the audit process may reduce the time required (1) to make the auditors’ judgements, (2) to authorise proposed actions, and (3) to create agreement on the auditor’s decision. AREX’s ability to improve the efficiency by saving resources when performing the task of formulating the auditor’s opinion is considered a central criterion in the validation and evaluation process. For example, if AREX reduces the time required to perform the task of formulating the auditor’s opinion, it may be helpful if the audit is time constrained. We believe that AREX will improve the efficiency of audits by reducing the time needed for performing the task of formulating the auditor’s opinion. Furthermore, by providing advice and giving explanations, AREX helps in training novices in a shorter time, and saves the experienced auditor’s time. Moreover, it may help increase the efficiency of the future process, still to be performed by human beings. The efficiency of AREX includes the time required for performing the task of

12

Automatic Formulation of the Auditor’s Opinion with AREX

formulating the auditor’s opinion. The auditors are in a clear agreement that AREX decreases the time needed for formulating the auditor’s opinion on financial statements (Table 5, Question 1, 4.19). Therefore, we may conclude that the use of AREX improves the personal productivity. Third: Acceptance The auditors’ acceptance of a KBS is influenced by the auditors’ confidence in the system’s recommendations and the ease of using the system (Boritz & Wensley, 1996; Changchit et al., 2001). Most validation methods are applicable to the performance criteria; in particular the acceptable level of the performance by a KBS must be useful in supporting the decision making in auditing (O’Keefe & Preece, 1996). Table 5 illustrates the auditor’s evaluation of the efficiency and acceptance of AREX. The acceptance of AREX includes (a) ease of understanding and (b) advice acceptance. (a) Ease of understanding The auditors agree that the logic of AREX is easy to follow (Question 2, 4.46). We remark that following the reasoning in an easier task than fully understanding the reasoning. In the latter case, some background knowledge is required. This explains the difference between question 2 and question 6 in Table 5. The auditors ranked the level of the ease of understanding of AREX’s logic as good (Question 6, 4.42). (b) Advice acceptance The auditors agree that the advice of AREX can be trusted (Question 3, 4.27) and professionally accepted (Question 4, 4.35). The phrasing of the questions of AREX is evaluated as good (Question 5, 4.31). Moreover, most auditors strongly agree that AREX is considered very good as an overall support tool for the auditing tasks (Question 7, 4.50). Hence, the acceptance hints to a user’s confidence in AREX conclusion.

Table 5: Auditors’ evaluation of the efficiency and acceptance of AREX.

Questions Strongly agree = 5…Strongly disagree = 1

Mean n= 26

Minimum Std. dev.

Std. Err

1. AREX decreases the time needed for the task of formulating the auditor’s opinion on financial statements

2. It is easy to follow the logic of AREX 3. AREX’s advice can be trusted 4. AREX’s advice is professionally accepted

4.19

4.46

4.27

4.35

3.00

3.00

3.00

3.00

0.69

0.65

0.67

0.80

0.14

0.13

0.13

0.16

Very good = 5…. Very poor = 1 5. Phrasing of questions in AREX 6. The ease of understanding AREX’s logic 7. AREX is an overall support tool for the auditing

tasks

4.31

4.42

4.50

3.00

3.00

3.00

0.68

0.64

0.58

0.13

0.13

0.11

Fourth: AREX and its models Evaluation of the AREX models is called “subsystem validation”, which amounts to examining the AREX assumptions and critical procedures. The evaluation focuses on the AREX details and determines which models of AREX need to be improved. Each of the eight models of AREX, discussed in Section 1, was validated and evaluated separately (cf. Back, 1993, 1994). In Table 6, the auditors evaluated the performance of the four models as very good. These models were the auditor’s opinion model (Question 8, 4.58), the model of examining controls (Question 1, 4.50), the preliminary materiality model (Question 3, 4.50), and the model of fairness of representation (Question 6, 4.50).

13

Wahdan, Spronck, Ali, Vaassen and van den Herik

Table 6: Auditors’ evaluation of the AREX models.

Questions Very good = 5…. Very poor = 1

Mean n= 26

Minimum Std. dev.

Std. Err

1. The model of examining controls 2. The model of assessing planned detection risk 3. The preliminary materiality model 4. The auditing-standards model 5. The accounting-principles model 6. The model of fairness of representation 7. The going-concern model 8. The auditor’s opinion model

4.50

4.31

4.50

4.31

4.46

4.50

4.46

4.58

3.00

3.00

3.00

3.00

3.00

3.00

3.00

4.00

0.65

0.73

0.58

0.74

0.65

0.58

0.65

0.50

0.13

0.14

0.11

0.14

0.13

0.11

0.15

0.10 In addition, the auditors evaluated the performance of the other four models as good. These models were the accounting-principles model (Question 5, 4.46), the going-concern model (Question 7, 4.46), the model of assessing planned detection risk (Question 2, 4.31), and the auditing-standards model (Question 3, 4.31). Therefore, we may conclude that the performance of AREX models is high. Moreover, the auditors’ overall evaluation of AREX as a whole is good (mean responses average across all 30 attributes is 4.44). 7. Conclusion and future research The paper described the construction, implementation, and validation of AREX, the Auditor’s Report EXpert. It addresses the two questions: (1) To what extent is it possible to automate the formulation of the auditor’s opinion? and (2) To what extent is a KBS effective, efficient, and acceptable as a tool to formulate the auditor’s opinion? AREX comprises a conceptual model of the auditor’s opinion, which is divided into eight models. The eight models are used to structure the information that the user must incorporate in AREX. The domain knowledge was acquired from practicing and academic auditors, and from the literature. It is represented in the form of concepts, properties, relations, and rules. The inference knowledge included eight types of inferences corresponding to the AREX models. Endowed with this knowledge AREX generates the proper auditor’s opinion by applying user-supplied facts to the encoded rules. The task knowledge (goals, decomposition, and control) is able to perform the activities related to the formulation of the auditor's opinion. Based on the preliminary validation of AREX, we may conclude that the results of the comparison between AREX’s opinions and the auditors’ opinions indicate that AREX almost arrived at the same opinions as the human auditors. Based on the field-tests validation of AREX, we may conclude that AREX performs the task of formulating the auditor’s opinion well and is a useful tool for formulating the auditor’s opinion in practice. The results of the auditors’ evaluations indicate that AREX has the effectiveness and efficiency in performing the task of formulating the auditor’s opinion. Moreover, AREX was considered acceptable for performing the task of formulating the auditor’s opinion. Therefore, we may conclude that (1) the task of creating the auditor’s report can be performed by a KBS, and (2) AREX is suitable and acceptable to formulate the auditor’s opinion. Returning to the four drawbacks involved in the personal-judgement approach, we may conclude that AREX is able to overcome the drawbacks of ineffectiveness and different decisions. To draw definite conclusions about AREX’s ability to deal with the drawbacks of personal bias and misleading judgements, future accounting and auditing research should concentrate on giving insights into the factors that contribute to providing objective assessments. That is an important issue of the construction of a KBS. In the future, we may investigate how the models of adequate auditor behaviour (cf. Libby, 1995; O’Learly, 2003; Arnold et al., 2004) deal with this problem of obtaining an objective assessment. Furthermore, future research will deal with the auditors’ requirements and recommendations, discovered during the interviews, to improve AREX.

14

Automatic Formulation of the Auditor’s Opinion with AREX

References Abdolmohammadi, M.J. and Bazaz, M. (1995). Identification of Tasks for Expert Systems

Development in Auditing, in Vasarhelyi, M.A., Artificial Intelligence in Accounting and Auditing: Using Expert Systems, Markus Wienus Publishers, Princeton.

Akoka, J. and Comyn-Wattiau, I. (1996). A Knowledge-Based System for Auditing Computer and Management Information Systems, Expert System with Applications, Vol. 11, No. 3, pp. 361-375.

Akoka, J. and Comyn-Wattiau, I. (1997). An Expert System for Financial and Accounting Information System Auditing, Computer Audit Update, (October), pp. 8-19.

Arens, A.A., Elder, R.J., Beasley, M.S., and Jenkins, G.J. (2005). Auditing and Assurance Services: An Integrated Approach, PEARSON: Prentice Hall, New Jersey.

Arnold, V., Collier, P., Leech, S., and Sutton, S. (2004). Impact of Intelligent Decision Aids on Expert and Novice Decision-Makers’ Judgments, Accounting and Finance, Vol. 44, pp. 1-26.

Back, B. (1993-1994). Validating an Expert System for Financial Statement Planning. Journal of Management Information Systems, 10, 157-177.

Baldwin-Morgan, A.A. & Stone, M.F. (1995). A Matrix Model of Expert Systems Impacts. Expert Systems with Applications, 9, 599-608.

Baldwin-Morgan, A.A. (1993). The Impact of Expert System Audit Tools on Auditing Firms in the Year 2001: A Delphi Investigation. Journal of International Systems. 7, 16-34.

Bayraktar, D. (1998). A Knowledge-Based Expert System Approach for the Auditing Process of Some Elements in the Quality Assurance System, International Journal of Products Economics, Vol. 56-57, pp. 37-46.

Borenstein, D. (1998). Towards a Practical Method to Validate Decision Support Systems. Decision Support Systems, 23, 227-239.

Boritz, J. E. & Wensley, A.K. (1996). CAPEX: A Knowledge-Based System for Substantive Audit Planning, Markus Wiener Publishers, Princeto.

Brown, C.E and Murphy, D.S. (1990). The Use of Auditing Expert Systems in Public Accounting, Journal of Information Systems, Vol. 4, Iss. 3, (Fall), pp. 63-73.

Changchit, C. (2003). The Construction of an Internet-Based Intelligent System for Internal Control Evaluation, Expert System with Applications, Vol. 25, pp. 449-460.

Changchit, C., Holsapple, C.W., and Viator, R.E. (2001). Transferring Auditors’ Internal Control Evaluation Knowledge to Management, Expert System with Applications, Vol. 20, pp. 275-291.

CLAES (1995). Expert Systems for Improved Crop Management: methodology for the Engineering of Expert Systems. Agriculture Research Center, (December), TR-88-024-41.

Collier, P.A., Leech, S.A., and Clark, N. (1999). A Validation Expert System for Decision Making in Corporate Recovery, International Journal on Intelligent Systems in Accounting, Finance & Management, Vol. 8, pp. 75-88.

Comyn-Wattiau, I. and Akoka, J. (1997). Expert Systems for Computer and Management Information Systems Auditing, Computer Audit Update, (September), pp.17-27.

Dhaliwal, J.S. and Tung, L.L. (2000). Using Group Support Systems for Developing a Knowledge-Based Explanation Facility, International Journal of Information Management, Vol. 20, pp. 131-149.

Ernst & Young (2005). Comparision IFRS with Dutch Law and Regulations, Ernst & Young Acountants, Rotterdam, The Netherlands, Available at http:// www.ey.com.

Flory, S.M. (1991). Here’s an Expert System-Based Support Tool for Making Accounting Decisions, The practitioner and the Computer of The CPA Journal Online, (October).

Guy, D.M., Carmichael, D.R., and Lach, L.A. (2003). Practitioner’s Guide to GAAS, John Wiley and Sons, Inc., Canada.

Hayes, R., Dassen, R., Schilder, A. and Wallage, P. (2005). Principles of Auditing: An Introduction to International Standards on Auditing. FT Prentice Hall, Pearson Education, Amsterdam.

Hornik, S. and Ruf, B.M. (1997). Expert Systems Usage and Knowledge Acquisition: An Empirical Assessment of Analogical Reasoning in the Evaluation of Internal Controls, Journal of Information Systems, Vol. 1, No. 2, pp. 57-74.

IFAC (2006). Handbook of International Auditing, Assurance and Ethics Pronouncements, International Federation of Accountants.

Karan, V., Murthy, U.S., and Vinze, A.S. (1995). Assessing the Suitability of Judgmental Auditing Tasks for Expert Systems Development: An empirical Approach, Expert systems with applications, Vol. 9, No. 4, pp. 441-455.

15

Wahdan, Spronck, Ali, Vaassen and van den Herik

Lenard, M.J. (2003). Knowledge Acquisition and Memory Effects Involving an Expert System Designed as a Learning Tool for Internal Control Assessment. Decision Sciences Journal of Innovative education, Vol. 1, No. 1, pp.23-38.

Lenard, M.J., Alam, P., Booth, D., and Madey, G. (2001). Decision-Making Capabilities of a Hybrid System Applied to the Auditor’s Going-Concern Assessment, International journal on intelligent Systems in Accounting, Finance & Management, Vol. 10, pp.1-24.

Lenard, M.J., Madey, G.R., and Alam, P. (1998). The Design and Validation of a Hybrid Information System for the Auditor’s Going Concern Decision, Journal of Management Information Systems, Vol. 14, Iss. 4, (Spring), pp. 219-237.

Libby, R. (1995). The role of Knowledge and Memory in Audit Judgment, in the book of Ashton, R.H. and Ashton, A.H. (1995). Judgment and Decision-Making research in Accounting and Auditing, Cambridge University Press, UK.

Mak, B., Schmitt, H., and Lyytinen, K. (1997). User Participation in Knowledge Update of Expert Systems, Information & Management, Vol. 32, pp. 55-63.

McDuffie, R.S. and Smith, L.M. (2005). The Impact of an Audit Reporting Expert System on Learning Performance: A Teaching Note. Forthcoming in Accounting Education, AAA Annual Meeting, (November).

McDuffie, R.S., Oden, D.H., and Smith, L.M. (1993). The Potential For Use of Expert Systems in Oil and Gas Tax Accounting, Oil and Gas Tax Quarterly, Vol. 41, Iss. 3, (March), pp. 389-397.

Murphy, D. S. and Yetmar, S.A. (1996). Auditor Evidence Evaluation: Expert Systems as Credible Sources. Behaviour & Information Technology, Vol. 15, No. 1, pp. 14-23.

Needles, B.E. and Pomeranz, F. (1985). Comparative International Auditing Standards, International Accounting Section of the American Accounting Association, Amsterdam.

O’Keefe, R.M. & Preece, A.D. (1996). The Development, Validation and Implementation of Knowledge-Based Systems. European Journal of Operational Research, 92, 458-473.

O’Leary, D.E. (2003). Auditor Environmental Assessments, International Journal of Accounting Information Systems, Vol. 4, pp. 275-294.

Pan, N.F., Hadipriono, F.C., and Whitlatch, E. (2005). A Fuzzy Reasoning Knowledge-Based System for Assessing Rain Impact in Highway Construction Scheduling: Part 2. Development and Validation of the System, Journal of Intelligent & Fuzzy Systems, Vol. 16, pp. 169-179.

Pany, K.J. and Ray W.O. (2001). Research Implications of the Auditing Standard Board’s current Agenda, American Accounting Association, Accounting Horizons, Vol. 15, No. 4, (December), pp. 401-411.

PCAOB (2004). An Audit of Internal Control over Financial Reporting Performed in Conjunction with an Audit of Financial Statements, Washington, D.C., PCAOB Release 2004-001, PCAOB Rulemaking, Docket Matter No.008.

Pei, B.K.W. & Steinbart, P.J. (1994). The Effects of Judgment Strategy and Prompting on Using Rule-Based Expert Systems for Knowledge Transfer, Journal of Information Systems, 8, 1, 21-43.

Post, W., Wielinga, B.J., and Schreiber, G. (1997). Organizational Modelling in Common KADS: The Emergency Medical Service. IEEE Expert Intelligent Systems and their Applications, Vol. 12, Iss. 6, (Nov/Dec), pp. 46-53.

PriceWaterhouseCoopers (2004). Similarities and Differences: A comparison of IFRS, US GAAP and Dutch GAAP, (October), Available at http:/www.pwc.com

PriceWaterhouseCoopers (2005). Similarities and Differences: A comparison of IFRS, US GAAP and Belgain GAAP, (October), Available at http:/www.pwc.com

PriceWaterhouseCoopers (2006). Similarities and Differences: A comparison of IFRS, and US GAAP, (February), Available at http:/www.pwc.com

Ram, Sundaresan and Ram, Sudha. (1996). Validation of Expert Systems for Innovation Management: Issues, Methodology, and Empirical Assessment. Journal of Product Innovation Management. 13, 1, 53-68.

Schreiber, G., Wielinga, B.J., and Breuker, J.A. (1993). KADS: A Principled Approach to Knowledge-Based System Development, Academic Press "AP", Harcourt Brace Javanovich, Publishers, London.

Shaalan, K., Rafea, M., and Rafea, A. (1998). KROL: A Knowledge Representation Object Language on top of Prolog, Expert System with Applications, Vol. 15, pp.33-46.

Wahdan, M.A., Spronck, P., Ali, H. F., Vaassen, E., and Herik, H.J. van den (2005a). AREX: A Knowledge-Based System for Auditor’s Reports, Partners’ conference 2005, Maastricht School of Management, The Netherlands, pp. 297-308.

16

Automatic Formulation of the Auditor’s Opinion with AREX

Wahdan, M.A., Spronck, P., Ali, H. F., Vaassen, E., and Herik, H.J. van den (2005b). A Knowledge-Based System for Auditor’s Reports, the proceedings of the congress 14th IMDA, 2005, Granada, Spain, pp. 111-117.

Wahdan, M.A., Spronck, P., Ali, H. F., Vaassen, E., and Herik, H.J. van den (2005c). When Will A Computer Write The Auditor’s Report, The Journal of Systemic Practice and Action Research, Vol. 18, No. 6, (December), pp. 569-580.

Wahdan, M.A. (2006). Automatic Formulation of the Auditor's opinion. Ph.D. Thesis, Maastricht University, Maastricht, The Netherlands.

Wheeler, S. (1990). Auditing: a Comparison of Various Materiality Rules of Thumb, The CPA Journal Online, (February).

Whittington, O.R. and Pany, K. (2003). Principles of Auditing and Other Assurance Services, Mc Graw Hill, Boston.

Wielinga, B.J., Schreiber, G., and Breuker, J.A. (1992). KADS: A Modelling Approach to Knowledge Engineering. Knowledge Acquisition, Vol. 4, pp. 5-53.

Young, R. (1994). Discussion of the Effects of Explanation Type and User Involvement on Learning from and Satisfaction with Expert Systems, Journal of Information Systems, Vol. 8, Iss. 1, pp. 18-21.

17