raising pigs

Upload: ciarmel

Post on 03-Apr-2018

215 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/28/2019 Raising Pigs

    1/8

    An Expert System for Raising Pigs

    Tim Menzies

    Artificial Intelligence Lab, University of New South Wales, AustraliaSchool of Computer Science and Engineering,

    University of New South Wales, P.O. Box 1, Kensington, NSW, Australia, [email protected].

    John Black, Joel Fleming

    DSL Systems Centre, CSIRO Division of Animal Production, AustraliaCommonwealth Scientific Industrial Research Organisation.

    Murray Dean

    Zwiggan Consulting, Australia

    1. INTRODUCTION

    An intelligent back-end to a DOS-basedmathematical modelling package written inARITY PROLOG is described. The model:

    simulates the growth and reproduction ofpigs;

    identifies factors that limit optimal perfor-mance of the pig;

    identifies management strategies that max-imise enterprise profit.

    The expert system:

    presents an abstracted description of the out-put of the model in a form that a non-mathematician can understand;

    suggests dietary, housing, genotype, orresource input changes that can improve theprofitability of the herd.

    Verification studies have demonstrated that theexpert system can significantly out-performhuman experts interpreting the output of themodel (performance measured in dollars persquare metre per day). In a usual case, theimprovement is of the order of 10%. The systemis in routine use in America, Holland, Belgium,France, Spain and Australia.

    2. THE DOMAIN

    Raising pigs is big business. In Australia alone,300 kilotonnes of pig meat worth some $500 mil-lion dollars is produced annually. The Aus-tralian pig herd represents one-third of one per-cent of the international herd.

    In response to commercial interest in pig pro-duction, CSIROs Division of Animal Productiondesigned AUSPIG, an MS-DOS tool for studyingthe performance of pigs growing in pens in a pig-gery. The pig farmer enters information con-cerning the pigs diet, genotype, and housing, aswell as certain data concerning the local buyers

    of pigs. AUSPIGs grower pig and breeder pigmodels then simulate the growth of the pig andreport its economic and biological characteristics(e.g. live weight, backfat thickness, amino acidutilisation, profit at sale) at various times dur-ing its life [Black 87a].

    As well as being useful for studying the pigsperformance, AUSPIG can be used for experi-menting with different ways of improving theprofit of a piggery. Once a growing regime hasbeen developed to optimise the growth of onepig, AUSPIGs PIGMAX optimisation model canbe used to study the effects of this regime on theprofitability of the entire herd. Typically, whenanalysising a piggery, an initial simulation runis made to ensure that the performance of thepigs under their current environment ispredicted accurately. If inefficiencies in the pigsdiet or environment are detected, a set of poten-tially beneficial alterations are identified. These

    alterations are tested by a second run of theAUSPIG and/or the PIGMAX model. The resultsof the first and second runs are then comparedto see if the tested alterations were successful orif they could be further improved. This processof:

    analysis => modification => test

    could repeat any number of times. Experiencehas shown, however, that there is little utility inrepeating the process any more than about fivetimes.

    Using AUSPIG, it is possible to identify simple

    changes to the pigs diet, feeding regimes, orenvironment that can significantly increase theperformance and profitability of the herd. Forexample:

    In one spectacularly successful study of a pig-gery, CSIRO scientists used AUSPIG todevelop a growing regime that increased apiggerys net return from $40,992 to

  • 7/28/2019 Raising Pigs

    2/8

    $202,321 (i.e. 496 percent) [Black 88].

    AUSPIG demonstrated that increasing theflow of air over a pig in a hot shed (e.g. 32degrees Celcius) from 0.2 metres per secondto 0.6 metres per second can double theanimals growth rate [Black 87b]. Such anincrease in air flow would be barely percepti-ble to a human being.

    The potential of the AUSPIG system is enor-

    mous. The current system has an internationalmarket. Some modern piggeries have installedcomputer-controlled liquid feeding systems fortheir pigs. Such piggeries can alter a pigs dietevery day of the pigs life. With such systems,the challenge is to decide how to best capitaliseon this control and systems like AUSPIG areessential. Further, the techniques developed for

    AUSPIG could be generalised to produce anAUS-COW, AUS-FARM, and a AUS-SHEEPsystem.

    A problem identified with the prototype versionsof AUSPIG was that it required an expert to

    fully utilise the output of the mathematicalmodels. Early versions of AUSPIG produceddozens of screens reports which the end-userbrowsed to find ways to optimise the pig growth.Each report was a 20 by 40 array of numbersfilling many screens. Trials showed that mostusers examined few of these reports. In order tomake the system more acceptable to the averagepig farmer, and to increase its market potential,it required an intelligent post-processor toanalyse the output. Ideally, the non-expert usershould be able to by-pass the screens of reportsand access a single screen that identifies anybiological inefficiencies and recommends

    changed management strategies. The imple-mented post-processor was called PigE1.

    3. WHY PROLOG?

    The choice of PROLOG as the expert systemimplementation language was made after a con-sideration of the required delivery environment.

    CSIROs primary concern was to keep thecost of the AUSPIG package to a minimum.

    AUSPIG already contained third-partysoftware which could only be distributedunder licence. CSIRO did not want to use a

    1. The running gag of the project was the name PigE.AUSPIG could be called a ManagementInformation System so the expert system couldhave been called MISs PigE. The file transferutility KERMITcould have been used to move filesfrom developer to developer. And so on.

    commercial expert system shell that wouldattract more licensing fees and so inflate thecost of the package.

    When the expert system was first considered,

    AUSPIG was targetted for a IBM-AT or AT-clone running MS-DOS2. An AT configuredfor AUSPIG leaves 420K of RAM free forother processes. The expert system wouldneed to be able to execute within this

    memory constraint.

    CSIRO wanted to allow for the extension ofany software written during the feasibilitystudy. CSIROs AUSPIG development teamincludes programmers and it was hoped thatthis programming team could continue thedevelopment of the expert system after thefeasibility study.

    The early version of the system wasdeveloped using a prototyping methodology.Initially, we were not sure how to reasonabout the system. The developers wanted ahigh-level symbolic-processing language thatcould be easily modified to handle a variety

    of inferencing techniques.After a survey of the available tools, PROLOGwas selected as the high-level symbolic process-ing language. At the time of the initial proto-type, it was known that certain DOS-basedPROLOGs supported the generation of stand-alone executables (i.e. no runtime license fees)as well as memory management tools for DOS.A major factor in the decision was the availableprogramming skills. Menzies and Dean both hadextensive PROLOG programming experience.

    4. DEVELOPMENT

    At first, a rapid prototyping methodology wasadopted. The two PROLOG developers (Menzies& Dean) would work on the examples preparedby the experts (Black & Fleming). Rules werewritten, initially on paper, and a notation wasdeveloped that reflected the native idiom of thedomain experts (a standard application lan-gauge development approach [Bustany 88]). Inbetween the sessions, the experts would work onfurther examples or attempt to write rules. ThePROLOG developers would work on an inter-preter for the notation. Whenever the PROLOGdevelopers recognized that the domain-expertswhere having to contort the expression of the

    logic because of limitations with the interpreter,the interpreter was altered to allow for a more

    2. This has now changed. AUSPIG runs on a 386DOS box with 2MB of RAM.

  • 7/28/2019 Raising Pigs

    3/8

    natural expression of the logic.

    At the end of the prototype period, the domainexperts could write rules and the rules couldexecute. Verification studies (see section 7) weremost encouraging. Testing revealed that the sys-tem consistently out-performed the humanexperts3. This "over-performance" of the systemis remarkable. We conjecture that a humanexpert must look at many numbers from the

    model before making a recommendation. Theamount of information that has to be processedis large and it appears to exceed the capacity ofhuman beings. An expert system has no suchlimitations to its short-term memory. The expertsystem can therefore successfully study morefactors than a human being. Hence the superiorperformance of the expert system.

    Funding was obtained for a continuation of theproject. The utility of continuing with PROLOGwas reviewed. It was decided:

    1. To stay with PROLOG while thespecification was under development.

    2. To move away from PROLOG if the run-times became unacceptably slow. This hasnot happened. The PROLOG reasoningexecutes at least as fast as the mathemati-cal models so the user doesnt perceive theexpert system as being the slowest part ofthe system4. Also, during the reasoning,interim conclusions are displayed to theuser. Hence, the user is not left with ablank screen during the reasoning.

    3. To move away from the in-house PROLOGused in the prototype to ARITY PROLOG.With ARITY, we gained:

    Memory management under DOS. Our

    PROLOG programs were no longerbound to the 640K DOS limit.

    Screen management tools. These screentools proved to be less-than-ideal (seesection).

    A interface was developed between ARITYand PASCAL to simplify data transfer andallow for the seamless embedding of theexpert system into the AUSPIG package.

    3. The developers were somewhat alarmed at thisfinding. Would the experts take offence at the

    system beating them at their own game?Fortunately, the experts pride in their rule-writing ability out-weighed any otherconsiderations.

    4. To the user, data entry is the most tedious andslowest part of using the system.

    Using the experience gained during the proto-type, it was possible to estimate the timesrequired to build the remaining system. A pro-ject plan was developed for the next years work.The rule set was extended from 72 rules to 486rules and is currently stable.

    The system was developed and fielded usingPROLOG. Occasionally, some thought wasgiven to re-coding the rules in PASCAL. How-

    ever, other enhancements were considered to beof higher priority than a rewrite of a sub-systemthat was performing adequately. Currently,

    AUSPIG/PigE is in use in Australia, Holland,Belgium, France, Spain, and America.

    The domain experts report that maintaining thesystem is not a major problem. Experts canaccess reports generated by the inference engineon rules that never fire or rules that fire toomuch. Hence, they have some automated sup-port for detecting inappropriate logic. The 486rules divide up into 9 knowledge bases (KB).Each KB consists of several rule groups of

    between 5 to 15 rules. Within a group, asser-tions can be made that other rules require.Between rule groups, there exists a coarse-grainlevel of communication (e.g. some areas ofmanagement are not altered until others areresolved). However, in general, rules remainedmodular chunks of knowledge and can be editedfreely without unwanted side-effects to the restof the system5.

    5. SHELL DESIGN

    The domain is data-driven so a forward chainingrule interpreter was written to process the

    experts rules. As a technique for allowing thesuccinct expression of generic queries andmeta-level logic, PigEs rules support variables.Rule conditions can be instantiated in numerousways and the associated rule actions are appliedfor each different instantiation of the condition.All the source code for the project was writtenespecially for CSIRO and CSIROs programmerswere trained in the maintenance of this code.Where ever possible, procedural attachmentswere used to implement commonly performedtasks. Such procedural attachments could becalled from the RHS or LHS of a rule.

    5. This is in marked contrast with the rest of theAUSPIG system. The maintainers of themathematics models are somewhat envious of therule system. No portion of the models can bechanged without extensively effecting the rest ofthe model.

  • 7/28/2019 Raising Pigs

    4/8

    At start up, PigE converts the AUSPIG reportfiles into Prolog facts. For example, if three linesin the first grower report where:

    32 11.0 462 425 4.2 1.75 1.71 18 28 38

    39 14.5 535 465 7.3 1.89 1.78 17 27 37

    45 18.0 606 498 9.4 1.98 1.84 16 27 37

    then PigE would load in the Prolog facts:

    g1(32,11.0,462,425,4.2,1.75,1.71,18,28,38).

    g1(39,14.5,535,465,7.3,1.89,1.78,17,27,37).

    g1(45,18.0,606,498,9.4,1.98,1.84,16,27,37).

    PigE refers to these files via a set of databaseselectors defined using a data dictionary. Thedata dictionary for g1 gives each column in thisreport file a succinct label. For example, FigureOne shows the data dictionary for report g1defines "live weight (kg)" as lwt.

    dd(g1, [

    want age : "age (days)",

    want lwt : "live weight (kg)",

    lwt_gain :

    "live weight gain (g/day)",

    want lwt_av_gain :

    "average daily gain (g/day)",

    want p2 : "backfat thickness (mm)",want fcr : "fat conversion ratio: fcr",

    want fcr_av : "average fcr",

    want lct : "lct (degrees C)",

    want ect : "ect (degrees C)",

    uct : "uct (degrees C)"

    ]).

    Selectors are automatically generated by theshell for each want-ed column. The use of selec-tors makes calls to the database easier and insu-lates programs from database changes. If pro-grams talk to the database via the selectors,then queries to the database can be madewithout having to remember how many parame-ters each clause has and without a lot of tedioustyping. Also, if the database structure ischanged, or more selectors are required, thenthe data dictionary is edited and the selectorsgeneration program is called again. Any pro-grams that access this database via the selectorswould not need changing.

    To alter the rule set, CSIROs experts edited anASCII file storing special Prolog facts. Thesefacts are entered using pseudo-English Prologoperators. For example, if browsing this ASCIIfile, the domain expert could read the ruleshown in Figure One.

    Line 1 of Figure One is the rule label. The rulelabel has a unique identification number and abrief synopsis of the function of the rule. Lines 2to 7 are the rule condition. The keyword if indi-cates that this rule condition can only besatisfied once. If this keyword was replaced by

    forall, then PigE would attempt to find all thedifferent instantiations of the variables that

    rule # 6000 - use temperature option ?

    if climate(off) and % 2

    lower_piggery_temp := L and % 3

    upper_piggery_temp := U and % 4

    lct is (LCT,ROW)and % 5

    ect is (ECT,ROW)and % 6

    (L < LCT or U > ECT) % 7

    then

    observe

    "When next you run the model,turn on the temperature option." %8

    and zap pig is ok. % 9

    Figure 1. A sample rule.

    satisfy this rule condition. The rule action wouldbe applied for each different instantiation. Vari-ables set in the condition can be accessed by therule action. Rule conditions and actions consistof executable Prolog code separated by the key-words and, or and not.

    Rule number 6000 tests to see if the user should

    have run the AUSPIG model with the climateresponse option turned on. If this option is on,the model studies the pigs growth hour by hourinstead of day by day. This increases executiontime of the system considerably, but is useful forinvestigating pigs under hot or cold conditions.After testing that the climate option is off, lines3 to 6 gather the information required to test ifthe pig was under stress. Lines 3 and 4 ask theuser for the extremes of temperature in the pig-gery. The procedure X := Y uses a question factto generate a question to the user; e.g.:

    question(

    lower_piggery_temp,

    "How cold does your piggery get ? ",0 to 50

    ).

    The user is prompted with the text shown hereand is pestered until they provide a numericvalue in the range 0 to 50.

    The actual logic of the rule is in line 7. The tem-perature option should be turned on if the lowerpiggery temperature is less than the pigs lowercritical temperature (the point at which the pigis forced to increase its energy expenditure byshivering) or if the upper piggery temperature isgreater than the pigs evaporate critical tem-perature (the point at which the pig starts pant-ing).

    The procedure zap X (line 9 of rule # 6000)removes an assertion made at start up. Asser-tions are made by a ++ X rule action. Theseassertions can be tested for by a call to { X }. Atstart up, ++ pig is ok is asserted. Certain rulestest for not { pig is ok}. If this is true, i.e. a rule

  • 7/28/2019 Raising Pigs

    5/8

    like rule # 6000 has fired, then certain remediesare explored. PigE tests its rules in a top-downmanner. This allows for a prioritisation of possi-ble remedies. The rules are ordered in such away that important problems are resolved first,regardless of their consequences on less impor-tant problems.

    6. SYSTEM VALIDATION

    The primary motivation for attempting anexpert system was to make the system moreaccessible to the average user. Hence, an assess-ment criteria was chosen that reflected the per-spective of such a user.

    PigE is assessed according to the speed at whichit reaches a conclusion and how much its recom-mendations can improve the profit per pig.Speed is defined in terms of the number of runsof the model required to eliminate biologicalinefficiencies. Profit is defined as dollars per pigper day (if the models housing option is off) oras dollars per square metre per day (if the hous-

    ing option is on). By graphing profit vs runs forboth PigE and a CSIRO expert, a simple visualimpression can be gained of the level of exper-tise reached by the program (see Figure Two).The expert system out performed the humanexpert by 6.5 percent and increased the profitper pig by 227 percent (using the example in[Black 88]).

    2 4 6 8

    0

    0.1

    0.2

    0.3

    Profit

    AUSPIG run number

    ....

    ....

    ....

    .....

    ....

    ...............

    ...........

    PigE:

    227% increase

    Human expert:

    192% increase

    Figure 2. Comparison of human expertise vsPigE

    When these figures are extended to cover theprofit of the entire piggery, then human expertsanalysis increased the piggerys net profit from

    $40,992 to $202,321. The expert systemincreased the piggerys net profit to $273,300;i.e. it outperformed the human expert by 34 per-cent and made $70,979 extra dollars.

    Note that in all the above cases, the expert sys-tem out-performed the human expert.

    7. LESSONS FROM THE SYSTEM

    7.1 Heuristic optimisation

    Traditionally, with many biological problems,linear programs are used to optimise inputs to asystem, such as occurs in least-cost diet formula-tion packages. However, with complex simula-tion models like AUSPIG, the values of vari-ables that must be satisfied during the optimisa-tion procedure are widely depending on the con-ditions of the simulation. This makes a linearprogramming approach unsatisfactory. Otherapproaches, such as selected variations in a fewmain variables, with and without "hill climbing"techniques have been used, but these are com-

    putationally extremely time-consuming andinefficient. The heuristic approach outlined inthis paper provides a far superior method ofoptimising these complex systems than thosepreviously adopted because it uses informationindicating which factors are limiting biologicalefficiency and the rules of "experts" to select andremedy in order of importance of these limita-tions.

    7.2 Arity Prolog

    This project gave us some insights into thestrengths and weaknesses of ARITY PROLOG.In order to retain a balanced perspective, the

    reader should view ARITYs weaknesses (listedbelow) the broader context. Using ARITY, werapidly built an extensive meta-interpreter andfielded a DOS-based PROLOG product interna-tionally.

    7.2.1 Strengths

    PigE made extensive use of meta-level program-ming. ARITY preserved the semantics of ourcode in both interpretative and compiled mode.

    We had some difficulties implementing theinterface from ARITY to PASCAL. This prob-lems proved to be with the PASCAL, not with

    ARITY. Once we changed PASCALs to one thathad a rational memory structure, theARITY/PASCAL interface went very smoothly.

    ARITYs memory management was quite tran-sparent to our PROLOG developers. After someinitial parameter tuning, we could ignorememory management altogether.

  • 7/28/2019 Raising Pigs

    6/8

    The ARITY compiler speed up our executiontimes by a factor of five. This was an unex-pected surprise since the rule-interpreter usednon-compiled rules (see below).

    7.2.2 Weaknesses

    The screen-management tools are overly-complicated. The specification of each screenrequired the assertion of numerous facts. One of

    our design goals was that the CSIRO program-mers could maintain the code. We realised thisgoal, except with the ARITY screen drivers. Inthe end, the CSIRO programmers replaced theARITY system with their own home-growninterface toolkit. We were surprised that thishome-grown system was easier to use and moresophisticated than the screen package distri-buted with ARITY.

    The ARITY compiler had a few limitations:

    Debugging compiled code is not fast. TheARITY compiler produces one internal data-base file (called a .idb file) for the whole

    application. If a module is re-compiled, its old.obj file is replaced but the .idb file is merelyappended to. In practice, this means that

    edit-compile-link-test requires a re-compilation of all modules in order to com-pletely reset the .idb file; i.e. no incrementalcompilation.

    The conclusions of our rules could be verylong and they exceeded the maximum factsize that the ARITY compiler could handle.Hence, we had to distribute our knowledgebase in an interpreted form.

    The declarations required at the top of eachcompiled files were involved and awkward to

    maintain.7.3 Explanation

    The standard view of expert systems is thatexplicit knowledge representations (such asrule-based systems) make an explanation of thereasoning simple to implement. According tothis view, expert systems should have an expla-nation facility since this makes the reasoningmore like a human (humans can explain theirreasoning) and it makes the system more accept-able to the user. Humans will not accept, it isargued, the recommendations of a black box andneed to have a justification for the conclusions.

    This has not been the PigE experience. A fullexplanation of PigEs conclusions requires aninspection of many screens. Users will happilyaccept the expert systems recommendations inpreference to viewing these screens and willnever elect to view the screens. So, expert sys-tems may not need an explanation facility so

    much as a facility to give the users the feelingthat they might be able to check the conclusionsshould they wish to. The paradox we found isthat once the users know such a facility exists,they may never use it.

    7.4 Knowledge Acqusition

    Knowledge acqusition is described as thebottleneck in building expert systems. In the

    case of PigE, the use of the application languageapproach simplified knowledge acqusition. Ourdomain experts (Black and Fleming) wrote the486 rules in the knowledge base with extensiveinitial support, some support in the post-prototype period, and minimal support for theon-going maintenance.

    It should be noted that our experts were bothexperienced programmers and had written tensof thousands of lines of FORTRAN each for therest of the package. Perhaps this prior program-ming experience made it easier for them to con-struct rules. Also, our experts were not

    "computer-scared". They were using the rules toaugment a package they had built entirelythemselves and, hence, were very motivated.Still, the ease of knowledge acquisition wassurprising.

    7.5 Knowledge Maintenance

    Knowledge maintenance is a subject discussedextensively in the expert systems literature. Ithas not proved to be a problem with PigE. Theacid test for maintenance is when the develop-ment crew changes and this has not happened inthe history of the current project. However, thedivision of the rules into knowledge bases and

    rule groups simplifies the maintenance problemas does the application language approach. Theuse of domain-specific procedural attachmentsthat mimic the natural language of the domainexperts significantly simplifies maintenance.

    8. FUTURE DIRECTIONS

    Following on from the success of AUSPIG,CSIRO is now planning a version of the systemfor beef cattle. The expert systems developmentwill continue. As the pig system is refined, theexpert system assumes a higher and higherprominence. Originally developed by mathemat-

    ical modelling experts, AUSPIG is an expertstool more than a tool for the computer novice.The expert system component, originallydeveloped as an intelligent back-end to themathematical model, is being assessed as a toolfor assisting the computer novice for all theirinteraction with the system:

  • 7/28/2019 Raising Pigs

    7/8

    Human users could be assessed on a contin-uum of expertise and the environment couldadapt appropriately.

    Much of the tedium of data entry could beremoved if an expert system could guess atmany of the parameters and offer theseguesses to the user for them to confirm oroverride. A frame representation of thescreens is being considered. Generic screens

    could be modelled as classes and the actualscreens could be instances, each entry fieldbeing one slot. Default values for the slotscould be inherited from the class structureand the user could override the slot values asappropriate.

    9. CONCLUSION

    The logic programming language PROLOG waschosen for business reasons as the implementa-tion tool of choice for a DOS-based farmmanagement expert system. The use of a com-piled PROLOG allowed for the rapid develop-

    ment of a royalty-free high-level domain-specificshell. Using PROLOG, we constructed a appli-cation language system in which the knowledgerepresentations were customised to reflect thenatural idiom of the domain experts. Domainexperts have found no trouble in understanding,developing, and maintaining this representa-tion. embedded into an existing application, andfielded around the world. The choice of PRO-LOG as the expert system tool has beenreviewed and no pressing reason has been foundto move from this language. The system hasbeen a complete success, partially due to theadaptablity and extendability of PROLOG.

    10. REFERENCES

    [Black 87a] Black J.L., Fleming J.F. & DaviesG.T. A Computer Modelling Package for the

    Nutritional Management of Pigs, in AustralianPoultry and Food Convention, 1987, pp273-278.

    [Black 87b] Black J.L. & Davies G.T. Predictingthe Effects of Climate on the Performance of

    Growing Pigs, in Proceedings of NSW Dept.of Agriculture Pig Advisory Office Confer-

    ence, 1987, pp1-13.

    [Black 88] Black J.L. & Barron A. Application ofthe AUSPIG Computer Package for the Manage-ment of Pigs in Proceedings of the Aus-tralian Farm Management Society, 15, 1988,5-1 - 5-10.

    [Bustany 88] Bustany A. & Skingle BKnowledge-based Development via Application

    Languages in Proceedings of the Fourth

    Australian Conference on Applications of

    Expert Systems, May 11-13, 1988, pp277-302.

  • 7/28/2019 Raising Pigs

    8/8

    --