planning a program evaluation - racial equity toolsthe fundamental purpose of evaluation is to...

27
Ellen Taylor-Powell Sara Steele Mohammad Douglah February 1996 Program Development and Evaluation Planning a Program Evaluation G3658-1

Upload: others

Post on 29-May-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Ellen Taylor-PowellSara SteeleMohammad Douglah

February 1996

Program Development and Evaluation

Planning a Program Evaluation

G3658-1

AcknowledgementsFor their timely and thoughtful review of thispublication, the authors wish to thank Phil Holman, Stephen Kluskens, Greg Lamb,Joan LeFebvre, Dave Sprehn, NancyStoutenborough, Jack Trzebiatowski and Kathi Vos.

■ ■ ■

Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

Focusing the evaluation . . . . . . . . . . . . . . . . . . . . 2

What are you going to evaluate?. . . . . . . . . . . . . . . . . . . 2

What is the purpose of the evaluation? . . . . . . . . . . . . . . . 3

Who will use the evaluation? How will they use it? . . . . . . . . 3

What questions will the evaluation seek to answer?. . . . . . . . 5

What information do you need to answer the questions?. . . . . 8

Indicators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

Kinds of information: numerical and narrative . . . . . . . 10

When is the evaluation needed? . . . . . . . . . . . . . . . . . . 10

What resources do you needÑtime, money, people? . . . . . . 10

Collecting the information. . . . . . . . . . . . . . . . . . 11

What sources of information will you use? . . . . . . . . . . . . 11

What data collection method(s) will you use? . . . . . . . . . . 11

What collection procedures will you use? . . . . . . . . . . . . . 12

Using the information . . . . . . . . . . . . . . . . . . . . . 13

How will the data be analyzed? . . . . . . . . . . . . . . . . . . 13

How will the information be interpretedÑBy whom? . . . . . . 14

How will the evaluation be communicated and shared? . . . . 14

Managing the evaluation . . . . . . . . . . . . . . . . . . . 15

Implementing the plan: timeline and responsibilities . . . . . . 15

Budget . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

Finalizing the plan . . . . . . . . . . . . . . . . . . . . . . . . 15References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15Appendices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

TABLE OF CONTENTS

IntroductionThere is no blueprint or recipe for conducting agood evaluation. Because the term evaluation issubject to different interpretations, a programcan be evaluated in a variety of ways. ManyExtension professionals evaluate their pro-grams informally on an ongoing basis throughcasual feedback and observation. The resultinginformation is often very useful and may be allthat is needed to keep a program relevant andoperating efficiently. You can enhance thevalue of information garnered from an evalua-tion however, if you devote sufficient fore-thought and planning to the evaluationprocess. As the term is used in this guide,program evaluation refers to the thoughtfulprocess of focusing on questions and topics ofconcern, collecting appropriate information,and then analyzing and interpreting the infor-mation for a specific use and purpose.

This guide is designed to help you plan aprogram evaluation. It is organized into fourmajor sections:

■ Focusing the evaluation

■ Collecting the information

■ Using the information

■ Managing the evaluation

Each section presents a series of questions andconsiderations for you to adapt to your ownneeds and situation.

YouÕll discover that evaluation is more thanjust collecting information. It involves seriousreflection on questions such as:

■ What is the purpose of the evaluation?

■ What do I want to know?

■ What do I intend to do with the information?

Answers to these questions are crucial if yourevaluation is to produce useful information.This guide will help you think about andanswer these and other questions as you plan aprogram evaluation.

Focusing the evaluation

What are you going to evaluate?

Define what you intend to evaluate. This mayor may not be easy depending upon how

clearly defined your program is. For example,you might wish to evaluate a financial man-agement program. Think about the programÕspurpose and content. Do you want to examinethe whole program or just a particular compo-nent of it? Briefly describe what you want toevaluateÑpurpose, expected outcomes,intended beneficiaries and activities. What isthe planned link between ExtensionÕs inputsand the desired outcomes?

The evaluation effort should fit the program-ming effort. In some cases, you may only beinterested in finding out how peopleresponded to your teaching style, or how satis-fied they were with a particular event. In othercases, you may want to document behavioralchanges or impacts which require a more com-prehensive evaluation and level of effort. Thepoint is to tailor your evaluation to fit theprogram. DonÕt expect to measure impact froma single workshop or behavioral changes froma limited media effort.

Remember, not all extension work needs to beformally evaluated. Formal evaluations requiretime, money and resources. Sometimes pro-grams lack sufficient substance to warrant aformal evaluation, or it may be too costly tocollect the evidence needed to demonstrateimpact. It may be possible that no one is inter-ested in the findings.

We also need to consider our clientele. Peopleget tired of filling out end-of-session forms oranswering surveys. Be selective, considerateand think about what is needed and what willbe used.

2 P R O G R A M D E V E L O P M E N T A N D E V A L U A T I O N ■ ■ ■

What is the purpose of the evaluation?

The fundamental purpose of evaluation is tocreate greater understanding. Within

Extension, program evaluations are conductedlargely to improve educational efforts and toaddress accountability. In action, these pur-poses translate into more specific reasons forconducting an evaluation.

What is the purpose(s) of the evaluation youpropose? For example, will it:

■ Help others (taxpayers, administrators,participants, colleagues, committeemembers) understand the program and itsresults?

■ Improve the program?

■ Improve your teaching?

■ Measure whether the Extension programmade a difference in peopleÕs lives?

■ Determine if the program is worth thecost?

■ Answer questions posed by funders andinfluential members of the community?

■ Meet rank and tenure requirements?

■ Meet administrative requirements?

■ Other?

It is important to clearly articulate the evalua-tionÕs purpose. Otherwise, it will lack directionand the resulting information will not be asvaluable as it could be.

Who will use the evaluation?How will they use it?

Have you ever completed an evaluation andasked, ÒWhat shall I do with it?ÓOr tallied

the results but never really used the findings?If we want to collect relevant data and makethe best use of limited resources, we mustthink about how weÕll use our evaluation rightfrom the start.

Sometimes, we conduct an evaluation only forour own use. Usually, however, there areothers who have requested or could use theresulting information. Any of the groups listedbelow might be interested in the evaluationresults of an Extension program.

■ People affected in some way by theprogram (either directly or indirectly) suchas program participants, nonparticipants,critics

■ County board members, elected officials

■ Community leaders

■ Colleagues, volunteers, collaborators, sup-porters

■ Extension administrators

■ Media

■ Tenure committees

■ Grantors

■ Agencies, firms, interest groups

Identify potential users of the information. Findout what they want to know, and how they willuse the information (see table 1). If you donÕtknow, ask. This will help you to clarify thepurpose(s) of the evaluation, build commit-ment for it and fine-tune the particular ques-tions the evaluation will address.

3P L A N N I N G A N E V A L U A T I O N ■ ■ ■

Involving othersAs with program planning, involving intendedusers of the information in an evaluation leadsto greater commitment to the evaluationprocess, helps ensure that relevant questionsare asked, and increases the chances that find-ings are listened to and used.

User input may be included throughout theentire evaluation process or just at specificstages, such as when you set the evaluationÕsfocus, determine the information needs, orcollect and interpret data. In recent years, con-siderable emphasis has been placed on involv-ing stakeholders as partners in the evaluationprocess to ensure that the information col-lected is relevant and that there is a commit-ment to use it.

Often, however, when we include users, it is asÒhelpersÓor Òdata collectors,Ó while we remainin control of the evaluation. Alternativeapproaches in the field of evaluation aim tochange the center of control. Participatoryapproaches and empowerment evaluation enablepeople to conduct and use their own evalua-tions.

An appropriately constituted advisory groupor a co-sponsor can be a strong asset. Theseparties can serve as advocates for the evalua-tion, see that tasks are completed and helpmake resources available. As a result, morepeople respond, the findings receive moreattention, and the results are disseminatedmore widely.

4 P R O G R A M D E V E L O P M E N T A N D E V A L U A T I O N ■ ■ ■

Table 1. Who wants to know what? How will the information be used?

Who might use What do they want How will they use the evaluation* to know? the results?

You Is the program meeting To make decisions about clientele needs? modifying the program

Is my teaching effective? To influence decisions about tenure or merit

County board Who does the program serve? To make decisions about budget Is the program cost-effective? allocations

Professional Are you an effective educator? To make rank and promotionreview committee decisions

Extension administration Has the program achieved To justify extension programs its expected outcomes? and ensure financial support

How effective are To decide about staff the extension faculty? training and achievements

Clientele Is the extension program To determine whether to participate meeting their needs? in other extension programs

*Examples of broad user categories are listed here. Be as specific as possible when you identify potential users and their interests.

Make a list of the questions and topics thatyou and the individuals or groups you

have listed want to address. As you do so,review the programÕs elements. Sometimesprograms change as they are implemented;sometimes not all the intended activities are

carried out. Defining appropriate questions tobe answered by an evaluation depends uponyour knowledge of the program.

The following table lists some typical ques-tions raised in Extension circles.

5P L A N N I N G A N E V A L U A T I O N ■ ■ ■

About outcomes/impacts■ What do people do differently as a result

of the program?

■ Who benefits and how?

■ Are participants satisfied with what theygain from the program?

■ Are the programÕs accomplishmentsworth the resources invested?

■ What do people learn, gain, accomplish?

■ What are the social, economic, environ-mental impacts (positive and negative)on people, communities, the environ-ment?

■ What are the strengths and weaknessesof the program?

■ Which activities contribute most? Least?

■ What, if any, are unintended secondaryor negative effects?

■ How well does the program respond tothe initiating need?

■ How efficiently are clientele and agencyresources being used?

About program implementation■ What does the program consist ofÑactiv-

ities, events?

■ What delivery methods are used?

■ Who actually carries out the programand how well do they do so?

■ Who participates in which activities?Does everyone have equal access?

■ What is ExtensionÕs role; the contribu-tions of others?

■ What resources and inputs are invested?

■ How many volunteers are involved andwhat roles do they play?

■ Are the financial and staff resources ade-quate?

About program context■ How well does the program fit in the

local setting? With educational needs andlearning styles of target audiences?

■ What in the socio-economic-politicalenvironment inhibits or contributes toprogram success?

■ What in the setting are givens and whatcan be changed?

■ Who else works on similar concerns? Isthere duplication?

■ Who are cooperators and competitors?

About program need■ What needs are appropriately addressed

through Extension education?

■ What are the characteristics of the targetpopulation?

■ What assets in the local context andamong target groups can be built upon?

■ What are current practices?

■ What changes do people see as possibleor important?

■ Is a pilot effort appropriate?

Table 2. Questions raised about extension programs

What questions will the evaluation seek to answer?

As you think about the questions you wantanswered, it may help to review the Bennetthierarchy (see table 3). First used in the 1970s,it continues to be updated and used inExtension for planning and evaluation. Thethree lowest levels concern program imple-mentation while the four upper levels dealwith program results. The logic of the hierarchyis that in Extension programs we expend:

1. resources to conduct

2. activities intended to obtain

3. participation among targeted audiences.

4. ParticipantsÕreactions to program activities affect their

5. learningÑknowledge, opinions, skillsand aspirations. Through learning,people take

6. action which helps achieve

7. impactÑsocial, economic, environmen-tal change.

Tips■ Evidence of outcomes is stronger as you go

up the hierarchy.

■ Difficulty and cost of obtaining evidenceincreases as you go up the hierarchy.

■ Evaluations are strengthened by showingevidence at several levels of the hierarchy.

■ Information from the lower levels helps toexplain results at the upper levels whichare longer-term.

Admittedly, the Bennett hierarchy is a simpli-fied representation of programs and does notindicate the role that larger social, economicand political environments play in extensionprogram delivery. But using this hierarchy canhelp to describe a programÕs logic andexpected links from inputs to end results. Itcan be useful in deciding what evidence to useand when. For example, a program may showevidence of accomplishments at the first fivelevels long before practices are changed,actions are taken or long-term communityimprovements are made.

Table 3. Bennett’s hierarchy of evidence forextension program evaluation

6 P R O G R A M D E V E L O P M E N T A N D E V A L U A T I O N ■ ■ ■

7. ImpactÑSocial, economic, environmen-tal conditions intended as end results,impacts or benefits of programs; publicand private benefits.

6. ActionsÑPatterns of behavior and pro-cedures, such as decisions taken, recom-mendations adopted, practices imple-mented, actions taken, technologiesused, policies enacted.

5. LearningÑKnowledge (awareness,understanding, mental abilities); opin-ions (outlooks, perspectives, view-points); skills (verbal or physical abili-ties); aspirations (ambitions, hopes).

4. ReactionsÑDegree of interest; feelingstoward the program; positive or negativeinterest in topics addressed, acceptanceof activity leaders, and attraction to edu-cational methods of program activities.

3. ParticipationÑNumber of peoplereached; characteristics/diversity ofpeople; frequency and intensity ofcontact/participation.

2. ActivitiesÑEvents, educational methodsused; subject matter taught; media work,promotional activities.

1. ResourcesÑStaff and volunteer time;salaries; resources used: equipment,travel.

Source: Bennett and Rockwell, 1995. Targeting Outcomesof Programs (TOP); slightly modified.

Many Extension professionals are concernedwith outcomes and with documenting evi-dence that their programs make a difference inpeoplesÕ lives. But understanding outcomesrequires more than just documenting endresults. When that happens, we are left withwhat has been called the Òblack boxÓ approachto evaluationÑwe record outcomes but wedonÕt know what led to them. For example,using a pre-test and post-test demonstratesthat something happened (we hope)Ñnot howor why or the role that Extension played. Whatwere the activities? What contribution didExtension make? What factors in the socio-economic context or implementation processinfluenced the outcomes? Identify those partsthat need to be explored relative to yourprogram and situation. At the minimum, docu-menting ExtensionÕs role and resource invest-ments is critical to most evaluations.

Extension plans programs to have certain posi-tive outcomes. However, unanticipated eventsmay occur that result in positive, negative orneutral outcomes. For example, a program todevelop a recreational center for youth mayresult in an increase in street litter and noise;or, an economic development program mayresult in new investors coming to town whodisplace local businesses. Or, there may beunexpected positive benefits which are asimpressive or more impressive than theplanned outcomes. Think about what someother effects of your program might be. Createan evaluation that will stay tuned to unex-pected results.

Clarifying the evaluation question(s)As you think about the questions that yourevaluation will answer, it may be necessary tobreak a larger question into its componentparts. This will help you fully answer thebroader question and begin to identify theinformation you need to collect. Consider thefollowing examples:

Main question: Who benefits from theprogram?

Sub-questions: Who actually participates inthe program? At what level ofinvolvement?

Who else gains from theprogram? What do they gain?

How do program participantscompare to the county popula-tion in general?

Who may be negativelyaffected? How?

Main question: Is the program duplicatingother efforts?

Sub-questions: Of what does the programconsist?

What other similar programsexistÑof what do theyconsist?

How are aspects of these programs alike? Dissimilar?Complementary?

What is our particular expertise/niche?

Main question: Did people learn the impor-tance of X?

Sub-questions: Did people know anythingabout X before attending theprogram?

Was the environment con-ducive to learning?

Do any other programs oragencies promote theimportance of X?

7P L A N N I N G A N E V A L U A T I O N ■ ■ ■

It will probably be necessary to prioritize theevaluation questions. Try to distinguishbetween what you need to know and whatmight be merely nice to know. Focus on thekey questions that are most important. Whenprioritizing, consider time, resources and theavailability of needed assistance compared tothe value of the information that might be col-lected. As needed, bring stakeholders togetherand negotiate a practical list. Above all, keepthe evaluation manageable. It is better to stayfocused and answer a few questions well.

Define key termsYou may need to define certain terms. Forexample, ÒimpactÓÑwhat does it mean? Interms of what? On whom? You may defineimpact from your perspectiveÑwhat youwanted to have happen (look back at yourdesired outcomes). But also, check in terms ofwhat the participants themselves see as theimpact. Our definitions and program partici-pant definitions are not necessarily the same.

What information do you need toanswer the questions?

Once youÕve identified the key questions, you(and those you are involving in the evalua-

tion) can begin the creative task of figuring outhow to answer those questions. Not all perti-nent information can be specified in advance,but much can and should be.

IndicatorsAn indicator expresses that which you wish toknow or see. It answers the question, ÒHowwill I know it?Ó It is the indication or observ-able evidence of accomplishments, changesmade, or progress achieved. Indicators are themeasurements that answer your evaluationquestions. Some examples are provided intable 4.

As we know, leadership is a complex phenome-non that can be displayed in many ways. Askyourself (and others)Ñwhat does leadershipmean? How would I recognize it if I saw it? Ifour evaluation seeks to document the impactour program has on developing leadershipskills, we first must identify those actionswhich indicate that a person is demonstratingimproved leadership. If our evaluationpurpose is to determine the impact a youthdevelopment program has on developingcapable youth, we first must define what wemean by capable youth and list the characteris-tics that identify them. These are the indicatorsyou should use to measure the outcome.

Often, a number of indicators are needed toexpress an outcome more accurately. You willneed to clearly define indicators that areappropriate to your program or use thosedeveloped and tested elsewhere. In ideal prac-tice, indicators are written during programplanning. An example showing indicators fordifferent levels of an extension program isfound in Appendix A.

8 P R O G R A M D E V E L O P M E N T A N D E V A L U A T I O N ■ ■ ■

Case example. Program staff defined andevaluated their outcome of a bilingual nutritioneducation program as knowledge gained(nutrition knowledge gained, proficiency inlanguage). An evaluation showed little, if any,gains in knowledge.

Upon further probing, it was found that theparticipants were very satisfied with theprogram. For them, it had been very success-ful because, at its conclusion they were ableto shop with greater confidence and ease,saving time. Staff-defined definitions of out-comes missed some important benefits asperceived by the participants.

Remember, a program outcome may mean dif-ferent things to different people. Therefore, theexpression of that outcome may be different.For example, volunteers and youth who expe-rience violence in their daily lives are likely tocharacterize capable differently than a rural 4-Hclub member. Hmong or Native Americansmay designate different attributes for leadershipthan participants of Hispanic or Europeanorigin. Wherever possible, try to understandthe meaning of the program and its outcomesfrom the participantsÕ perspectives. Includethose meanings as the indicators for measuringsuccess.

Also, remember that when you collaboratewith other agencies, they may use differentindicators. The Department of NaturalResources, for example, might measure waterquality in terms of bio-physical characteristics,the local Economic Development Group maymeasure healthy community in terms of numberof jobs per capita, and the Feed ProducerÕsAssociation may measure agricultural profitabil-ity as cost/return ratios.

9P L A N N I N G A N E V A L U A T I O N ■ ■ ■

Evaluation question How I will know it? The indicators

Has the expected change in ¥ Ability to negotiate when group disagreesleadership capabilities occurred? ¥ Improved listening skills

¥ Ability to maintain balance between process and task activities

¥ Anything else?

Is water quality improving? ¥ Less pollution as a result of improved nutrient management¥ Balanced species composition in lakes¥ Anything else?

Are young people learning to ¥ Increased confidence in expressing ideas clearlycommunicate effectively? ¥ Improved verbal and non-verbal communication skills

¥ Improved listening skills¥ Anything else?

Was the collaboration successful? ¥ Actions taken as a result of the collaboration¥ Membership from each segment of the

community affected¥ Roles, rights and responsibilities clearly delineated¥ Communication is open and frequent¥ Anything else?

Table 4. Examples of indicators

Kinds of information: numerical and narrative Numerical and narrative data both serve auseful purpose in evaluation. The choice forusing one or both types of informationdepends upon what you want to know aboutthe program and who will receive the informa-tion. Ultimately, the value of the informationdepends upon how it is viewed by your audi-ence. Most evaluations include a mix ofnumerical and narrative information. The exactmix depends upon the preferences and back-grounds of the people to whom you will com-municate.

Think about what type of data is most likely tobe understood and viewed as credible by thosewho will receive the evaluation.

■ Will your evaluation audience beimpressed with numbers and statistics?

■ Will your evaluation audience beimpressed with human interest stories andexamples of real situations?

■ Will a combination of numbers and narra-tive information be valuable?

When is the evaluation needed?

DeadlinesWhen the information is needed and what youcan manage to do within the timeline willinfluence the scope of your evaluation. Youmight decide to save some questions or con-cerns for another study, or to discard others asunnecessary or inconsequential. Try to developa realistic timeline for completing the evalua-tion. Keep the components of your plan man-ageable so they can be handled well withinthose time limits.

Usable momentsYou may not need evaluation information tomeet a specific deadline, but there might betimes when having such information wouldserve a valuable purpose. A few examples ofsuch Òusable momentsÓ include importantcommittee meetings, testimony in front offunders, pre-budget hearings, etc.

What resources do you need—time, money, people?

The resources you have available may influ-ence your evaluation plan more than any

other single factor. Even if you expect to inte-grate evaluation into the program or if theevaluation will be conducted by volunteers orthe participants themselves, you will need toallocate time for planning. Balancing yourexpectations (and those of others) with what isrealistic and manageable is a challenge. YouÕllneed to consider:

■ Time. Whose time and how much of it isavailable to work on evaluation? What pri-ority will evaluation have in your overallworkload? Involving volunteers or partici-pants is a way to spread the workload, butit may require time for preparation ortraining.

■ Money. Some activities require financing.For example, what dollar resources areavailable to print questionnaires, pay forpostage, reimburse participants, analyzethe data?

■ Expertise. Sometimes we need outsideexpertise to help with certain parts of anevaluation. Constructing the instrument oranalyzing the data may require such help.Or, there may be others with a lot of experi-ence and knowledge related to the programfrom whom we could learn. Sometimes theinvolvement of an ÒoutsiderÓ increases theevaluationÕs credibility.

10 P R O G R A M D E V E L O P M E N T A N D E V A L U A T I O N ■ ■ ■

Collecting the information

Go back and review the questions you wish toask and the information needed to answerthem. Now is the time to think about how youwill collect the information.

What sources of information will you use?

Existing informationRemember, you donÕt always have to goout and collect new data. Explore possiblesources of existing information such asother agency records, previous evaluationreports, local service and business reports,WISPOP, the Census Bureau, etc. Printedmaterial about the program (newspaperarticles, annual reports and updates, par-ticipant logs and records) may be a valu-able source of information.

PeopleThe most common source of information isthe programÕs participants and beneficia-ries themselves. However, a range ofpotential ÒhumanÓ sources exist includingnonparticipants, proponents and critics;key informants (school principals, courtjudges, parents of participants, volunteerleaders, etc.Ñindividuals who are likely toknow something about the programs andtheir effects); program staff and collabora-tors; legislators; funders; and policymakers.

Be sure that the people you identify canactually provide the information you areseeking.

ObservationsAn underused, but powerful, source ofinformation is the direct observation ofprogram events, activities and results.

Pictorial recordsAnother powerful source of information isany pictorial record that shows programactivities and effects documented inphotos, charts, videotapes, and maps.

What data collection method(s)will you use?

MethodWhen you think about the type of method touse for collecting the information, consider:

■ Which method is most likely to secure theinformation?

■ Which method is most appropriate giventhe values, understandings and capabili-ties of those who are being asked toprovide the information?

■ Which method is least disruptive to yourprogram and to your clientele? Askingquestions can be intrusive, time-consum-ing and/or anxiety-provoking.

■ Which method can you afford and handlewell?

The best way to collect data often dependsupon an understanding of the social, culturaland political environment.

■ Some participants may not feel comfort-able responding over the telephone or in awritten format. You will need cultural sen-sitivity to link an appropriate data collec-tion technique with diverse respondents.

■ If, in the past, only 20% responded to yourmail survey, you will need to decidewhether that is an adequate representa-tion. Can you get more useful informationfrom another method?

In Extension, weÕve tended to rely on surveys,tests and end-of-session questionnaires.Currently, focus group interviews are gainingpopularity. Altogether there are a variety oftechniques from which to choose (see table 5).Select the method which suits your purposeÑdonÕt let the method determine your approach.Be creative and experiment with various techniques.

11P L A N N I N G A N E V A L U A T I O N ■ ■ ■

Table 5. Common types of data collectionmethods used in evaluations of extension programs

Multiple data collection methods. Using twoor more methods provides a more thoroughaccount and cross-validates your findings. Thismay not be necessary if the evaluationÕspurpose is narrow, you have few resources oryou expect limited use of your findings.

InstrumentationIn most evaluations, there is some sort of formor device for compiling information such as arecording sheet, a questionnaire, a video oraudio tape. Think about the method you havechosen and decide what is needed to recordthe information. If a questionnaire or recordingsheet is used, check to ensure that it

■ will secure the information you want;

■ will be understood by the respondent andthe recorder;

■ will be simple and easy to follow;

■ will be culturally sensitive.

Conduct a pilot test of the questionnaire orrecording form with people similar to yourproposed respondents or recorders. Avoid thetemptation to only use office colleagues to pre-test evaluation instruments and forms. Theymay understand what you intended whenrespondents will not. Pilot the instrument andthen discuss with the pilot respondents anyuncertainties they might have had. This pro-vides a chance to eliminate potential problems.

What collection procedures willyou use?

When will the data be collected?■ Before and after the program. Examples:

pre- and post-measures.

■ At one time. Examples: a single survey; anend-of-program assessment; a groupdebate.

■ At various times during the course of theprogram. Examples: a panel survey; a mixof methods implemented at various times;at three and six months.

■ Continuously throughout the program.Example: logs kept by youth throughout aweekend camping experience.

■ Over time. Example: a longitudinal surveythat documents practices over several years.

Will a sample be used?Whether to sample or not depends upon thepurpose of the evaluation, the size of the popu-lation, and the method used. For example, itmay be possible to administer a post-test to all100 participants in a workshop to ascertaintheir level of learning and satisfaction. But, ifyou want interview data from these same par-ticipants, you may not be able to interview all100. And, if you want to generalize to thewhole group, you will need a probabilitysample. Sampling also depends upon thenumber of people in your program. If yourprogram is a focused effort working with 20 at-risk teenagers, you will probably want toinclude all 20 in your evaluation.

12 P R O G R A M D E V E L O P M E N T A N D E V A L U A T I O N ■ ■ ■

Survey

Interview

Test

Observation

Group techniques

Case study

Photograph, videotape, slides

Document review and analysis

Portfolio review

Testimonials

Expert or peer review

Simulated problem or situation

Journal, log, diary

Unobtrusive measures

We tend to think of sampling in terms ofpeople. One can also take a sample of docu-ments, program sites or locations. For example,rather than collecting information from all thecounty 4-H clubs, you may wish to focus yourresources and take a random sample of clubs.Or, you may want to stratify your sample byage, club activity, or location. To do so willrequire particular attention to your sample sizeand selection. In other cases, you may wish tolearn about select groups without needing togeneralize to all the groups. Then, a nonproba-bility sample is appropriate.

Consider the kind of sample and size that willbe most credible to those you want to payattention to the findings.

Note: Some professional evaluators argue thatit is better to sample and use several data col-lection techniques in an evaluation than toexpend all your resources on collecting datafrom the entire population using a singleinstrument. Also, political concerns may needto be considered. For example, political offi-cials or legislators may only see the evaluationas credible if it includes their district or a largenumber of respondents.

Who will collect the data?You may be the only one collecting informa-tion, but more and more frequently, others arealso involved in evaluationÑparticularly indata collection. Training or support may beneeded to help them do their job.

What is the schedule for data collection?

■ When will the information be available?

■ When can the information be convenientlycollected? When will it be least disruptive?

■ Where will the information collection takeplace?

■ When will data collection start and end?

Consider your respondents. Convenienttimes and places are likely to differ dependingupon whether your respondents are farmers,business owners, men, women, single parents,school teachers, or some other group.Likewise, there may be culturally appropriatemeeting times and locations.

A sample worksheet in Appendix B covers the ÒinformationÓ aspects of planning an evaluation.

Using the information

Evaluation involves more than just collectinginformation. The information must be orga-nized and presented in a way that permitspeople to understand it.

How will the data be analyzed?

Organizing, tabulating and analyzing yourdata to permit meaningful interpretation

takes time and effortÑoften, more time andeffort than youÕd expect. Factor this in whenyou design your evaluation. If resources arelimited, you may want to structure your evalu-ation to be smaller, rather than larger.

The aim of data analysis is to synthesize infor-mation to make sense out of it. Different tech-niques are appropriate depending uponwhether you have qualitative (narrative,natural language) or quantitative data (numerical data).

Consider such questions as:

■ How will responses be organized/tabulated? By hand? By computer?

■ Do you need separate tabulations from different locations or groups?

■ What, if any, statistical techniques will beused?

■ How will narrative data be analyzed?

■ Who will organize and analyze the information?

13P L A N N I N G A N E V A L U A T I O N ■ ■ ■

How will the information beinterpreted—by whom?

Interpretation is the process of attachingmeaning to the analyzed data. Too often we

analyze data but fail to take the next step to putthe results in context and draw conclusions. Forexample, what does it mean that 45% of therespondents reported having their wells tested?Is this greater or less than last year? Is thisnumber high or a low for X county? What doesit mean in terms of health and safety? What, ifanything, should be done next?

Numbers do not speak for themselves. Theyneed to be interpreted based on careful andfair judgements. Similarly, narrative statementsneed interpretation.

Who should be involved in interpretingthe results of the data analysis?The same information can be interpreted invarious ways. As the program director, youmay have your own perspective. Others willlook at the data through different eyes. Greaterunderstanding usually results when weinvolve others or take time to hear how differ-ent people interpret the same information.One way to do that is through meetings withsmall groups to discuss the data. Think aboutincluding program participants when dis-cussing the meaning of the information.

What is the base for interpreting thedata?Consider how you will make sense of theresults. To what will the information be com-pared: findings from other evaluations?Baseline data? Initiating evidence of need? Pre-defined standards of expected perfor-manceÑÓwhat should beÓ?

Who sets the basis for comparison?■ Expert/professional judgements

■ Program personnel judgements

■ ParticipantsÕ judgements

■ Existing research; experience from similarprograms

■ Your own personal judgement

What are the conclusions and recommendations?Summarize the three to five main points thatyou feel are most important from the evalua-tionÑthe points you really want to rememberand have other people remember. As appropri-ate, provide recommendations that you feelfollow from these findings.

What did we learn? What will we do differently?If we agree that the underlying purpose of anyevaluation is to promote understanding andlearning about extension programs, then theultimate result is to articulate what welearnedÑabout the program, about our profes-sional competencies, about the process of theevaluation. What will we do as a result of theseinsights? Often, it is useful to lay out an actionplan. When conducting the evaluation in col-laboration with othersÑsuch as a communitygroup, a farmersÕ association or a 4-H clubÑdeveloping an action plan helps ensure theresults are used.

How will the evaluation be communicated and shared?

To whom?Look back at who was identified early on as akey user. Target key decision makers withappropriate and hard-hitting information.Share with your colleagues who may need toconduct a similar evaluation. Is there anyoneelse who might, or should be, interested in theevaluation results?

Remember to communicate your findings tothe respondents who participated in your eval-uation. Not only is this courteous, but it helpsto ensure their cooperation in future work.

How?You have expended time and resources in con-ducting your evaluation. Now, you need tomaximize your investment in the project.Think about other ways you might get somemileage from your effort. Remember that citinga finding or two in informal conversations mayhave more influence than a formal report.

14 P R O G R A M D E V E L O P M E N T A N D E V A L U A T I O N ■ ■ ■

Communication methods you use will dependupon your audience. A variety of possibilitiesexist, such as:

■ A written report

■ Short summary statements

■ Film or videotape

■ Pictures, photo essays, wall charts,bulletin boards, displays

■ Slide-tape presentations

■ Graphs and visuals

■ Media releases

■ Internet postings

Invite your audiences to suggest ways theyÕdlike to receive the information; for example,dates when it would be most valuable; usefulformats; effective displays or graphs; other rec-ommendations that would maximize its use.

Managing the evaluation

Implementing the plan: timelineand responsibilities

There are various ways to lay out responsibil-ities and a timeline for managing an evalua-

tion. Construct your own or see the examplesin Appendices C and D. You may wish to postthe chart in Appendix C in an obvious locationwhere all involved can see it.

BudgetIf necessary, establish a budget to cover costssuch as printing or duplicating (data collectioninstrument, reports), communications(postage, telephone calls), incentives orrewards to respondents, providing a meal orother reimbursement for group participants,making or editing a videotape, travel and perdiem costs, data processing, or consultantsÕ fees.

Finalizing the plan■ Assess the feasibility of carrying out your

plan.

■ Do you foresee any barriers or obstacles?

■ Refine and revise as necessary.

ReferencesBennett, C. and K. Rockwell. 1995. Targeting

Outcomes of Programs (TOP). USDA, draftdocument.

Brinkerhoff, R.O., D. Brethower, T. Hluchyj,Jeri Nowakowski. 1983. Program Evaluation:A PractitionerÕs Guide for Trainers andEducators: Sourcebook, Casebook. Boston:Kluwer-Nijhoff.

Clouthier, D., B Lilley, D. Phillips, B. Weber, D. Sanderson. 1987. A Guide to ProgramEvaluation and Reporting. University ofMaine Cooperative Extension Service.

Froke, Barbara. 1980. The Answers to ProgramEvaluation: A Workbook. Adapted by Spiegeland Leeds, August 1992. Ohio CooperativeExtension Service, Ohio State University.

Herman, Joan, Lynn Morris, Carol Fitz-Gibbon.1987. EvaluatorÕs Handbook. Newbury Park:Sage Publications.

Patton, M.Q. 1982. Practical evaluation. Beverly Hills: Sage Publications.

Rockwell, S. Kay. 1993. Program Evaluation inAdult Education and Training. StudentManual. Satellite program consisting of 15modules. U. of NebraskaÐLincoln andAgriculture Satellite Corporation.

Stecher, B.M. & W.A. Davis. 1987. How to Focusan Evaluation. Newbury Park, CA: Sage.

Worthen, B.R., and James R. Sanders. 1987.Educational Evaluation: AlternativeApproaches and Practical Guidelines. New York: Longman.

15P L A N N I N G A N E V A L U A T I O N ■ ■ ■

Appendix A

Example showing indicators for different levels in an extension program.

Program level Expected acheivements Indicators

7. Impact Town makes development Need for zoning clarified.decision based on good Town adopts comprehensive planning techniques. plan. Effective moratorium on

condos. Citizen satisfaction.

6. Actions Individuals use skills or Planning board creates and proposesknowledge; board works better comprehensive plan, functionswith other town departments; as a cohesive unit, schedules new relationships are formed. sessions with decision makers;

new working relationships evolve.

5. Learning Acquire sufficient knowledge Community planning techniques about planning and skills learned. Group functioning with group process. Develop understood. Process noted positive attitudes about informally among members planning. Choose to act. and Extension staff.

4. Reactions Group members maintain level Progress is made, deadlines kept. of interest and accept leadership Board takes initiative in planning responsibilities. Extension process. Group is satisfied withstaff role is appropriate. progress and ExtensionÕs role.

3. Participation Appropriate people are Broad-based representation; involved as members and as each member accepts part of the technical resources. work; appropriate resource

people (technical and key community leaders) take part.

2. Activities Needs assessment; Needs assessment completed. facilitate prioritization Problems defined and written. process; four meetings. Objectives and priorities set.

1. Inputs Volunteer/citizen Public or official support and participation; 100 hours sanctions; agreement (contract) made Extension time; specialist input. between group and Extension staff;

group membership established; contract formed if necessary.

16

Ap

pe

nd

ixB

17

continued on next page

Method for collecting Evaluation questions Information required Information source information

Sample worksheet developed by Rockwell, 1993. Module 8.3 based on worksheet for summarizing an evaluation plan from Worthen & Sanders, 1987 (p.244)

Sample w

orksheet developed by Rockw

ell, 19931

8

Ap

pe

nd

ixB

, continuedInformation collection

arrangements Reporting of information

Interpretation proceduresBy whom Conditions When Analysis procedures and criteria To whom How When

19

Ap

pe

nd

ix C

, Eva

lua

tion

sch

ed

ule

an

d in

pu

t req

uire

me

nts

Dates Inputs needed

Activity/Task Person responsible Start Finish to accomplish task

The Gantt ChartGantt Chart is a simple display that includesproportionate, chronologically scaled time-frames for each evaluation task. The chart pro-vides an overview of the entire evaluationprocess that illustrates when evaluation activi-ties will begin and how long each will con-tinue.

Vertical axis lists the tasks to be completed.Horizontal axis shows a time scale.A horizontal line is drawn for each task toshow how long it will take.Milestones (important interim deadlines) arekeyed with symbols.

Figure 1. Example of a Gantt Chart

Uses1. Communicates evaluation plan to a non-

technical audience; therefore it is useful inproposals or reports.

2. Helps in time management since it forcesone to examine the length of time eachproject task will require, to contemplatethe overlap between tasks, and to establisha realistic timeframe for the entire project.

A Gantt chart reflects the evaluatorÕs planning.Although it is easy to prepare, it is useful onlywhen all evaluation steps are accounted forwithin realistic time frames. One must allowsufficient time for each step in the evaluationprocessÑstarting from focusing the evaluationthrough the final report.

Source: Rockwell, 1993. Module 8.9based on Worthen and Sanders, 1987, pp. 256-257.

20

Appendix D

Time (weeks)

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18

Milestone when a product is available

1. Develop instrument

2. Test instrument & revise

3. Select sample

4. Prepare for mailing

5. Data collection

6. Data analysis

7. Draft report

8. Final report

Authors: Ellen Taylor-Powell is a program development and evaluation specialist for Cooperative Extension,University of WisconsinÐExtension. Sara Steele is a professor of continuing and vocational education at theUniversity of WisconsinÐMadison.

An EEO/Affirmative Action employer, University of WisconsinÐExtension provides equal opportunities inemployment and programming, including Title IX and ADA requirements. Requests for reasonable accommo-dation for disabilities or limitations should be made prior to the date of the program or activity for which theyare needed. Publications are available in alternative formats upon request. Please make such requests as early aspossible by contacting your county Extension office so proper arrangements can be made.

© 1996 by the Board of Regents of the University of Wisconsin System doing business as the division ofCooperative Extension of the University of WisconsinÐExtension. Send inquiries about copyright permission to:Director, Cooperative Extension Publications, 201 Hiram Smith Hall, 1545 Observatory Dr., Madison, WI 53706.

This publication is available from:Cooperative Extension PublicationsRoom 170, 630 W. Mifflin StreetMadison, WI 53703Phone: (608)262-3346

G3658-1 Program Development and Evaluation, Planning a Program Evaluation

Ellen Taylor-PowellSara SteeleMohammad Douglah

February 1996

Program Development and Evaluation

Planning a Program Evaluation:Worksheet

G3658-1W

Focusing an evaluation

1. What are you going to evaluate?

2. What is the purpose of the evaluation?

3. Who will use the evaluation ? How will they use it?

Who/users How will they use the information?

How may others be involved in the evaluation? __________________________________________

4. What questions will the evaluation seek to answer?

5. What information do you need to answer the questions?

What I wish to know Indicators—How I will know it?

6. When is the evaluation needed? ____________________________________________________________

7. What resources do you need?

a. Time available to work on evaluation: _____________________________________________

_______________________________________________________________________________

b. Money: _______________________________________________________________________

_______________________________________________________________________________

c. PeopleÑprofessional, paraprofessional, volunteers, participants: _____________________

_______________________________________________________________________________

P R O G R A M D E V E L O P M E N T A N D E V A L U A T I O N ■ ■ ■

Collecting the information

8. What sources of information will you use?

Existing information: ____________________________________________________________

People: _________________________________________________________________________

Observations: ___________________________________________________________________

Pictorial records: ________________________________________________________________

9. What data collection method(s) will you use?

■■ Survey ■■ Document review

■■ Interview ■■ Testimonials

■■ Observation ■■ Expert panel

■■ Group techniques ■■ Simulated problems or situations

■■ Case study ■■ Journal, log, diary

■■ Tests ■■ Unobtrusive measures

■■ Photos, videos ■■ Other (list) ___________________________

Instrumentation: What is needed to record the information?

10. What data collection procedures will be used?

When will you collect data for each method youÕve chosen?

During Immediately Method Before program program after Later

Will a sample be used?________________________________________________________________

No ■■

Yes ■■ If yes, describe the procedure you will use. ___________________________________

_________________________________________________________________________________

_________________________________________________________________________________

Who will collect the data? _____________________________________________________________

P L A N N I N G A N E V A L U A T I O N — W O R K S H E E T ■ ■ ■

Using the information

11. How will the data be analyzed?

Data analysis methods: ________________________________________________________________

Who is responsible: ___________________________________________________________________

12. How will the information be interpreted—by whom?

Who will do the summary?_____________________________________________________________

13. How will the evaluation be communicated and shared?

To whom When/where/how to present

Managing the evaluation

14. Implementation plan: timeline and responsibilities

Management chart_________________________________________________________________

Budget ___________________________________________________________________________

P R O G R A M D E V E L O P M E N T A N D E V A L U A T I O N ■ ■ ■

University of WisconsinÐExtension ■ Cooperative Extension630 W. Mifflin Street, Room 170, Madison, WI 53703, Phone: 608/262-2169, fax: 608/262-9166

February 1996