how do we know it’s working? creating evaluations for technology projects and evaluations (part i)

Post on 14-Dec-2015

217 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

TRANSCRIPT

How Do We Know It’s Working?

Creating Evaluations for Technology Projects and

Evaluations (part I)

Contact Information

jsun@sun-associates.com 978-251-1600 ext. 204

www.edtechevaluation.com This presentation will be linked to that site (on the

Tools page)

Where Do We Stand?

Who’s working on an actual project? Current? Anticipated?

Your expectations for today

Workshop Goals

To review the key elements of effective program evaluation as applied to technology evaluations

To consider evaluation in the context of your actual projects

Why Evaluate?

To fulfill program requirements NCLB and hence Title IID carry evaluation

requirements

To realize your investment in technology What sort of “difference” has all of this

technology made?

Basis in NCLB

“The application shall include:…

A description of the process and accountability measures that the applicant will use to evaluate the extent to which activities funded under this subpart are effective in integrating technology into curricula and instruction, increasing the ability of teachers to teach, and enabling students to meet challenging State academic content and student academic achievement standards.”

NCLB Act, Title II, Part D, Section 2414(11)

One consistent thread in NCLB is evaluation and assessment How can you document that this “intervention”

is making a difference?

All funded work must be based in reflection and data-driven decision-making

Naturally, this translates to local district proposals

A Framework for Review

From Designing ProfessionalDevelopment for Teachers of Scienceand Mathematics, Loucks-Horsley,Hewson, Love, and Stiles. CorwinPress Inc. 1998

Evaluation

Helps clarify project goals, processes, products Must be tied to indicators of success written for your

project’s goals Not a “test” or checklist of completed activities Qualitatively, are you achieving your goals? What adjustments to can be made to your project to

realize greater success?

The Basic Process

Evaluation Questions Tied to original project goals

Performance Rubrics Allow for authentic, qualitative,

and holistic evaluation

Data Collection Tied to indicators in the

rubrics

Scoring and Reporting Role of this committee (the

evaluation committee)

Creating a

District-wide

Technology

Evaluation

Generate

leadership

support

Determine scope

of the evaluation

effort

Formulate

Evaluation

Questions

Appoint

Committee

Review

Questions

Develop Indicator

Rubrics

Data Collection

Data Analysis

Scoring the

Rubrics

Recommendations

Dissemination of

Report

Findings

Initiating the Next

Review Cycle

Orient and Train

In-District

Evaluation

Committee

Stage 1

Committee orientation,

evaluation framing, and

training

Stage 2

Data collection and

analysis

Stage 3

Findings, recommendations,

and reporting

Who Evaluates?

Committee of stakeholders (pg 12)Outside facilitator?Data collection specialists?Task checklistOther issues:

Honesty Perspective Time-intensive

Evaluation Starts with Goals

Evaluation should be rooted in your goals for how you are going to use or integrate that technology Is more than an infrastructure plan Focuses on technology’s impact on teachers

and students Has clear goals and objectives for what you

want to see happen

Evaluation Logic Map

Project Sample

Your Project?

Using the Evaluation Logic Map, map your: Project purpose/vision Goals Objectives Actions

Goals Lead to Questions

What do you want to see happen? These are your goals Rephrase goals into questions

Achieving these goals requires a process that can be measured through a formative evaluation

We Start with Goals…

To improve student achievement through their participation in authentic and meaningful science learning experiences.

To provide advanced science and technology learning opportunities to all students regardless of learning styles or abilities.

To produce high quality science and technology curriculum in which the integration of technology provides “added value” to teaching and learning activities.

To increase students’ knowledge of the Connecticut River’s history and geology, and to gain and understanding its past, present and possible future environmental issues.

…and move to questions

Has the project developed technology-enhanced science learning experiences that have been instrumental in improving student mastery of the Skills of Inquiry, understanding of the history/geology/ecology of the Connecticut River, and of the 5-8 science curriculum in general?

Has the project offered teacher professional development that has resulted in improved teacher understanding of universal design principles and technology integration strategies?

…And Then to Indicators

What is it that you want to measure? Whether the projects have enhanced learning The relationship between the units and

The selected curriculumThe process by which they were developed

Increases in teacher technology skills (in relation to particular standards)

Whether the professional development model met with its design expectations

Collaborative and sustainableInvolves multiple subjects and administrators

Indicators should reflect your project’s unique goals and aspirations Rooted in proposed work Indicators must be indicative of your unique

environment...what constitutes success for you might not for someone else

Indicators need to be highly descriptive and can include both qualitative and quantitative measures

Try a Sample Indicator

Going back to the Logic Map, try to develop a few indicators for your sample project Keep it simple Qualitative and quantitative Will you be able to seesee the indicator?

To Summarize...

Start with your proposal or technology plan

From your goals, develop indicators and a performance rubric

Coming in Part II

Data CollectionReporting

How Do We Know It’s Working?

Creating Evaluations for Technology Projects and

Evaluations (part II)

Creating a

District-wide

Technology

Evaluation

Generate

leadership

support

Determine scope

of the evaluation

effort

Formulate

Evaluation

Questions

Appoint

Committee

Review

Questions

Develop Indicator

Rubrics

Data Collection

Data Analysis

Scoring the

Rubrics

Recommendations

Dissemination of

Report

Findings

Initiating the Next

Review Cycle

Orient and Train

In-District

Evaluation

Committee

Stage 1

Committee orientation,

evaluation framing, and

training

Stage 2

Data collection and

analysis

Stage 3

Findings, recommendations,

and reporting

A Basic Process

Evaluation Questions Must be tied to original planning goals

Performance Rubrics Allow for authentic, qualitative, and holistic

evaluationData Collection

Tied to indicators in the rubricsScoring and Reporting

Measures?

Classroom observation, interviews, and work-product review What are teachers doing on a day-to-day basis to

address student needs?Focus groups and surveys

Measuring teacher satisfactionTriangulation with data from administrators and

staff Do other groups confirm that teachers are being

served?

Data Collection

Review Existing Data Current technology plan Curriculum District/school improvement plans

www.sun-associates.com/eval/sampleCreate a checklist for data collection

Surveys

Creating good surveys length differentiation (teachers, staff, parents,

community, etc..) quantitative data attitudinal data timing/response rates (getting returns!)

www.sun-associates.com/eval/samples/samplesurv.html

Surveys

Online Profiler LoTi Zoomerang

Survey Issues

Online surveys produce high response rates

Easy to report and analyze data Potential for abuse Depends on access to connectivity

Focus Groups/Interviews

Focus Groups/Interviews Teachers Parents Students Administrators Other stakeholders

Classroom Observations

Using an observation templateUsing outside observers

Other Data Elements?

Artifact analysis A rubric for analyzing teacher and student

work?

Solicitation of teacher/parent/student stories This is a way to gather truly qualitative data What does the community say about the use

and impact of technology?

Dissemination

Compile the reportDetermine how to share the report

School committee presentation Press releases Community meetings

Conclusion

Build evaluation into your technology planning effort

Remember, not all evaluation is quantitative

You cannot evaluate what you are not looking for, so it’s important to —

Develop expectations of what constitutes good technology integration

More Information

jsun@sun-associates.com 978-251-1600 ext. 204

www.sun-associates.com/evaluationwww.edtechevaluation.com

This presentation is linked to that page

top related