evaluating the dod presidential technology initiative: innovative methods to measure student...

39
Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA Louise Yarnall Center for Technology in Learning, SRI International Harold F. O’Neil, Jr. CRESST/USC Paper presented as part of symposium “Quantitative and Qualitative Strategies for Evaluating Technology Use in Classrooms” AERA New Orleans—April 2000 C R E S S T / U C L A

Upload: felicia-cole

Post on 29-Jan-2016

217 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

Evaluating the DoD Presidential Technology Initiative:

Innovative Methods to Measure Student Outcomes

Davina C. D. Klein & Christina GlaubkeCRESST/UCLA

Louise YarnallCenter for Technology in Learning, SRI International

Harold F. O’Neil, Jr.CRESST/USC

Paper presented as part of symposium “Quantitative and Qualitative Strategies for Evaluating Technology Use in Classrooms”

AERA New Orleans—April 2000

C R E S S T / U C L A

Page 2: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

2

PTI Program Background

In 1995 President Clinton set goals: computer access for students and teachers

connectivity to the Internet for classrooms

courseware to support quality curriculum

competent teachers trained in technology

DoDEA’s response was the Presidential Technology Initiative

Page 3: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

3

PTI Participants

PTI program implemented at 11 selected DoDEA school testbed sites across the world

Selected testbed sites required: Minimum hardware and connectivity configurations

Technology implementation plans

School-wide support (e.g., staff)

Page 4: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

4

PTI Project Goals

“To develop and implement effective strategies for curriculum and technology integration”

Local site objectives included: Evaluation and alignment of courseware

Development of technology integration plans

Integration of software and PTI courseware tools into the DoDEA curriculum

Page 5: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

5

Evaluation Steps

Step 1: Identify program goals Specific expectations

Our focus was on PTI students General achievement measures Student attitudinal measures Content-specific performance measures Technology-specific performance measures

Page 6: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

6

Examining Technology Outcomes

Classroom OutcomesIntegration of technology and curriculum

New instructional practices

System OutcomesComputers, connectivity, courseware

Professional developmentSupport for innovative teaching

Teacher OutcomesSkilled teachers

Student OutcomesIncreased performance

Better attitudes

Page 7: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

7

Evaluation Steps (cont.)

Step 2: Describe how program plans to achieve goals Theory of Action for PTI program

Page 8: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

8

Achieving Technology Goals

Systemoutcomes

Systemoutcomes

Student outcomes

Student outcomes

Teacheroutcomes

Teacheroutcomes

Classroom outcomes

Classroom outcomes

Page 9: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

9

Evaluation Steps (cont.)

Step 3: Measure intended outcomes Students’ attitudes toward technology

Students’ content-specific knowledge (focus on courseware tools)

Students’ Web fluency

Student-perceived classroom practices

Page 10: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

10

Measurement Instrumentation

Common measures General impact of PTI on all students

Technology Questionnaire On-line Web Expertise Assessment (WEA) Student interviews

Courseware-specific measures Detailed, courseware-by-courseware examination

of tool impact on students Content-specific performance-based assessment PTI courseware usability studies

Page 11: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

11

Technology Questionnaire

Purpose: Measure students’ attitudes toward technology and perceptions of classroom practices 36-item paper-and-pencil survey

Students rated statements on scale of 1 (“I really don’t agree”) to 5 (“I really agree”)

“I feel comfortable using computers” “In class we use computers to solve problems or answer

questions”

Page 12: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

12

Web Expertise Assessment

Purpose: Examine effects of Web usage in the classroom Student training, then 20-minute session

Presented students with authentic search tasks

Asked students to navigate and search for relevant information in a closed Web-based environment, then bookmark relevant findings

All measures logged and coded

Page 13: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

13

WEA Search Task

Imagine you are learning about the U.S. presidents in your history class. Your teacher has asked you to write a report about what presidents said during their speeches when first elected to office. She has asked you to find out which presidents spoke of the importance of an educational system available to all without charge.

Use WEA to find this information for your report.

Find as many useful pages as you can.

Bookmark pages by clicking on the Add Bookmark button near the top of your screen.

You may bookmark as many useful pages as you think necessary.

Page 14: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA
Page 15: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA
Page 16: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

*

Page 17: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA
Page 18: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA
Page 19: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA
Page 20: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

20

WWW Background Questionnaire

Purpose: Evaluate students’ background knowledge regarding the World Wide Web 7-item paper-and-pencil survey

Students rated statements on scale of 1 (“I really don’t agree”) to 5 (“I really agree”)

“The information on the World Wide Web is not very useful”

Page 21: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

21

Student Interviews

Purpose: To obtain further information about students’ attitudes toward technology and their perceptions of classroom practices Brief 5- to 10-minute interviews

Three students interviewed per class

Qualitative data supplements quantitative findings

Page 22: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

22

Evaluation Participants

6 schools at 2 DoDEA sites 3 elementary schools

2 middle schools

1 middle/high school

21 classrooms

181 students participated in both pre- and posttest sessions

Page 23: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

23

Pre-Post Comparisons

Data aggregated to PTI program intervention level (classroom)

N = 14 classrooms 4 of 21 classrooms not included because they

completed modified questionnaire due to the young age of the students

3 additional classrooms dropped because of lack of overlap between pretest and posttest samples

Page 24: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

24

Results: Technology Questionnaire

140 students completed TQs

Two scales created Attitudes toward technology

19 items = .92

Student perceptions of classroom practices 8 items = .79

Page 25: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

25

Attitudes Toward Technology

In general, positive attitudes held (pre/post; 1-5 scale): Students agreed it is fun figuring out how things work on a

computer (4.1/3.9)

Students agreed/strongly agreed they feel comfortable using computers (4.3/4.4)

Students disagreed/strongly disagreed schoolwork on computer is waste of time (1.5/1.6)

Students agreed/strongly agreed it would be helpful to learn how to use WWW (4.4/4.4)

Page 26: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

26

Reported Classroom Practices

In general, limited computer use reported

High use (pre/post; 1-5 scale) Used presentations, essays, portfolios (4.1/3.7)

Typed reports on computer after writing (3.8/3.6)

Worked in small groups (3.6/3.4)

Page 27: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

27

Classroom Practices (cont.)

Moderate use (pre/post; 1-5 scale) Computers used for different assignments

(3.3/3.5)

Computers used to explore things (3.2/3.2)

Low use (pre/post; 1-5 scale) Computers used to practice basics (3.0/2.9)

Computers used to solve problems (3.2/3.0)

Many computer programs used (3.0/2.9)

Page 28: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

28

TQ Pre-Post Comparisons

No significant differences found from fall to spring in: Attitudes toward technology

(t(13) = -0.92, p = .37)

Reported classroom practices(t(13) = -1.3, p = .21)

Page 29: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

29

Results: WEA

142 students completed WEA

Four scales created Students’ background Web knowledge

4 items, =.77

Students’ finding ability 3 items, =.88

Students’ searching expertise 3 items, =.68

Students’ navigational strategies 2 items, =.72

Page 30: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

30

Background Web Knowledge

In general, students familiar with Web

Students were neutral/agreed that information on WWW is accurate (3.6/3.3)

Students disagreed/strongly disagreed that information on WWW is not useful (1.6/1.8)

Students disagreed that there is not a lot of detailed or in-depth information on WWW (2.0/2.1)

Students agreed/strongly agreed that WWW is helpful in finding information (4.5/4.3)

Page 31: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

31

Finding Ability

In general, students able to find info

Average bookmark peripherally relevant to task (2.2/2.0 on 0-3 scale)

Quality of bookmark response set was good (2.2/2.0 on 0-3 scale)

About one third of pages bookmarked appropriately (efficiency of .31/.32)

Page 32: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

32

Searching Expertise

In general, students had difficulty searching (consistent with literature)

Quality of keyword searching set rather poor (1.6/1.6 on 0-3 scale)

Number of good searches low (3.0/1.7)

Students redirected searches, browsing search output before selection (2.2/2.2)

Page 33: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

33

Navigational Strategies

In general, students navigated well

Students revisited over half the information pages visited, orienting themselves in the Web space (7/6)

Students completed more steps, a sign of better searching (86/113)

[Use of back missing]

Page 34: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

34

WEA Pre-Post Comparisons

No significant differences found from fall to spring in: Students’ Web knowledge (t(13) = 0.61, p = .55)

Students’ finding ability (t(13) = 0.43, p = .68)

Students’ searching expertise (t(13) = 0.54, p = .60)

Students’ navigational strategies (t(13) = 0.15, p = .88)

Page 35: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

35

Evaluation Steps (con’t.)

Step 4: Review implementation of plans If antecedents don’t occur, expected

outcomes won’t occur

With technology, pay close attention to: Hardware/software Measures of use or exposure Technology integration

Page 36: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

36

PTI Implementation

Only 9 evaluation teachers planned to use courseware Of these 9, only 5 used courseware

Courseware usage for these 5 was sparse

Teacher training/support was an issue

Student-reported classroom technology integration was weak

Page 37: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

37

Evaluation Steps (con’t.)

Step 5: Evaluate progress toward goals No progress yet...

Not surprising that we found no student effects of the PTI program, as teacher- and classroom-level effects were not evident

Page 38: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

38

Conclusions

Need to find sensitive, innovative measures to reveal best use of technology to instruct, assess, evaluate WEA and TQ are sample approaches

Our general approach involves: Defining where benefits are expected based on

particular high-technology environment Creating/finding innovative measures that will be

sensitive to changes within given area Ensuring that expectations required “below” or before

goal levels are being met

Page 39: Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA

C R E S S T / U C L A

39

For More Information

Visit our Web site at: http://www.cse.ucla.edu/CRESST/pages/aera00.htm

Available: Overheads of this presentation

Full paper

And much, much more...