can computer-marked final assessment improve retention?

35
Can computer-marked final assessment improve retention? Jon Rosewell Dept of Communications and Systems, Faculty of Maths, Computing and Technology, The Open University, UK ALT-C, 11-13 th Sept 2012

Upload: jon-rosewell

Post on 26-May-2015

281 views

Category:

Education


0 download

DESCRIPTION

Distance learning modules (particularly low-cost introductory and enrichment modules) may show poor retention compared to traditional campus courses. The perceived difficulty of exams and end-of-module assessments (EMA) appears to deter some students from submitting. In contrast, interactive computer-marked assignments (iCMA) are typically attempted by most students. Can retention therefore be improved by changing the format of part of the final assessment to an iCMA? Robotics and the meaning of life is a 10-point, 10-week general-interest Open University module. The assessment comprised a mid-module iCMA and a final written EMA. The iCMA (a Moodle quiz) provided detailed feedback only after the submission deadline. The EMA included short-answer questions, a programming question and an essay. The EMA was script-marked and feedback limited to overall score and performance profile provided well after the end of the course. The intervention simply replaced the script-marked short-answer questions by a second iCMA covering the same content with similar questions. The programming and essay questions were retained unchanged as a written, script-marked EMA. The hypothesis to be tested was that retention would increase: students would be more likely to submit the final iCMA, their confidence would increase, and they would be motivated to submit the written EMA. Quantitative data were gathered for patterns of submission, course completion and pass rates for two presentations (124 and 220 students); data were also available for thirteen previous presentations (1814 students). Structured interviews were carried out to probe student preferences, confidence and engagement. More students submitted the iCMA (86%) than the EMA (81%). Although they had the same deadline, 91% of students submitted the iCMA before the EMA. They submitted the iCMA well in advance of the deadline (median 4 days 15 hrs) but kept the EMA open as long as possible (median 18 hrs before deadline; 11% submitted in the final hour). These patterns strongly suggest that students were more confident with the iCMA than the EMA. Completion rates were the highest recorded: 88% and 89% compared to 79% for pre-intervention presentations. Overall pass rates were also improved (83% and 85% c.f. 76%). This can be ascribed to improved submission rates alone: the pass rate and mean scores among those who submit were unchanged giving confidence that the assessment difficulty was unaltered. Student interviews suggested that students did attempt the final iCMA before the EMA and had greater confidence in obtaining a good mark for the iCMA than the EMA. Students valued the mix of assessment methods and felt it produced a robust result; although some expressed concern over the correctness of computer marking, they appreciated the detailed feedback it provided. This intervention suggests that a change of assessment format can improve student engagement and pass rates without compromising rigour.

TRANSCRIPT

Page 1: Can computer-marked final assessment improve retention?

Can computer-marked final assessment improve retention?

Jon RosewellDept of Communications and Systems, Faculty of Maths, Computing and Technology, The Open University, UK

ALT-C, 11-13th Sept 2012

Page 2: Can computer-marked final assessment improve retention?

Robotics and the meaning of life:a practical guide to things that think (T184)• Open University ‘taster’ distance learning course• 10-point, 10-week, 100 hours• 2003 – 2011, 2387 students • Components:

– Website– OU-RobotLab practical– software with LEGO kit or simulator– Isaac Asimov: I, Robot– James May on DVD– Moderated, peer-support forums

Page 3: Can computer-marked final assessment improve retention?

Challenge of retention• ‘Taster’ distance learning modules, not core to degree• Mid-course CMA (Moodle quiz) & final EMA (script-marked)• Students enjoyed and completed course

…but daunted by EMA• Students who submitted nearly all passed – too late!• Not submitting EMA

not completing module poor retention statistic setting up students to fail

Retention = module completion (not pass)Progression = continue to study another OU module

Page 4: Can computer-marked final assessment improve retention?

Can a final CMA improve retention?• iCMA encourages engagement and motivation• Submission of CMA generally high (T184: 87%, n=2158)• So add final CMA(CME) and reduce EMA• iCMA / iCME are Moodle quizzes

– Open at start of course; single submission by traditional cut-off date– Questions can be revisited– Question level feedback immediately after cut-off date– Even without adaptive feedback, learner can judge their own

performance• Builds confidence• Lowers perceived barrier to pass

Page 5: Can computer-marked final assessment improve retention?

Assessment strategyOriginal• 10% – mid-course iCMA• 90% – final written EMA

– Part I – short answer qns– Part II – prog & essay qns

New• 10% – mid-course iCMA

• 30% – final iCME• 60% – final written EMA

– prog & essay qns

Page 6: Can computer-marked final assessment improve retention?

Hypothesis:• Students will attempt iCME/quiz first, before written EMA

They will thus complete the module• This will improve their confidence• They will then be motivated to attempt EMA

They will thus pass the module

Page 7: Can computer-marked final assessment improve retention?

EMA Total

Not submitted

Submitted

CME

Not submitted

44 1 45 13%

Submitted 19 281 300 87%

Total 63 282 34518% 82%

Did more submit iCME?

T184 2011E & 2011J presentations

Page 8: Can computer-marked final assessment improve retention?

Did students submit iCME first?CME clearly first 214 76%EMA clearly first 9 3%CME & EMA

together (<1hr) 58 21%CME slightly earlier 43 15%EMA slightly earlier 15 5%

Total 281 100%

T184 2011E & 2011J presentation

Page 9: Can computer-marked final assessment improve retention?

Leaving it to the last minute:submitting EMA

0

20

40

60

80

100

120

140

160

180

200

01

-Oct

08

-Oct

15

-Oct

22

-Oct

29

-Oct

05

-No

v

12

-No

v

19

-No

v

26

-No

v

03

-De

c

10

-De

c

EMA submitted

T184 2011J

Median: 18hrs before deadline60% in final 24hrs11% in final hour

Page 10: Can computer-marked final assessment improve retention?

Not leaving it to the last minute:submitting iCME

0

50

100

150

200

01-O

ct

08-O

ct

15-O

ct

22-O

ct

29-O

ct

05-N

ov

12-N

ov

19-N

ov

26-N

ov

03-D

ec

10-D

ec

iCME started

iCME completed

T184 2011J

Median: 4 days 15 hrs

Page 11: Can computer-marked final assessment improve retention?

T184 completion rates

60%

65%

70%

75%

80%

85%

90%

95%

100%

2004 2005 2006 2007 2008 2009 2010 2011

May

Oct

Introduction of CME

Page 12: Can computer-marked final assessment improve retention?

T184 completion rates

60%

65%

70%

75%

80%

85%

90%

95%

100%

2004 2005 2006 2007 2008 2009 2010 2011

May

Oct

Introduction of CME

Page 13: Can computer-marked final assessment improve retention?

T184 completion rates

60%

65%

70%

75%

80%

85%

90%

95%

100%

2004 2005 2006 2007 2008 2009 2010 2011

May

Oct

Introduction of CME

Page 14: Can computer-marked final assessment improve retention?

T184 pass rates

60%

65%

70%

75%

80%

85%

90%

95%

100%

2004 2005 2006 2007 2008 2009 2010 2011

May

Oct

Introduction of CME

Page 15: Can computer-marked final assessment improve retention?

Interviews with studentsI thought the balance of assessment was about the best I’d come across. Interesting computer-based tests … plus the programming elements … an interesting [EMA] with a choice of two essays. … The assessment contributed to the enjoyment of the whole course.

By the time I’d finished the assessment I’d felt I’d learnt more. … I also felt fairly confident that I’d got enough marks on the computer based test.

Page 16: Can computer-marked final assessment improve retention?

Interviews with students[Early on] get an achievement and get a bit of success, … it was really easy that first CMA. … So an early success and then you know at the end you’ve got some short answers in the CMA that you think pick up your score a bit, and then you’ve got the EMA where you can do even better. I think it’s spot on.

Page 17: Can computer-marked final assessment improve retention?

Interviews with students… as long as they feel it’s rigorous … it didn’t have that kind of ITV quiz thing, you know ‘what’s the capital of England’ or something. … There was a combination with ‘you’ve done this bit now, and now you need to tell us about it, now you need to describe it, now you need to do some programming and show us your work’. It felt like there was real rigour to it so it didn’t feel like ‘this is a quick and dirty kind of interactive quiz’.

Page 18: Can computer-marked final assessment improve retention?

Interviews with studentsStudent: I think it was excellent. … The only thing

I don’t like is having to wait till September for my result!

Me: So that’s another reason for introducing that computer marked exam … it gives students feedback immediately rather than a long delay.

Student: Yes, so you know where you’ve gone wrong. That’s really good, and it cheers you up as well, not an idiot after all!

Page 19: Can computer-marked final assessment improve retention?

Interviews with students“I enjoyed all of it really. I thought the course materials were brilliantly laid out, I thought it was very clear, concise. It gave you enough about the subject to make you interested and wanted you to do more.”“As I say it was a pleasure doing it. I hope I’ve passed, but I just think it was a joy doing it, it was really fun, you learnt something, it really complemented what I learnt last year and it’s just made me keen to do other stuff, or try and find out more about it.”

Page 20: Can computer-marked final assessment improve retention?

Thanks for listening…Jon [email protected]

Page 21: Can computer-marked final assessment improve retention?

The end … of T184The birth … of TM129 Block 3

Page 22: Can computer-marked final assessment improve retention?
Page 23: Can computer-marked final assessment improve retention?
Page 24: Can computer-marked final assessment improve retention?
Page 25: Can computer-marked final assessment improve retention?
Page 26: Can computer-marked final assessment improve retention?
Page 27: Can computer-marked final assessment improve retention?
Page 28: Can computer-marked final assessment improve retention?
Page 29: Can computer-marked final assessment improve retention?
Page 30: Can computer-marked final assessment improve retention?

A tale of several short modulesBased on Relevant Knowledge, 10 point, 10 week, level one modules with mid-course CMA and final EMA

• A module with poor completion• A module that satisfies student needs• A module that is a paragon of good practice

Retention = module completion (not pass)Progression = continue to study another OU module

Page 31: Can computer-marked final assessment improve retention?

A module with poor completion• The mid-module CMA is low-stakes and easy to submit

– ~80% students submit mid-course CMA• The final EMA is a project, submitted via eTMA system

– Only 50% students submit final EMA• Many new students do not continue to study• Very few students subsequently complete a qualification

Page 32: Can computer-marked final assessment improve retention?

A ‘one-off’ module• Most new students want this module, not a qualification• Most new students won’t register for a further module• Many students therefore see no need to stress over EMA

– Only 50% of students submit EMA• Most students complete the course material

– ~80% complete CMA– Engaged on forums

Page 33: Can computer-marked final assessment improve retention?

Can you guess what it is yet?• A module with poor completion

– T180 Living with the net• A module that satisfies student needs

– T180 Living with the net

Page 34: Can computer-marked final assessment improve retention?

Does ‘one-off’ poor retention?• T180 Living with the Net (2004-2006) as an example• Some studying as ‘one-off’• Students enjoyed and completed course but daunted by EMA• Students who submitted nearly all passed – too late!• Not submitting EMA

not completing module poor retention statistic (51%)

HEFCE penalty• Setting up students to fail?

An ‘M-world’ module

Page 35: Can computer-marked final assessment improve retention?

Can an iCME improve retention?• Moodle quiz

– Same format as mid-course iCMA– Open at start of course; single submission by traditional cut-off date– Questions can be revisited– Question level feedback immediately after cut-off date

(cf OU anodyne performance profile 3 months later…)• Even without adaptive feedback, learner can judge their own

performance– Builds confidence– Lowers barrier to pass: relatively low score on EMA programming and

essay questions sufficient to pass