a critical analysis of the use of electronic voting ... · a critical analysis of the use of...

16
Emirates Journal for Engineering Research, 12 (1), 11-26 (2007) (Regular Paper) 11 A CRITICAL ANALYSIS OF THE USE OF ELECTRONIC VOTING SYSTEMS: ASK THE AUDIENCE A. Felce The University of Wolverhampton, School of Engineering and the Built Environment Wulfruna Street, WOLVERHAMPTON, UK WV1 1SB, email: [email protected] (Received November 2006 and accepted March 2007) Pedagogic theory has identified that a student learns best by being interactive and that timely and effective feedback is essential for deep learning. Widening Participation agendas within UK higher education have resulted in large cohorts of students from diverse educational backgrounds. Practical solutions need to be found to achieve interactivity whilst providing interesting learning opportunities for students in the 21 st century. This action research was undertaken on the introduction of an Electronic Voting System (EVS) in a numeric module taught in the School of Engineering and the Built Environment at the University of Wolverhampton. Student performance in summative assessed coursework is compared with that of the EVS class tests together with a qualitative analysis of the students’ perceptions of the use and effectiveness of the EVS. Results show limited correlation between student assignment grade and EVS performance but that a relationship does exist between perceived usefulness, EVS grade and coursework grade. Qualitative feedback at mid-module and end-of-module was positive and matched previous reported findings that approximately 70% of users were in favour of the EVS. Students consider that the EVS is a useful and effective tool to provide an opportunity for interactivity and timely feedback to large cohorts of students. Keywords: Assessment, interactivity, large classes. 1. INTRODUCTION It has long been recognised that a student learns best by doing the task themselves. One must learn by doing the thing; though you think you know it, you have no certainty until you try Sophocles [1] and Snee [2] quoted an old Chinese proverb: “I hear, I forget. I see, I remember. I do, I understand”. More recently Biggs [3] identified the importance of action learning and quoted the work of Tyler: Learning takes place through the active behaviour of the student: it is what he does that he learns, not what the teacher does and of Shuell: what the student does is ... more important One could argue that the traditional tutorial, or a workshop, is a method of encouraging student engagement with an activity and learning through doing a worked example of a particular aspect of a module. However, how can a teacher ensure that all students are engaging with the material, that they are analysing the material sufficiently well to be able to calculate the correct answer, that timely and appropriate feedback is given, that student interest is maintained and that students are not discouraged from engaging with the workshop in case they calculate an incorrect answer? This paper is set out to report on the introduction of an electronic voting system in a level 3 numeric module and to evaluate its use and effectiveness to the students. This will be achieved through identifying the challenges of achieving interactivity when working with large classes and potential solutions to the challenges through the use of electronic voting systems. This paper will also describe the introduction of an EVS (Electronic Voting System) to a module and evaluate its effectiveness through (a) student assignment grades compared to performance in in-class EVS and (b) student feedback on the use of the EVS. 2. CHALLENGES TO ACHIEVING INTERACTIVITY WITH LARGER CLASSES The importance of what the student does, rather than what the teacher does in achieving deep learning cannot be underestimated [2-7] . In addition, students need to know that what they are doing is correct, or where it is not correct what they can do to improve their performance. The growth in the number of students in higher education has created new challenges to lecturers to include opportunities for

Upload: others

Post on 08-Oct-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: A CRITICAL ANALYSIS OF THE USE OF ELECTRONIC VOTING ... · A Critical Analysis of the use of Electronic Voting Systems: Ask the Audience Emirates Journal for Engineering Research,

Emirates Journal for Engineering Research, 12 (1), 11-26 (2007) (Regular Paper)

11

A CRITICAL ANALYSIS OF THE USE OF ELECTRONIC VOTING SYSTEMS: ASK THE AUDIENCE

A. Felce

The University of Wolverhampton, School of Engineering and the Built Environment Wulfruna Street, WOLVERHAMPTON, UK WV1 1SB, email: [email protected]

(Received November 2006 and accepted March 2007)

Pedagogic theory has identified that a student learns best by being interactive and that timely and effective feedback is essential for deep learning. Widening Participation agendas within UK higher education have resulted in large cohorts of students from diverse educational backgrounds. Practical solutions need to be found to achieve interactivity whilst providing interesting learning opportunities for students in the 21st century. This action research was undertaken on the introduction of an Electronic Voting System (EVS) in a numeric module taught in the School of Engineering and the Built Environment at the University of Wolverhampton. Student performance in summative assessed coursework is compared with that of the EVS class tests together with a qualitative analysis of the students’ perceptions of the use and effectiveness of the EVS. Results show limited correlation between student assignment grade and EVS performance but that a relationship does exist between perceived usefulness, EVS grade and coursework grade. Qualitative feedback at mid-module and end-of-module was positive and matched previous reported findings that approximately 70% of users were in favour of the EVS. Students consider that the EVS is a useful and effective tool to provide an opportunity for interactivity and timely feedback to large cohorts of students.

Keywords: Assessment, interactivity, large classes.

1. INTRODUCTION It has long been recognised that a student learns best by doing the task themselves. One must learn by doing the thing; though you think you know it, you have no certainty until you try Sophocles[1] and Snee[2] quoted an old Chinese proverb: “I hear, I forget. I see, I remember. I do, I understand”.

More recently Biggs[3] identified the importance of action learning and quoted the work of Tyler: Learning takes place through the active behaviour of the student: it is what he does that he learns, not what the teacher does and of Shuell: what the student does is ... more important

One could argue that the traditional tutorial, or a workshop, is a method of encouraging student engagement with an activity and learning through doing a worked example of a particular aspect of a module. However, how can a teacher ensure that all students are engaging with the material, that they are analysing the material sufficiently well to be able to calculate the correct answer, that timely and appropriate feedback is given, that student interest is maintained and that students are not discouraged from engaging with the workshop in case they calculate an incorrect answer? This paper is set out to report on the

introduction of an electronic voting system in a level 3 numeric module and to evaluate its use and effectiveness to the students. This will be achieved through identifying the challenges of achieving interactivity when working with large classes and potential solutions to the challenges through the use of electronic voting systems. This paper will also describe the introduction of an EVS (Electronic Voting System) to a module and evaluate its effectiveness through (a) student assignment grades compared to performance in in-class EVS and (b) student feedback on the use of the EVS.

2. CHALLENGES TO ACHIEVING INTERACTIVITY WITH LARGER CLASSES

The importance of what the student does, rather than what the teacher does in achieving deep learning cannot be underestimated[2-7]. In addition, students need to know that what they are doing is correct, or where it is not correct what they can do to improve their performance. The growth in the number of students in higher education has created new challenges to lecturers to include opportunities for

Page 2: A CRITICAL ANALYSIS OF THE USE OF ELECTRONIC VOTING ... · A Critical Analysis of the use of Electronic Voting Systems: Ask the Audience Emirates Journal for Engineering Research,

A. Felce

12 Emirates Journal for Engineering Research, Vol. 12, No.1, 2007

students to ‘learn by doing’ and to provide feedback on the students’ performance.

This section provides an overview of pedagogic theory in relation to the need for interactivity in learning, the usefulness of formative assessment and feedback in achieving deep learning, issues relating to group size and the potential use of a Technology Supported Learning (TSL) solution to overcome these challenges.

2.1 Interaction

Biggs[8] states that: deeper learning is related to more active engagement and Draper[9] considers one of the key underlying issues for interactivity is the amount of time learners spend thinking as opposed to waiting, listening or taking dictation.

Lack of interaction by students is a source of frustration for academics[10]. Cowley[11] considers that the way to succeed with a class is to: use an ‘engaging lesson’… something different, unusual, or something with ‘props’. Gibbs[12] states that structure is needed … devised by teacher, not haphazard. Technology provides a useful platform to encourage interactive learning[13-15].

Draper[9] recommends that lectures should be made interactive to improve the learning outcomes and that the benefits are three-fold: - For the learners: deepening understanding,

lengthening retention and providing targeted feedback

- For the teacher: getting feedback to improve what they do

- Achieves true interaction: teacher and learner change their actions because of the others actions

Draper et al[16] state that the electronic voting system provides a good fit between a particular learning situation and a specific technical solution.

2.2 Formative Assessment and Feedback

Deep learning can be encouraged through regular formative assessment and effective feedback[17-19] and that when this is undertaken as part of teaching it has the most significant impact, particularly when students are involved in assessing their own work and are given frequent opportunities to reflect on goals, strategies and outcomes[20,21].

Feedback needs to be timely[22,23], quick and dirty rather than detailed, pretty and late[24] and technology, used in formative testing, can support student learning[25].

Draper et al[16] identify that pedagogic uses of electronic voting systems include: - Formative feedback on learning within a class - Formative feedback to the teacher on the

teaching.

2.3 Group Size

In large classes there is a possibility of lack of interaction, audience passivity and for only surface

learning to take place[16,20,26]. In addition, the more diverse the group, the greater the range of learning styles, current knowledge, needs and understanding and solutions to this diversity need to be introduced[4,26-29].

The significant growth in students in the HE sector has led to larger class sizes and new challenges to lecturers in achieving student engagement, interaction and thus opportunities for deep learning to take place.

2.4 Electronic Voting Systems (EVS)

Draper[30] describes an electronic voting system as: a show of hands, but with privacy for individuals, more accurate and automatic than counting.

Juwah et al[31] state that “it allows a whole class to contribute to an anonymous vote... with immediate feedback to students on their learning. It … delivers high quality information to students about their learning.” Therefore benefits of EVS can be noted as: involving everyone, embarrassing no-one, reaching a new generation, capturing attention, obtain feedback instantly, tracking student comprehension, measuring progress and communicating weaknesses[32]. EVS equipment is also portable and setting up and packing up is quick[33].

EVS rely on the use of MCQs to which there are two parts: the stem (the question or statement) and the options (the possible answers)[25,34]. MCQs allow the lecturer to test student understanding and provide feedback to the student and the lecturer[30,35,36].

As the need for, and importance of, interactivity in student learning and the need for feedback to achieve deep learning has been identified (in this section) along with the challenges of achieving these in large classes, EVS has been identified as a potential solution to the need for interactivity and for timely and appropriate feedback and they will work in a large group situation.

3. RESEARCH METHODOLOGY The EVS used in this research is the Classroom Performance System (CPS) supplied through e-Instruction, Texas, USA. It is used in ten countries worldwide[37] and is described as a wireless question response system[38] wherein students are asked to respond to a series of multiple-choice questions using a hand-held pad. An overview of the application of the CPS for in-class MCQ tests is given in Appendix A.

The module that was used in the study is a numeric module that covers a range of issues likely to be encountered by junior/middle managers within the early years of their careers. ‘Traditional’ lectures and tutorials have been used in previous iterations i.e. one hour lecture and two hours tutorial examples where students were asked to complete numeric exercises and model answer schemes given in class and solutions and methodology shown on the whiteboard.

Page 3: A CRITICAL ANALYSIS OF THE USE OF ELECTRONIC VOTING ... · A Critical Analysis of the use of Electronic Voting Systems: Ask the Audience Emirates Journal for Engineering Research,

A Critical Analysis of the use of Electronic Voting Systems: Ask the Audience

Emirates Journal for Engineering Research, Vol. 12, No.1, 2007 13

The CPS was used to support the student to calculate the examples through selective MCQs: in this way students would calculate part of the solution and select an answer, the lecturer would show the correct method of calculation before moving on to the next question in the set. Students would see the correct answer and the numbers responding to each answer/distracter. Students would see where errors had been made and how to correct those errors as the calculation progressed. Copies of the question sets (with answers) were made available in the VLE, following each session.

The group used for the study was a diverse cohort of level 3 students in the final year of undergraduate programmes. There were 59 students registered on the module, 55 students attended lectures, 3 students were resitting the module. Six students did not submit coursework. Attendance at lectures was variable with few students achieving 100% attendance.

The responses to CPS question sets were compared to the student coursework results. The question sets do not always directly relate to the coursework questions but the closest ‘match’ was used for the analysis.

Students were asked to complete qualitative evaluations of their use of CPS through mid-module and end-of-module evaluation questionnaires. Comments are included in Appendix C. Sixteen students completed mid-module evaluations and 35 the end-of-module evaluations. Both evaluations can be considered representative of the student group and the non-response bias is not a concern[39].

In the final week of the semester students completed an evaluation using the CPS (Appendix D). The responses were analysed both qualitatively and quantitatively. For the purpose of the quantitative evaluation the question sets identified the ‘most positive’ answer as the ‘correct’ answer and the students were advised of this before responses were given.

3.1 Results and Analysis

The Class Performance System (CPS) results were exported to Excel and compared to the student coursework grades (Appendix B). CPS results for students who did not submit coursework were omitted. For the purposes of this research the summaries were sorted on Grade Point Average (GPA, maximum grade is 16, minimum pass is 5) and the average ‘scores’ for GPA and CPS identified for bottom, middle and top ten ranges (Appendix B). The CPS feedback (Appendix D) was included in this summary where ‘positive’ comments were identified as ‘correct’ answers, for the purpose of this analysis.

Affect of CPS on student performance in coursework The results show that those students in the ‘bottom’ set of coursework grades tended to achieve the lowest CPS scores. Students in the ‘middle’ set of

coursework grades achieved the highest CPS scores, which is contrary to what would be expected i.e. students achieving highest CPS scores should achieve the highest assignment grades. The results for Q5 are the only question set where the average assignment grades and CPS scores are what would be expected i.e. highest CPS is highest assignment grade. Possible reasons for these unexpected results are: - The use of averages in the analysis - Non-comparability of CPS sets and coursework

content - Students ‘guessing’ the correct CPS answers

Student evaluation through CPS feedback The summary results (Appendix B) show that students with the lowest assignment grades had the lowest average CPS score and gave the least positive feedback; those with the highest assignment grades and a high average CPS score gave the most positive feedback. These results suggest a correlation between perceived usefulness and ‘liking’ of CPS and coursework achievement.

Student evaluation through qualitative questionnaires Student responses show that the use of CPS in the module was widely perceived as positive as it provided the opportunity for students to engage with the material without fear of being seen to make a mistake. Typical comments included: Positives - Exercises were very useful - Electronic interaction, everyone can get

involved - Electronic resource was good for feedback and

making sure you understood the question - Good interaction/very interactive/interactive, not

just 3 hours of boring lecture/interactive learning

- Using the electronic zappers allowed me to see if I’m doing the work right and know where you’re going wrong

- Helps create a competitive atmosphere - Good to know if class/individuals are following

and understanding - Everyone is involved/keeps all students focused

for the entire lecture In the end of module evaluation the CPS was

consistently identified in the “best thing about the module”. Negatives: Few negative criticisms were made. Those that were included: Can take a while for people to answer and zappers take too much time/too slow and questions too easy

In the end of module evaluation “worst thing” the CPS was rarely mentioned: sometimes slow pace waiting for zapper answers Areas for improvement: In the end of module evaluation “areas for improvement”, relating to CPS, were: Time limits on interactive exercises

Page 4: A CRITICAL ANALYSIS OF THE USE OF ELECTRONIC VOTING ... · A Critical Analysis of the use of Electronic Voting Systems: Ask the Audience Emirates Journal for Engineering Research,

A. Felce

14 Emirates Journal for Engineering Research, Vol. 12, No.1, 2007

Student evaluation using CPS question set Analysis of the student responses to the CPS questions for the end of module evaluation (Appendix D) were generally positive and showed that students perceived they had benefited from the use of CPS. In summary: - 73% of students found the zappers quite useful

or very useful whilst 35% found that they were not useful. The compares well with the findings of Draper et al[16]: when asked if they considered EVS an advantage or not 70% were in favour, 20% indifferent and 10% against its use.

- 16% found the questions were too or a bit easy; 58% considered that they were just right and 26% thought the questions were a bit difficult

- 84% thought that CPS had been useful; 11% stated that it was of no use

- 42% had used CPS to identify areas of weakness - 87% stated that there should be a limit on

response time; the majority (39%) stated the limit should be two minutes

- 82% considered that CPS use was appropriate for this module and 87% liked it being used, either a lit, or a lot

- 47% would like to see CPS used in other modules with a further 29% stating that it would depend on the module.

Other evaluation The student cohort included one student who has a hearing impairment; this student provided evaluation within the analysis above. Additional comments were sought from the Communication Support Unit who provide signers and note takers for students with hearing impairments who gave the following feedback: “I think the CPS system is a useful tool for Deaf students as it is visual and allows equal participation. A Deaf student's English would have to be of a fairly high standard to access the questions but for HE students this should not be an issue. They vary the tasks within a lecture and allow the lecturer to know instantly whether the class has understood a topic or not without drawing attention to individual students.”

For the lecturer, who delivered the module for the first time in this iteration the most telling student comment was: “I’m retaking this module and I feel this time I understand and will be able to do the assignment and exam this year. It has been a better experience for me than last year’s.”

4. CONCLUSIONS The aim of this paper was to report on the introduction of an electronic voting system in a level 3 numeric module and to evaluate its use and effectiveness to the students. This was achieved through: 1. A review of the literature on the challenges of

achieving interactivity when working with large

groups. It was identified that a student learns best by being active but that it is easy for an individual to be inactive in a group environment and that a student is more likely to succeed in an engaging lesson[11]. Furthermore, a student needs timely and appropriate feedback to enable an improved performance.

2. The use of an electronic voting system (EVS) was identified as a potential solution and the method in which the Class Performance System (CPS) can be used to generate data was described. The CPS is easy to learn and is intuitive software that allows a variety of uses in the classroom environment as well as a range of report options, including the ability to export into other software applications, such as Excel, for further analysis of the data.

3. Action research was undertaken on a level 3 numeric module and its effectiveness was evaluated through a comparison of student assignment grades and results of CPS question sets and consideration of student perceptions of its use. Results of the analysis showed that there was limited correlation between assignment grades and CPS results but that further, more detailed, analysis should be undertaken before final conclusions could be drawn. There was a correlation between student perceptions of the usefulness of CPS and their assignment grade; possible reasons for this correlation have not been identified as part of this research. Feedback from students was predominately positive with 73% finding it to be quite, or very, useful. 47% stated that they would like to see it used in other modules. An additional, unanticipated, outcome was the potential benefits of CPS for students with hearing impairment as the CPS is “visual and allows equal participation”. EVS ensures that students are active and focused;

students’ anonymity is valued as is feedback which is timely and students can self-direct any follow-up identified. For teachers it saves on marking and can show quickly where there are any problems[30].

In summary, the EVS was shown to be a useful and effective tool in providing an opportunity for interactivity and timely feedback to students in a large cohort. It has not been shown to affect student performance in coursework but it is perceived by students to be a useful addition to a lecturer’s toolkit.

5. RECOMMENDATIONS The research conducted as part of this study was on one numeric module. It did not show a correlation between the CPS results and assignment performance. Further analysis of the comparison conducted should be undertaken to fully exclude a relationship.

The evaluation undertaken in this research can be described as ‘Level 1’[35] i.e. Reaction: a measurement of learners’ feelings and opinions involving methods of instruction, learning materials and facilities. Further work can be undertaken to evaluate the use of

Page 5: A CRITICAL ANALYSIS OF THE USE OF ELECTRONIC VOTING ... · A Critical Analysis of the use of Electronic Voting Systems: Ask the Audience Emirates Journal for Engineering Research,

A Critical Analysis of the use of Electronic Voting Systems: Ask the Audience

Emirates Journal for Engineering Research, Vol. 12, No.1, 2007 15

EVS at the three higher levels of evaluation identified to measure the learning that may have taken place.

The EVS used in this research provides a range of detailed reports, analysis of which are beyond the scope of this report. The range of reports should be reviewed to identify additional information and support that these can provide the lecturer and individual students.

REFERENCES 1. Sophacles, (495-406BC) 2005. in O'DONOHUE, J.

(Ed.), Lecture notes: CELT University of Wolverhampton,

2. Snee, R.D. 1993. What's missing in statistical education? The American Statistician, 47, 149-154.

3. Biggs, J. 2003. Teaching for Quality Learning at University: what the student does. 2nd ed., Buckingham: SRHE and Open University Press.

4. Fry, H., et al. 2003. A Handbook for Teaching and Learning in Higher Education. 2nd ed., London: RoutledgeFalmer.

5. Gibbs, G. 1988. Learning by doing: a guide to teaching and learning methods. Oxford: FEU.

6. Marton, F. and Säljö, R. 1997. Approaches to Learning in MARTON, F., et al. (Eds.). The Experience of Learning. 2nd ed., Edinburgh: Scottish Academic Press,

7. Race, P. 2001. The lecturers' toolkit. 2nd ed., London: Kogan Page.

8. Biggs, J. 1988. Approaches to learning and to essay writing in SCHMECK, R. (Ed.), Learning strategies and learning styles London: Plenum Press,

9. Draper, S. 2005b. Interactive Lectures [online] [accessed 17th May 2006] Available: http://www.psy.gla.ac.uk/~steve/ilig/il.html

10. Murphy, D., et al. 2001. Online Learning and Teaching with Technology: Case Studies, Experience and Practice. London: Kogan Page.

11. Cowley, S. 2003. Getting the Buggers to Behave 2. New York: Continuum.

12. Gibbs, G., et al. 1996. Class Size and Student Performance: 1984-94. Studies in Higher Education, 21(3) 261-273.

13. Salmon, G. 2002. E-tivities: The Key to Active Online Learning. London: Kogan Page.

14. Salmon, G. 2003. E-moderating: The Key to Teaching and Learning Online. 2nd ed., London: Routledge Falmer.

15. Simpson, O. 2000. Supporting Students in Open and Distance Learning. London: Kogan Page.

16. Draper, S., et al. 2002. Electronically enhanced classroom interaction. Australian Journal of Educational Technology, 18(1) 13-23. [online] [accessed 17th May 2006] Available: http://www.psy.gla.ac.uk/~steve/ilig/handsets.html

17. Dunn, L., et al. 2004. The student assessment handbook: new directions in traditional & online assessment. London: Routledge Falmer.

18. Taras, M. 2006. Do unto others or not: equity in feedback for undergraduates. Assessment & Evaluation in Higher Education, 31(3) 365-377.

19. Yorke, M. 2003. Formative assessment in higher education: Moves towards theory and the enhancement of pedagogic practice. Higher Education, 45(4) 477-501.

20. Pickford, R. and Clothier, H. 2003. Ask the Audience: A Simple Teaching Method to Improve the Learning Experience in Large Lectures. LTSN Centre for Information and Computer Sciences.

21. Mcdonald, B. and Boud, D 2003. The impact of self-assessment on achievement: the effects of self-assessment training on performance in external examinations. Assessment in Education, 10(2) 209-220.

22. Ramsden, P. 2003. Learning to Teach in Higher Education. 2nd ed., London: RoutledgeFalmer.

23. Chickering, A.W. and Ehrmann, S.C. (1996). Implementing the seven principles: Technology as lever. [online]. [accessed 30th November 2005] Available: http://www.tltgroup.org/programs/seven.html

24. Gibbs, G. and Jenkins, A. (eds) 1992. Teaching Large Classes in Higher Education. London: Kogan Page.

25. Maier, P. and Warren, A. 2000. Integr@ting Technology in Learning and Teaching. London: Kogan Page.

26. Biggs, J.B. and Moore, P.J. 1993. The process of learning. 3rd ed., Sydney: Prentice Hall.

27. Honey, P. and Mumford, A. 1992. The manual of learning styles. 3rd ed., Maidenhead: P. Honey.

28. Feda 1995. Learning Styles. London: FEDA. 29. Smith, J. 2002. Learning styles: Fashion Fad or Lever

for Change? The Application of Learning Style Theory to Inclusive Curriculum Delivery. Innovations in Education and Teaching International, 39(1) 63-70.

30. Draper, S. 2005a. Using EVS for interactive lectures [online] [accessed 17th May 2006] Available: http://www.psy.gla.ac.uk/~steve/ilig/handsetintro.html

31. Juwah, C., et al. 2004. Enhancing student learning through effective formative feedback. York: The Higher Education Academy (Generic Centre).

32. Pearson Assessments, 2006. Classroom Performance System (CPS) Student Response Pads [online] [accessed: 17th May 2006] Available from: http://www.pearsonncs.com

33. Nicol, D.J. and Boyle, J.T. 2003. Peer Instruction versus Class-wide Discussion in Large Classes: a Comparison of Two Interaction Methods in the Wired Classroom. Studies in Higher Education, 28(4).

34. Jolliffe, A., et al. 2001. The online learning handbook: developing and using web-based learning. London: Kogan Page.

35. Laurillard, D. 2002. Rethinking university teaching: a framework for the effective use of educational technology. 2nd ed., New York and London: Routledge.

36. Steadman, M. 1998. Using classroom assessment to change both learning and teaching. New Directions for Teaching and Learning, 75(23-35).

37. Anon, (no date) Classroom Performance System [online] [accessed: 17th May 2006] Available: http://www.mhhe.com/au/cps/whosusingcps.shtml

38. Anon, 2005. E-Learning - Classroom Performance System [online] [accessed: 17th May 2006] Available from:http://www.mcgraw-hill.com.au/highered/elearn/ classroomperformacesys.jsp

39. Gamliel, E. and Davidovitz, L. 2005. Online versus traditional teaching evaluation: mode can matter. Assessment & Evaluation in Higher Education, 30(6) 581-592.

Page 6: A CRITICAL ANALYSIS OF THE USE OF ELECTRONIC VOTING ... · A Critical Analysis of the use of Electronic Voting Systems: Ask the Audience Emirates Journal for Engineering Research,

A. Felce

16 Emirates Journal for Engineering Research, Vol. 12, No.1, 2007

APPENDIX A: Use of the EVS The software was used to create a ‘Class’ (Figure A.1). In week 1 students were asked to select a response pad and to provide their name and student number. The students’ details were input to the CPS for use in the module but have been omitted from this report to ensure anonymity.

Figure A.1 Class set up in CPS A group was created for the module and series of lessons written for each topic covered on the module (Figure A.2)

Figure A.2 Lessons created in CPS CPS offers a variety of MCQ options from two answers (T/F; Y/N) to up to five possible answers (Figure A.3). The question sets created for the module used the ‘No Graphics’ options and a variety of possible answers between two and five.

Page 7: A CRITICAL ANALYSIS OF THE USE OF ELECTRONIC VOTING ... · A Critical Analysis of the use of Electronic Voting Systems: Ask the Audience Emirates Journal for Engineering Research,

A Critical Analysis of the use of Electronic Voting Systems: Ask the Audience

Emirates Journal for Engineering Research, Vol. 12, No.1, 2007 17

Figure A.3 Choices of MCQ within CPS The creation of questions was simple and intuitive; once the stem and possible answers were identified for each question the information was in typed into the question window (Figure A.4) and then saved and the window closed, or the next question was created.

Figure A.4 CPS Question Window In running a question set in class, the lecturer has to ‘engage’ the lesson and select the ‘include in gradebook option (Figure A.5)

Page 8: A CRITICAL ANALYSIS OF THE USE OF ELECTRONIC VOTING ... · A Critical Analysis of the use of Electronic Voting Systems: Ask the Audience Emirates Journal for Engineering Research,

A. Felce

18 Emirates Journal for Engineering Research, Vol. 12, No.1, 2007

Figure A.5 CPS Session set up and recording CPS is automatically minimised until the first question is ‘started’; this function allowed the lecture and tutorial exercises to be introduced and then to move straight onto the questions without having to pause to set-up the software (Figure A.6).

Figure A.6 CPS minimised on desktop At the appropriate stage in the tutorial exercise each question in the set is ‘launched’ and students given the opportunity to respond. The grid shows the students who have responded (reverse video) and the total number of responses. An optional ‘timer’ is also included, preset to one minute. (Figure A.7)

Page 9: A CRITICAL ANALYSIS OF THE USE OF ELECTRONIC VOTING ... · A Critical Analysis of the use of Electronic Voting Systems: Ask the Audience Emirates Journal for Engineering Research,

A Critical Analysis of the use of Electronic Voting Systems: Ask the Audience

Emirates Journal for Engineering Research, Vol. 12, No.1, 2007 19

Figure A.7 CPS question set during response time Once all students have responded, or the time set has run out, the question ‘ends’ and a summary of the responses given is shown (Figure A.8). Students can see the correct answer (indicated by the tick) and they will know if they were correct or not without other students being able to identify others’ answers.

Figure A.8 CPS question response summary As each question was completed and after the correct answer was given the lecturer showed the class how to calculate the result and possible ways that error had been made. At the end of each question set the results were automatically saved into the CPS Gradebook (Figure A.9) from where they were exported into Excel for further analysis (Appendix B).

Page 10: A CRITICAL ANALYSIS OF THE USE OF ELECTRONIC VOTING ... · A Critical Analysis of the use of Electronic Voting Systems: Ask the Audience Emirates Journal for Engineering Research,

A. Felce

20 Emirates Journal for Engineering Research, Vol. 12, No.1, 2007

Figure A.9 CPS Gradebook A variety of alternative reports are available in CPS (Figure A.10); the usefulness of these is outside the scope of this research.

Figure A.10 Range of reporting options in CPS

Page 11: A CRITICAL ANALYSIS OF THE USE OF ELECTRONIC VOTING ... · A Critical Analysis of the use of Electronic Voting Systems: Ask the Audience Emirates Journal for Engineering Research,

A Critical Analysis of the use of Electronic Voting Systems: Ask the Audience

Emirates Journal for Engineering Research, Vol. 12, No.1, 2007 21

APPENDIX B

Comparison of top/middle/bottom sets of cps and gpa results Av

erag

e GPA

Aver

age C

PS

Aver

age G

PA

Aver

age C

PS

Aver

age G

PA

Aver

age C

PS

Aver

age G

PA

Aver

age C

PS

Aver

age G

PA

Aver

age C

PS

Aver

age C

PS F

eedb

ack

Aver

age G

PA

Aver

age C

PS

Q1 Q2 Q3 Q4 Q5 'Positive' feedback

Bottom Ten 7 54 1 64 4 60 1 56 4 26 38 5 62

Middle Ten 11 69 8 58 10 76 6 63 9 66 46 9 76

Top ten 14 67 14 56 13 65 13 62 14 76 70 12 75

GPA: Grade Point Average from coursework (Maximum = 16, Minimum Pass = 5) CPS: Given as percentage

Page 12: A CRITICAL ANALYSIS OF THE USE OF ELECTRONIC VOTING ... · A Critical Analysis of the use of Electronic Voting Systems: Ask the Audience Emirates Journal for Engineering Research,

A. Felce

22 Emirates Journal for Engineering Research, Vol. 12, No.1, 2007

APPENDIX C

Mid-Module Evaluation Criteria Student Comment Best 1 the use of teaching aid to encourage the interaction of students - better understanding Best 2 clear powerpoint presentations; interactive aspect Best 3 use of zapper; objective questions during class sessions Best 4 I have enjoyed the style of teaching and the interactive computer learning Best 5 interactive nature using zappers; good lesson plans showing effort on tutor's part Best 6 use of interaction through computer system; workshop method of working; weekly assessment through voting system Best 7 the presentation and the handouts and using the system of the zapper Best 8 good motivation; good practicals and tutorials in class; clear presentation Best 9 lecturing is very clear and brief, it is very good for student Best 11 use of new technology in tutorials Best 12 the class exercises Best 16 remote control assisted learning experience Best 17 interaction with remote control facility Best 18 the use of electronic remotes keeping all students focused for the entire lecture Best 22 tutorials, interactive questions (remote control) Best 23 interactive learning, computer based Best 24 implementation of cps system, obtain major involvement of students; patient lecturer Best 26 the notes were complete and compact Best 30 interaction in lectures Best 31 the interactive methods used; good use of work showing model answers from tutorials, in depth lecture notes Best 32 its interactiveness Best 33 questions and answers with remote Best 34 variety; model answers to tutorials; participation; progress checking through zappers system Best 35 practicals using the computer to ask questions; well presented and prepared lectures Improve 1 consider slow learning students; although majority of them are fast learners Improve 4 none Improve 6 none Improve 19 concentrate on the students rather than the use of new IT equipment Improve 21 none Improve 25 nil Improve 26 n/a Improve 27 time limits on interactive exercises

Improve 28 lecture too long because of interactive exercises; they were good but took too long to answer questions meant a lot of waiting time

Improve 30 n/a Improve 31 build on lecture notes, take more time on tutorials, continue to use interactive methods Improve 34 all questions timed (zapper questions)

Page 13: A CRITICAL ANALYSIS OF THE USE OF ELECTRONIC VOTING ... · A Critical Analysis of the use of Electronic Voting Systems: Ask the Audience Emirates Journal for Engineering Research,

A Critical Analysis of the use of Electronic Voting Systems: Ask the Audience

Emirates Journal for Engineering Research, Vol. 12, No.1, 2007 23

End of Module Evlaution

Question Respondent Response

1 MODULE CONTENT

1.1 What parts of this module have you found useful and why?

1.1 1 Basic accounting skills may be useful in future

1.1 2 Projection screen work. Not just handouts and endless talking by lecturer

1.1 3 the assessment part has been useful

1.1 4 exercises were very useful, learning methods - remote control

1.1 5 zapper workout - everyone is listening, tutorials

1.1 6 tutorials and examples are always good

1.1 7 visual aids, group interaction

1.1 9 electronic interaction, everyone can get involved. Lecture notes, easy to read and understand

1.1 13 good interaction

1.1 14 tutorial/questions

1.1 16 using the electronic zappers as allowed me to see if I'm doing the work right

1.2 What parts do you think were a waste of your time and why?

1.2 5 zappers - takes too much time

1.2 6 The questions on the module. The technique was very slow and questions to easy. Set up zapper and receiver tool a lot of time and answer the questions

1.2 16 None

2 MODULE MATERIALS

2.1 What materials did you find the most helpful and why?

2.1 1 zappers - helps to understand questions better and know where your going wrong

2.1 2 Projection screen + e-instruction: both very good at providing interaction with students and lecturer

2.1 3 handouts and tutorials

2.1 4 work examples and handouts

2.1 7 interactive voting system

2.1 8 all handouts and particularly the model answers

2.1 12 handouts + electronic answering

2.1 13 handouts

2.1 14 module notes and tutorial

2.1 15 CPS

2.1 16 all

3 MODULE STRUCTURE

3.1 What was good about the way this module was structured and why?

3.1 1 good visual aids and worksheets, clear illustration of student development

3.1 7 very interactive and information displayed clearly

3.1 12 it was interesting and well-taught

3.1 13 good pace of learning

3.1 14 electronic questions

3.1 16 clear to understand

Page 14: A CRITICAL ANALYSIS OF THE USE OF ELECTRONIC VOTING ... · A Critical Analysis of the use of Electronic Voting Systems: Ask the Audience Emirates Journal for Engineering Research,

A. Felce

24 Emirates Journal for Engineering Research, Vol. 12, No.1, 2007

3.2 What was bad about the way this module was structured and why?

3.2 2 Nothing

3.2 16 None

4 MODULE TEACHING

4.1 What do you like about the way this module is taught and why?

4.1 2 interactive, not just 3 hours of boring lecture

4.1 8 good breakdown of theory and worked examples

4.1 9 teaching methods, the lecture gave clear instructions and was thorough through learning

4.1 10 it was interactive but the worksheet were a little too much

4.1 12 it was good interaction and not heavy going. Electronic resource was good for feedback andmaking sure you understood the question

4.1 13 style of teaching

4.1 14 use of technology CPS

4.1 16 very good

4.4 The way this module is taught is different from other modules i.e. the use of CPS (zappers). What do you think are the pros and cons of this type of activity and why?

4.4 1 pros - good way to learn cons - takes a long time; sometimes doesn't work

4.4 2 good - enables students to get involved in the lectures without having to speak. Many students are too shy, therefore zappers are good

4.4 7 helps create a competitive atmosphere

4.4 8 good to know if class/individuals are following and understanding however, can lose some people during the waiting time (chatting)

4.4 9 pros - everyone is involved; cons - time can take a while for people to answer

4.4 10 there is definitely more good to it than bad

4.4 12 actually I thought it was very good with no cons

4.4 13 puts the emphasis on the student to pay attention (pro)

4.4 14 all positive

4.4 16 pros - understanding of weather I was doing it correct

5 Is there anything else you'd like to tell me?

5 12 I have enjoyed this lecture. It has not been bland or boring

5 15 Thank you

5 16 I'm retaking this module and I feel this time I understand and will be able to do assignment and exam this year as been a better experience form me to last years

Page 15: A CRITICAL ANALYSIS OF THE USE OF ELECTRONIC VOTING ... · A Critical Analysis of the use of Electronic Voting Systems: Ask the Audience Emirates Journal for Engineering Research,

A Critical Analysis of the use of Electronic Voting Systems: Ask the Audience

Emirates Journal for Engineering Research, Vol. 12, No.1, 2007 25

APPENDIX D: CPS end of module evaluation

Response Report

Page 16: A CRITICAL ANALYSIS OF THE USE OF ELECTRONIC VOTING ... · A Critical Analysis of the use of Electronic Voting Systems: Ask the Audience Emirates Journal for Engineering Research,

A. Felce

26 Emirates Journal for Engineering Research, Vol. 12, No.1, 2007

Response Report