jiscdesignstudio.pbworks.comjiscdesignstudio.pbworks.com/w/file/fetch/53931998/bar…  · web...

17
AN INSTITUTION-WIDE PROJECT USING AN ELECTRONIC VOTING SYSTEM FOR ASSESSMENT: THE STORY SO FAR.... K. Robins, E. Gormley-Fleming University of Hertfordshire (UNITED KINGDOM) [email protected], e.gormley-fl[email protected] An Institution-wide project using an Electronic Voting System for assessment: The story so far.... This paper will consider an institutional perspective of the role-out of an Electronic Voting System (EVS) across a number of schools, with an overarching view that it will enhance the assessment experience for both students and academics. Our initial findings, that the use of handsets benefitted both students and academics, are not dissimilar to that of Draper & Brown (2004). The paper will also demonstrate successes and challenges of such a large scale project. Three drivers define the need for this project. • The critical role assessment and feedback plays in supporting learning, developing students’ self-regulation and ultimately enhancing student progression and success. • Students nationally and locally identifying assessment and feedback as the least satisfactory aspect of their university experience. • Assessment and feedback often have time pressures and technology enhanced solutions can be both educationally effective and resource efficient. Across the institution, eleven schools have volunteered to be involved in the project and over 7000 EVS handsets have been issued to students. This institution-wide approach in using technology to increase and enhance assessment and feedback practice has been carried out at three levels; individual student, module and programme level. The presentation aims to share • How the schools were supported across the institution • How EVS links to good principles of assessment • Some smart things to do with EVS • Some of the challenges of technology The next phase will be to introduce a ‘Student Dashboard’ that will collect performance data from a variety of sources and create automated and regular reports of student engagement, with our Managed Learning Environment (MLE). Ultimately, the project is focused on improving student support; student The ITEAM project - a joint JISC/UH funded project

Upload: vuongliem

Post on 05-Feb-2018

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: jiscdesignstudio.pbworks.comjiscdesignstudio.pbworks.com/w/file/fetch/53931998/Bar…  · Web viewEVS has been used by enthusiasts in their individual modules for a number of years

AN INSTITUTION-WIDE PROJECT USING AN ELECTRONIC VOTING SYSTEM FOR ASSESSMENT: THE STORY SO FAR....

K. Robins, E. Gormley-Fleming

University of Hertfordshire (UNITED KINGDOM)[email protected], [email protected]

An Institution-wide project using an Electronic Voting System for assessment: The story so far....This paper will consider an institutional perspective of the role-out of an Electronic Voting System (EVS) across a number of schools, with an overarching view that it will enhance the assessment experience for both students and academics. Our initial findings, that the use of handsets benefitted both students and academics, are not dissimilar to that of Draper & Brown (2004). The paper will also demonstrate successes and challenges of such a large scale project.

Three drivers define the need for this project.• The critical role assessment and feedback plays in supporting learning, developing students’ self-regulation and ultimately enhancing student progression and success.• Students nationally and locally identifying assessment and feedback as the least satisfactory aspect of their university experience.• Assessment and feedback often have time pressures and technology enhanced solutions can be both educationally effective and resource efficient.

Across the institution, eleven schools have volunteered to be involved in the project and over 7000 EVS handsets have been issued to students. This institution-wide approach in using technology to increase and enhance assessment and feedback practice has been carried out at three levels; individual student, module and programme level.

The presentation aims to share • How the schools were supported across the institution• How EVS links to good principles of assessment • Some smart things to do with EVS• Some of the challenges of technology

The next phase will be to introduce a ‘Student Dashboard’ that will collect performance data from a variety of sources and create automated and regular reports of student engagement, with our Managed Learning Environment (MLE). Ultimately, the project is focused on improving student support; student learning and student engagement in their learning experience.

References:• Draper S & Brown M (2004) Increasing interactivity in lectures using an electronic voting system. Journal of Computer Assisted Learning. 20. 81-94.Please note, this project has recently received funding from the JISC.

Keywords: EVS, PRS, assessment, technology.

The ITEAM project - a joint JISC/UH funded project

Page 2: jiscdesignstudio.pbworks.comjiscdesignstudio.pbworks.com/w/file/fetch/53931998/Bar…  · Web viewEVS has been used by enthusiasts in their individual modules for a number of years

1 INTRODUCTION

This paper will give an overview of the roll out of Electronic Voting Handsets (EVS) in teaching across the University. It is hoped that the handsets would provide benefits in learning, teaching and assessment for students. This project is part of a larger project to Integrating Technology Enhanced Assessment Methods (ITEAM) and is a JISC funded project. The project team all work in the Learning and Teaching Institute (LTI), and the team comprises of Project Director, Project Manager and several LTI teachers.

EVS has been used by enthusiasts in their individual modules for a number of years to good effect. The project aims to pull together this work by offering a central approach. The project started in 2010/11, by developing important relationships with Heads of Academic Schools. The team also considered a number of stakeholders, such as the technical support, student union, disability services and the supplier, Reivo, to ensure the smooth roll-out of EVS across the University.

Our initial findings, that the use of handsets benefitted both students and academics, are not dissimilar to that of Draper & Brown (2004) [1]. These benefits include encouraging students to engage in the module, encourage learning, supports their personalised learning and providing timely feedback. From a academic’s perspective, EVS can be used to provide formative feedback, correct understanding, provide just in time teaching, add variety and improve attendance and performance

The team decided that they would use the TurningPoint Electronic Voting System, on the basis of its ease of use, ready integration with PowerPoint and that two schools have developed experience in using this product. It has also been tested and worked with our in-house managed learning environment (MLE) called StudyNet.

Initially, schools across the University were offered free handsets if they became part of this project. The team worked with eight partner schools and have approximately 3785 EVS handsets were purchased and deployed to schools. Each school was free to use EVS in whatever way they thought was appropriate within their subject discipline. Two types of handsets were purchased: the LCD RF handset that supports standard multiple choice questions, and the XR handset that would take alphanumeric answers and self paced assessments (homework mode). A breakdown of the distribution is shown below.

Table 1: Distribution of handsets distributed across the University (2010/11)

Academic School 2010/11 Handset Type

Psychology 500 LCD RFComputer Science 320 XRHumanities 575 XRBusiness 1000 LCD RFEducation 260 LCD RFLaw 450 LCD RFLife Science 420 LCD RFPhysics, Astronomy and Mathematics 320 LCD RF

In 2011/12, two further schools joined the project, ‘Engineering and Technology’ and ‘Nursing and Midwifery and Social Work’. All schools that needed additional handsets from September 2011 were required to buy their own, albeit through an agreed supplier. There are now over 7500 handsets available across the University.

The ITEAM project - a joint JISC/UH funded project

Page 3: jiscdesignstudio.pbworks.comjiscdesignstudio.pbworks.com/w/file/fetch/53931998/Bar…  · Web viewEVS has been used by enthusiasts in their individual modules for a number of years

Throughout this project, we have been conscious to ensure that the use of EVS fits with our Assessment for Learning Principles. These principles were developed by a team led by Dr Mark Russell (Russel et al, 2010) [2], and are synthesised (mainly) from the work of Gibbs and Simpson (2004) [3], Nicol (2007) [4], the NUS and the Weston Manor Group (2007) [5]. These are known as the Assessment for Learning Principles.

2 HOW THE SCHOOLS WERE SUPPORTED ACROSS THE INSTITUTION

Each school were required to nominate a school lead. The project director communicated with school leads on a regular basis to provide status updates, give advice, training and information for staff and students. At the start of the project, each school lead was required to develop a project plan outlining the following

The scope of your project, Aims, objectives and targets Key milestones and dates Risks to success (preferably with plans to mitigate against the risk) Any training / staff development needs Requirements from us to their project succeed

2.1Technical Support

2.1.1 Information Hertfordshire (IH)IH provides technical support across the university and have fully engaged with this project. Within this unit, the Learning and Teaching Development Unit (LTDU) have developed a system which tags handsets to individual students and stores the results in StudyNet, the managed learning environment (MLE) for the university. The system created is extremely efficient as it scans the student ID card and the EVS bar code in a matter of only a few seconds per student. Academics are able to access this data by module and produce a participation list for TurningPoint and hence allow them to keep control of individual student performance data.

The technical support team have downloaded the TurningPoint software and attached receivers to all classroom computers across the university. The helpdesk have recently taken on the role of helping academics and students with technical issues related to the use of EVS.

2.1.2 Technology MentorsWithin the University, we also have a number of Student Technology Mentors. These technology mentors have been trained on how to use EVS and are available to support academics as and when required.

2.1.3 Supplier supportReivo, the supplier, has been very helpful in supporting both academics in using TurningPoint by providing training. They have also supported the technical team with hardware issues.

The ITEAM project - a joint JISC/UH funded project

Page 4: jiscdesignstudio.pbworks.comjiscdesignstudio.pbworks.com/w/file/fetch/53931998/Bar…  · Web viewEVS has been used by enthusiasts in their individual modules for a number of years

2.2 Pedagogic Support

Pedagogic support has been provided by the project team by developing online resources and links to useful articles on the LTI Knowledge Exchange. These online resources include documents on

EVS and the pedagogy e.g. Smart things to do with EVS, Inclusive Practice Useful tips for teachers e.g. FAQ’s, Downloading a participant list, Inclusivity for students How to use TurningPoint e.g. downloading the TurningPoint software, changing channels on

the EVS handset.

2.2.1 A one day school symposium eventThis event ran for schools that had been participating in the EVS project. The symposium included introductory and more specialised “hands-on” EVS training sessions run by UH and Reivo staff. Schools were also encouraged to share their experiences and achievements using EVS within their school and to raise awareness of staff to the potential of EVS and thus extend its use.

2.2.2 Introduction to TurningPoint workshopThis workshop aims to introduce staff to TurningPoint, how and why EVS is used in teaching, discuss the hardware and software, how EVS works with StudyNet (MLE) and give academics experience in developing some simple questions.

2.2.3 Effective and Efficient Assessment workshopThis workshop aimed to help academics to reflect on their current assessment and feedback practices while considering best practice in this area and to identify how technology may enhance the assessment process.

2.2.4 Using EVS for Assessment and Feedback workshopThis workshop aimed to provide an overview of how EVS can be used for Assessment and Feedback, and explore the different ways in which EVS can be used teaching. This workshop also included examples of how EVS linked with the Assessment for Learning Principles.

3 HOW EVS LINKS TO ASSESSMENT FOR LEARNING PRINCIPLES

The team are conscious that EVS is used from a sound pedagogic perspective as it is important that academics think carefully about how and why they are using the system. To this end, the team have identified and are promoting how using EVS links to our Assessment for Learning Principles, together with the benefits and considerations in using EVS in this way. See the principles developed by (Russell et al, 2010), [2] below

1. Engages students with the assessment criteria2. Supports personalised learning3. Ensures feedback leads to improvement4. Focuses on student development5. Stimulates dialogue6. Considers student and staff effort

The ITEAM project - a joint JISC/UH funded project

Page 5: jiscdesignstudio.pbworks.comjiscdesignstudio.pbworks.com/w/file/fetch/53931998/Bar…  · Web viewEVS has been used by enthusiasts in their individual modules for a number of years

3.1 Three examples of use of EVS in teaching

3.1.1 Drop quizzes during the semester – Assessment Principles 1 & 2.The students are advised that there will be a number of drop quizzes that run in the lecture over the Semester. These quizzes will be set as summative assessments that count towards the overall module mark. The students will not be told when the quiz will take place. It could take place at any time during the lecture but is most likely to be at the start or the end. The aim of this is to encourage students work on the module both during and outside the lecture time, for the entire semester. Even if the answers are not provided instantly, the results can be given to students in a timely manner after the lecture. Benefits include

Students can have instant feedback that is individual and personal. Academics can identify the level of understanding in a given topic both for the class as a

whole and the individuals. Extra support can be targeted to students who need it.

Considerations include Allow adequate time for the students to answer the questions Feedback must be provided to the students in a timely manner.

3.1.2 Contingent teaching – use student answers to questions to dictate the way the lecture goes forward – Assessment Principles 2 & 3.Conditional branching allows you to control the order of slides in your presentation based on the responses received from the audience. You might ask a question covering a specific subject area to assess whether the participants understand the subject. If most of the participants respond correctly, you can skip ahead to the next section of material.Benefits include

Academics do not need to cover material that students already understand. However, Academics can track the students who did not understand and provide support for them.

Considerations include Ensure students are not voting randomly as this will mislead the lecturer and then the content.

3.1.3 To track individual or class progress – Assessment Principles 3 & 6When students have answered either formative or summative EVS questions in a lecture or tutorial, it is possible to track progress of either the student or entire class. The Electronic voting handsets can be to Anonymous (not linked to a handset), Automatic (provides handset number) or associated with a class list downloaded from StudyNet, which will include the name and handset number. When tutors are ready to generate reports they can simply launch the Reports Wizard from the TurningPoint tool bar and can generate a report through Excel or Word.Benefits include

Can identify how the much the class understands and which questions were found difficult. Support can be offered to individuals who are not performing well or engaging in the EVS

questions. Academics can provide additional support, just in time teaching, if necessary.

Considerations include Interactive teaching does not guarantee an active learner but will encourage learning. Use of EVS should not distract from content.

3.1.4 Peer assessment – Assessment Principles 1 & 4EVS handsets can be used for students to assess their peers on an assessment task, either formatively or summatively. The task is likely to be something visual and may include presentations,

The ITEAM project - a joint JISC/UH funded project

Page 6: jiscdesignstudio.pbworks.comjiscdesignstudio.pbworks.com/w/file/fetch/53931998/Bar…  · Web viewEVS has been used by enthusiasts in their individual modules for a number of years

short piece of work, debates or a poster. Students use their handset to vote on their peers. Academics must set the grading / marking method that will be used beforehand and clearly articulate this to students e.g. A – Excellent, B – Good.Benefits include

Students engage with the marking criteria and can have a voice. Considerations include

Clear guidance is required Does not capture qualitative feedback from students.

3.2 Lecture plans and EVS

As a means to introduce EVS and highlight its potential of technology enhanced teaching, a series of lesson plans were critiqued to identify where and when students were being actively engaged in their learning. It became apparent that student engagement was not always obvious and there were periods of ‘lecturing’ to the students. It is known that a variety of student activities will engage students and that attention levels are high during first ten minutes but drop as the lecture continues if students are not actively involved (Bligh 2002) [6]. The level of engagement is critical to the overall success of their learning. Simpson and Oliver (2006) [7] identified how interactive lectures that engaged students lead to a greatly improved student learning experience.

The use of EVS as a technology enhanced teaching resource was then mapped onto lesson plans so areas for student engagement could be considered along with traditional methods of teaching. See Figure 1 below. Having regular and timely assessment demands would increase student attention levels and potentially their leaning experience. The use of EVS in the lectures was directly linked to the document created by the team on “101 things to do with EVS” which also embedded the assessment for learning principles.

Figure 1

The ITEAM project - a joint JISC/UH funded project

Page 7: jiscdesignstudio.pbworks.comjiscdesignstudio.pbworks.com/w/file/fetch/53931998/Bar…  · Web viewEVS has been used by enthusiasts in their individual modules for a number of years

4.0 RESOURCE EFFICIENCY

Assessments should be designed to support student learning, but there is little empirical evidence to date about the time it takes an academic to create the assessment task. This should be seen as intrinsic to the whole assessment process, as well designed assessment tasks should be fair, equitable and transparent and also measure if learning outcomes have been achieved or not.

While it could be suggested that all assessment practices should be of a ‘Rolls Royce’ standard, see Figure 2 below, this may not always be realistic or feasible. It may be better to aim for assessments that are efficient and effective for the relevant discipline.

Figure 2 - (Hornby, 2003) [8]

As a part of this project, resource calculators were created which capture all aspects of the assessment process, see figure 3 below. The initial thinking around the creation of resource calculators was to identify if using EVS would be a more efficient means of managing summative assessment. It is appreciated that EVS is not a panacea in terms of fixing the ills of assessment when student numbers are large. However, it is important to think about how to manage large numbers of students effectively and efficiently through the entire assessment process.

Figure 3 – Typical activities for an essay with resources (time) required

The ITEAM project - a joint JISC/UH funded project

Page 8: jiscdesignstudio.pbworks.comjiscdesignstudio.pbworks.com/w/file/fetch/53931998/Bar…  · Web viewEVS has been used by enthusiasts in their individual modules for a number of years

Using EVS for summative assessment compared to other assessments for larger student groups can offer an enhanced experience for not only the student in terms of prompt feedback but also for academic in terms of marking and administrative processes. Figure 3 below shows an example of four types of assessment by total assessment time in minutes.

Figure 4 – Assessment process time by size of group

Essay OSCE Group Presentation Exam EVS 0

50

100

150

200

250

300

20 students60 students120 students

5 SOME OF THE CHALLENGES

During the academic year a number of issues have been raised and in most cases resolved.

5.1 Technological Issues

5.1.1 Channel ConflictThe most common issue reported was due to the same channel being used in all classrooms and student responses not being received in the correct place , although it presented itself in different ways. All receivers have 82 channels but the receiver automatically defaults to channel 41. These receivers work to 200ft, with longer range receivers available for larger venues (range 400ft). Consequently, if a student submits an answer using EVS handset, the response could be picked up by the closest receiver, which may be in nearby classroom. It is not necessary to be running the software; it only requires the computer to be switched on.

To overcome this, the technical team is undertaking a channel mapping review over the summer months to maximise classroom performance. Each classroom will show appropriate signage with regard to the room channel numbers and instructions for changing channels.

5.1.2 Receivers not working in the classroom. It is believed the reason for this is that receiver has effectively gone to ‘sleep’. The technical team have experienced this problem before and have now addressed this.

5.2 Academic Issues

5.2.1 Losing results from a sessionIf academics use the classroom computer, and forget to save their results to a memory stick, they will lose the student responses to EVS questions when the computer is switched off. The team are currently investigating of automatically saving all EVS data sessions to an appropriate place (e.g. at end of each day, or at close of PC).

The ITEAM project - a joint JISC/UH funded project

Page 9: jiscdesignstudio.pbworks.comjiscdesignstudio.pbworks.com/w/file/fetch/53931998/Bar…  · Web viewEVS has been used by enthusiasts in their individual modules for a number of years

5.2.2 Using EVS to test students EVS has been used to test students as a one off test. Whilst this might be an efficient way to do so, it is a high risk method of testing, if the equipment fails to work or if individual students are unable to submit their answers. EVS has been used to carry out a number of tests over the last two years and has worked successfully in all but one occasion, which failed because the academic set the data to anonymous. To support academics in using EVS for tests, guidance has been provided with regard to timing, inclusivity, question/test design, and running and saving the test results.

5.2.3 Students not able to vote due to being timed out too earlyOne of the more unusual problems was that the countdown timer in TurningPoint was not visible (in white against a white background) and therefore students were unable to vote after 30 seconds.

Overall, managing the technology has been an added strain on academics and they need more support. The team are addressing this through a range of workshops, promoting the online help on the knowledge exchange and providing drop-in sessions.

5.3 Student Issues

From a student perspective, one issue has been around students forgetting and losing their handset. To this end a student facing document has been written and issued to students when first provided with an EVS handset. If a handset is lost, the student can replace it by contacting the supplier direct. However, the team are looking into schools having spare handsets available to buy direct from their school because buying in bulk offers reduces the cost of the handset and it can be replaced immediately.

Due to some technology issues such as channel interference and the inexperience of academics in using TurningPoint, some students lack confidence that the handset is working properly. They feel vulnerable that their answer may not have been captured by the receiver, despite the handset providing feedback to the student. It is hoped that once the channels in the classrooms have been remapped, the system will work effectively

6 THE WAY FORWARD

Recently, the team has been visiting school leads, to support, encourage and enhance the use of EVS in teaching, by offering to run a co-run an EVS workshop. This would enable the school to share and demonstrate their own EVS successes which hopefully will act as a catalyst to get others on board. Most schools have welcomed these visits and have requested support in a number of ways.

6.1 Writing Good MCQ’s Most schools would like some training on writing good multiple choice questions and in particular creating questions at the appropriate level and linking them with Blooms taxonomy (Bloom, 1956) [9]

6.2 Supporting school leadsIt is felt that dissemination in the faculties could be improved by sharing good practice between the school leads. For example, one school has used a buddy system to encourage staff to use EVS in the classroom. Another school has mandated that all staff will use EVS in their modules and have had extensive training support for the academics. Indivudual schools have also been offered tailored workshops that relate to their own needs.

The ITEAM project - a joint JISC/UH funded project

Page 10: jiscdesignstudio.pbworks.comjiscdesignstudio.pbworks.com/w/file/fetch/53931998/Bar…  · Web viewEVS has been used by enthusiasts in their individual modules for a number of years

6.3 Case studies EVS has been used in a wide variety of ways and schools have requested we share this knowledge. This will undoubtedly help reduce using EVS for the sake of it (EVS fatigue), rather than to improve student understanding and performance, which was one of our concerns. It is planned that we will write up these examples as case studies and make them available for everyone.

6.4 Student dashboard

The next phase will be to introduce a ‘Student Dashboard’ that will collect performance data from a variety of sources, including EVS data, and create automated and regular reports of student engagement, with our Managed Learning Environment (MLE).

References

[1] Draper S & Brown M (2004) Increasing interactivity in lectures using an electronic voting system. Journal of Computer Assisted Learning. 20. 81-94.

[2] Dr Mark Russell. ESCAPE Project. http://jiscdesignstudio.pbworks.com/w/page/12458419/ESCAPE%20Project [Accessed May 13th

2012]

[3] Gibbs, G. & Simpson, C. (2004). Conditions under which assessment supports students' learning. Learning and Teaching in Higher Education, 1(8).

[4] Nicol, D. (2007) Principles of good assessment and feedback: Theory and practice, from the REAP International Online Conference on Assessment Design for Learner Responsibility, 29th-31st May, 2007.

[5] Weston Manor Group (2008) Assessment standards: A Manifesto for Change. http://www.brookes.ac.uk/aske/Manifesto/

[6] Bligh. D (2002) What is the use of lectures? Intellect, Exeter.

[7] Simpson V., Oliver M. (2006) Using electronic voting systems in lectures. 2006 www.ucl.ac.uk/learningtechnology/examples/ElectronicVotingSystems.pdf [Accessed May 13th 2012]

[8] Hornby W (2003) Case Studies on Streamlining Assessment, Centre for theEnhancement of Learning and Teaching, The Robert Gordon University, Aberdeenwww.rgu.ac.uk/celt/learning/page.cfm?pge=7347#cases

[9] Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy of educational objectives: the classification of educational goals; Handbook I: Cognitive Domain New York, Longmans, Green, 1956.

The ITEAM project - a joint JISC/UH funded project