sera conference 2012 student evaluations

7
Lawry Price Roehampton University 1 SERA CONFERENCE 2012 - November 21 st /23 rd November Ayr Campus, University of the West of Scotland “Student evaluations – making them work” Summary Any university’s success and reputation is dependent to a very large extent on its ability to deliver a quality student experience. This paper reports on the main findings from a university’s first standardised, institutional wide internal undergraduate module evaluation survey. The survey was developed to build up a comprehensive picture of student’s satisfaction with their undergraduate module experience. The original and first report related to autumn 2011 modules only while subsequent spring and yearlong modules were analysed separately at the conclusion to the academic year 2011/12. This report therefore focuses on all the university-wide questions that used the five point rating scale for autumn modules only. It also provides an overview of the process that took place to conduct this survey and makes some suggestions for improvements for subsequent surveys. Overall the results from 4488 responses (60% response rate) relating to 218 modules were highly positive. Mean ratings 1 ranged between 4.1 and 4.4 for the seven main question sections (table 1 below) and range between 3.7 and 4.6 for the individual questions. Students were particularly positive about the teaching, supervision of their work and the academic support on offer during the module delivery, with an overall mean rating for these areas of 4.4 out of 5. Students were less satisfied with assessment and feedback and learning resources (both with a mean score of 4.1). The question ‘I completed all the suggested reading’ was by far the lowest rated question in the survey with only 46% of students agreeing with this comment. Table 1: Mean ratings for the seven main scaled question sections ____________________________________ 1 The mean scores referred to throughout this report relate to un-weighted means. Un-weighted scores are the mean ratings of modules as a whole so do not take account of module size. 4.1 4.1 4.1 4.2 4.3 4.4 4.4 1 2 3 4 5 Me and my module Learning resources relating to the module Assessment and feedback Overall satisfaction with the module Module organisation and management Quality of teaching and supervision Academic support during the module Mean ratings Institutional mean ratings

Upload: jlodge

Post on 12-Jul-2015

122 views

Category:

Education


0 download

TRANSCRIPT

Page 1: Sera conference 2012  student evaluations

Lawry Price Roehampton University

1

SERA CONFERENCE 2012 - November 21st/23rd November

Ayr Campus, University of the West of Scotland

“Student evaluations – making them work”

Summary

Any university’s success and reputation is dependent to a very large extent on its ability to deliver a

quality student experience. This paper reports on the main findings from a university’s first

standardised, institutional wide internal undergraduate module evaluation survey. The survey was

developed to build up a comprehensive picture of student’s satisfaction with their undergraduate

module experience. The original and first report related to autumn 2011 modules only while

subsequent spring and yearlong modules were analysed separately at the conclusion to the

academic year 2011/12.

This report therefore focuses on all the university-wide questions that used the five point rating

scale for autumn modules only. It also provides an overview of the process that took place to

conduct this survey and makes some suggestions for improvements for subsequent surveys.

Overall the results from 4488 responses (60% response rate) relating to 218 modules were highly

positive. Mean ratings1 ranged between 4.1 and 4.4 for the seven main question sections (table 1

below) and range between 3.7 and 4.6 for the individual questions. Students were particularly

positive about the teaching, supervision of their work and the academic support on offer during the

module delivery, with an overall mean rating for these areas of 4.4 out of 5. Students were less

satisfied with assessment and feedback and learning resources (both with a mean score of 4.1). The

question ‘I completed all the suggested reading’ was by far the lowest rated question in the survey

with only 46% of students agreeing with this comment.

Table 1: Mean ratings for the seven main scaled question sections

____________________________________

1The mean scores referred to throughout this report relate to un-weighted means. Un-weighted scores are the mean

ratings of modules as a whole so do not take account of module size.

4.1

4.1

4.1

4.2

4.3

4.4

4.4

1 2 3 4 5

Me and my module

Learning resources relating to the module

Assessment and feedback

Overall satisfaction with the module

Module organisation and management

Quality of teaching and supervision

Academic support during the module

Mean ratings

Institutional mean ratings

Page 2: Sera conference 2012  student evaluations

Lawry Price Roehampton University

2

The process of the surveying the students worked generally well given it was the first time of

implementation but there were areas identified where improvements could be made and proposals

made for subsequent surveys. Survey data was made available to a range of stakeholders (including

Heads of Department), module convenors, programme leaders and individual lecturers at the

appropriate level of disaggregation for self-evaluation purposes.

Organisation of the Survey

The survey software chosen for hosting the module evaluation survey was Evasys, a system

maintained by Electric Paper, a company who are experts in student-related surveys. 32 questions

were asked covering the following areas:-

1. Me and my module

2. Quality of teaching and supervision

3. Assessment and feedback

4. Academic support during the module

5. The way the module was organised and managed

6. Learning resources relating to the module

7. Departmental specific questions

8. Overall satisfaction with the module

9. A free text section asking students what was good about the module and what could be

improved

Most questions gave the students the opportunity to agree/disagree with statements on a scale

from “Strongly agree” to “Strongly disagree”. There was the additional opportunity for individual

departmental questions as well as a section for students to comment more generally. In order to

avoid the low responses typically associated with online questionnaires it was decided that the

survey would take place on paper during class time. A summary of the process for conducting the

survey is shown at appendix 1. A review of the overall process to take the survey took place in June

2012involving key stakeholders with the aim of improving the efficiency of the process for 2012/13.

The university’s Planning Department, in liaison with its Academic Office, identified the following

issues emerging out of this first institutional-wide survey:-

Awareness among academic members of staff about the process could have been greater.

Concerns were expressed from some academic staff on what the survey data would be used

for and how secure the data was.

Gaps in data on modules for example lack of up to date lecturer information meant

preparing data was time consuming.

Academic departments would benefit from clearer communication on what reports are

provided, to whom, at what level of detail and when.

Key proposals emerging out of this review:-

Page 3: Sera conference 2012  student evaluations

Lawry Price Roehampton University

3

A need for review of the overall process (which took place in June 2012)

A timeline of activities and broad outline of the process for 2012/13 would be sent out in the

summer period to all department contacts. By establishing time frames and informing

departments about the basic requirements and also a reiteration of the benefits of the

process, an increased sense of ownership of the project could be generated.

A series of road shows would be offered to academic departments with the aim of raising

awareness of the process as well as giving opportunities for questions. The university’s

Planning Department would consult on this with the established review group to see what

departments would like covered and then design the sessions accordingly in conjunction

with staff from the Academic Office and the Learning and Teaching Enhancement Unit of the

University.

The privacy policies would be highlighted at the road shows and the document (appendix 3)

would be distributed before the start of the process.

A review of how to maximise accuracy of the module data would take place as part of the

overall project review.

A timeline of who received which reports and at what level of detail would be discussed at

the project review and subsequently communicated to departments.

Institutional Summary

Overall the results from 4488 responses, relating to 218 modules were highly positive. Mean ratings

ranged between 3.7 and 4.6 for individual questions. Students were particularly positive about

teaching and learning and academic support. ‘There were sufficient opportunities to participate and

ask questions’ was the highest rated question with a mean rating of 4.6 and 90% of respondents

agreeing to this question.

The question ‘I have completed the suggested reading’ was the lowest rated question of the survey

with only 46% of students agreeing with this. Only 58% of respondents agreed that ‘the library

resources relating to this module have been good enough for my needs’ and 57.5% agreed with the

statement ‘I have submitted a lot of my coursework online’. ‘Understanding the marking criteria

before I completed the assessment’ and ‘understanding how to improve my work from the feedback I

received’ both received mean ratings of 4, with 67.5% and 64.7% of respondents agreeing with these

statements respectively.

Key findings by Question& Department

The following summarises headline responses to individual questions by questionnaire theme.

Me and my module– the lowest mean rated question of the survey was for ‘I have completed all the

suggested reading’ with all but one department having this as their lowest mean score. Students

responded positively across all departments to attending sessions and tutorials associated, with 83%

agreeing with this question

Page 4: Sera conference 2012  student evaluations

Lawry Price Roehampton University

4

Overall satisfaction with the module– responses on overall satisfaction were favourable across all

departments. ‘Overall I am satisfied with the quality of the module’ had the highest mean

departmental ratings in this section, ranging between 4 and 4.5. A module’s applicability to

workplace was the lowest rated question with mean values between 3.8 and 4.

Quality of teaching and supervision– scores were highly positive in response to this question.

Students in all departments rated the question on ‘sufficient opportunities to participate and ask

questions’ the highest with departmental mean ratings ranging between 4.3 and 4.7. Levels of

satisfaction were high for all questions in this section with only one mean rating falling below 4.

Assessment and feedback– this section contained some of the lowest scores of the survey. Areas of

most concern for students were ‘I understood the marking criteria before completing the

assessment’, with departmental mean scores ranging from 3.6 to 4.2 and ‘I understood how to

improve my work from the feedback I received’ with departmental mean scores ranging from 3.5 to

4.2. The responses on online submission of coursework needed careful interpretation as some

coursework submissions are submitted on paper as well as online.

Academic support during the module–students responded very positively to this question,

expressing high levels of agreement with being able to discuss matters with lecturers. Mean scores

ranged between 3.9 and 4.5.

The way the module was organised and managed– students expressed high levels of satisfaction on

module organisation and management. Scores for the two questions in this section appear closely

related with mean scores ranging from 3.9 to 4.6.

Learning resources relating to the module – this question had the second lowest mean score after ‘I

have completed my suggested reading’. Out of this it was suggested that there would be value in

investigating whether the two responses were interrelated at module level. Library staff responded

by setting up to use the Evasys system to evaluate these findings. Mean ratings ranged more widely

in this section, indicating wider differentials in satisfaction between departments. Mean ratings for

‘the library resources relating to this module have been good enough for my needs’ range from 3.2 to

4.1 and ‘the moodle site has been good enough for my needs’ range from 3.6 to 4.3.

The remaining two questions were driven by individual Departments and specifically designed to

elicit key messages related to the student experience.The Departmental specific questions produced

results that were highly positive and provided key insights and feedback from students which, when

linked to the free text section(asking students what was good about the module and what could be

improved) were informative tools useful in the context of informing review, planning forward and

prompts for potential change to module content and delivery (with further comment regarding

resources to support learning included).

Separate evaluations and summary reports were produced for Heads of Department and those

responsible for Learning and Teaching. These focused on summaries of the module evaluations,

response rates and gave indicators as to how individual Departments compared to the university

results as a whole. It was re-emphasised here that this was first and foremost an “evaluating the

Page 5: Sera conference 2012  student evaluations

Lawry Price Roehampton University

5

module” process and certainly not an evaluation of the individual lecturer/module leader overseeing

delivery of the particular module.

Review and changes for follow on processes

A full review process followed the completion of this first round of completed autumn module

evaluations. Some key changes in operational matters were put in place for the spring period as well

as a confirmed commitment and accepted view that the same module evaluation template would be

used to maintain a consistency for the academic year in question.

The revised quest for even better response rates and therefore more detailed and accurate data to

emerge from the exercise was a key objective. To achieve this a further concerted raise awareness

campaign to reiterate and communicate purpose, value and worth of the activity was put in place for

students and staff alike – “buy-in” was deemed to be crucial for on-going success for all stakeholders

in the process. There was recognition too that the very careful planning that had been invested in

the project, over a long lead-in period (including a contained pilot exercise preceding full university-

wide implementation) had played its part in initial success. The need to finally go with what was in

place for an autumn module evaluation (to also meet the timescale originally planned)was a key

decision but also acknowledged potential shortfalls where specific data was lacking.

The outcome was that individual departments did indeed prosper from receiving data-rich feedback

on a scale not available nor experienced before and module leaders and designers were placed in a

position to fully utilise this accrued information and detail both for review purposes and future

planning. Ultimately the drivers behind students satisfaction were more pronounced, open and

transparent, the key aim of the project in the first place. Rather than wait for the results of NSS

(National Student Survey) and other related barometers to gauge student response about their

experiences, the university was now able to monitor this across the particular academic year in

question. The beginnings therefore of a culture of module evaluation had been established on

which to further build.

Page 6: Sera conference 2012  student evaluations

Lawry Price Roehampton University

6

Appendix 1

UNDERGRADUATE MODULE EVALUATION PROCESS FLOW CHART

LTEU and Planning design/agree questionnaire

Planning extract module data from SRS

Academic Office arrange print of standard questionnaires

Academic Office provide details of lecturers teaching modules

Timetabling provide additional details on

modules

Planning reconcile module, lecturer and

timetabling information and send to programme convenors to fill in gaps

Planning upload modular information into Evasys

and generate coversheets for Academic Office.

Academic Office arrange for printing of

• questionnaires

• cover sheets -differeent one per module

Academic Office make packs for each module:

• Collate questionnaires and coversheets

• Put in A4 envelopes

Academic Office distribute packs to departmental

offices

Departmental Offices distribute questionnaires

to lecturers

Students complete questionnaire

Lecturer return completed questionnaires to

Departmental offices

Departmental offices return completed questionnaires to Academic Office

Academic Office compile completed questionairres and cover sheets and send

for scanning

Electronic Paper scan coversheets/ forms into ScanStation

• Include checks

Electric Paper upload responses into Evasys

Planning send individual reports out through the

software to module lecturers

Planning send summary reports to Heads of

department and LTAG chairs

Page 7: Sera conference 2012  student evaluations

Lawry Price Roehampton University

7

Appendix 2

PRIVACY POLICY – MODULE EVALUATION DATA COLLECTION & USE (ACADEMIC STAFF)

PRIVACY POLICY

All students will be invited to complete an anonymous module evaluation form to provide feedback on the learning experience in each module

Programme reps will invite students to contribute more informal feedback which can be fed back to the programme team at a programme board meeting

Academic staff may invite students to share their views on a programme (for example through focus groups)

Student feedback enables you to:

• reflect on teaching and assessment strategies in the light of comments • evaluate the extent to which changes you have made are successful in enhancing

learning • gain a richer picture of the student experience of a module • ensure that student expectations of a module are accurate • notify students of changes made in areas which have been highlighted as important

to them • collect evidence of teaching quality at your discretion (to take to an appraisal meeting

for example or to use as evidence for promotion).

Disclosure of your Information

Surveys are anonymous. They are completed either online or in class. Staff teaching each module are provided with overview data in graphical form. Heads of Department are provided with department statistics.

Feedback to Programme Reps may be less anonymous and clearly programme reps are not anonymous, however any reports to programme boards will make points in general terms so as to protect individual students and staff.

Data retention

Module evaluation data will be retained securely for three years.

Changes to our Privacy Policy

Any changes we may make to our privacy policy in the future will be posted on the University website

Contact: Director of Learning and Teaching Enhancement

Questions, comments and requests regarding this privacy policy are welcomed and should be addressed to [email protected]