reporting module evaluation outcomes an evolving approach
TRANSCRIPT
![Page 1: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/1.jpg)
Reporting Module Evaluation Outcomes
An Evolving Approach
Dr Tim LinseyHead of Academic Systems & Evaluation
Directorate for Student AchievementKingston University
London, [email protected]
Bluenotes GlobalAugust 2019
![Page 2: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/2.jpg)
Overview
• Introduction & Context• Delivery of MEQs• Reporting & Analysis & Evaluation of Reporting approaches• Data Warehouse
• University Dashboards• University Annual Monitoring and Enhancement process
• Issues and Developments
![Page 3: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/3.jpg)
IntroductionAcademic Systems and Evaluation• TEL Systems, • Attendance Monitoring, • Learning Analytics, • Student Voice
• MEQs, • Kingston Student Survey• National Student Survey• Postgraduate Survey
![Page 4: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/4.jpg)
Key considerations
• Creating a culture of continuous improvement supported by analytics
• Consistent approach embedded in Quality Assurance and Evaluation processes
• Closing the feedback back loop and valuing the student voice• Effectively engaging students throughout the curriculum• MEQs are a key part of the student feedback approach
![Page 5: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/5.jpg)
Reintroduction of MEQs• Decision taken in January 2017 to reintroduce MEQs• MEQ Working Group• 10 Quantitative + 2 Qualitative questions• 2018/19 – All online surveys
• MEQ Environment: Blue from Explorance• Lead Indicator for Teaching Quality KPI.
![Page 6: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/6.jpg)
VLE Integration
My Module Evaluations
![Page 7: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/7.jpg)
Processes & Timing Consistent approach across all modules Automated scheduling and publishing of MEQs Scheduled to allow time for in-class discussion MEQs run all year but two main survey windows (16 days) Reports automatically Published into the VLE within a few hours of
an MEQ completing Response rates
![Page 8: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/8.jpg)
Orchestrated approach–Briefing guide and PowerPoint for all module leaders– Set of agreed statements to conveyed to students–Student created video introducing MEQs–Staff asked to find a slot in class–Staff requested to leave class for 15 mins.–Use of course representatives
![Page 9: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/9.jpg)
2018/19 (to March)
• 832 MEQ reports generated (exceeding minimum threshold of 4)• 76% of student responses contained qualitative feedback• 38% students completed one or more MEQs• 47% completed via mobile devices
![Page 10: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/10.jpg)
Module ReportsStaff and student reports similar except the student version excluded
comments and comparisons (Department and Faculty averages)
![Page 11: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/11.jpg)
Improve
Best things
![Page 12: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/12.jpg)
Further Reports–Department, Faculty and University aggregate reports–Summary reports for each Faculty–Modules with zero responses or not met threshold–Custom Reports
![Page 13: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/13.jpg)
Summary Report for all Modules 2016/17Summary table ranking all modules by their mean overall score.Colour coded => 4.5 =< 3.5
![Page 14: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/14.jpg)
Summary Report for all Modules 2017/18– Colour coding was problematic– Staff suggestion to rank by standard deviation from the overall
university mean.
![Page 15: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/15.jpg)
Additionally– Comparison of 2016/17 vs 2017/18
![Page 16: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/16.jpg)
Summary Report for all Modules 2018/19Reviewed our approach to consider issues raised in the literature:
• Comparisons between modules of different types, levels, sizes, functions, or disciplines
• Averaging Ordinal scale data• Bias• Internal consistency
(e.g. Boring, 2017; Clayson, 2018; Hornstein, 2017; Wagner et. al. 2016)
![Page 17: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/17.jpg)
Education Committee Summary Report
Sorted by Faculty, Level, Response rateInclude statistical significance
![Page 18: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/18.jpg)
Statistical confidence
Methodology: Dillman, D. Smyth, J, Christian, L. 2014 Internet, Phone, Mail and Mixed-Mode Surveys: The Tailored Design Method. John Wiley & Sons.
![Page 19: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/19.jpg)
Ranking by % Agree
![Page 20: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/20.jpg)
Frequency Distributions–Request that staff also review the frequency distribution of their
responses–Is the distribution bimodal, and if so why?
Mean = 2.9
![Page 21: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/21.jpg)
Aggregating Questions to Themes
–Teaching–Assessment–Academic Support–Organisation
![Page 22: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/22.jpg)
We noted– Care needed to be taken with aggregated data and inferences
drawn from it– An individual MEQ report is informative for the module team
knowing the local context but care needs to be taken without looking at trends and holistically across other metrics.
![Page 23: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/23.jpg)
Data Warehouse–Raw data passed to the KU Data Warehouse
–Tableau Dashboards (Strategic Planning and Data Insight Department).
• Dashboards accessible by all staff including showing top 5 and bottom 5 modules for each level.
• Data aggregated with ability to drill down to module level–Annual Monitoring and Enhancement Process
![Page 24: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/24.jpg)
University Dashboards
CourseMetrics
ModuleMetricsMEQs Progression
AttainmentGap
NSSKSS
BME VA
Progression
KPIsEmployability
LearningAnalytics
![Page 25: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/25.jpg)
MEQs
![Page 26: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/26.jpg)
Drilling up and down
![Page 27: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/27.jpg)
Department View
![Page 28: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/28.jpg)
Annual Monitoring and Enhance Process
Module Enhancement PlansCourse Enhancement Plans
–Online–Pre-populated with MEQ and other metrics–Module leaders and course leaders review the performance of the
module / course citing evidence and pre-populated metrics
![Page 29: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/29.jpg)
MEPs
(extract)
![Page 30: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/30.jpg)
Issues & Developments–When should the MEQ be distributed? – Focus Group
feedback–Automation – Administration & Analysis–Text Analytics–47% students completing MEQs via Mobile Devices–Analysis across different student voice surveys
– Staff being named in qualitative feedback & issues of etiquette– Students concerned about anonymity– Response rates – followed up with modules with high response rates.– Feedback to Students– Demographic analysis
![Page 31: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/31.jpg)
Collaborative • Directorate for Student Achievement• Academic Systems & Evaluation Team• Strategic Planning and Data Insight (Inc. all dashboard
development)• Academic Registry• Information & Technology Services• Faculties via the MEQ Working Group / Student Surveys Group• Student Course Representatives• Explorance
![Page 32: Reporting Module Evaluation Outcomes An Evolving Approach](https://reader034.vdocument.in/reader034/viewer/2022052105/628713a4a54e171b5600eefa/html5/thumbnails/32.jpg)
References• Boring, A. 2017 Gender biases in student evaluations of teaching. Journal of
Public Economics 145, 27–41• Clayson, D. 2018 Student evaluation of teaching and matters of reliability.
Assessment & Evaluation in Higher Education, 43:4, 666-681• Dillman, DA, Smyth, JD, & Christian, LM 2014, Internet, Phone, Mail, and Mixed-
Mode Surveys : The Tailored Design Method, John Wiley & Sons.• Hornstein, H. 2017 Student evaluations of teaching are an inadequate
assessment tool for evaluating faculty performance. Cogent Education, 4, 1-8.• Wagner, N, Rieger, M, & Voorvelt, K 2016 Gender, ethnicity and teaching
evaluations: Evidence from mixed teaching teams. Economics of Education Review, 54, 79-94.