new approaches to evaluating impact
TRANSCRIPT
Rachel BuryAcademic Liaison Manager – Quality, Communication and [email protected]@rachelriding
Helen JamiesonCustomer Services [email protected]@jamiesonhelena
Evaluating the Impact of Academic Skills Support @ Edge Hill University
Edge Hill University – University of the Year!
• 13,500 FTE• University staff – 1,298• University status – 2006• Large provider of Teacher Education and Nursing
Learning Services• Libraries • Learning Technology• SpLD support• Media & ICT Support• Academic Skills• Research Support
Presentation aims:
• Drivers for assessing impact of academic skills
• Use of AMOSSHE value and impact tool kit for group sessions
• Use of impact survey for 1-2-1 support
• Introduction of a peer review framework
Drivers:
• Customer Service Excellence holders (10 years) – the ‘so what!?’ factor
• What difference does it make to the student experience? Is it adding value?
• Do we do more of the same? Something different?
• Measuring quality of academic skills support – outcomes versus outputs
• Learning from best practice - time for reflection
My project brief: Evaluate the effectiveness of our Steps to Success programme
• Supplementary academic skills/information literacy
workshops – academic writing, Harvard Referencing, literature searching….
• Intensive on staff time/Low – Medium take up
• Series of recommendations
Using a value and impact toolkit (AMOSSHE, 2011) the project looked at three strands:
• Evaluating satisfaction (whether customers are happy/satisfied with the experience)
• Evaluating impact (whether a change has taken place as a result of an intervention)
• Evaluating value for money (using the 3 E’s – economy, efficiency and effectiveness)
Evaluating satisfaction – feedback forms after the session
• Small number of questions• Room for free text
comments
Evaluating impact: has a change has taken place? Changes may be:
• Affective: attitudes, perceptions, levels of confidence
• Behavioural: people doing things differently e.g. doing something more or less often, asking different types of questions, being more critical or more independent
• Knowledge based: e.g. knowing about key sources of relevant information
• Competence based: people doing things more effectively e.g. improved search techniques, finding more appropriate information
Impact survey
• Sent 4-6 weeks after the intervention
• Looking for changes in practice – affective, behavioural, knowledge based, competency based
• Self reported – surrogate impact indicators – need to triangulate
• Follow up – semi structured interviews
Value for money: Cost/benefit analysis using the 3 E’s model
1. Economy: for example how much has the programme cost to run per person/per session?
£55.77 per session (where there was one or more attendees) and £15.58 per head
Total cost of delivery of attended sessions
£545.30
Preparation time for all sessions
£344.40
Administration
£729.60
Promotion/publicity £500
Total
£2119.30
2. Efficiency: for example how many students are seen per session, how can this be improved?
3. Effectiveness: is the programme delivering the intended outcomes?
Have outcomes been clearly articulated at the outset? Difficult to measure if not…
• So what about one to one support?
• 2013/2014 - 1,343 sessions delivered – Academic Skills Support
• 2013/2014 – 1,950 sessions delivered – specialist SpLD support
• Whole teams of staff deliver sessions – demand increasing
• No formal way of either gathering feedback, assessing quality or measuring impact
• Impact survey devised based on principles of AMOSSHE
• How – feedback likely to be too superficial straight after the session
• Now the boring part – administration!
• 2 rounds of surveys completed in 2014/15 – average 30% return rate
• Feedback has been very valuable
Question 4 Could you tell us one change you have made since accessing support?
‘Clearer writing’
‘I changed the structure of my assignments’
‘My grades went up from 65- 72%’
‘Being able to keep up and more confidence in my work’
‘I have been able to have an understanding of how to reflect on my work’
• So what have we done with it? Two key outputs
• More reflection and real evidence!
• Feedback used in team meetings – looking at some of the responses in more detail
• Used information to develop our offer in terms of skills support
• Bringing the quality and impact framework full circle
• Learning Services Peer Support Framework
• 4 teams involved – staff who deliver one to one & group sessions
• Staff development and support
• Boring bits again – the administration and paperwork
• Peer support scheme – the challenges
• Don’t call it peer observation – staff anxiety and fears
• 10 months on – has everyone engaged?
• Next year?
• Try and improve return rates of survey and look at other ways to UX
• Encourage more staff to engage in peer support and sharing in meetings
Thank you for listening
Any questions?