analytics4action evaluation framework: a review of evidence-based learning analytics interventions...

27
[email protected] | @nashman11178 Results so far Pass rates: On 7 modules the pass rates increased On 4 modules the pass rates decreased No comparable data for 7 modules. Z- scores: On 8 modules the z scores increased On 2 modules the z scores decreased No impact on z-scores for one module No comparable data for 7 modules. Satisfaction: On 7 modules the overall satisfaction increased On 4 modules the overall 2014-2015 academic year 1 While proving causality is always very difficult, the increases in pass rates, z-scores and overall satisfaction can be a proxy for engaged and well-supported module team chairs willing and being able to take action.

Upload: avinash-boroowa

Post on 21-Apr-2017

135 views

Category:

Data & Analytics


0 download

TRANSCRIPT

A4A

Results so farPass rates: On 7 modules the pass rates increasedOn 4 modules the pass rates decreasedNo comparable data for 7 modules.

Z- scores:On 8 modules the z scores increasedOn 2 modules the z scores decreasedNo impact on z-scores for one moduleNo comparable data for 7 modules.

Satisfaction: On 7 modules the overall satisfaction increasedOn 4 modules the overall satisfaction decreased No comparable data for 7 modules.

2014-2015 academic year1While proving causality is always very difficult, the increases in pass rates, z-scores and overall satisfaction can be a proxy for engaged and well-supported module team chairs willing and being able to take action.

[email protected] | @nashman11178

1

A224 Inside music an example 2

Reviewing the TMA data revealed a possible issue between TMAs 03 and 04.

The data suggested a significant drop in students submitting TMA04 (11% on the 2014J presentation)

[email protected] | @nashman11178

3

The data also suggested retention was an issue between TMA03 and TMA05. 12% formal withdrawal in this period.

A224 Inside music an example

[email protected] | @nashman11178

4

Investigation of the data showed low engagement with structured content activities 17.22 -17.26

Proposed Action: Rework the content in these activities.

A224 Inside music an example

[email protected] | @nashman11178

4

5

Data suggests that roughly a quarter of students registered on A224 15J are studying 120 credits.

Further investigation required to track their success.

Proposed Action: To better advise these students at registration.

[email protected] | @nashman11178

5

The ProblemPrevious presentations over 500 students registered at start

Steady attrition means that the two previous presentations had Completion rates of 67% and 69% Pass rates of 63% and 65%

30% of students with lower than A level - PEQs

High concurrency with other modules (30%)

Based on feedback, the feeling was that students new to OU study werent adequately prepared for what it is to study 30/60 credits.L192 Bon dpart: beginners' French 6The SolutionModule website opens 3 weeks before start. In those 3 weeks students were offered induction sessions on MaterialsTutorialsSupportDrop-in session for ad-hoc queries

Rationale: to set the right expectations in the minds of students - what it is to study 30 credits/60 credits

Targeted at students new to the OU

[email protected] | @nashman11178

L192 is a compulsory course in the Certificate in French (C33) L192 is an optional course in the BA/BSc (Honours) European Studies (B10) BA (Honours) Humanities (B03) BA (Honours) Modern Language Studies (B30) Certificate of Higher Education in Humanities (C98)

6

The outcomesL192 Bon dpart: beginners' French 7

Withdrawals before module start

[email protected] | @nashman11178

The outcomesL192 Bon dpart: beginners' French 8

Withdrawals as at 03/03/2016

Assignment Submission Rates

[email protected] | @nashman11178

8

The outcomesL192 Bon dpart: beginners' French 9

Average Assessment ScoresOverall significant correlations between attendance and subsequent retention and performance

Number of modules in concurrent study the group that did not attend the induction session have registered on more modules (in both terms of modules and credits) than those who did - it may be a lack of time that caused them not to engage in the induction session

Proportion of new/continuing students the group that did not attend the induction session has significantly higher proportion of continuing students. On the whole, it looks like new students were the ones who mostly benefited.

[email protected] | @nashman11178

9

10

[email protected] | @nashman11178

Evaluating the use of the A4A FrameworkTechnology Acceptance Model (TAM1)11

Explains why a user accepts or rejects a technology.

Perceived usefulness and perceived ease of use influence intentions to use and actual behaviour.

Identify what factors explain future intentions to use the innovation and actual usage behaviourThe Technology Acceptance Model, version 1. (Davis, Bagozzi & Warshaw 1989)

[email protected] | @nashman11178

Thetechnology acceptance model(TAM) is aninformation systemstheory that models how users come to accept and use a technology. The model suggests that when users are presented with a new technology, a number of factors influence their decision about how and when they will use it, notably:Perceived usefulness(PU) This was defined by Fred Davis as "the degree to which a person believes that using a particular system would enhance his or her job performance".Perceived ease-of-use(PEOU) Davis defined this as "the degree to which a person believes that using a particular system would be free from effort" (Davis 1989).

11

Feedback from Data Source Briefing WorkshopsPerceived usefulness(PU) Using the data tools will improve the delivery of the module. Using the data tools will increase my productivity. Using the data tools will enhance the effectiveness of the teaching on the module.

Perceived ease-of-use(PEOU)Learning to operate data tools is easy for me. I find it easy to get the data tools to do what I want them to do. I find the data tools easy to use.Based on Technology Acceptance Model (TAM1)12Perceived training requirementI expect most staff will need formal training on the data tools

Satisfaction with WorkshopThe instructors were enthusiastic in the data briefing. The instructors provided clear instructions on what to do. Overall, I am satsified with the workshop.

[email protected] | @nashman11178

13

[email protected] | @nashman11178

14

[email protected] | @nashman11178

15

[email protected] | @nashman11178

16

[email protected] | @nashman11178

17

[email protected] | @nashman11178

Feedback from Data Support MeetingsPerceived usefulness(PU) Using the data tools from the support meeting will enhance the effectiveness of the teaching on the module.Using the data tools from the support meeting will improve the delivery of my module.Using the data tools from the support meeting will increase my productivity.Perceived ease-of-use(PEOU)I find it easy to get the data tools used in the support meetings to do what I want them to do.I find the tools used in the support meeting easy to use.Learning to operate the data tools used in the support meeting is easy for me.Based on Technology Acceptance Model (TAM1)18Perceived training requirementBased upon my experience with the data tools used in the support meeting, I expect that most staff will need formal training to use these tools.Satisfaction with WorkshopThe facilitators helped me identify an issue, or an action, that could be taken on my module.The facilitators provided a clear interpretation of my module's data.The facilitators were enthusiastic in the support meeting.Overall, I am satisfied with the support meeting.

[email protected] | @nashman11178

18

19

[email protected] | @nashman11178

20

[email protected] | @nashman11178

21

[email protected] | @nashman11178

22

[email protected] | @nashman11178

23

[email protected] | @nashman11178

Are there any questions?For further details please contact:Avinash Boroowa [email protected] @nashman11178Bart Rienties [email protected] @DrBartRientiesDr. Christothea Herodotu [email protected]

[email protected] | @nashman11178

24

ReferencesAgudo-Peregrina, . F., Iglesias-Pradas, S., Conde-Gonzlez, M. ., & Hernndez-Garca, . (2014). Can we predict success from log data in VLEs? Classification of interactions for learning analytics and their relation with performance in VLE-supported F2F and online learning. Computers in Human Behavior, 31(February), 542-550. doi: 10.1016/j.chb.2013.05.031Anderson, T., Rourke, L., Garrison, D., & Archer, W. (2001). Assessing teaching presence in a computer conferencing context. Journal of Asynchronous Learning Networks, 5(2), 1-17. Arbaugh, J. B. (2014). System, scholar, or students? Which most influences online MBA course effectiveness? Journal of Computer Assisted Learning, 30(4), 349-362. doi: 10.1111/jcal.12048Arnold, K. E., & Pistilli, M. D. (2012). Course signals at Purdue: using learning analytics to increase student success. Paper presented at the Proceedings of the 2nd International Conference on Learning Analytics and Knowledge.Ashby, A. (2004). Monitoring student retention in the Open University: definition, measurement, interpretation and action. Open Learning: The Journal of Open, Distance and e-Learning, 19(1), 65-77. doi: 10.1080/0268051042000177854Calvert, C. E. (2014). Developing a model and applications for probabilities of student success: a case study of predictive analytics. Open Learning: The Journal of Open, Distance and e-Learning, 29(2), 160-173. doi: 10.1080/02680513.2014.931805Cleveland-Innes, M., & Campbell, P. (2012). Emotional presence, learning, and the online learning environment. The International Review of Research in Open and Distance Learning, 13(4), 269-292. Clow, D., Cross, S., Ferguson, R., & Rienties, B. (2014). Evidence Hub Review. Milton Keynes: LACE Project.Collins, A., Joseph, D., & Bielaczyc, K. (2004). Design Research: Theoretical and Methodological Issues. The journal of the learning sciences, 13(1), 15-42. Conde, M. ., & Hernndez-Garca, . (2015). Learning analytics for educational decision making. Computers in Human Behavior, 47, 1-3. doi: http://dx.doi.org/10.1016/j.chb.2014.12.034Conole, G. (2012). Designing for Learning in an Open World. Dordrecht: Springer.Cross, S., Galley, R., Brasher, A., & Weller, M. (2012). Final Project Report of the OULDI-JISC Project: Challenge and Change in Curriculum Design Process, Communities, Visualisation and Practice. York: JISC.Ferguson, R. (2012). Learning analytics: drivers, developments and challenges. International Journal of Technology Enhanced Learning, 4(5), 304-317. doi: 10.1504/ijtel.2012.051816Ferguson, R., & Buckingham Shum, S. (2012). Social learning analytics: five approaches. Paper presented at the Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, Vancouver, British Columbia, Canada.Garrison, D., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education The Internet and Higher Education, 2(2), 87-105. doi: 10.1016/S1096-7516(00)00016-6Garrison, D., & Arbaugh, J. B. (2007). Researching the community of inquiry framework: Review, issues, and future directions. The Internet and Higher Education, 10(3), 157-172. doi: DOI: 10.1016/j.iheduc.2007.04.00125

[email protected] | @nashman11178

ReferencesGasevic, D., Zouaq, A., & Janzen, R. (2013). Choose your classmates, your GPA is at stake!: The association of cross-class social ties and academic performance. American Behavioral Scientist, 57(10), 1460-1479. doi: 10.1177/0002764213479362Gonzlez-Torres, A., Garca-Pealvo, F. J., & Thern, R. (2013). Humancomputer interaction in evolutionary visual software analytics. Computers in Human Behavior, 29(2), 486-495. doi: 10.1016/j.chb.2012.01.013Hattie, J. (2009). Visible Learning: A synthesis of over 800 meta-analyses relating to achievement. New York: Routledge.Hess, F. M., & Saxberg, B. (2013). Breakthrough Leadership in the Digital Age: Using Learning Science to Reboot Schooling. Thousand Oaks: Corwin Press.Inkelaar, T., & Simpson, O. (2015). Challenging the distance education deficit through motivational emails. Open Learning: The Journal of Open, Distance and e-Learning, 1-12. doi: 10.1080/02680513.2015.1055718Isherwood, M. C. (2009). Developing on-line Learning Aids for M150 Students COLMSCT Final Report. Milton Keynes, UK: The Open University UK.Jordan, S. (2014). Computer-marked assessment as learning analytics. Paper presented at the CALRG Annual Conference, Milton Keynes, UK. McMillan, J. H., & Schumacher, S. (2014). Research in education: Evidence-based inquiry. Harlow: Pearson Higher Ed.Papamitsiou, Z., & Economides, A. (2014). Learning Analytics and Educational Data Mining in Practice: A Systematic Literature Review of Empirical Evidence. Educational Technology & Society, 17(4), 4964. Richardson, J. T. E. (2012). The attainment of White and ethnic minority students in distance education. Assessment & Evaluation in Higher Education, 37(4), 393-408. doi: 10.1080/02602938.2010.534767Rienties, B., & Alden Rivers, B. (2014). Measuring and Understanding Learner Emotions: Evidence and Prospects LACE review papers (Vol. 1). Milton Keynes: LACE.Rienties, B., Cross, S., & Zdrahal, Z. (2016). Implementing a Learning Analytics Intervention and Evaluation Framework: what works? In B. Motidyang & R. Butson (Eds.), Big Data and Learning Analytics in Higher Education Current Theory and Practice. Heidelberg: Springer.Rienties, B., Giesbers, S., Lygo-Baker, S., Ma, S., & Rees, R. (2014). Why some teachers easily learn to use a new Virtual Learning Environment: a Technology Acceptance perspective. Interactive Learning Environments. doi: 10.1080/10494820.2014.881394Rienties, B., Toetenel, L., & Bryan, A. (2015). Scaling up learning design: impact of learning design activities on LMS behavior and performance. Paper presented at the 5th Learning Analytics Knowledge conference, New York.Rienties, B., & Townsend, D. (2012). Integrating ICT in business education: using TPACK to reflect on two course redesigns. In P. Van den Bossche, W. H. Gijselaers & R. G. Milter (Eds.), Learning at the Crossroads of Theory and Practice (Vol. 4, pp. 141-156). Dordrecht: Springer.Siroker, D., & Koomen, P. (2013). A/B Testing: The Most Powerful Way to Turn Clicks Into Customers. New Jersey: John Wiley & Sons.Slade, S., & Prinsloo, P. (2013). Learning Analytics: Ethical Issues and Dilemmas. American Behavioral Scientist, 57(10), 1510-1529. doi: 10.1177/0002764213479366Slavin, R. E. (2002). Evidence-Based Education Policies: Transforming Educational Practice and Research. Educational Researcher, 31(7), 15-21. doi: 10.2307/3594400Slavin, R. E. (2008). Perspectives on evidence-based research in educationWhat works? Issues in synthesizing educational program evaluations. Educational Researcher, 37(1), 5-14. doi: 10.3102/0013189X0831411726

[email protected] | @nashman11178

ReferencesStenbom, S., Cleveland-Innes, M., & Hrastinski, S. (2014). Online Coaching as a Relationship of Inquiry: Mathematics, online help, and emotional presence. Paper presented at the Eden 2014, Oxford.Tempelaar, D. T., Rienties, B., & Giesbers, B. (2015). In search for the most informative data for feedback generation: Learning Analytics in a data-rich context. Computers in Human Behavior, 47, 157-167. doi: 10.1016/j.chb.2014.05.038Tingle, R., & Cross, S. (2010). How do 'WP' students differ from others in their engagement with e-learning activities. Paper presented at the OU Widening Participation Conference, Milton Keynes, UK. https://latestendeavour.files.wordpress.com/2010/06/wp_students_engagment_with_elearning_poster_tinglecross2010.pdfTobarra, L., Robles-Gmez, A., Ros, S., Hernndez, R., & Caminero, A. C. (2014). Analyzing the students behavior and relevant topics in virtual learning communities. Computers in Human Behavior, 31(0), 659-669. doi: 10.1016/j.chb.2013.10.001Torgerson, D. J., & Torgerson, C. (2008). Designing randomised trials in health, education and the social sciences: an introduction. London: Palgrave Macmillan.Whitelock, D., Richardson, J., Field, D., Van Labeke, N., & Pulman, S. (2014). Designing and Testing Visual representations of Draft Essays for Higher Education Students. Paper presented at the Learning Analytics Knoweldge conference 2014, Indianapolis. Wolff, A., Zdrahal, Z., Herrmannova, D., Kuzilek, J., & Hlosta, M. (2014). Developing predictive models for early detection of at-risk students on distance learning modules, Workshop: Machine Learning and Learning Analytics Paper presented at the Learning Analytics and Knowledge (2014), Indianapolis.Wolff, A., Zdrahal, Z., Nikolov, A., & Pantucek, M. (2013). Improving retention: predicting at-risk students by analysing clicking behaviour in a virtual learning environment. Paper presented at the Proceedings of the Third International Conference on Learning Analytics and Knowledge.

27

[email protected] | @nashman11178