using learning analytics to assess innovation & improve student achievement
TRANSCRIPT
![Page 1: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/1.jpg)
Using Learning Analytics to Assess Innovation & Improve Student Achievement
John Whitmer, [email protected]@johncwhitmer
UK Learning Analytics Network Event (JISC) March 5, 2015
http://bit.ly/jwhitmer-jisc
![Page 2: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/2.jpg)
Quick bio
15 years managing academic technology at public higher ed institutions (R1, 4-year, CC’s)
• Always multi-campus projects, innovative uses of academic technologies
• Most recently: California State University, Chancellor’s Office, Academic Technology Services
Doctorate in Education from UC Davis (2013) with Learning Analytics study on Hybrid, Large Enrollment course
Active academic research practice (San Diego State Learning Analytics, MOOC Research Initiative, Udacity SJSU Study…)
![Page 3: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/3.jpg)
Meta-questions driving my research
1. How can we provide students with immediate, real-time feedback? (esp identify students at-risk of failing a course)
2. How can we design effective interventions for these students?
3. How can we assess innovations (or status quo deployments) of academic technologies?
4. Do these findings apply equally to students ‘at promise’ due to their background (e.g. race, class, family education, geography)
![Page 4: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/4.jpg)
Outline
1. Defining & Positioning Learning Analytics2. A Few Empirical Research Findings
• Understanding Contradictory Outcomes in a Redesigned Hybrid Course (Chico State)
• Creating Accurate Learning Analytics Triggers & Effective Interventions (SDSU)3. How we’re Applying this Research @ Blackboard4. Discussion
4
![Page 5: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/5.jpg)
Economist. (2010, 11/4/2010). Augmented business: Smart systems will disrupt lots of industries, and perhaps the entire economy. The Economist.5
![Page 6: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/6.jpg)
200MB of data emissions annually
Economist. (2010, 11/4/2010). Augmented business: Smart systems will disrupt lots of industries, and perhaps the entire economy. The Economist.6
![Page 7: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/7.jpg)
Logged into course within 24 hours
Interacts frequently in discussion boards
Failed first exam
Hasn’t taken college-level math
No declared major
7
![Page 8: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/8.jpg)
What is learning analytics?
Learning and Knowledge Analytics Conference, 2011
“ ...measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding
and optimizing learning
and the environments
in which it occurs.”
![Page 9: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/9.jpg)
Strong interest by faculty & students
From Eden Dahlstrom, D. Christopher Brooks, and Jacqueline Bichsel. The Current Ecosystem of Learning Management Systems in Higher Education: Student, Faculty, and IT Perspectives. Research report. Louisville, CO: ECAR, September 2014. Available from http://www.educause.edu/ecar.
![Page 10: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/10.jpg)
Source: Educause and AIR, 2012 (2012), http://goo.gl/337mA
![Page 11: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/11.jpg)
2. A Few Empirical Research Findings
![Page 12: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/12.jpg)
Study 1: Understanding Contradictory Outcomes in a Redesigned Hybrid Course (Chico State)
![Page 13: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/13.jpg)
Course redesigned for hybrid delivery in year-long program
Enrollment: 373 students (54% increase largest section)
Highest LMS usage entire campus Fall 2010 (>250k hits)
Bimodal outcomes:• 10% increased SLO mastery• 7% & 11% increase in DWF
Why? Can’t tell with aggregated reporting data
Study Overview
54 F’s
![Page 14: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/14.jpg)
Grades Significantly Related to Access
Course: “Introduction to Religious Studies” CSU Chico, Fall 2013 (n=373)
Variable % VarianceTotal Hits 23%Assessment activity hits 22%Content activity hits 17%Engagement activity hits 16%Administrative activity hits 12%
Mean value all significant variables 18%
![Page 15: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/15.jpg)
LMS Activity better Predictor than Demographic/Educational Variables
Variable % Var.HS GPA 9%URM and Pell-Eligibility Interaction 7%Under-Represented Minority 4%Enrollment Status 3%URM and Gender Interaction 2%Pell Eligible 2%First in Family to Attend College 1%Mean value all significant variables 4% Not Statistically Significant
Gender Major-College
![Page 16: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/16.jpg)
At-risk students: “Over-working gap”
![Page 17: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/17.jpg)
Activities by Pell and grade
Grade / Pell-Eligible
A B+ C C-
0K
5K
10K
15K
20K
25K
30K
35K
Measure Names
Admin
Assess
Engage
Content
Not Pell-Eligible
Pell-Eligible
Not Pell-Eligible
Pell-Eligible
Not Pell-Eligible
Pell-Eligible
Not Pell-Eligible
Pell-Eligible
![Page 18: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/18.jpg)
Activities by Pell and grade
Grade / Pell-Eligible
A B+ C C-
0K
5K
10K
15K
20K
25K
30K
35K
Measure Names
Admin
Assess
Engage
Content
Not Pell-Eligible
Pell-Eligible
Not Pell-Eligible
Pell-Eligible
Not Pell-Eligible
Pell-Eligible
Not Pell-Eligible
Pell-Eligible
Extra effortIn content-related activities
![Page 19: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/19.jpg)
Study 2: Creating Accurate Learning Analytics Triggers & Effective Interventions (SDSU)
![Page 20: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/20.jpg)
20
Study Overview
• President-level initiative
• Goal: identify effective interventions driven by Learning Analytics “triggers”
• Multiple “triggers” (e.g., LMS access, Grade, Online Homework/Quiz, Clicker use)
• At scale & over time: conducted for 3 terms, 5 unique courses, 3,529 students
• “Gold standard” experimental design (control / treatment)
![Page 21: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/21.jpg)
21
Focus on High Need Courses
ANTH 101 COMPE 270 ECON 101 PSY 101 STAT 1190%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
84%
62%69%
78.5%70.8%
16%
38%31%
21.5%29.2%
Repeatable Grades
Non-Repeatable Grades
![Page 22: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/22.jpg)
1. Identify courses and recruit instructors 2. Prior to course start, review syllabus, schedule meaningful
“triggers” for each course (e.g. attendance, graded items, Blackboard use, etc.)
3. Run reports in Blackboard, Online Homework/Quiz software to identify students with low activity or performance (~ weekly)
4. Send “flagged” student in experimental group a notification/intervention
5. Aggregate data, add demographic data. Analyze.
Study Protocol
22
![Page 23: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/23.jpg)
Key Questions
1. Are triggers accurate predictors of course grade?
2. Do interventions (based on triggers) improve student grades?
3. Do these relationships vary based on student background characteristics?
![Page 24: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/24.jpg)
Frequency of interventions (Spring 2014)
# Students Receiving >0 Interventions: PSY: 177 (84%) STAT: 165 (70%)
14%
19%
11%
17%
10%
6% 5% 6%
2% 1%3% 2%
30%
17%
13% 12%7%
6% 6%
3%
4%
1% 2%
4%
0%
5%
10%
15%
20%
25%
30%
35%
0 1 2 3 4 5 6 7 8 9 10 >10
Stud
ents
Interventions
PSY
STAT
![Page 25: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/25.jpg)
Frequency of interventions (Spring 2015)
0 1 2 3 4 5 6 7 8 9 10 11 12 13 140%
5%
10%
15%
20%
25%
30%
35%
40%
45%
Triggers Activated per StudentS(Spring 2015)
Anthro1Anthro3Comp EngrEcon4Psych1Psych2Stats3Stats4
Triggers Activated
Stud
ents
25
![Page 26: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/26.jpg)
Interventions
Spring 2014
Fall 2014
Fall 2015
26
![Page 27: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/27.jpg)
27
A Typical Intervention: “Concerned Friend” tone
… data that I've gathered over the years via clickers indicatesthat students who attend every face-to-face class meeting reduce their chances of getting a D or an F in the class from almost 30% down to approximately 8%.
So, please take my friendly advice and attend class and participate in our classroom activities via your clicker. You'll be happy you did! Let me know if you have any questions.
Good luck,Dr. Laumakis
![Page 28: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/28.jpg)
Poll question
A Not significant
<10%, significant .05 level
20%, significant .01 level
30%, significant .001 level
Did triggers predict achievement? What level significance? How much variation in student grade was explained?
B
C
D
E 50%+, significant .0001 level
28
![Page 29: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/29.jpg)
Poll question
A Not significant
<10%, significant .05 level
20%, significant .01 level
30%, significant .001 level
Did triggers predict achievement? What level significance? How much variation in student grade was explained?
B
C
D
E 50%+, significant .0001 level (Spring 2014, Fall 2014)
29
![Page 30: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/30.jpg)
Statistics
Learning analytics triggers vs. final course pointsSpring 2014: 4 sections, 2 courses, 882 students
Psychology
p<0.0001; r2=0.4828 p<0.0001; r2=0.6558
![Page 31: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/31.jpg)
Fall 2014 results: Almost identical5 Sections, 3 Courses, N=1,220 students
p<0.00001; r2=0.4836
![Page 32: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/32.jpg)
Spring 2015 Results (tentative): lower relationship
8 Sections, 5 Courses, N=1,390 students
p<0.00001; r2=0.28
32
![Page 33: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/33.jpg)
Spring 2015 Results (tentative): lower relationship
8 Sections, 5 Courses, N=1,390 students
p<0.00001; r2=0.28
33
![Page 34: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/34.jpg)
Explained by differences between courses (Spring 2015 Results by Course)
34
![Page 35: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/35.jpg)
So did the interventions make a difference in learning outcomes?
![Page 36: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/36.jpg)
Control (STAT)
Exp. (STAT)
Exp. (PSY) Control (PSY)
0%10%20%30%40%50%60%70%80%90%
100%
79% 79%
89%82%
21% 21% 11% 18%
Experimental Participation vs.
Repeatable Grade (Spring 2014)
Repeati-ble Grade
Passing Grade
36
![Page 37: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/37.jpg)
No Interventions _x000d_(n=87, PSY,
Pell-Eligible)
Interventions _x000d_(n=81, PSY,
Pell-eligible)
0%10%20%30%40%50%60%70%80%90%
100%
77%
91%23%9%
Experimental Participation vs. Repeatable Grade (Pell-El-
igible) (n=168, Spring 2014, PSY
101) Passing Grade
Repeati-ble Grade
24 additional Pell-eligible students would have passed the class
if the intervention was applied to all participating students.
37
![Page 38: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/38.jpg)
Fall 2014 / Spring 2015 Intervention Results:
No Significant Difference Between Experimental/Control Groups.
38
![Page 39: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/39.jpg)
One Explanation: Low Reach
39
Fall 2014 (n = 1,220)
Row Labels # Triggers Message Open Rate Clickthrough Rate
Econ1 8 76% 36%Psych1 6 70% 29%Psych2 7 69% 35%Stat3 9 62% 25%Stat4 8 65% 27%Grand Total 38 68% 30%
Spring 2015 (n = 1,138)
Row Labels # Triggers Message Open Rate Clickthrough Rate
Anthro-In Person 17 57% 10%Anthro-Online 7 71% 35%Comp Engineering
15 52% 14%
Econ 15 44% 13%Psych1 17 60% 13%Psych2 17 63% 13%Stat3 21 64% 9%Stat4 20 55% 5%Grand Total 129 58% 12%
![Page 40: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/40.jpg)
Proposed Next Steps
Add interventions that move “beyond informing” students to address underlying study skills and behaviors
Supplemental Instruction <http://www.umkc.edu/asm/si/>
Adaptive Release within online courses (content, activities)
40
![Page 41: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/41.jpg)
1. Data from academic technology use predicts student achievement; diverse sources provide better predictions.
2. Tech use > demographic data to predict course success; adding demographic data provides nuanced understandings and identifies trends not otherwise visible.
3. Academic technology use is not a “cause” in itself, but reveals underlying study habits and behaviors (e.g. effort, time on task, massed vs. distributed activity).
4. Predictions are necessary, but not sufficient, to change academic outcomes. Research into interventions is promising.
5. We’re at an early stage in Learning Analytics; expect quantum leaps in the near future.41
Conclusions and Implications
![Page 42: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/42.jpg)
4. How we’re Applying this Research @ Blackboard
42
![Page 43: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/43.jpg)
Improved instrumentation for learning activity within applications
Blackboard’s “Platform Analytics” Project
A new effort to enhance our analytics offerings across our academic technology applications that include
Applied findings from analysis (inc. inferential statistics and data mining)
Integrated analytics into user experiences(inc. student and faculty)
Aggregated usage data across cloud applications (anonymized, rolled-up, privacy-compliant)
![Page 44: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/44.jpg)
![Page 45: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/45.jpg)
![Page 46: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/46.jpg)
Blackboard Analytics for Learn
![Page 47: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/47.jpg)
3. Wrap-Up and Discussion
![Page 48: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/48.jpg)
Factors affecting growth of learning analytics
Enabler
Constraint
WidespreadRare
New education models
Resources ($$$, talent)
Data governance (privacy, security, ownership)
Clear goals and linked actions
Data valued in academic decisions
Tools/systems for data co-mingling and analysis
Academic technology adoption
Low data quality (fidelity with meaningful learning)
Difficulty of data preparation
Not invented here syndrome
![Page 49: Using Learning Analytics to Assess Innovation & Improve Student Achievement](https://reader030.vdocument.in/reader030/viewer/2022032502/55ba553cbb61ebb5538b46aa/html5/thumbnails/49.jpg)
Call to action [with amendments](from a May 2012 Keynote Presentation @ San Diego State U)
You’re not behind the curve, this is a rapidly emerging area that we can (should) lead... [together with interested partners]
Metrics reporting is the foundation for analytics [don’t under or over-estimate the importance]
Start with what you have! Don’t wait for student characteristics and detailed database information; LMS data can provide significant insights
If there’s any ed tech software folks in the audience, please help us with better reporting! [we’re working on it and feel your pain!]