crowd-sourcing innovative practices: assessing integrative learning at large research institutions

31
Crowd-Sourcing Innovative Practices: Assessing Integrative Learning at Large Research Institutions

Upload: alvin-patterson

Post on 17-Dec-2015

215 views

Category:

Documents


1 download

TRANSCRIPT

Crowd-Sourcing Innovative Practices:Assessing Integrative Learning at Large Research Institutions

Mo Noonan Bischof

Assistant Vice Provost

[email protected]

Amy Goodburn Associate Vice [email protected]

Nancy MitchellDirector, Undergraduate [email protected]

LEAP Integrative Learning

Synthesis and advanced accomplishment across general and specialized studies

demonstrated through the application of knowledge, skills, and responsibilities to new settings and complex problems.

Challenge: Assessing Integrative Learning

• Can/Should the same assessment tools be used for assessing within a course, a unit, and/or institution?

• Does it apply to integrating knowledge and skills within a discipline, among disciplines, or both?

• How can we align quality improvement levels while respecting disciplinary purposes & values?

21,615 employees… 2,177 faculty 1,635 instructional academic staff 1,261 research academic staff 5,291 graduate assistants

UW-Madison Learning Community

42,820 students …29,118 undergraduates 9,183 graduate students 2,774 professional students 1,745 Non-degree students

Annually:7,400 new undergraduates 29,500 enrolled undergraduates6,500 Bachelor’s degree graduates

More than 300

200-299

100-199

50-99

1-49

Annual Degrees

13 academic schools/collegesdistributed responsibility and governance

~500 academic programs, all levels134 Bachelor’s level degree programs

Program-level learning

goals, assessments

Program-level learning

goals, assessments

Program-level learning

goals, assessments

Institutional-level learning goals, assessments

Includes WI-X and ELO’s

Program-level learning

goals, assessments

Program-level learning

goals, assessments

Why pilot the AAC&U VALUE Rubrics?

• Identified gap: institutional level assessment, direct measure approach

• Evaluates student learning across programs• Aligns with AAC&U Essential Learning Outcomes• Aligns with VSA/College Portrait demonstration

project• First pilot project summer 2012, second pilot 2013

• Main Goal: bring faculty across disciplines together to evaluate student work

AAC&U VALUE Rubric Project

Scorers

ArtifactsRubrics

• Cohort of 25 faculty• Cross-disciplinary representation• Focus on faculty engagement

• AAC&U VALUE written communication rubric

• “Value-added” approach to compare first year students and students near graduation

Written Communication VALUE Rubric

Selected written communication for ease of identifying artifacts across disciplines/programs

Dimensions:• Context and Purpose for Writing• Content Development• Genre and Disciplinary Convention• Sources and Evidence• Control of Syntax and Mechanics

Artifacts: “Value-added” Approach

• Goal was to collect 350 artifacts at each level, FYR and NGR

• Identified 52 courses that had high numbers of FYR and NGR and seemed likely to have a suitable writing assignment

• 22 courses (41 instructors) had a suitable assignment and agreed

• Invited 2450 students to submit artifacts• Collected 451 submissions

Scorers: Faculty Engagement • 1.5 day workshop in June 2013

• Set ground rules

• 3 structured rounds intended to get faculty familiar with the rubric and to “test” scorer agreement

• Asked faculty to think beyond their field/discipline

• Each scorer rated about 40 artifacts

• Discussion revealed challenge with the 4-point scale and what is “mastery”

Scorers

Artifacts

Rubrics

Table 1. Overall Results for All Artifact Scores Rubric Dimension

Student Group

# of Artifacts

Mean Std Dev Zmw Score

Context Nearly Graduating

213 2.95 0.95 3.05*

First Year 237 2.77 Content Nearly

Graduating 213 2.79 0.96 4.68*

First Year 237 2.48 Genre Nearly

Graduating 211 2.69 0.88 2.65*

First Year 235 2.50 Sources Nearly

Graduating 190 2.61 0.99 1.54

First Year 225 2.50 Syntax Nearly

Graduating 213 2.82 0.84 2.16*

First Year 237 2.69 *Zmw score is from the Mann Whitney U-Test. Zmw scores >1.96 indicate that the two groups are significantly different at p=0.05.

1 2 3 40.0

10.0

20.0

30.0

40.0

50.0

60.0

3.8

27.3

51.0

17.5

2.6

22.6

44.4

30.3

Table 1. Distribution of Combined Scores - Written Communication Rubric

First-Year Students Nearly Graduating Students

Perc

ent o

f Sco

res

Summary Findings• Percent of nearly graduating students who were judged

proficient or better (a score of 3 or 4 on 4 point scale) on each of the dimensions was fairly high—ranged from 64%-83%. Across all dimensions: 74.7%

• Levels of significant difference between first-year and nearly graduating students were weak

• Inter-scorer reliability was problematic (“mastery” issue…)– Overall 67% of scorer pairs showed weak agreement or– Systematic disagreement

What did we learn?

• Importance of assignment (artifact) development• Adapt rubric: program mix and/or campus culture

(language, LOs)• Engagement of faculty = high quality discussions

(ground rules/calibration)• Next Steps: continue to engage faculty at program

and disciplinary levels

Contact InformationMo Noonan Bischof, Assistant Vice Provost, University of Wisconsin-Madison, [email protected] More about our project: http://apir.wisc.edu/valuerubricproject.htm

University of Nebraska-LincolnResearch One, Big Ten Conference, Land-Grant

24,000 students8 independent colleges

Achievement-Centered Education (ACE)

• 10 Student Learning Outcomes (30 credits)• 600 courses across 67 departments• Transferable across 8 colleges• Requires assessment of collected student work

UNL Assessment Context• Review of each ACE course on 5-year cycle• Biennial review of all undergrad degree programs• 50 disciplinary program accreditations • 10-year North Central/HLC accreditation

ACE 10Generate a creative or scholarly product that requires broad knowledge, appropriate technical proficiency, information collection, synthesis, interpretation, presentation, and reflection.

HLC Quality Initiative: ACE 10 Project

25 faculty across colleges meet monthly to• Explore methods and tools for assessing work • Develop a community to share ideas • Connect ACE 10 & degree program assessment• Develop process for creating assessment report• Create team of assessment “ambassadors”

Discussing Assessment Practices

A Common Rubricdisciplinary vs. institutional goals

Inquiry Project Results• Abandoned idea to pilot a common rubric • Revised syllabus to focus on processes, not tools

• Developed poster session for public sharing• Streamlined ACE & program review processes

• Creating process for 5-year ACE program review

Group Discussion• How do you address

differences across disciplinary norms and cultures?

• How can program/

disciplinary assessments inform institutional assessment and vice versa?

• What strategies can you use to develop shared goals and understanding?

• What are some effective practices for supporting and sustaining faculty and staff engagement?