a window on assessment · 2019. 10. 11. · were able to tell me that history on their final exam....

7
Top Five Reasons Faculty Should Learn to Love Program Assessment By Judi Kusnick, Ph. D., Faculty Consultant, Office of Academic Program Assessment Professor of Geology At the Office of Academic Program Assessment (OAPA), we get a unique view of assessment efforts around the campus and the impact that successful program assessment can have on a department or program. But we also believe that individual faculty members should embrace program assessment for lots of reasons. So here, in the grand tradition of Top Five lists, is the countdown of why faculty members should learn to love program assessment: 5. Losing that sinking feeling. I’m sure other instructors have experienced that unpleasant sensation in the pit of your stomach when you are grading an exam or an assignment and realize that the students just didn’t get it. For me, that moment was several years ago when after spending two weeks working on deciphering the geologic history of the Appalachian Mountains, almost none of the students were able to tell me that history on their final exam. As a program, my department has had that collective sinking feeling when reviewing some of our assessment data. For example, we discovered a few years ago that very few of our graduating seniors were able to draw accurate geologic cross-sections – a fundamental skill of a working geologist. What on earth happened in our instructional sequence that somehow a majority of our students were not operating at the level that we assumed they were? We ultimately uncovered the cause of both our cross-section problem and my Appalachian problem – that the students did not have enough opportunities to practice using their geologic problem-solving skills. I had assumed that students had a working knowledge of rocks that in fact they did not have; our field instructors assumed that students were experienced in drawing cross- sections, when in fact we had provided very few opportunities across the curriculum for students to practice this skill and receive feedback on their work. Once the students were given more practice in the needed skills, both the Appalachian problem and the cross-section problem vanished, taking with them that sinking feeling of failure. 4. Having more fun teaching. We all know curmudgeonly faculty who rant on about the failings of students today. Perhaps a few folks just like to complain, but I think most of us really want to teach students who are prepared and ready to learn. Well-prepared students are simply more fun to teach than poorly prepared students. Students who have mastered the content of their prerequisite classes can tackle more interesting problems and have more interesting discussions in class. Students who have Fall 2015 | Issue 16 Office of Academic Program Assessment (OAPA) Dr. Amy Liu, Director Location: South-End Library 67 Phone: 916-278-2497 Email: [email protected] Monday - Friday Phone: 916-278-2497 | [email protected] Contact us anytime for updates or support! Visit us at: csus.edu/programassessment A WINDOW ON ASSESSMENT C A L I F O R N I A S T A T E U N I V E R S I T Y , S A C R A M E N T O O F F I C E O F A C A D E M I C A F F A I R S (continued on next page) DR. JUDI KUSNICK

Upload: others

Post on 28-Sep-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: A WINDOW ON ASSESSMENT · 2019. 10. 11. · were able to tell me that history on their final exam. As a program, my department has had that collective sinking feeling when reviewing

Top Five Reasons Faculty Should Learn to Love Program AssessmentBy Judi Kusnick, Ph. D., Faculty Consultant, Office of Academic Program Assessment Professor of Geology

At the Office of Academic Program Assessment

(OAPA), we get a unique view of assessment efforts

around the campus and the impact that successful

program assessment can have on a department or

program. But we also believe that individual faculty

members should embrace program assessment for

lots of reasons. So here, in the grand tradition of Top

Five lists, is the countdown of why faculty members

should learn to love program assessment:

5. Losing that sinking feeling. I’m sure other

instructors have experienced that unpleasant

sensation in the pit of your stomach when you are

grading an exam or an assignment and realize that

the students just didn’t get it. For me, that moment

was several years ago when after spending two weeks

working on deciphering the geologic history of the

Appalachian Mountains, almost none of the students

were able to tell me that history on their final exam.

As a program, my department has had that collective

sinking feeling when reviewing some of our

assessment data. For example, we discovered a few

years ago that very few of our graduating seniors

were able to draw accurate geologic cross-sections

– a fundamental skill of a working geologist. What

on earth happened in our instructional sequence

that somehow a majority of our students were not

operating at the level that we assumed they were?

We ultimately uncovered the cause of both our

cross-section problem and my Appalachian

problem – that the students did not have enough

opportunities to practice using their geologic problem-solving skills. I had assumed that students had a working knowledge of rocks that in fact they did not have; our field instructors assumed that students were experienced in drawing cross-sections, when in fact we had provided very few opportunities across the curriculum for students to practice this skill and receive feedback on their work. Once the students were given more practice in the needed skills, both the Appalachian problem and the cross-section problem vanished, taking with them that sinking feeling of failure.

4. Having more fun teaching. We all know curmudgeonly faculty who rant on about the failings of students today. Perhaps a few folks just like to complain, but I think most of us really want to teach students who are prepared and ready to learn. Well-prepared students are simply more fun to teach than poorly prepared students. Students who have mastered the content of their prerequisite classes can tackle more interesting problems and have more interesting discussions in class. Students who have

Fall 2015 | Issue 16

Office of Academic Program Assessment (OAPA) Dr. Amy Liu, Director Location: South-End Library 67 Phone: 916-278-2497 Email: [email protected]

Monday - Friday Phone: 916-278-2497 | [email protected]

Contact us anytime for updates or support!

Visit us at: csus.edu/programassessment

A WINDOW ON

ASSESSMENTC A L I F O R N I A S T A T E U N I V E R S I T Y , S A C R A M E N T O O F F I C E O F A C A D E M I C A F F A I R S

(continued on next page)

DR. JUDI KUSNICK

Page 2: A WINDOW ON ASSESSMENT · 2019. 10. 11. · were able to tell me that history on their final exam. As a program, my department has had that collective sinking feeling when reviewing

FACULT Y AND OAPA

CONSULTANTS VIEW A

PRESENTATION ABOUT

THE PARADIGMATIC

SHIFT FROM TEACH-

ING TO LEARNING AND

FROM INDIRECT TO

DIRECT ASSESSMENT.

mastered the fundamentals of writing can bring those skills to bear on more engaging tasks.

Program assessment is one of the best tools to improve the ability of students to tackle more challenging work. By measuring student learning, finding the holes in their knowledge and skills, and working backwards

through curriculum maps to improve curriculum and instruction to plug those holes, we create better-prepared students. It’s just way more fun to tell my husband the stories of what amazing things the students did today, than to whine about how the students can’t write (or recognize rocks, or whatever other hole in their knowledge popped up today).

3. Being part of a well-functioning team. Too many faculty members do most of their jobs in solitary. We rarely get to collaborate with our colleagues around teaching and learning. Instead our notion of what it means to work with our colleagues is often just our committee work, which (honestly) can be less than rewarding.

My closest friends and colleagues on campus are the people I do meaningful work with. It is tremendously satisfying to work with other like-minded folks on a common vision, especially when you can experience success in those efforts. For me, most of this work occurs outside my department – in my research life with my research group, in my work with the Center for Teaching and Learning, and here at the Office of Academic Program Assessment. But program assessment has provided opportunities for that same kind of satisfying collaboration within my department. In setting goals for students, deciding on measurement tools and analyzing data, I get to engage in an intellectual consideration of teaching and learning with folks in the neighboring offices. We are an unusually close-knit and collegial department, and working on improving our program helps to strengthen those bonds. Being part of a team, not just a department, makes my job so much more satisfying.

2. Controlling your own destiny. Student success in higher education is an issue that is here to stay. It’s not unreasonable for the folks who fund us – the

legislature and the people they represent – to ask

what they are getting for their money. The same

situation faced the K-12 world over a decade ago,

and the result was a high-stakes accountability

system of testing students that was imposed from

the outside. Wouldn’t we rather retain control of how

we measure student success and how we take steps

to improve success rates? Program assessment offers

us exactly that mechanism. Programs can decide

what they value in student learning, how to measure

it, and what steps to take to improve learning.

On an individual level, I have never understood

faculty members who complain about student

performance but also resist program assessment.

The way I look at it, you can be a victim of how

things are, or you can take control and work on

making things how you want them to be. Program

assessment gives you a tool for controlling your

teaching world.

1. And most importantly – students matter.

Program assessment is ultimately about students

– about improving student learning, about giving

students the best educational experience that

we can. We are all here because of the students.

Cynical observers sometimes say K-12 schools are

institutions for children run for the benefit of adults.

I truly hope that we all see the university as existing

for the benefit of the students, not for the faculty

and staff. I don’t mean that the faculty should not

be satisfied in their jobs, or that faculty needs are

not important. But the students should always be

the center of the institution. Program assessment

serves the students by putting their learning at the

middle of our efforts, and serves the institution by

helping to make us the best we can be at serving

our student population – not some mythical student

body we might imagine, but the students in front

of us. We are privileged to teach a pretty amazing

population of students with an incredibly diverse set

of backgrounds, life histories and abilities. Program

assessment helps us tune our curriculum and

instruction to best align with our students’ needs,

and give them the education that they deserve.

Top Five Reasons (continued)

2 | A Window on Assessment

Page 3: A WINDOW ON ASSESSMENT · 2019. 10. 11. · were able to tell me that history on their final exam. As a program, my department has had that collective sinking feeling when reviewing

Program Assessment FLC: A Great Tool for Faculty Teaching & Research DevelopmentBy Amy Liu, Professor of Sociology & Director, OAPA Lynn Tashiro, Professor of Physics & Director, Center for Teaching & Learning Chia-Jung Chung, Professor of Education & Faculty Consultant, OAPA

The Program Assessment Faculty Learning

Community (FLC) on Critical Thinking and

Global Perspectives enabled faculty to not only

acquire assessment knowledge, but also practice

programmatic assessment skills in a supportive

environment. This FLC collaboratively engaged

faculty from Nursing, Library, Criminal Justice,

Foreign Languages, and Education over a sustained

period of time (Feb. 2015 to Dec. 2015) to develop

and refine skills in assessment and research. These

skills enable faculty to make evidence-based

changes in courses and programs, and to engage

in the scholarship of teaching. The FLC began

with creation and consensus building around two

measurable program outcomes: Critical Thinking

and Global Perspectives. The goal of this FLC was to

produce exemplars of the specific tools (curriculum

maps, signature assignments, rubrics, etc.) and data

(student work) needed for meaningful programmatic

assessment and develop faculty leadership in the

disciplines to propagate this practice.

This FLC was a great success. It was successful in

navigating some administrative boundaries and

political territories to create a trusting community

of faculty who were positively engaged in the work

of assessment. Program assessment projects were

presented and well received. Some faculty members

have been invited to present their work at state and

national conferences. Exit survey evaluations from

the FLC participants were also very positive and are

summarized below:

“One idea that I want to remember from this activity is:”

• Concrete data presentation is really helpful to

have a deeper understanding

• I will need to revise my data analysis after all

students’ reflections are graded

• Assessment is difficult, but there are ways to

gather data and evaluate student learning and

these are getting better

• The importance

of beginning

assignment design

with well-written

learning outcomes

• Including student samples in final report

“One way that I can use what I learned from this activity is:”

• I will select “best” representative data from the

collected data

• Introduce more curricular changes on cultural

competence. It is lacking in the nursing

curriculum.

• Recruitment of new faculty to CTL activities and FLCs

• To implement assessment as a tool for curricular

improvement

• To explore the results of assessment using

different rubrics

• Using assessment to improve student learning

“One thing I would like to learn more about is:”

• How to analyze the data

• How to select the best sample (Benchmark

sample)

• The use of FLC in publication

• What is happening on a larger scale in other

departments than those represented here

• Methods for assessment of quantitative data

• More effective ways to conduct assessment

Comments:

• Thank you for all the encouragement

• Thank you for organizing this FLC. It was very

informative

• Very interesting FLC

• I learned so much!

• Thank you, Amy

In summary, the program assessment faculty

learning community was and is a great tool for

faculty teaching and research development.

DR. AMY LIU

DR. LYNN TASHIRO

OAPA DIRECTOR, DR.

AMY LIU (STANDING),

GIVES A PRESENTATION

TO FACULT Y REGARD-

ING THE DEVELOP-

MENT AND UPDATE

OF SIMPLE AND CLEAR

ASSESSMENT PLANS.

Fall 2015 | 3

Page 4: A WINDOW ON ASSESSMENT · 2019. 10. 11. · were able to tell me that history on their final exam. As a program, my department has had that collective sinking feeling when reviewing

SharePoint Technology and Program Assessment By Chia-Jung Chung, Ph. D., Faculty Consultant, Office of Academic Program Assessment Professor of Education

This is my second year working as a faculty consultant in the Office of Academic Program Assessment (OAPA), where I enjoy all my consultant duties and technology related projects. Within the office, annual assessment reports are read and discussed by teams of faculty. In my academic position, I have been teaching my students how to apply technology to enrich their instruction and to make their own and their students’ lives easier. Because OAPA collects many documents and a huge volume of data, we have applied technology to make assessment easier for everyone on campus. This long awaited improvement is finally seeing some fruition!

Annual assessment has become an integral part of the program review process, which is a critical component of the University’s WASC accreditation. The most pressing need has been to develop a data management, collection, and storage system that is easy for our faculty to use to store annual assessment data. It is also important for the public and accreditation agencies to understand our assessment results, not just for the year the data is collected, but also for tracking multiyear trends and progress in student learning.

Our journey at OAPA to seek better technological solutions for assessment never stops. We have searched for the last few years, but never found a good solution until now. In spring 2015, we first heard of two Microsoft programs, InfoPath and SharePoint, but had no idea what they were and how to use them. We began to experiment with both in late spring and in

the summer in order to explore and discover the benefit and utility of this software for assessment of student learning on campus.

Throughout the summer, development of the assessment protocol and materials has made much progress towards a redesigned assessment template using InfoPath and SharePoint. With about twenty 2014-2015 assessment reports from the College of Education as a pilot, we tested the capabilities and user-interface of the InfoPath template, uploaded reports, filled them out in SharePoint, and exported database results to Microsoft Excel where they were further tested. Fig. 1 shows the changes in the assessment process from the prior academic year to the present (15/16).

Using these two already available resources we have created a high quality electronic data collection system that allows easy information gathering, storage, and analysis. It will enhance the abilities of the academic programs and the university to collect and summarize program and university learning outcomes and their assessment efforts, processes, and results. This will increase assessment continuity, avoid confusion, and reduce workload for the faculty and staff in the departments and colleges in the long run, and allow academic and other university units to use historical and cross-sectional data to promote and improve our programs and to help the general public, policy makers, WASC, and other accreditation agencies to understand student learning on our campus.

The OAPA SharePoint site was officially launched in October 2015 and a training workshop was also offered twice in the same month. The newly developed SharePoint site has made the assessment process much simpler and clearer!

This is not the end of our journey. We will continue our effort to make sure this technology solution addresses our assessment needs in the future.

“Using these two already available

resources we have created a high

quality electronic data collection

system that allows easy information

gathering, storage, and analysis.”

DR. CHIA-JUNG CHUNG

AN ASSORTMENT OF SHAREPOINT SOFT WARE

BROCHURES, FEEDBACK REPORTS, AND UNIVERSITY

AND COLLEGE STATISTICAL ASSESSMENT SUMMARIES.

4 | A Window on Assessment

Page 5: A WINDOW ON ASSESSMENT · 2019. 10. 11. · were able to tell me that history on their final exam. As a program, my department has had that collective sinking feeling when reviewing

Activities and Workshops at the Office of Academic Program Assessment (OAPA)During the year, the Office of Academic Program Assessment consultants meet every week to discuss changes and review assessment reports in an effort to develop more meaningful, simple, and clear support for faculty. At OAPA workshops, faculty were invited to learn how to better assess their students’ learning as well as how to use some of our new assessment software. Here at OAPA, we produce many materials that help faculty to 1) improve the quality of assessment here on campus; 2) learn from our descriptive statistics concerning assessment; 3) use our suggestions for further improvement; and 4) write the annual assessment report. Under the guidance of our assessment consultants Dr. Amy Liu, Dr. Chia-Jung Chung, Dr. Judi Kusnick, and Dr. Elizabeth Strasser, the Office of Academic Program Assessment strives to significantly enhance student learning and program assessment on this campus.

Assistant Vice President Interim of Academic Programs

and Educational Effectiveness, Dr. Don Taylor (standing),

explains the assessment guidelines to faculty.

“Under the guidance of our assessment consultants, the Office

of Academic Program Assessment will enhance significantly

student learning and program assessment on this campus.”

Fall 2015 | 5

Page 6: A WINDOW ON ASSESSMENT · 2019. 10. 11. · were able to tell me that history on their final exam. As a program, my department has had that collective sinking feeling when reviewing

OAPA DIRECTOR, DR.

AMY LIU, CONSULTANT,

DR. CHIA-JUNG

CHUNG, AND

GRADUATE ASSISTANTS

(CHRISTIAN AND PAUL

SCHOENMANN) MEET

WITH NEW OAPA

CONSULTANT, DR.

MICHAEL WRIGHT.

A student’s Journey and Commitment to Assessment at OAPAChristian Ian Schoenmann, Sociology Graduate Assistant

It has only been one year since I began my position as a graduate student assistant in the Office of Academic Program Assessment (OAPA) and I have been a part of and witnessed great development. Personally, I have learned so much about not only academic assessment, but of the methodology of high quality needs-assessment techniques – much of which relates very closely with my discipline of sociology. Under the guidance of director and professor Dr. Amy Liu, as well as the combined effort of our assessment consultants, Drs. Elizabeth Strasser, Judi Kusnick, and Chia-Jung Chung, along with my brother and fellow assistant Paul Schoenmann, OAPA itself has refined many aspects of its data collection, management, and analysis and its means of assisting faculty and administration, and most importantly, of supporting the quality of student learning and success here at Sacramento State University. Because of the university’s mission to conduct assessment in this digital era and to provide opportunities for professional development, our office has implemented new technology that will meet this need and have shifted our focus from reactive feedback to proactive assistance and training.

Over the summer, I worked with OAPA consultant Dr. Chung and director Dr. Liu to help transition our office and its collection and reporting processes into the digital era. As a team we worked to develop and pilot new internet browser software that allows faculty to enter information into, store, update, and share program assessment components and reports online through Microsoft’s convenient and private internet browser-based application called SharePoint. This is a much welcomed and streamlined system as our office will operate the assessment section of the browser-based

application. We will collect information using uploaded and filled out assessment templates before generating, from what I have witnessed in just this summer, extremely useful and high quality data that can be used in the improvement and strategic planning of programs, but also the entire university as a whole. I am honored to say that I have been pivotal in working on this crucial and excellent development and remain integral in the maintenance and operation of this project.

As an assistant of both the office and of the research it conducts, I have worked extensively in the development of new assessment templates; aided in the creation of reliable assessment questions; collected, analyzed, and reported on qualitative and quantitative answers from faculty measuring student learning using the Statistical Package for the Social Sciences (SPSS); designed and maintained internet-based assessment and assessment website material; and have helped in the precision operation of OAPA operations. Through the dedication of our director, consultants, and assistants, we have evolved as a team into much more than an office for collecting and reporting information; we have progressed in strategy, understanding, and technology that are more proactive and useful to our most important and valued partner, the faculty. This is all the result of our determination to enhance and aid in student learning, student success, and in our university’s exceptional professors. Additionally, Drs. Liu, Chung, Strasser, and Kusnick have so many faculty at our workshops and meetings that it is a delight to see the proactive progress right before my eyes.

I am pleased to be a part of this development and of the rich work our office produces. From where I stand starting my second year at OAPA, I can see only great progress in the quality of student learning and in the assessment of academic programs on this campus. I enjoy the work I do and love to see how my effort in meeting the needs of the office, and of my own personal deadlines and goals, has influenced the office and what we produce in positive ways. I look forward to another year in the realm of assessment, education, and improvement, and of the continued leadership and lessons of our director, Dr. Amy Liu, and consultants, Drs. Strasser, Kusnick, and Chung, as well as our new consultant, Dr. Michael Wright.

CHRISTIAN IAN

SCHOENMANN

6 | A Window on Assessment

Page 7: A WINDOW ON ASSESSMENT · 2019. 10. 11. · were able to tell me that history on their final exam. As a program, my department has had that collective sinking feeling when reviewing

Resources at OAPAThe following articles, books, and websites will help you to improve program review and assessment:

1. 2014-2015 Annual Assessment Reports and their Feedback http://www.csus.edu/programassessment/annual-assessment/2014-15Assessment.html

2. 2013-2014 Annual Assessment Reports and their Feedback http://www.csus.edu/programassessment/annual-assessment/2013-14Assessment.html

3. 2012-2013 Annual Assessment Reports and their Feedback http://www.csus.edu/programassessment/annual-assessment/2012-13Assessment.html

4. 2011-2012 Annual Assessment Reports and their Feedback http://www.csus.edu/programassessment/annual-assessment/AssessmentArchive.html 

5. 5) AAC& U’s 16 VALUE Rubrics http://www.aacu.org/value/rubrics/index_p.cfm?CFID=41012296&CFTOKEN=24714954

6. Degree Qualification Profile (DQP) http://www.luminafoundation.org/publications/The_Degree_Qualifications_Profile.pdf

7. Ewell, P. (2013). The Lumina Degree Qualifications Profile {DQP}: Implications for Assessment. National Institute for Learning Outcomes Assessment. http://www.learningoutcomeassessment.org/documents/EwellDQPop1.pdf

8. WASC 2013 Handbook of Accreditation http://www.wascsenior.org/annoucements/2013-handbook-accreditation-finalized

Fall 2015 | 7