minutes of the tenth amcoa meeting, may 1st, 2012 -- framingham state university

27
Minutes of the Tenth AMCOA Meeting, May 1st, 2012 Prepared by Kerry McNally Host Campus: Framingham State University I. Attendance The tenth AMCOA meeting was hosted by Framingham State University (FSU) from 10:00 a.m.-1:00 p.m. on May 1st, 2012. Representatives from 23 institutions attended the meeting (see list in Appendix A), and Peggy Maki, Consultant under the Davis Educational Foundation Grant awarded to the Department of Higher Education, opened and chaired the meeting. Peggy thanked Ellen Zimmerman and FSU for hosting the meeting. II. Welcome: Vice President for Academic Affairs Dr. Linda Vaden-Goad, Framingham State University (FSU) Vice President Linda Vaden-Goad welcomed AMCOA Team members to FSU and this month’s meeting featuring an hour and a half panel discussion on Quantitative - 1 -

Upload: advancing-a-massachusetts-culture-of-assessment

Post on 14-Jun-2015

87 views

Category:

Education


1 download

DESCRIPTION

This document contains the minutes from the tenth AMCOA meeting that occurred on May 1st, 2012 at Framingham State University

TRANSCRIPT

Page 1: Minutes of the Tenth AMCOA Meeting, May 1st, 2012 -- Framingham State University

Minutes of the Tenth AMCOA Meeting, May 1st, 2012

Prepared by Kerry McNally

Host Campus: Framingham State University

I. Attendance

The tenth AMCOA meeting was hosted by Framingham State University (FSU)

from 10:00 a.m.-1:00 p.m. on May 1st, 2012. Representatives from 23

institutions attended the meeting (see list in Appendix A), and Peggy Maki,

Consultant under the Davis Educational Foundation Grant awarded to the

Department of Higher Education, opened and chaired the meeting.

Peggy thanked Ellen Zimmerman and FSU for hosting the meeting.

II. Welcome: Vice President for Academic Affairs Dr. Linda Vaden-Goad,

Framingham State University (FSU)

Vice President Linda Vaden-Goad welcomed AMCOA Team members to FSU

and this month’s meeting featuring an hour and a half panel discussion on

Quantitative Reasoning. She pointed to FSU’s new community garden, which

was visible from the meeting location, and compared it with educational

learning communities. “We’re ‘assessing’ the garden’s impact on the

community. Similarly, educational assessment is important, particularly how

we offer learning to the community. It is important to look at the

effectiveness of what we do. Sometimes we don’t know the impact of what

we do for years. It is essential to be doing outcome assessment as we go

- 1 -

Page 2: Minutes of the Tenth AMCOA Meeting, May 1st, 2012 -- Framingham State University

along.” Dr. Vaden-Goad thanked the AMCOA Team for coming to FSU and for

all it is doing in the area of improving assessment methodologies.

Peggy Maki then welcomed the AMCOA Team to the final meeting of the

2011-2012 Academic Year. She told the Team that she would be sharing

responsibilities next year with the co-chairs because she will be reducing her

time on the project. “It has been my honor,” she said, “to work with you on

this project. It has been a highlight of my professional life.”

III. Introduction of Presenters, “Perspectives on and Experiences with Assessing

Quantitative Reasoning”: Peggy Maki

Peggy Maki said that it is important to hear from the people who are involved

in Quantitative Reasoning (QR) and learn about their experiences. She then

introduced the panelists:

Mark Pawlak, Director, Academic Support Department: Academic Support

Programs, University of Massachusetts Boston

Ellen Wentland, Associate Dean of Academic & Institutional Effectiveness,

Northern Essex Community College

John Donnellan, Professor of Business Administration, Holyoke Community

College

Richard Eells, Professor, Department of Math, Science & Technology, Roxbury

Community College

Alex Asare, Professor, Department of Math, Science & Technology, Roxbury

Community College

Christopher Cratsley, Director of Assessment, Fitchburg State University

Dawne Spangler, Director of the Center for Teaching, Learning and

Assessment, North Shore Community College

- 2 -

Page 3: Minutes of the Tenth AMCOA Meeting, May 1st, 2012 -- Framingham State University

A. Mark Pawlak (MP), Director, Academic Support Department: Academic

Support Programs at UMass Boston opened the panel discussion. At the

outset he told the group that because of time restraints, he would

distribute his PowerPoint presentation at a later date, rather than showing

it at the meeting. A copy of Mark’s PowerPoint presentation is attached as

Appendix B.

He has worked for 14 years in QR assessment for an entry-level college-

level course. Initially, people were failing College Algebra and Calculus,

mostly those from the Liberal Arts majors. He and the University wanted to

give them a more authentic experience, so he developed a QR course that

incorporated statistics, numeracy, probability, etc., drawn from newspaper

articles and authentic problems. The students used technology in a robust

fashion. Students also learned why it is valuable to calculate percentages and

why a little Algebra can be useful.

Regarding the VALUE rubrics, UMass Boston is not there yet, but it is

incorporating some of them. Its QR program currently uses Blackboard, which

the Math faculty are accustomed to. Over the years UMB has migrated to a

portfolio, consisting of entries from a common final exam and a student

questionnaire. Faculty are allowed to add questions to the final exam. UMB

does a holistic evaluation of the exam to see if faculty are achieving agreed

upon assessment goals.

AMCOA Members asked the following questions:

What percentage of students is taking the course? We teach 11-12 sections

in the summer, and about 500 students take it per semester. At UMB QR is

required, and this course fulfills the requirement. The General Education QR

committee oversees this requirement. When new courses are approved, they

- 3 -

Page 4: Minutes of the Tenth AMCOA Meeting, May 1st, 2012 -- Framingham State University

follow a set of guidelines for approval. The outcomes are understood by

everyone teaching the course. Mark Pawlak designs the outcomes and then

gives them to faculty for final review.

Do you follow up on the students later? We don’t do that formally, but we

do get anecdotal information. We have gotten input from faculty in

economics and social science statistics courses. We have also learned that

passing the placement test is not good enough to go right into Statistics.

Is the final exam the same each year? No. It changes every year.

Do you use Accuplacer for the first-time freshmen? Yes, but we also use in-

house placement tests based on the Mathematical Association of America.

B. Ellen Wentland (EW), Associate Dean of Academic & Institutional

Effectiveness, spoke on the QR assessment initiative at Northern Essex

Community College.

Several years ago, the college set six assessment goals. Last year, the

College established an interdisciplinary global awareness and QR learning

assessment for writing. We created an authentic-type scenario that was

fictional. In writing, it is difficult to assess products when the assignments

vary so much. So, we developed a fictional assignment that would be

uniform: Six countries needed to buy oil. We incorporated political

philosophies and relationships to the U.S. and asked students what they

would do to get oil from these countries. The solution requires the ability to

interpret charts, tables and graphs, and add and subtract. These are the

constants that you need, plus some basic math. Faculty had to give up 50

minutes of their class time to implement this scenario, considered to be a

moderate-level assessment. The faculty agreed and students gave their

permissions. There were detailed instructions on the vignette to students,

because motivation is lower if it is not graded. We used students with 49-59

- 4 -

Page 5: Minutes of the Tenth AMCOA Meeting, May 1st, 2012 -- Framingham State University

credit hours and we accepted classes with, at least, 3 willing students, but

hoped to get 5 or 6 students. We collected the products, and raters who

taught math, government, and political science gathered for norming

sessions. The last page of the instructions to students contains the scoring

rubric (before we got into VALUE rubrics). We also looked at the VALUE rubric

to see if the interpretations were similar.

In general, faculty who rated were disappointed with the results of

students’ performance on global awareness even though the vignette did not

demand much of QR. We wondered if students’ performance resulted from

their lack of motivation; nevertheless, the scenario didn’t work, but

stimulated discussions on campus about the nature of assignments that

prompt students to demonstrate QR. If the VALUE QR rubric were used at the

College, we could become more intentional about aligning our assignments

with the criteria of the VALUE rubric. In actual courses faculty would be

intentional about their QR assignments.

AMCOA team members asked the following questions or made the following

comments:

Tom Curley (TC) said that Berkshire Community College has a variant

assignment – an out-of-class writing assignment that includes using graphs

and doing real research.

EW: NECC considered that, but questioned “what is more like real life?” On

the job, you are asked what you think about a situation, and you must

respond while at work. That is real life, so we thought using an in-class

project was more authentic.

TC: If you are evaluating depth of knowledge, an out-of-class assignment

would be a more comprehensive exam. You could have two options, one in

- 5 -

Page 6: Minutes of the Tenth AMCOA Meeting, May 1st, 2012 -- Framingham State University

and one outside of class. You could also make it count for 10% of the class

grade to increase motivation.

Elise Martin (EM): Do you use the same methodology (a vignette) outside of

the class, or do you use products from the course?

EW: We get courses to define an intensive-level assessment assignment. We

don’t have QR covered yet. The question is: where is that going to appear?

Pharmacology has it down. We are following the students’ choices to see

how it can fit in other courses.

C. Next to speak was John Donnellan, Professor of Business Administration,

at Holyoke Community College (HCC).

John is a member of HCC’s Gen Ed Assessment Committee that is charged

to look at QR as a core competency in various courses. Initially, a focus group

was formed, and then the College looked at assessment in student work

samples first by asking students questions in focus groups (28 students in 3

groups). We bribed them to participate by offering them lunch, and it

worked. We got 4 pages of results from the focus group. Predictably,

students feared math and QR. Students said that some teachers made QR

easier to understand than others. Another common response was that

students wanted to know why they should take Calculus if they would never

use it. However, they seemed to understand that QR was an important skill

to have in daily living. After our focus groups, we thought through our

commitment to assess QR in the following ways:

We wanted to create an assignment.

We wanted to establish criteria, but not grade student work.

We assumed that QR is not just in the math department; it could

happen across courses.

- 6 -

Page 7: Minutes of the Tenth AMCOA Meeting, May 1st, 2012 -- Framingham State University

We looked at a broad range of courses – chemistry, math, statistics,

psychology statistics, economics and music – to assess QR. We

didn’t assess music, because we didn’t know how.

Judy Turcotte knew a community college in Brooklyn, NY, that had

long experience in assessment using the AAC&U rubric in QR, and

had normalized and standardized it. At HCC the feeling was: why

re-invent the wheel? We had moderate success using the

AAC&U/Brooklyn-normalized rubric.

Some of the drawbacks of using their rubric?

o There was no “Does Not Apply” option.

o Standards were too high, but better than anything we could

come up with on our own.

We used it in a sample assignment in Economics to try to see how it

could be integrated into an assignment. The assignment asked

about a society’s food print – what people eat and where food

comes from. The mileage to get the food was flawed because it

showed finished product to people, rather than raw materials to

people, but it got students to look at data.

The Outcomes:

o Calculation is where they did best.

o A score of 2.5 was acceptable, which was closest with

calculations.

When students were told how to calculate and come up with

answers, it was higher. It was acceptable with communications.

- 7 -

Page 8: Minutes of the Tenth AMCOA Meeting, May 1st, 2012 -- Framingham State University

The last page involved frequencies, but we were most interested in

the means.

AMCOA members asked the following questions:

Charlotte Mandell (CM), UMass Lowell: Could an assignment be designed to

more effectively get the outcomes? Yes, e.g., a vignette like Ellen Wentland

did.

Peggy Maki: Were you trying to get at the reasoning behind QR, not just

rate calculations? Yes, we found that the Chemistry group did not do so well

with this. There is an emphasis on rote memory in that discipline.

D. Then, Richard Eells (RE), Professor, Department of Math, Science &

Technology from Roxbury Community College (RCC) spoke.

Professor Eells said that he is not an assessment professional, but he has

been assessing QR for many years. RCC uses common sense mathematics.

Fifteen years ago, there was a first-level course in College Algebra and a

watered-down Statistics course, which was like QR. No inference was

covered, so it was not considered Statistics at UMass Boston, but it required

writing, thinking and analysis, so we had a QR course all along. However, we

had a problem: the statistics course that we had was more relevant to the

social sciences, so more people were taking the Algebra course, even though

it was irrelevant to most of them.

Finally, RCC gave in to UMass Boston and started a QR course this year. It

includes Fermi Problems – back-of-the-envelope-type questions, using rough

estimates. There is extensive use of Excel, and students feel empowered by

that. The outcome: the book we use is a web resource and is taught in the

computer lab. How do you write a test or quiz to gauge learning? One

- 8 -

Page 9: Minutes of the Tenth AMCOA Meeting, May 1st, 2012 -- Framingham State University

colleague gives open-book tests, so the test must be much harder, and

students spend a great deal of the test time looking through the book to come

up with answers.

I’ve learned that Google is a great calculator. Some students just type in a

quantitative question and get an answer from Google without really knowing

the underpinnings of the mathematical process. I have taught charts and

graphs, like a pie chart, and students will Google the answer. I have not

developed a rubric, which is in two-dimensions. Facetiously, he said that he

wants to develop one in three dimensions – a rubric cubed.

At RCC the college math courses use number systems, set theory, Venn

diagrams, and/or, etc. They do not rely on Algebra because students find that

too abstract. There is now a 50% pass rate in the new QR course and, he said,

“We are trying to write better QR questions that are not Googleable.” He

concluded by saying that we should be asking ourselves what life skills we

want our students to take away and know in ten years?

E. Alex Asare, Professor of Math, Roxbury Community College. He teaches

the second section of the QR course.

Students find it challenging and difficult to learn to reason quantitatively.

A lot have dropped out of the course. However, the students have grown

accustomed to using Excel, and they are doing well with that. RCC has a

number of full-time Professors. We haven’t divided into groups, at least not

formally yet. There is a Carnegie Study that indicates that students do not

have enough experience with technology, so we would like to deal with that.

In response to a question raised about how successful group work can be

in solving QR-based problems, a faculty member visiting our meeting stated

that he uses problems generated from the university database and then pairs

- 9 -

Page 10: Minutes of the Tenth AMCOA Meeting, May 1st, 2012 -- Framingham State University

students to solve them. Students are asked questions about affirmative

action and diversity, with groups opposing each other, arguing from different

perspectives. They select the data to inform their arguments, and they do a

PowerPoint presentation to back up the data. We also use high school

problem scenarios, and then ask students to solve the problems and write an

essay on how they solved those problems.

EW: It is similar to a debating society that takes opposing views and backs

them up.

F. Christopher Cratsley (CC), Director of Assessment, Fitchburg State

University

We looked at problem-solving QR and then asked: How can we assess

problem solving? Answering that question led to two rubrics at Fitchburg

State University:

1) Analyzing data. Students collect and analyze data and develop

conclusions.

2) Making Calculations in a meaningful way. We’d like to get to

applications. We hope to impart a facility with equations: using them to

make predictions, rising to a threshold, then expanding the analysis and

also generalizing. The ideal is an ability to use multiple representations

from symbolic equations and move to verbal descriptions. Also, students

should be able to make a calculation that others can use for different

purposes.

AMCOA team members made the following comments:

EW: So, you want students to understand the logic behind the equation.

- 10 -

Page 11: Minutes of the Tenth AMCOA Meeting, May 1st, 2012 -- Framingham State University

Yves Salomon-Fernandez (YSF): It is also important to know the limitations of

the equation, how it can be knocked down. It is important to know the types

of analyses that can be done. You may forget the specific formula, but you

will remember the process of how to get the answer.

PM: The goal is integrated learning versus siloed learning.

TC: The goal is empowering community college students as opposed to

making them compliant.

CC: I am very involved in articulation of the K-12 Program and PARCC.

Twelfth grade math level is very aspirational, calculating residuals off linear

data to make inferences. There is a standard about model construction, a

model to plug in different data, reasoning and thinking about what they are

doing with their QR. They are constructing math models to be applied to

other questions.

G. Dawne Spangler, Director of the Center for Teaching, Learning and

Assessment, North Shore Community College

Dawne expressed her concerns about QR as follows: How do we define

QR? Does it mean taking a college Algebra course? We’ve come away from

that, but not far enough. We can’t get K-12 to do this unless the colleges

value it. It must come from the colleges, and then the lower grades will

comply.

The QR courses don’t transfer, so even if students need them, they don’t

take them, because they do not transfer to UMass. We need to look at

“holes” in understanding and fill in those holes. Where do you assess QR? If

you go to “sterilized” math classes, you won’t get good outcomes. Math must

be taught across the disciplines.

- 11 -

Page 12: Minutes of the Tenth AMCOA Meeting, May 1st, 2012 -- Framingham State University

You have to decide where QR resides, so that it trickles down to K-12.

Who owns QR? Dawne has taught math for Elementary teachers. Math is a

study of structure. We need to get to quantitative understanding. According

to Grant Wiggins students who are given problems without contexts typically

fail. Students who are given context-based problems do well. Our

assessments will disappoint us if we don’t know how to put students in

contexts for solving them.

Concluding remarks:

PM: We need to continue having these discussions. Thank you to the panelists

for speaking and sharing your campuses’ experiences with QR assessment.

IV. Planning for AY 2012-2013

A. Overall Plans for Next Year: Pat Crosson

During the past year we have made great strides: we have formed the

AMCOA Team; held meetings and conferences with all the schools in the

Massachusetts public higher education system; shared best practices on

assessment methodologies; and become a LEAP state in February. The

Massachusetts initiative will be rooted in the WGSLOA goal of doing

assessment better than a single test; basing it on campus work and taking us

to a statewide reporting system. There have been a few modifications from

the WGSLOA to LEAP state, but they are minor. For example, we dropped the

composite working model, and focused on a basic statewide assessment plan.

Regardless of Davis Funding, we will continue this work next year. We will

name people to task forces to represent campuses to do this work. Three

campuses have done so already. The task force will provide overall guidance

for the work. Other groups doing thinking and planning are two smaller

teams: 1) the Massachusetts Team and 2) The State Partner Team that hopes

- 12 -

Page 13: Minutes of the Tenth AMCOA Meeting, May 1st, 2012 -- Framingham State University

to find another set of states with similar goals to work with. A lot of states

are interested in partnering with Massachusetts. There will be co-chairs for

these task forces.

Pat stated that the AMCOA Team will go forward. We would like to see

this kind of collaboration continue across campuses. Campuses have said that

they still support AMCOA, so the work can continue.

AMCOA team members raised the following questions:

Will being on the task force exclude any of us from being on one of the

other two teams, other groups or committees? No. Pat said that she hoped

groups could straddle each other. There will be two teams peopled with

individuals who are willing to engage the challenging questions.

When will we know about the second Davis Grant? Pat stated we should

hear sometime by the end of May.

How many state partners might there be? Counting state partners, there will

be 14 states attending the conference at Boulder, Colorado. They will talk

about partnering with Massachusetts at that time. There is initial interest,

and we will know more about it at the end of May. The state partner work

will follow that.

Pat distributed a grid that describes how AMCOA will go forward in the

future. Phase II continues the good assessment work on campuses, sharing

ideas and collaborating. Phase I created a collaboration that is very

successful.

The following comments were made about AMCOA’s work this year:

EW: It worked well. We shared ideas and worked together.

PM: We still have the challenge of how to bring word back to campuses.

People in AMCOA and attendees learned a lot this year.

- 13 -

Page 14: Minutes of the Tenth AMCOA Meeting, May 1st, 2012 -- Framingham State University

PC: We have an ongoing obligation to expand the involvement on the

campuses. AMCOA needs to bring word back to the schools.

PM: Next year we will be learning from each other. We will be moving to

working meetings and conferences — conferences that focus on the three

outcomes we eventually will report on statewide, including curricula design,

assignment design, and scoring of student work. The more we engage people

in these activities, the more comfortable people will become. Peggy asked

that people send her examples of exemplary assignments for QR, writing, and

CT so that we can use these at our conferences to help others develop

effective assignments. We also need models of student work that reflect their

levels of achievement in responding to assignments.

EM: I would like to brainstorm on what essential outcomes a work sample

exemplifies in its depth and breadth.

B. Days, Dates, and Times for Next Year’s Four AMCOA Meetings (two in the

fall and two in the spring) Based on Results of Survey Monkey and Plans

for Next Year

After some discussion, the Team agreed that the dates for next year’s four

AMCOA Meetings will be:

Thursday, September 20, 2012

Wednesday, November 7, 2012

Monday, March 11, 2013

Friday, May 3, 2013

- 14 -

Page 15: Minutes of the Tenth AMCOA Meeting, May 1st, 2012 -- Framingham State University

C. Days, Dates, and Times of Next Year’s Three Statewide Conferences (one

in the fall and two in the spring) Based on Results of Survey Monkey and

Plans for Next Year

The Team agreed that the dates for next year’s three AMCOA conferences

will be:

Friday, October 19, 2012

Thursday, February 28, 2013 (now changed to March 1 because of a

scheduling conflict)

Wednesday, April 24, 2013

V. Summary of Results

A. Questionnaire about Topics/Foci for Assessment that AMCOA Team

Members Have Identified to Support Our Campuses: Bonnie Orcutt

Bonnie reported that there is interest in having an AMCOA Support Team

with on-call help teams. What would that commitment entail? There

would have to be a limit on the number of calls. Also, there would have to

be an application process demonstrating why institutions would bring you

onto their campuses to advise them. We would also have to consider the

institutional impact. Only four responses were sent in. To encourage

more volunteers, Bonnie pointed out that it doesn’t have to be a

permanent commitment. Also, some leads are looking for teammates to

work with them. Pat Crosson responded that she assumed there probably

wouldn’t be a huge demand from campuses, yet there should be a lead

person from AMCOA who understands the nature of the need and

matches it with talent and the campus needs.

- 15 -

Page 16: Minutes of the Tenth AMCOA Meeting, May 1st, 2012 -- Framingham State University

Peggy stated that there will be several committees established under

AMCOA next year, such as a committee to determine how NSSE/CCSSE

might be used to contribute data about students’ learning of QR, CT and

writing; a committee to work with a DHE staff person to develop a

repository focused on assessment; and a committee to establish criteria

for a new round of assessment experiments focused on identifying or

designing a web-based reporting system for student outcomes.

B. March Scoring of Critical Thinking (CT): Peggy Maki

The scoring results of each group in our round of Critical Thinking (CT) are

posted on Yammer. There were issues about assignments in every single

group. “Does the assignment really ask students to demonstrate critical

thinking?” was a common group question. What are representative

assignments that ask students to exemplify writing and CT? We need to

gather examples of them to use in our work and in our conferences.

- 16 -

Page 17: Minutes of the Tenth AMCOA Meeting, May 1st, 2012 -- Framingham State University

Appendix A: Institutions Represented at the AMCOA May 1st Meeting:

Berkshire Community College

Bunker Hill Community College

Cape Cod Community College

Fitchburg State University

Framingham State University

Greenfield Community College

Holyoke Community College

Massachusetts Bay Community College

Massachusetts Maritime Academy

Massasoit Community College

Middlesex Community College

Mount Wachusett Community College

North Shore Community College

Northern Essex Community College

Roxbury Community College

Salem State University

Springfield Technical Community College

University of Massachusetts Boston

University of Massachusetts Dartmouth

University of Massachusetts Lowell

University of Massachusetts President’s Office

Westfield State University

Worcester State University

- 17 -

Page 18: Minutes of the Tenth AMCOA Meeting, May 1st, 2012 -- Framingham State University

Appendix B: Mark Pawlak’s PowerPoint Presentation on Quantitative

Reasoning Efforts at UMass Boston

[Please double click on the image below to open the presentation and then click once to

move from one slide to the next.]

- 18 -