a valuable stakeholder engagement and evaluation design tool...a valuable stakeholder engagement and...

Post on 18-Jun-2020

2 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Focusing Sessions

A valuable stakeholder engagement and evaluation design toolVeronica S. SmithNational Science Foundation Research Traineeship Annual Meeting September 2019

1. Purpose

Donaldson’s theory-driven evaluation science

• Contributes to team learning and program improvement• Increases stakeholder engagement• Reduces evaluation anxiety• Hypothesis driven

Donaldson, Stewart, (2017) Program Theory-Driven Evaluation Science:Strategies and Applications

Focusing session goals• Create common understanding • Make program theory explicit and testable• Frame evaluation questions and identify key measures• Provide framework for data collection and analysis

2. Are you ready?

3. Be gameful!

Theory of Action

4. Focus on learning

together

Theory of Change

Community of Learning

Where people continually expand their capacity to create the results they truly desire, where new and expansive patterns of thinking are nurtured, where collective aspiration is set free, and where people are continually learning to see the whole (reality) together.

-Peter Senge

What are the 3 attributes you would like to contribute to our learning community?

Veronica Smith1. Diversity, equity,

inclusion best practices2. Facilitation skills3. Data science

How do we want to conduct the evaluation as co-creators of knowledge?

What kind of learning community do we want to create?

Norms• Experience Discomfort• Take Risk• Stay Engaged• Listen for Understanding• Speak Your Truth• Expect/Accept Non-Closure• No Fixing

17

5. Focus on change

together

Program Logic

Input Activity Output Short-termOutcome

IntermediateOutcome

Long-termOutcome

Process evaluation

A C T I O N

Impact evaluation

C H A N G E

Focus on C H A N G E

Increase Decrease DeepenBroaden Mediate ModerateDiversify Compress TemperAccelerate Decelerate

6. Map together

Need impact map to move beyond waving of hands

éSTEM Diversity,

Equity & Inclusion

YOUR NRT PROGRAM

Anchor Changes

5+ years

Timeline

é Trainee Knowledge & Skill

éInterdisciplinarity

& Convergence

Timeline2-4 years 5+ years1-2 months

Desired Change C

Desired Change E

leads to

Impact Map

leads to

Desired Change B

Desired Change A

Desired Change D

Anchor change

24

Insert the DIRECT maps

25

7. Focus on

knowledge needs

together

What if you could get a sound answer to a question about your program that would help improve outcomes?

What are the measures you need to track progress against goals?

Example questions• How confident are trainees about their career prospects and

competitiveness in the clean energy industry?• How effective are the NRT courses at preparing students for

the capstone project?• How capable are students at applying and communicating

about data science across domains?

31

“Concrete development of questions and group discussion.”

“It was great to learn the goals and general direction of the organization.”

“Healthy discussion and reframing of comments.”

“Enjoyed being able to share as a stakeholder.”

“The collaborative process allowed everyone to participate.”

“I felt like my opinion as a student and my contributions were really valued.”

“Having input into the program’s focus.”

32

Questions?veronicasmith@data2insight.com

Measurement Invariance Across Time

Cheryl Schwab PhDInternal Evaluator

Overview of the DS421 Program

Purpose

Test the assumption of measurement invariance when assessing change in trainee ability (latent variable) using the same instrument at several time points.

AssumptionResponses across time points for the same person are conditionally independent given the latent variable (i.e., the correlations between the responses at each time point are smaller than can be attributed to trainee ability).

Item Response TheoryUnderlying function called an item characteristic curve which specifies that as the level of a trait increases the probability of a correct response to an item increases.

(see Hambleton, Swaminathan & Rogers, 1991)

Outcome SurveyRate your ability to do the following in a collaboration (i.e., project, research, work).I am a(n)…novice, advanced beginner, proficient,

expert. ---contribute environmental science disciplinary knowledge.---conduct collaborative research.---write for audiences inside primary discipline.---present ideas to audiences outside primary discipline.

ICC

Differential Item Functioning (DIF)

Test if the difficulty of an item is different across groups or across time.

DataCollected between November 2015 and May 2018

Administrations 4 times across two year programCohorts = 3Departments = 11Cases n= 78

Modeling Items

MODEL DEVIANCE #PARAMETERSRating Scale Model

item + step 4314 33Partial Credit Model

item + item*step 4230 90

Map of Latent Distribution and Thresholds

Latent RegressionREGRESSION COEFFICIENTS

CONSTANT -0.226 ( 0.236) c2 0.007 ( 0.383) c3 0.414 ( 0.395)

Variance 2.002 Unconditional Variance 2.123 ~ 3%

Time

MODEL DEVIANCE #PARAMETERS

item + step 4319 35

item + step + time 4310 38*

* Significant change in deviance 0.05

Main Effect Time

TIME ESTIMATE ERROR 1 0.523 0.108 2 0.225 0.1653 -0.483 0.1754 -0.265*

*estimate constraintChi-square test of parameter equality = 32.77, df = 3, Sig Level = 0.000

Map of Latent Distribution and Parameters

Results• Distribution of item difficulties shifts as the students

move through the program.

Next Steps• Modeling differences between cases• Case dependence• Including more data per case

Faculty and Staff

David Ackerly (Co-PI), Integrative Biology (L&S) Max Auffhammer (PI and Director), Agricultural and Resource Economics (CNR) Maggi Kelly (Co-PI), Environmental Science, Policy and Management (CNR) Philip Stark (Co-PI), Statistics (L&S) David Anthoff, Energy and Resources Group (ERG) David Culler, Electrical Engineering and Computer Sciences (CE) Kristina Hill, Landscape Architecture and Environmental Planning (CED) Solomon Hsiang, Goldman School of Public Policy (GSPP)Laurel Larsen, Geography (L&S) Heather Constable (BiGCB Berkeley Initiative on Global Change Biology & DS421 NSF Research Traineeship: Data Science for the 21st Century & UC Berkeley Field Stations Campus Coordinator)

A Collaborative Approach to Program Evaluation for an NRT

Valerie DeckerAssistant Director, Center for Evaluation and Assessment

University of Iowa

Sustainable Water Development Program

Poster Session 2Number 211

waterhawks.uiowa.edu

Program Evaluation Overview

• Systematic data collection• Informing program improvements• Determining the success of a program• Driven by the information needs of stakeholders

The Program Evaluation Standards

Yarbrough, D. B., Shulha, L. M., Hopson, R. K., & Caruthers, F. A. (2010). The program evaluation standards: A guide for evaluators and evaluation users (3rd ed.). Thousand Oaks, CA: Sage.

Dimensions of quality:• Utility• Feasibility• Propriety• Accuracy• Evaluation Accountability

www.jcsee.org

Outcome of a Collaborative Approach

Strong foundational plan with opportunities to adapt built in

Necessary Components

Evaluators as collaboratorsBudget for evaluation

Questions

Assessing Interdisciplinary Competency in the Disaster Resilience and RiskManagement (DRRM) Graduate Program using Concept MapsMarie C. Paretti, PhDEducation Director, DRRMVT

Professor, Engineering Educationmparetti@vt.edu

Image courtesy of NASA Goddard Space Flight Center

This material is based upon work supported by the National Science Foundation under Grant Number 1735139. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

DRRM Program: Purpose

▪ Promote collaboration through a transdisciplinary educational experience

▪ Train business analysts, educators, engineers, scientists, and urban planners to better prepare for, respond to, and recover from natural disasters

Concept Maps

● Measures domain understanding

● Captures depth (levels of hierarcy) & interconnections (number of links)

● Supports multiple scoring methods

○ Quantitative○ Holistic

Concept Map Scoring: Holistic Criteria

Complexity Density Organizational Strategy

Construct Depth of demonstrated knowledge

Breadth of demonstrated knowledge

Sophistication of organization

Measurement Number of hierarchies included in map

Number of concepts included in map

Organizational strategy

[1] M. Besterfield-Sacre, et. al., "Scoring Concept Maps: An Integrated Rubric for Assessing Engineering Education," J. of Eng. Ed., vol. 93, no. 2, pp. 105-115, April 2004, Art.[2] J. E. Sims-Knight et al., "Using concept maps to assess design process knowledge," in FIE 2004. 34th Annual, 2004.

Holistic Scoring: DRRM Scholars* Only

Topical Structured Topical Functional Topical Functional

Complexity

Dens

ity

A

ALow

Med

ium

High

Low Medium High

B

B

C

C

D

D

EE

F

F

G

GHH

Aug 2018

Dec 2018

J

KEY:

J

Density: Number and range of topics and subtopicsComplexity: Connections across topicsOrganizational Strategy: Topical, Structured Topical, Functional Topical, Functional

Organizational Strategy

*Scholars are those students participating in the full doctoral program

Strengths of Concept Maps as Assessment Tool in this Context

1. Provides a representation of student learning2. Supports analysis of depth of thinking and interactions among concepts

Student G

Pre-Map

Post-Map

DRRM concepts span, bridge, and intersect multiple disciplinary boundaries.

1. Interdisciplinarity creates a high level of complexity.

2. 2D Concept map may be insufficient to map knowledge.

3. Quantitative scoring complicated by dimensional limits, scope, and “between” spaces.

Challenges of Concept Maps as Assessment Tool in this Context

Suggestions for using Concept Maps as Assessment Tool in Highly Interdisciplinary Contexts

1. Written explanations can complement visual representations and expand our ability to assess learning.

Suggestions for using Concept Maps as Assessment Tool in Highly Interdisciplinary Contexts

1. Written explanations can complement visual representations and expand our ability to assess learning.

2. Global (holistic) scoring approaches may more readily capture nuances in the concept maps than quantitative (# of nodes, # of connections, # of levels) methods.

Questions?

11Icon made by Roundicons and freepik from www.flaticon.com

@DRRM_VTJessica Detersjdeters@vt.edu

@jess_deets

Dr. Marie C. Parettimparetti@vt.edu

@mparetti2

DRRMhttps://www.drrm.fralin.vt.edu

VT

Gili Marbach-AdCollege of Computer, Mathematical, and Natural

Sciences, University of Maryland

Using Formative Evaluation Practices to Promote Continuous Improvement in two NRT Projects

Mission:

To enhance graduate students’ following skills:

• Interdisciplinary Research

• Communication

• Teamwork/Leadership

• Mentorship

• Cohort 1: Spr2017 - Spr2019 (N=11)

• Cohort 2: Fall2017 - Fall2019 (N=12)

• Cohort 3: Fall2018 - Fall2020 (N=15)

• Cohort 4: Fall2019 - Fall2021 (N=9)

• Cohort 5: Fall2020 - Fall2022

Poster #212 Session 2

Mission:program that trains students to effectively work and communicate across disciplines to address food/energy/water nexus issues

Cohort 1: Spring 2019Poster #110 Session 1

STEM Training at the Nexus of Energy, WAter Reuse and FooD Systems

COMBINE Change Model

�� �����# "�����#! �! ��#������# � :*/$@*2@9 5(%9%$@%$:# 9*32@ 2$@95 +2*2(@�@%2) 2#%@(5 $: 9%@79:$%297@*2@ *293@95 $*9*32 /@�)���@453(5 17�@ +2#/:$*2(�@ � �@�#)3/ 57@<*9)@79+4%2$@�135%@<*9)3:99)%@&3/03<*2(@7-*//7�@ � ��0��'��#���*���#�=�%6(0�:%, 79*4%2$�

���8#���=�#1�(��0��'�#�(< � �#�(��0��=5%436%$@#32&*$%2#%@*2@9 5(%9%$� �29%5$*7#*4/+2 5>@�%7% 5#) �%6/�:&,=06��=!8@�=1��"���0�� 7-*//7� �311:2*# 9*32@ �� � �#�(��0��=2:1"%5@3&@4:"/*# 9*327@ 2$"%�6��=�%6/�=�#�=�#=�#1�#0�7�=��1��� 1<35.�% $%57)*4@

'(%���1=�%6(0� 45%7%29 9*327@3&@*29%5$*7#*4/*2 5>@5%7% 5#)� �%29357)+4@ � �#�(��0��= $;*735@ < 5%2%77@ 2$� ���=��=0�"�#�/ 445%#+ 9*32@&35@*29%5$*7#*4/+2 5>@5%7% 5#)� ��(=1%=��(=161%*��=:���� � �#2�.0��'0=;4=��='�(3$�/ ��

� �#�(��0��=2:1"%5@3&@*29%527)+47� ��#1%+� � ��7��%'= @13$%/@&35@39)%5@&:9:5%��##6��=��(��(=��7��%'"�#1 *29%5$+7#*4/*2 5>@453(5 17�%-�%'��##6��=��=�<"'%0�6" ���� #����#! ������#� �65����=9= ��=���##�!= �� ���7��%'=453(5 1@#31432%297@�%"'���=�%##��1=�#�=���� �� ��� #��(�(��6�1�=(�0��)�="�#2%+�

� �35-@<*9)@ $;*7357@ 2$@�,7@ ���� #����#������ ����#� �2#0:$%@��� ���@& #:/9>@1%1"%5@&5313:97*$%@3&@#35%@$+7#*4/*2%@32@9)%7*7@#311*99%%

������� #���!��# �

� �)���@79:$%297@*2@� 9)%1 9+# /�)>7*# 0�@�314:9%5@�2(*2%%5+2(�314:9 9*32 /�@ 2$@

�� �������#�5 $*9*32 /@�)���@453(5 17@& */@93@$%;%/34@+29%5$*7#*40*2 5>@-23</%$(%�@7-*/07�@ 2$@5%7% 5#)@ 4453 #)%7@9) 9@ 5%@2%#%77 5>@93@#:55%29@<35-40 #%7�@��������#�53(5 17@3&9%2@*2#/:$%@*27:&'*#*%29@1%2935*2(�@ 933@1:#)@%14) 7*7@35@7*/3%$@5%7% 5#)�@ & */:5%@93@ #-23</%$(%@:7%&:/@1%9)3$3/3(*%7@&531@3:97+$%@ # $%1+#@&*%0$7�@���������� �����#�%;%/34@ @13$%/@&35@*29%5$*7#*4/+2 5>@(5 $: 9%@5%7% 5#)@453(5 17@9) 9@%14) 7*?%@9)%@ "3;%@$%7*5%$@7-*//7�@ �*&%@�#*%2#%7@ 9@9)%@�2+;%57*9>@3&@� 5>/ 2$�@

� !77%77*2(@453(5%77@3&@#3:57%@#31432%297@ 2$@9)%+5@439%29* /@795%2(9)7@ 2$@<% -2%77%7@��%%$" #-@&531@79:$%297@ 2$@*2795:#9357�� �5( 2*?*2(@&:9:5%@7#)%$:/%@3&@��� ���@453(5 1@ 2$@*$%29*&>@#) 2#%7@&35@&:59)%5@*2;%79*( 9*32� !$ 49*2(@933/7@93@"%99%5@5%&/%#9@%=4%5*%2#%@*2@��� ���@453(5 1@@

FrameworkforAssessment:MeasuringFiveLevelsofProgramEvaluation

ØParticipationØSatisfactionØLearningØApplicationØImpact

Kirkpatrick, D. L. (1994). Evaluating Training Programs: The Four Levels. San Francisco: Berrett-Koehler.Guskey, T. R. (2000). Evaluating professional development. Thousand Oaks, CA: Corwin.Colbeck, C. L. (2003). Measures of success: An evaluator’s perspective. Paper presented at the CIRTL Forum, Madison, WI.Connolly, M. R., and Millar, S. B. (2006). Using workshops to improve instruction in STEM courses. Metropolitan Universities 17.4 : 53-65.Marbach-Ad, G., Egan, L. and Thompson, K. (2015). A discipline-based teaching and Learning Center: A model for professional development. Springer

Jan 2017

May/Aug 2017

Dec/Jan 2018

May/Aug 2018

Dec/Jan 2019

Cohort I (N=13-1=12)

Data Practicum(n=13)

Research & Progress Seminar (n=12), Network Analysis (n=21)

Inte

rvie

ws

Surv

eys

Pre-

Surv

ey

Post

-Sur

vey

Post

-Sur

vey

Peer to Peer (n=5)

Post

-Pro

gram

Sur

vey

Int. with instructors

(n=2) Int. with students

(n=4)

Peer to Peer

Int. with instructor

Focus group with

students (~15)

Follow-up

interview with

students (n=3)

Mixed Model Research DesignCOMBINE: Cohort 1 (Jan 2017- Jan 2019)

Research & Progress Seminar

Lit. Survey Seminar

Interview with

Advisors (n=3)

Data Collection:• Observations• Artifacts• Surveys• interviews

Jan

2017

May/Aug

2017

Dec/Jan

2018

May/Aug

2018

Dec/Jan

2019

May/Aug

2019

Dec/Jan

2020

May

2020

Cohort I

Cohort II

Data Practicum

Lit. Survey

Seminar & Data

Practicum

Network Analysis

& Research &

Progress Seminar

Inte

rvie

ws

Su

rve

ys

Pre

-Surv

ey

Post

-Surv

ey

Pre

-Surv

ey

Post

-Surv

ey

Peer to Peer

Post

-Pro

gra

m S

urv

ey

Post

-Pro

gra

m S

urv

ey

Peer to Peer

Post

-Surv

ey

Int. with

students

(n=4) Follow-

up

interview

with

students

(n=4)

Pre

-Surv

ey

Post

-Surv

ey

Int. with

instructors

(n=2) Int. with

students

(n=4)

Int. with

instructor

Focus

group

with

students

(~15)

Follow-

up

interview

with

students

(n=3)

Mixed Model Research DesignCOMBINE: Cohort 2 (May 2017- May 2019)

Research &

Progress Seminar

(n=12), Network

Analysis (n=21)

Lit. Survey

Seminar

Interview

with

Advisors

(n=3)

Jan

2017

May/Aug

2017

Dec/Jan

2018May/Aug

2018Dec/Jan

2019

May/Aug

2019

Dec/Jan

2020

May

2020

Cohort I

Cohort II

Cohort III

Data Practicum

Lit. Survey

Seminar & Data

Practicum

Lit. Survey

Seminar

& Data

Practicum

Network Analysis

& Research &

Progress Seminar

Inte

rvie

ws

Su

rve

ysPr

e-Su

rvey

Post

-Sur

vey

Pre-

Surv

ey

Post

-Sur

vey

Symposium Peer to Peer

Prog

ram

Sur

vey

Prog

ram

Sur

vey

Po

st-P

rog

ram

Su

rve

y

(+co

ntr

ol g

rou

p)

Peer to PeerPo

st-S

urve

y

Interview

with

students

(n=4)

Pre-

Surv

ey

(+co

ntro

l gro

up)

Post

-Sur

vey

Post

-Sur

vey

Follow-up

interview with

students

(n=4)

Int. with

students (n=4)

Follow-

up interview

with

students (n=4)

Int. with

instructors

(n=2) Int. with

students (n=4)

Int. with

instructorFocus

group with

students

(~15)

Follow-

up interview

with

students (n=3)

Mixed Model Research DesignCOMBINE: Cohort 3 (May 2018- May 2020)

Network Analysis

& Research &

Progress Seminar

Interview with

Advisors

(n=3)

Research &

Progress Seminar

Mixed Model Research DesignUMD Global STEWARDS

Interdisiplinary Research Communication Career Preparation

COMBINE Components

Lear

n ab

out i

mpo

rtan

t res

earc

h re

sults

in n

etw

ork

scie

nce

Lear

n ab

out t

he th

eory

and

met

hods

of

net

wor

k an

alys

is

Deve

lop

an a

ppre

ciatio

n an

d un

ders

tand

ing

of th

e qu

estio

ns a

nd

met

hods

of o

ther

fiel

ds

Appl

y m

etho

ds o

f net

wor

k an

alys

is to

bi

olog

ical d

ata

Deve

lop

skill

s for

ora

l pre

sent

atio

ns

Lear

n to

bui

ld e

ffect

ive

visu

aliza

tions

Deve

lop

writ

ten

com

mun

icatio

n sk

ills

Com

mun

icate

to a

div

erse

aud

ienc

e

Build

team

wor

k sk

ills

Gain

exp

erie

nce

men

torin

g ot

hers

Deve

lop

lead

ersh

ip sk

ills

Gain

exp

osur

e to

div

erse

care

er

optio

ns

COMBINE Courses Data Practicum with Communication Training (PHYS798N)

6 2 10 5 10

9 8 10 7 2 3 3

Computational and Mathematical Analysis of Networks (CMSC8280)

5 11 5 10 5 6 3 6 9 2 4 1

Seminar Courses Network Science Literature Survey Seminar

8 4 5 1 8 0 2 5 5 1 2 3

Network Biology Research-in-Progress Seminar

6 4 9 2 9 3 2 7 2 0 0 5

Elective Courses Discipline-bridging 3-4-credit elective course (chosen from list of options)

5 4 6 3 2 5 5 3 3 0 1 3

Discipline-bridging elective seminar (chosen from list of options)

3 2 4 2 1 1 2 1 2 0 0 3

Other Activities Peer-to-peer tutorial 2 1 3 2 7 1 2 6 5 5 2 2COMBINE Annual Symposium 7 4 6 1 6 3 2 4 1 0 1 7Career Development Workshop 2 1 3 0 3 0 0 1 1 0 0 4COMBINE Outreach 1 0 3 2 3 1 0 5 4 2 3 0Outside research project/Internship 0 0 1 0 1 0 0 0 0 0 0 0Other COMBINE-related element 1 0 1 1 2 0 0 2 1 2 2 1

Feedback from First CohortFor the COMBINE components that you participated in, check those that helped you with each of the following objectives (check all that apply)

Feedback from First Cohort• Not enough opportunities for fellows to connect

with each other outside of their scheduled courses or the required COMBINE activities. üLeadership team began hosting a series of “coffee talks”

wherein fellows were invited to socialize and collaborate with each other in a more social, non-academic setting.

COMBINE faculty (~25)

Weekly seminars- COMBINE faculty research talks- Literature presentations/discussion- COMBINE trainee research talks

CMCS828O Network Analysis tools

Advanced coursework

Discipline-brindging coursework

Minimum 2 courses (4 credits) in complementary areas

ACTIVITIES

PHYS798N - Intensive data project and communication

Peer-to-peer tutorials

InternshipsSmithsonian, NIH, NIST, UMD School of Medicine, Genentech, NVIDIA, Epic, Dexcom...

Outreach/Mentoring

- K-12 student education- Public engagement- Undergrad mentoring- Peer mentoring

Annual Career Workshop

Annual COMBINE Symposium

Trainee oral and poster presentations

PARTICIPANTS SHORT-TERM OUTCOMES LONG-TERM OUTCOMES

PI

Research educatorProgram manager

? ability to communicate research and findings to diverse audiences (including researchers outside their immediate field and the general public)

PhD graduates with

? knowledge of opportunities and effective approaches for outreach and mentoring

? knowledge-base and skills to develop, propose, and execute research at the intersection of P/M, CS, and BIO*

? knowledge of career options in academia, industry, and national labs

? professional networks in academia, industry, and national labs

Increased number of interdisciplinary researchers prepared to address problems at the intersection of P/M, CS, and BIO* (in academia, industry, and national labs)

Student committee or Executive committee

? teamwork skills for interdisciplinary research

*The three COMBINE disciplinesP/M: Physical/Mathematical SciencesCS: Computational SciencesBIO: Biological Sciences

Evaluation categoriesP: ParticipationS: SatisfactionL: LearningA: ApplicationI: Impact

co-PIs

COMBINE research faculty (mentors/advisers)

COMBINE administration

COMBINE evaluation

L

A

L

L

L

A

I

Internal evaluatorExternal evaluatorAdvisory board

Regular participant

COMBINE trainees (~50)

x5

Evaluation methods done Survey Student Interviews Instructor Interviews Focus Group Class artifacts

designates COMBINE components that are in progress

PROPOSED EVALUATION

Surveying of studentsTo investigate:- baseline characteristics-educational history-strength and confidence in skills incorporated in COMBINE program-evaluation of program components

Student InterviewsTo investigate:- more in-depth classroom experience and individual growth-effects of COMBINE not evidenced in classroom like conference attandance or publications

Instructor InterviewsTo investigate:- effectiveness of course components- potential changes to the course or program

Focus Groups To investigate:-- effectiveness of course components- potential changes to the course or program

Class Artifacts To investigate:- frequency and characteristics of course components- student participation and response to course

P, S, L

S, L, I, A

P, L, I

P, S, L

P, L

Expansion of interdiciplinary graduate programs based on advocacy for COMBINE's effectiveness

Create a repository for evaluated and established courses for interdisciplinary research and communication

I

I

COMBINE: Logic Model

Acknowledgements

• COMBINE and UMD Global STEWARDS Fellows, their advisors, and the leadership team

• The Evaluation Teams:• Jack Marr• Frances Turner• Megan Gerdes• Hannoori Jeong

Questions

• COMBINE: Poster #212 Session 2• UMD Global STEWARDS: Poster #110 Session 1

Evaluation of graduate program courses: going beyond the typical end of course survey

2019 NRT Annual Meeting

9.26.2019 • Andrew Grillo-Hill, Josh Valcarcel

Evaluators for RIPTIDES program at San Francisco State University

Poster # 124 (session 1)

2

Context:

3

Challenge

• Y1 evaluation focused on newly developed courses

• Useful & actionable feedback for the program

4

Challenge

• Y1 evaluation focused on newly developed courses

• Useful & actionable feedback for the program

• Some issues related to the existing courses

• Not enough information • Disagreed with university course

evaluation

5

Challenge

• Y1 evaluation focused on newly developed courses

• Useful & actionable feedback for the program

• Some issues related to the existing courses

• Not enough information • Disagreed with university course

evaluation• For Y2 look at ALL the core courses

• Student focus groups• Student surveys• Limited observations• Internal memo of findings• Faculty interviews

6

Methodology:

7

Key Findings:

• Improve interdisciplinarity• Expectations didn’t always

match• More coordination across

the courses• More statistics training• Less than ideal meeting

day and time

8

Program Action

• Setting clearer expectations • Writing course now specific for

program students• One course changed day & time• One course redesigned to be a

statistics course• Faculty planning meeting for

team taught course

9

Questions???agrillo@wested.org

top related