general education assessment (gea) task...

42
GENERAL EDUCATION ASSESSMENT (GEA) TASK FORCE END OF YEAR REPORT AND RECOMMENDATIONS PRESENTED TO THE GENERAL EDUCATION SUB-COMMITTEE OF THE COLLEGE CURRICULUM COMMITTEE APRIL 29, 2008

Upload: truongdiep

Post on 17-May-2018

217 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

GENERAL EDUCATION ASSESSMENT (GEA) TASK FORCE

END OF YEAR REPORT AND

RECOMMENDATIONS

PRESENTED TO THE GENERAL EDUCATION SUB-COMMITTEE OF THE COLLEGE

CURRICULUM COMMITTEE

APRIL 29, 2008

Page 2: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

TABLE OF CONTENTS

GEA Task Force Members 2008 ................................................... 3

GEA Task Force Charge and Goals ............................................. 4

Definitions: Student Learning Outcomes

Communication .................................................................. 5

Information Literacy ........................................................... 6

Critical Thinking ................................................................. 7

Scientific and Quantitative Reasoning ............................. 8

Global Sociocultural Responsibility ................................. 9

Communications Rubrics:

Reception .......................................................................... 11

Oral Content ..................................................................... 15

Oral Delivery ..................................................................... 17

Written............................................................................... 19

GEA Task Force Action Plan for 2008-2009 .............................. 21

GEA Task Force To Do List ........................................................ 23

GEA Task Force Recommendations ......................................... 24

GEA Task Force Members 2009 ................................................. 28

GEA Pilot – MAPP Test Abbreviated Form ............................... 29

Page 3: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

GENERAL EDUCATION ASSESSMENT TASK FORCE

2008

FACULTY

Tessie Bond, GER Committee Chair, KC

David Bowen, Faculty Senate President, Humanities, SC

Sheri Brown, Librarian, DTC

Youlanda Henry, Communications, NC

Patty Lee, GER Committee Co-Chair, NC

Matt Mitchell, Mathematics, DTC

Lourdes Norman, Science, KC

Joel Rappoport, Mathematics, SC

Wayne Singletary, Workforce, KC

John Wall, Social and Behavioral Sciences, SC

ADMINISTRATION

Maggie Cabral-Maly, Kent Campus President

Lynne Crosby, Director of Program Development, Liberal Arts and Sciences

Julie Giuliani, Executive Dean of the Virtual College

Mike Reynolds, Associate Dean, Mathematics and Natural Sciences, KC

Jim Simpson, AVP for Workforce Development and Adult Education

Charles Smires, Dean of Liberal Arts, SC

Karen Stearns, Research Analyst

Jennifer Stoetzer, Task Force Assistant

Nancy Yurko, AVP for Liberal Arts and Sciences

CO-CHAIRS: Tessie Bond, Faculty Chair

Nancy Yurko, Administrative Chair

GEA Task Force, rev 5/12008 Page 3

Page 4: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

GENERAL EDUCATION ASSESSMENT TASK FORCE

CHARGE AND GOALS

The Florida Department of Education has recommended that post-secondary institutions collect

information regarding the efficacy of their respective general education programs. In addition, the

Southern Association of Colleges and Schools (SACS) requires that member institutions assess

their general education program. The methodology used to collect this data has not been dictated

by the Florida Department of Education or the state legislature. Thus, the Learning Outcomes

Assessment (LOA) Task Force, renamed the General Education Assessment (GEA) Task Force,

was assembled by the General Education Requirements (GER) Subcommittee of the College

Curriculum Committee in order to achieve the following:

GOAL

This task force will facilitate the development of a college-wide, faculty-driven, general education

outcomes assessment process to improve what we value as educators and as a college body,

namely the mentoring and education of our students so that they can achieve their personal and

professional goals. The task force suggests that the final assessment program should be

multifaceted in its design to reflect the rich diversity of our faculty and students. Furthermore, the

data collected from this assessment endeavor will be used to enhance FCCJ’s general education

outcomes. The GEA Task Force’s recommendations will be presented to the faculty for feedback,

input and development, and to the GER Sub-Committee for further discussion and approval prior to

final submission to the College Curriculum Committee.

TASKS

The Task Force will serve as:

1. A conduit for information concerning current best practices in general education

assessment.

2. A facilitator for the collection and implementation of faculty driven ideas regarding strategies

for a college-wide assessment plan.

3. A support for the preliminary collection and analysis of assessment data. In particular, the

task force will sponsor faculty discussions to cooperatively develop and test-pilot a general

education assessment strategy that targets one of the five specific general education

learning outcomes that have been identified by the State.

4. The task force will work with the Center for the Advancement of Teaching and Learning to

provide faculty with professional development opportunities related to learning outcomes

assessment.

The five general education learning outcomes identified by the Statewide Student Learning

Outcomes Task Force are: communication skills, scientific and quantitative reasoning, critical

thinking, information literacy, and global socio-cultural responsibility. Since each of the five general

education skills is founded in the ability to effectively communicate, the members of the FCCJ

General Education Assessment Task Force unanimously agreed that this general education

outcome would be a logical place to begin.

GEA Task Force, rev 5/12008 Page 4

Page 5: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

GENERAL EDUCATION ASSESSMENT TASK FORCE

COMMUNICATION LEARNING OUTCOME

DEFINITION

Effective communication is defined as an individual’s ability to choose the appropriate

means for obtaining, generating and using information and language to interact

successfully in the world. To be an effective communicator one must possess the ability to:

receive, comprehend, synthesize and integrate information through reading,

listening, and observation, and

transmit and exchange such information through the appropriate means of

expression including writing and speaking.

GEA Task Force, rev 5/12008 Page 5

Page 6: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

GENERAL EDUCATION ASSESSMENT TASK FORCE

INFORMATION LITERACY LEARNING OUTCOME

DEFINITION

Information literacy is defined as an individual’s ability to find, retrieve, analyze and use information. Being information literate requires the ability to:

identify the need for information,

select the most appropriate information retrieval system,

acquire pertinent information,

evaluate the information obtained,

manipulate information in a usable form, and

communicate the information appropriately.

Modified from the definition provided by the Association of College and Research Libraries.

GEA Task Force, rev 5/12008 Page 6

Page 7: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

GENERAL EDUCATION ASSESSMENT TASK FORCE

CRITICAL THINKING LEARNING OUTCOME

DEFINITION

Critical thinking is defined as an individual’s ability to apply logic, effective reasoning skills,

sound judgment and reflection in order to solve problems and to clarify the individual’s

objective understanding of the world.

An effective critical thinker is able to:

analyze and classify information,

compare and contrast,

establish appropriate hypotheses,

recognize assumptions and biases,

apply deductive and inductive reasoning to problem solve, and

synthesize, evaluate and reflect upon the information gathered.

GEA Task Force, rev 5/12008 Page 7

Page 8: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

GENERAL EDUCATION ASSESSMENT TASK FORCE

SCIENTIFIC AND QUANTITATIVE REASONING

DEFINITION

Scientific Reasoning is the interpretation of measurable, observable, or empirical

information through inference, analogy and induction to direct the formulation of

hypotheses and conclusions.

Specifically, scientific reasoning includes the ability to:

identify a scientific problem,

recognize and generate hypotheses,

identify relevant experimental variables,

make logical deductions using empirical or observed evidence,

distinguish between causal and correlation relationships,

distinguish between scientific and non-scientific arguments,

weigh and assess the quality of scientific information,

derive generalizations from data, and

use generalizations to make predictions.

Quantitative reasoning is the ability to understand and communicate mathematical

information. Specifically, quantitative reasoning includes the ability to:

interpret and make inferences from information presented in formulas, tables,

graphs and charts,

represent mathematical information symbolically, visually, numerically and verbally,

use arithmetical, algebraic, geometric and statistical methods to solve problems,

recognize appropriate and inappropriate applications of mathematical and statistical

models,

estimate and check/consider the reasonableness of numerical results,

use appropriate technology in the evaluation, analysis and synthesis of information

in problem-solving situations, and

evaluate information, make logical deductions and arrive at reasonable conclusions.

Modified from guidelines proposed by the Mathematical Association of America.

GEA Task Force, rev 5/12008 Page 8

Page 9: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

GENERAL EDUCATION ASSESSMENT TASK FORCE

GLOBAL SOCIOCULTURAL RESPONSIBILITY

DEFINITION

Global Sociocultural Responsibility is defined as the way of recognizing and responding to

different ethnic and cultural groups by analyzing from multiple perspectives the means by

which each group serves a global community that contributes to local, national, and

environmental events and concerns of humanity.

An individual who is globally and socio-culturally informed and responsible has the ability

to:

comprehend the historical, political, social, economic and cultural influences on the

development of societies,

demonstrate an understanding of diversity within economic, social, cultural, civil and

political infrastructure and its underlying value systems,

acknowledge the wide range of differences among individuals and the diversity

among socioeconomic and global communities,

recognize the effect of historical and current perceptions, assumptions, and beliefs

on individuals and groups,

recognize the value of contributing to the welfare of the community, and

understand the impact of individuals and groups on the local and global community.

Some outcomes were adapted from Central Florida Community College and Seminole

Community College.

GEA Task Force, rev 5/12008 Page 9

Page 10: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

This page is intentionally blank.

GEA Task Force, rev 5/12008 Page 10

Page 11: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

Leve

l 1 (

INIT

IATE

)Le

vel 2

(EM

ERG

ING

)Le

vel 3

(P

RO

FIC

IEN

T)Le

vel 4

(SK

ILLE

D)

LIST

ENIN

G C

OM

PR

EHEN

SIO

NTh

e ab

ility

to

in

terp

ret,

eval

uat

e, a

nd

rea

ct t

o

som

eth

ing

that

is h

eard

The

stu

den

t is

un

awar

e o

f b

asic

info

rmat

ion

ab

ou

t w

hat

was

com

mu

nic

ated

. He

or

she

has

no

t

rece

ived

th

e in

form

atio

n n

eed

ed. T

he

stu

den

t kn

ow

s so

met

hin

g im

po

rtan

t w

as

com

mu

nic

ated

, bu

t d

oes

no

t kn

ow

ho

w

to r

esp

on

d t

o it

. , T

he

stu

den

t m

ay a

sk

som

e b

asic

qu

esti

on

s ab

ou

t th

e to

pic

dis

cuss

ed.

The

stu

den

t ei

ther

dem

on

stra

tes

no

res

po

nse

or

resp

on

ds

com

ple

tely

off

to

pic

. Th

e st

ud

ent

may

no

t b

e ab

le t

o a

nsw

er o

r m

ay r

efu

se t

o

answ

er q

ues

tio

ns

abo

ut

wh

at w

as

com

mu

nic

ated

, an

d m

ay n

ot

be

able

to

app

rop

riat

ely

con

trib

ute

to

a d

iscu

ssio

n.

The

stu

den

t in

accu

rate

ly r

epea

ts w

hat

was

co

mm

un

icat

ed. T

he

stu

den

t m

ay

resp

on

d t

o w

hat

was

co

mm

un

icat

ed, b

ut

dem

on

stra

tes

no

co

mp

reh

ensi

on

of

the

com

mu

nic

atio

n. T

he

stu

den

t ca

n r

epea

t

som

e lit

eral

par

ts o

f w

hat

was

com

mu

nic

ated

bu

t n

ot

all b

ecau

se t

he

stu

den

t d

oes

no

t u

nd

erst

and

th

e en

tire

mea

nin

g o

f w

hat

was

co

mm

un

icat

ed.

In

the

clas

sro

om

, fo

r ex

amp

le, t

he

stu

den

t

kno

ws

a w

ritt

en o

r o

ral r

esp

on

se a

bo

ut

a

cert

ain

to

pic

is

bei

ng

req

ues

ted

, b

ut

do

es n

ot

un

der

stan

d w

hat

res

po

nse

sho

uld

be

mad

e.

The

stu

den

t ca

n v

agu

ely

sum

mar

ize

mai

n

idea

s o

f w

hat

was

co

mm

un

icat

ed. T

hey

are

gen

eral

ly a

war

e o

f th

e m

ain

mes

sage

s b

ut

they

are

un

able

to

com

ple

tely

co

mp

reh

end

th

e m

ean

ing

of

the

con

veye

d m

essa

ge. T

he

stu

den

t m

ay

exp

lain

th

e lit

eral

mea

nin

g o

f th

e

com

mu

nic

ated

info

rmat

ion

, bu

t th

ey

dem

on

stra

te li

mit

ed a

war

enes

s as

to

th

e

sco

pe

of

wh

at w

as c

om

mu

nic

ated

.

The

stu

den

t ca

n a

ccu

rate

ly s

um

mar

ize

the

mai

n id

eas

and

imp

licat

ion

s o

f w

hat

was

co

mm

un

icat

ed.

The

stu

den

t ca

n

accu

rate

ly a

pp

ly o

r d

emo

nst

rate

th

e

inte

nt

of

the

wri

tten

or

ora

l mes

sage

. O

r

if t

he

stu

den

t is

sti

ll u

nsu

re a

s h

ow

to

app

ly o

r d

emo

nst

rate

th

at i

nte

nt,

th

e

stu

den

t ca

n a

sk p

rob

ing

qu

esti

on

s o

r

rest

ate

wh

at w

as c

om

mu

nic

ated

in o

rder

to c

lari

fy h

is o

r h

er u

nd

erst

and

ing,

ind

icat

ing

a re

adin

ess

to r

esp

on

d

app

rop

riat

ely.

FLO

RID

A C

OM

MU

NIT

Y C

OLL

EGE

AT

JAC

KSO

NV

ILLE

CO

MM

UN

ICA

TIO

N -

REC

EPTI

ON

(G

EA R

ub

ric

Gro

up

Le

ade

rs -

Lo

urd

es

No

rman

& M

att

Mit

chel

l)

LEV

ELS

OF

AC

HIE

VEM

ENT

IND

ICA

TOR

S O

F

EFFE

CTI

VE

REC

EPTI

ON

OF

CO

MM

UN

ICA

TIO

N

Pre

par

ed b

y ks

tear

ns,

rev

ised

4/2

3/2

008

1 o

f 3

GEA Task Force, rev 5/12008 Page 11

Page 12: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

Leve

l 1 (

INIT

IATE

)Le

vel 2

(EM

ERG

ING

)Le

vel 3

(P

RO

FIC

IEN

T)Le

vel 4

(SK

ILLE

D)

LEV

ELS

OF

AC

HIE

VEM

ENT

IND

ICA

TOR

S O

F

EFFE

CTI

VE

REC

EPTI

ON

OF

CO

MM

UN

ICA

TIO

NR

EAD

ING

CO

MP

REH

ENSI

ON M

ain

Idea

, Det

ails

an

d P

atte

rns

of

Org

aniz

atio

n:

Has

dif

ficu

lty

det

erm

inin

g

and

des

crib

ing

the

mai

n id

ea o

f a

pas

sage

an

d n

egle

cts

and

/or

inac

cura

tely

iden

tifi

es t

he

mo

st im

po

rtan

t d

etai

ls t

hat

rela

te t

o t

he

top

ic.

Mai

n Id

ea, D

etai

ls a

nd

Pat

tern

s o

f

Org

aniz

atio

n:

Can

det

erm

ine

and

des

crib

e th

e m

ain

idea

of

a p

assa

ge a

nd

can

iden

tify

a li

mit

ed n

um

ber

of

imp

ort

ant

det

ails

th

at r

elat

e to

th

e to

pic

Mai

n Id

ea, D

etai

ls a

nd

Pat

tern

s o

f

Org

aniz

atio

n:

Can

co

rrec

tly

det

erm

ine

and

des

crib

e th

e m

ain

idea

of

a p

assa

ge

and

can

iden

tify

so

me

of

the

mo

st

imp

ort

ant

det

ails

th

at r

elat

e to

th

e to

pic

.

Mai

n Id

ea, D

etai

ls a

nd

Pat

tern

s o

f

Org

aniz

atio

n:

Can

co

rrec

tly

det

erm

ine,

des

crib

e an

d a

nal

yze

the

mai

n id

ea o

f a

pas

sage

an

d t

hey

can

iden

tify

a m

ajo

rity

of

the

mo

st im

po

rtan

t d

etai

ls t

hat

rel

ate

to t

he

top

ic.

Elem

ents

of

Nar

rati

ve:

Has

dif

ficu

lty

reco

gniz

ing

and

exp

lain

ing

man

y o

f th

e

elem

ents

of

a w

ritt

en w

ork

su

ch a

s p

lot,

sett

ing,

ch

arac

ter,

po

int

of

view

, to

ne

and

th

eme.

May

be

un

fam

iliar

wit

h t

he

term

s.

Elem

ents

of

Nar

rati

ve:

Can

rec

ogn

ize

and

exp

lain

a li

mit

ed n

um

ber

of

the

elem

ents

of

a w

ritt

en w

ork

su

ch a

s p

lot,

set

tin

g,

char

acte

r, p

oin

t o

f vi

ew, t

on

e an

d

them

e.

Elem

ents

of

Nar

rati

ve:

Can

rec

ogn

ize

and

exp

lain

man

y o

f th

e el

emen

ts o

f a

wri

tten

wo

rk s

uch

as

plo

t, s

etti

ng,

char

acte

r, p

oin

t o

f vi

ew, t

on

e an

d

them

e.

Elem

ents

of

Nar

rati

ve:

Can

rec

ogn

ize,

exp

lain

an

d a

nal

yze

mo

st o

f th

e

elem

ents

of

a w

ritt

en w

ork

su

ch a

s p

lot,

sett

ing,

ch

arac

ter,

po

int

of

view

, to

ne

and

th

eme.

Co

mp

arin

g an

d C

on

tras

tin

g Te

xt

Elem

ents

: Is

un

able

to

det

erm

ine

ho

w

elem

ents

wit

hin

a t

ext

are

sim

ilar

and

/or

dif

fere

nt.

Co

mp

arin

g an

d C

on

tras

tin

g Te

xt

Elem

ents

: Is

ab

le t

o d

eter

min

e h

ow

on

e

or

two

ele

men

ts w

ith

in a

tex

t ar

e si

mila

r

and

/or

dif

fere

nt.

Co

mp

arin

g an

d C

on

tras

tin

g Te

xt

Elem

ents

: Is

usu

ally

ab

le t

o d

eter

min

e

ho

w e

lem

ents

wit

hin

a t

ext

are

sim

ilar

and

/or

dif

fere

nt.

Co

mp

arin

g an

d C

on

tras

tin

g Te

xt

Elem

ents

: C

an c

on

sist

entl

y d

eter

min

e

ho

w e

lem

ents

wit

hin

a t

ext

are

sim

ilar

and

/or

dif

fere

nt.

Rec

ogn

izin

g C

ause

an

d E

ffec

t

Rel

atio

nsh

ips:

Can

iden

tify

a li

mit

ed

nu

mb

er o

f th

e th

ings

th

at h

app

en w

ith

in

a te

xt. M

ay g

ive

inac

cura

te e

xpla

nat

ion

s

ho

w/w

hy

thes

e th

ings

hav

e h

app

ened

.

Rec

ogn

izin

g C

ause

an

d E

ffec

t

Rel

atio

nsh

ips:

Can

iden

tify

so

me

of

the

thin

gs t

hat

hap

pen

wit

hin

a t

ext.

Can

give

exp

lan

atio

ns

for

ho

w/w

hy

som

e o

f

the

thin

gs h

ave

hap

pen

ed.

Rec

ogn

izin

g C

ause

an

d E

ffec

t

Rel

atio

nsh

ips:

Can

iden

tify

so

me

of

the

thin

gs t

hat

hap

pen

wit

hin

a t

ext

and

can

exp

lain

ho

w/w

hy

thes

e th

ings

hav

e

hap

pen

ed.

Rec

ogn

izin

g C

ause

an

d E

ffec

t

Rel

atio

nsh

ips:

Can

iden

tify

mo

st o

f th

e

thin

gs t

hat

hap

pen

wit

hin

a t

ext

and

th

ey

can

exp

lain

ho

w/w

hy

thes

e th

ings

hav

e

hap

pen

ed.

Gat

her

ing,

An

alyz

ing,

an

d E

valu

atin

g

Info

rmat

ion

fro

m D

iffe

ren

t So

urc

es:

con

sid

ers

and

an

alyz

es t

he

info

rmat

ion

fro

m o

ne

or

two

so

urc

es.

Un

able

to

det

erm

ine

ho

w t

he

vari

ou

s so

urc

es a

re

inte

rrel

ated

. Has

dif

ficu

lty

eval

uat

ing

qu

alit

y o

f th

e so

urc

e.

Gat

her

ing,

An

alyz

ing,

an

d E

valu

atin

g

Info

rmat

ion

fro

m D

iffe

ren

t So

urc

es:

Can

anal

yze

the

info

rmat

ion

fro

m a

lim

ited

nu

mb

er o

f so

urc

es. M

ay h

ave

dif

ficu

lty

det

erm

inin

g h

ow

th

e va

rio

us

sou

rces

are

inte

rrel

ated

. Has

dif

ficu

lty

eval

uat

ing

qu

alit

y o

f th

e so

urc

e.

Gat

her

ing,

An

alyz

ing,

an

d E

valu

atin

g

Info

rmat

ion

fro

m D

iffe

ren

t So

urc

es:

Can

con

sid

er a

nd

an

alyz

e th

e in

form

atio

n

they

rea

d f

rom

a n

um

ber

of

sou

rces

an

d

they

gen

eral

ly a

re a

ble

to

det

erm

ine

ho

w

the

vari

ou

s so

urc

es a

re in

terr

elat

ed. A

leve

l 3 r

ead

er m

ay n

ot

be

able

to

jud

ge

the

rela

tive

qu

alit

y o

f w

hat

was

rea

d.

Gat

her

ing,

An

alyz

ing,

an

d E

valu

atin

g

Info

rmat

ion

fro

m D

iffe

ren

t So

urc

es:

Can

con

sid

er a

nd

an

alyz

e th

e in

form

atio

n

read

fro

m a

nu

mb

er o

f so

urc

es a

nd

are

con

sist

entl

y ab

le t

o d

eter

min

e h

ow

th

e

vari

ou

s so

urc

es a

re in

terr

elat

ed. T

he

read

er w

ill b

e ab

le t

o ju

dge

th

e re

lati

ve

qu

alit

y o

f w

hat

th

ey r

ead

.

Syn

thes

izin

g In

form

atio

n a

nd

Dra

win

g

Co

ncl

usi

on

s: T

he

read

er is

un

able

to

eval

uat

e in

form

atio

n p

rese

nte

d w

ith

in

the

con

text

of

a te

xt, o

r m

ult

iple

tex

ts.

Has

dif

ficu

lty

eval

uat

ing

the

info

rmat

ion

and

mak

ing

pre

dic

tio

ns.

Syn

thes

izin

g In

form

atio

n a

nd

Dra

win

g

Co

ncl

usi

on

s: C

an e

valu

ate

a lim

ited

amo

un

t o

f in

form

atio

n p

rese

nte

d w

ith

in

the

con

text

of

a te

xt, o

r m

ult

iple

tex

ts.

Can

use

th

e in

form

atio

n t

o m

ake

pre

dic

tio

ns

wh

ich

may

no

t b

e ac

cura

te.

Syn

thes

izin

g In

form

atio

n a

nd

Dra

win

g

Co

ncl

usi

on

s: T

he

read

er c

an g

ener

ally

(alt

ho

ugh

no

t co

nsi

sten

tly)

ab

le t

o

eval

uat

e in

form

atio

n p

rese

nte

d w

ith

in

the

con

text

of

a te

xt, o

r m

ult

iple

tex

ts,

and

th

en e

valu

ate

the

info

rmat

ion

in

ord

er t

o m

ake

reas

on

able

pre

dic

tio

ns.

Syn

thes

izin

g In

form

atio

n a

nd

Dra

win

g

Co

ncl

usi

on

s: C

an c

on

sist

entl

y ab

le t

o

eval

uat

e in

form

atio

n p

rese

nte

d w

ith

in

the

con

text

of

a te

xt, o

r m

ult

iple

tex

ts,

and

th

en e

valu

ate

the

info

rmat

ion

in

ord

er t

o m

ake

reas

on

able

pre

dic

tio

ns.

The

abili

ty t

o i

nte

rpre

t,

eval

uat

e, a

nd

rea

ct t

o

som

eth

ing

that

is r

ead

Pre

par

ed b

y ks

tear

ns,

rev

ised

4/2

3/2

008

2 o

f 3

GEA Task Force, rev 5/12008 Page 12

Page 13: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

Leve

l 1 (

INIT

IATE

)Le

vel 2

(EM

ERG

ING

)Le

vel 3

(P

RO

FIC

IEN

T)Le

vel 4

(SK

ILLE

D)

LEV

ELS

OF

AC

HIE

VEM

ENT

IND

ICA

TOR

S O

F

EFFE

CTI

VE

REC

EPTI

ON

OF

CO

MM

UN

ICA

TIO

NV

ISU

AL,

SP

ATI

AL

AN

D

GR

AP

HIC

AL

CO

MP

REH

ENSI

ON

The

abili

ty t

o i

nte

rpre

t,

eval

uat

e, a

nd

rea

ct t

o

som

eth

ing

that

is

ob

serv

ed a

nd

/or

exp

erie

nce

d

The

stu

den

t is

ab

le t

o e

xplo

re a

nd

iden

tify

ob

ject

s in

th

e re

ceiv

ed d

ata

ho

wev

er t

hey

eit

her

do

no

t in

terp

ret

or

mis

inte

rpre

ts t

he

dat

a co

nve

yed

by

the

visu

al o

bje

ct/o

r th

e o

bse

rved

sce

nar

io. 

The

stu

den

t is

un

able

to

mak

e

con

nec

tio

ns

to o

ther

info

rmat

ion

.  Th

e

stu

den

t ca

nn

ot

asse

ss/j

ud

ge q

ual

ity

of

the

info

rmat

ion

co

nve

yed

.

The

stu

den

t is

ab

le t

o h

ypo

thes

ize

abo

ut

the

rece

ived

dat

a. T

he

stu

den

t ca

n

answ

er b

asic

qu

esti

on

s ab

ou

t th

e

dat

a/in

form

atio

n c

on

veye

d b

y th

e vi

sual

rep

rese

nta

tio

n o

r o

bse

rved

sce

nar

io . 

The

stu

den

t m

ay m

ake

som

e si

mp

le

con

nec

tio

ns

to o

ther

info

rmat

ion

.  Th

e

stu

den

t ca

n m

ake

som

e, p

erh

aps

inac

cura

te, p

red

icti

on

/ ge

ner

aliz

atio

ns

bas

ed o

n in

form

atio

n f

rom

an

ob

serv

able

rep

rese

nta

tio

n (

i.e. a

gra

ph

,

imag

e, s

cen

ario

, etc

.). 

The

stu

den

t m

ay

be

able

to

ass

ess

qu

alit

y o

f th

e

info

rmat

ion

pre

sen

ted

in t

he

grap

h.

The

stu

den

t is

ab

le t

o m

anip

ula

te t

he

visu

ally

rec

eive

d d

ata.

Th

ey c

an a

nsw

er

mo

st/a

ll q

ues

tio

ns

abo

ut

the

dat

a/in

form

atio

n c

on

veye

d b

y th

e

ob

serv

able

dat

a.  T

hey

can

use

th

e

info

rmat

ion

to

mak

e so

me

con

nec

tio

ns

wit

h o

ther

info

rmat

ion

.  Th

e st

ud

ent

can

mak

e so

me

accu

rate

gen

eral

izat

ion

s

and

/or

pre

dic

tio

ns

bas

ed o

n t

he

dat

a. 

They

can

co

nsi

sten

tly

mak

e ac

cura

te

asse

ssm

ents

of

the

qu

alit

y o

f d

ata.

Stu

den

ts a

re a

ble

to

pro

du

ce d

ata

sim

ilar

to t

he

rece

ived

dat

a. T

hey

sh

ow

a

tho

rou

gh u

nd

erst

and

ing

of

the

info

rmat

ion

in t

he

visu

ally

rep

rese

nte

d

mat

eria

l (i.e

. gra

ph

, im

age

or

ob

serv

ed

pro

cess

/sce

nar

io).

  Th

ey c

an c

on

sist

entl

y

mak

e co

nn

ecti

on

s w

ith

oth

er

info

rmat

ion

.  Th

e st

ud

ent

can

ext

end

idea

s p

rese

nte

d a

nd

can

co

nsi

sten

tly

mak

e ac

cura

te g

ener

aliz

atio

ns,

pre

dic

tio

ns,

jud

gmen

ts b

ased

on

th

e

dat

a.  T

hey

co

nsi

sten

tly

mak

e ju

dgm

ents

abo

ut

the

qu

alit

y o

f th

e d

ata.

Pre

par

ed b

y ks

tear

ns,

rev

ised

4/2

3/2

008

3 o

f 3

GEA Task Force, rev 5/12008 Page 13

Page 14: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

This

pag

e is

inte

nti

on

ally

bla

nk.

GEA Task Force, rev 5/12008 Page 14

Page 15: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

Leve

l 1 (

INIT

IATE

)Le

vel 2

(EM

ERG

ING

)Le

vel 3

(P

RO

FIC

IEN

T)Le

vel 4

(SK

ILLE

D)

INTR

OD

UC

TIO

NC

on

nec

ts w

ith

an

d e

nga

ges

the

aud

ien

ce

Esta

blis

hes

th

esis

Esta

blis

hes

cre

dib

ility

No

att

enti

on

-gra

bb

ing

stra

tegy

is e

mp

loye

d.

Rel

evan

ce o

f th

e

pre

sen

tati

on

’s t

op

ic a

nd

pu

rpo

se is

no

t ap

par

ent.

Thes

is s

tate

men

t is

mis

sin

g o

r

un

clea

r.

Pu

rpo

se a

nd

mai

n id

ea o

f th

e

pre

sen

tati

on

are

no

t

con

veye

d.

Cre

dib

ility

is n

ot

esta

blis

hed

by

the

spea

ker.

Att

enti

on

-gra

bb

ing

dev

ice

is

use

d b

ut

do

es n

ot

adeq

uat

ely

cap

ture

th

e au

die

nce

’s

inte

rest

.

Rel

evan

ce o

f th

e

pre

sen

tati

on

’s t

op

ic a

nd

pu

rpo

se is

imp

lied

bu

t n

ot

exp

licit

ly e

xpre

ssed

.

Thes

is is

imp

lied

bu

t n

ot

clea

rly

stat

ed.

Mai

n p

oin

ts o

f th

e th

esis

are

no

t p

revi

ewed

.

Cre

dib

ility

is i

mp

lied

bu

t n

ot

esta

blis

hed

by

the

spea

ker.

Effe

ctiv

e st

rate

gy is

use

d t

o

cap

tiva

te a

ud

ien

ce’s

att

enti

on

.

Rel

evan

ce o

f th

e

pre

sen

tati

on

’s t

op

ic a

nd

pu

rpo

se is

est

ablis

hed

.

Thes

is is

exp

licit

ly s

tate

d.

Top

ic is

iden

tifi

ed, a

nd

th

e

mai

n p

oin

ts o

f th

e

pre

sen

tati

on

are

pre

view

ed.

Cre

dib

ility

is e

stab

lish

ed b

y th

e

spea

ker

thro

ugh

th

e at

ten

tio

n-

grab

bin

g d

evic

e an

d t

hes

is

stat

emen

t.

Cre

ativ

e, t

ho

ugh

t-p

rovo

cati

ve

atte

nti

on

-gra

bb

ing

dev

ice

is u

sed

to in

tro

du

ce t

he

top

ic.

Rel

evan

ce o

f th

e p

rese

nta

tio

n’s

top

ic a

nd

pu

rpo

se is

ski

llfu

lly

esta

blis

hed

.

Insi

ghtf

ul,

wel

l-th

ou

ght-

ou

t th

esis

is c

lear

ly a

nd

eff

ecti

vely

sta

ted

.

Top

ic is

wel

l-id

enti

fied

, an

d t

he

mai

n p

oin

ts o

f th

e p

rese

nta

tio

n

are

tho

rou

ghly

pre

view

ed.

Cre

dib

ility

is w

ell-

esta

blis

hed

by

the

spea

ker

thro

ugh

th

e

atte

nti

on

-gra

bb

ing

dev

ice

and

thes

is s

tate

men

t.

OR

GA

NIZ

ATI

ON

Logi

cal S

eq

ue

nce

Flo

w

Tran

siti

on

s

Co

he

ren

ce

Pre

sen

tati

on

is v

agu

e in

stru

ctu

re. P

rese

nta

tio

n is

har

d

to f

ollo

w b

ecau

se o

f la

ck o

f

tran

siti

on

s an

d il

logi

cal

seq

uen

ce. 

Pre

sen

tati

on

str

uct

ure

is

pre

sen

t b

ut

dev

iate

s fr

om

top

ic. 

Tran

siti

on

s an

d f

low

of

idea

s ar

e sp

ora

dic

an

d

som

ewh

at d

isp

lace

d.

Pre

sen

tati

on

has

a c

on

sist

ent

stru

ctu

re b

ut

occ

asio

nal

ly lo

ses

focu

s o

f st

atem

ents

.  A

n

adeq

uat

e se

qu

ence

is

follo

wed

.

Pre

sen

tati

on

has

a c

on

sist

ent

stru

ctu

re w

ith

a s

tro

ng

sen

se o

f

pu

rpo

se, w

hile

kee

pin

g au

die

nce

inte

rest

an

d c

reat

ing

an

exce

pti

on

ally

eff

ecti

ve

con

nec

tio

n o

f id

eas.

FLO

RID

A C

OM

MU

NIT

Y C

OLL

EGE

AT

JA

CK

SON

VIL

LE

CO

MM

UN

ICA

TIO

N -

OR

AL

CO

NT

ENT

(GEA

Ru

bri

c G

rou

p L

ead

ers

- J

oe

l Rap

po

po

rt &

Way

ne

Sin

glet

ary)

IND

ICA

TOR

S O

F EF

FEC

TIV

E O

RA

L

CO

MM

UN

ICA

TIO

N-C

ON

TEN

T

LEV

ELS

OF

AC

HIE

VEM

ENT

Pre

par

ed b

y ks

tear

ns,

4/1

0/2

00

81

of

2

GEA Task Force, rev 5/12008 Page 15

Page 16: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

Leve

l 1 (

INIT

IATE

)Le

vel 2

(EM

ERG

ING

)Le

vel 3

(P

RO

FIC

IEN

T)Le

vel 4

(SK

ILLE

D)

IND

ICA

TOR

S O

F EF

FEC

TIV

E O

RA

L

CO

MM

UN

ICA

TIO

N-C

ON

TEN

T

LEV

ELS

OF

AC

HIE

VEM

ENT

SUB

JEC

T K

NO

WLE

DG

ETo

pic

Dev

elo

pm

ent

(Dep

th)

Sup

po

rtin

g Ev

iden

ce

Lan

guag

e A

pp

rop

riat

e to

Au

die

nce

and

Su

bje

ct

Acc

ura

cy

Rel

evan

ce

Pre

sen

tati

on

do

es n

ot

adeq

uat

ely

dev

elo

p t

he

top

ic.

Pre

sen

tati

on

lack

s d

epth

bec

ause

of

insu

ffic

ien

t su

pp

ort

and

/or

irre

leva

nt

det

ails

.

The

cen

tral

idea

may

be

un

clea

r o

r u

nst

ated

. Gra

mm

ar

and

/or

wo

rd c

ho

ice

are

seve

rely

def

icie

nt

and

sh

ow

littl

e re

cogn

itio

n o

f la

ngu

age

app

rop

riat

enes

s. 

Pre

sen

tati

on

sta

tes

a p

urp

ose

bu

t gi

ves

littl

e ev

iden

ce o

r

inco

rrec

t ev

iden

ce in

su

pp

ort

of

that

pu

rpo

se. I

sola

ted

err

ors

in g

ram

mar

an

d/o

r w

ord

cho

ice

red

uce

cla

rity

an

d

cred

ibili

ty. S

up

po

rt is

off

ered

bu

t in

adeq

uat

e fo

r th

e

pu

rpo

se. U

se o

f so

urc

es m

ay

hin

der

th

e p

rese

nta

tio

n’s

effe

ct o

r b

e in

corr

ectl

y

ackn

ow

led

ged

.

Pre

sen

tati

on

cle

arly

sta

tes

the

pu

rpo

se. 

Co

nte

nt

is a

ccu

rate

and

rel

evan

t.  P

rese

nta

tio

n is

free

of

seri

ou

s er

rors

in

gram

mar

an

d/o

r w

ord

usa

ge. 

Val

id s

up

po

rt is

giv

en f

or

each

asse

rtio

n. S

ou

rce

info

rmat

ion

add

s to

th

e p

rese

nta

tio

n a

nd

is

corr

ectl

y ac

kno

wle

dge

d.

Pre

sen

tati

on

no

t o

nly

sta

tes

the

pu

rpo

se w

ith

cla

rity

bu

t al

so

enga

ges

the

liste

ner

wit

h

inte

rest

ing

and

/or

pro

voca

tive

info

rmat

ion

. Pre

sen

tati

on

is f

ree

of

gram

mar

err

ors

; th

e w

ord

cho

ice

pro

vid

es c

lari

ty a

nd

incr

ease

s in

tere

st. T

he

con

ten

t

and

sty

le a

re c

on

sist

entl

y

app

rop

riat

e an

d t

arge

ted

to

th

e

aud

ien

ce. D

etai

ls s

tro

ngl

y

sup

po

rt e

ach

ass

erti

on

, an

d t

he

con

ten

t is

th

oro

ugh

. Use

of

sou

rces

incr

ease

s th

e

pre

sen

tati

on

’s e

ffec

tive

nes

s, a

dd

s

to t

he

ove

rall

pre

sen

tati

on

, an

d is

corr

ectl

y ac

kno

wle

dge

d.

CO

NC

LUSI

ON

Sum

mar

y

Pro

fou

nd

En

din

g

Pre

sen

tati

on

su

mm

ary

is n

ot

iden

tifi

able

. R

efer

ence

to

th

e

mai

n id

ea is

no

t ap

par

ent.

Co

ncl

ud

ing

dev

ice

is n

ot

use

d.

Pre

sen

tati

on

en

ds

abru

ptl

y.

Pre

sen

tati

on

su

mm

ary

is

un

clea

r. M

ain

idea

is

refe

ren

ced

. C

on

clu

din

g d

evic

e

is m

ade

bu

t is

no

t im

pac

tfu

l.

Pre

sen

tati

on

su

mm

ary

is

emp

loye

d, a

nd

ref

eren

ce t

o

the

mai

n id

ea is

cle

arly

sta

ted

.

Co

ncl

ud

ing

dev

ice

add

s to

th

e

pre

sen

tati

on

.

Pre

sen

tati

on

su

mm

ary

and

mai

n

idea

are

cre

ativ

ely

and

cle

arly

emp

loye

d.

Co

ncl

ud

ing

dev

ice

is

tim

ed p

erfe

ctly

an

d m

akes

th

e

end

of

the

pre

sen

tati

on

imp

actf

ul.

Pre

par

ed b

y ks

tear

ns,

4/1

0/2

00

82

of

2

GEA Task Force, rev 5/12008 Page 16

Page 17: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

Leve

l 1 (

INIT

IATE

)Le

vel 2

(EM

ERG

ING

)Le

vel 3

(P

RO

FIC

IEN

T)Le

vel 4

(SK

ILLE

D)

VO

CA

LFl

ue

ncy

in d

eliv

ery

Pre

sen

tati

on

has

man

y "u

h/u

m"

fille

rs a

nd

dis

trac

tin

g p

ause

s

thro

ugh

ou

t

Pre

sen

tati

on

has

sev

eral

"u

h/u

m"

fille

rs a

nd

sta

mm

erin

g

Mo

st o

f p

rese

nta

tio

n f

low

s

smo

oth

ly, w

ith

just

a f

ew "

uh

/um

"

fille

rs a

nd

hes

itat

ion

s

Pre

sen

tati

on

flo

ws

smo

oth

ly

thro

ugh

ou

t, w

ith

few

or

no

dis

flu

enci

es

Vo

lum

eP

oo

r p

roje

ctio

n, m

akin

g

pre

sen

tati

on

dif

ficu

lt t

o h

ear

fro

m

any

aud

ien

ce p

osi

tio

n

Inad

equ

ate

pro

ject

ion

, mak

ing

pre

sen

tati

on

au

dib

le f

or

tho

se

clo

se t

o s

pea

ker

bu

t in

aud

ible

to

tho

se i

n b

ack

Vo

ice

pro

ject

ion

is a

deq

uat

e

ove

rall,

th

ou

gh s

om

e p

arts

of

pre

sen

tati

on

are

dif

ficu

lt f

or

dis

tan

t p

arts

of

aud

ien

ce t

o h

ear

cle

arly

Pro

ject

s vo

ice

cle

arly

to

en

tire

aud

ien

ce

Enu

nci

atio

nIn

dis

tin

ct a

rtic

ula

tio

n t

hro

ugh

ou

t

pre

sen

tati

on

Man

y w

ord

s in

dis

tin

ctM

ost

wo

rds

cle

arly

art

icu

late

dC

lear

an

d d

isti

nct

en

un

ciat

ion

thro

ugh

ou

tR

ate

of

de

liver

yD

eliv

ery

rate

(e

ith

er t

oo

fas

t, t

oo

slo

w)

is a

pri

mar

y d

istr

acti

on

Del

iver

y ra

te (

eit

her

to

o f

ast

or

two

slo

w)

is d

istr

acti

ng

at t

imes

Del

iver

y ra

te d

oes

no

t d

istr

act

fro

m p

rese

nta

tio

n

Del

iver

y ra

te is

co

mfo

rtab

le f

or

aud

ien

ce a

nd

ad

apte

d a

s n

eed

ed

for

effe

ctiv

e p

rese

nta

tio

n

Infl

ecti

on

Mo

no

ton

eM

ech

anic

al w

ith

un

nat

ura

l

sou

nd

ing

infl

ect

ion

s

Som

e ex

pre

ssio

nC

on

vers

atio

nal

Pro

nu

nci

atio

nSi

gnif

ican

t m

isp

ron

un

ciat

ion

s

dis

trac

t o

r co

nfu

se a

ud

ien

ce

A f

ew m

isp

ron

un

ciat

ion

s th

at a

re

dis

trac

tin

g, b

ut

do

no

t d

isru

pt

con

ten

t

Co

rrec

t fo

r m

ost

wo

rds

Co

rrec

t st

and

ard

En

glis

h

NO

N V

ERB

AL

Eye

con

tact

Focu

sed

on

no

tes

Focu

sed

on

tea

che

rG

lan

ces

at a

ud

ien

ceEq

ual

an

d s

ust

ain

edFa

cial

exp

ress

ion

Dea

dp

an, e

xpre

ssio

nle

ssSo

me

exp

ress

ion

pre

sen

t, b

ut

com

es a

cro

ss a

s fo

rced

Exp

ress

ion

is a

pp

rop

riat

e an

d d

oes

no

t d

istr

act

fro

m p

rese

nta

tio

n

Exp

ress

ion

en

han

ces

pre

sen

tati

on

Po

stu

reU

nst

able

an

d v

isu

ally

dis

trac

tin

gO

ngo

ing

ind

icat

ors

of

dis

com

fort

(e.g

., w

ith

dra

wn

po

stu

re, n

ervo

us

foo

t m

ove

men

t)

Stab

le o

vera

ll w

ith

min

or

dis

trac

tio

ns

Stab

le, c

on

fid

ent

Ges

ture

sFe

w o

r n

on

eA

few

, mec

han

ical

Som

e ex

pre

ssiv

e ge

stu

res

Man

y en

han

cin

g ge

stu

res

Pro

xem

ics

No

cre

ativ

e u

se o

f sp

ace

Stay

s at

po

diu

mC

on

tro

ls s

pac

eC

reat

ive

use

of

spac

eA

ttir

eIn

app

rop

riat

e , t

o t

he

po

int

of

dis

trac

tin

g au

die

nce

Mis

mat

ched

to

co

nte

xt o

f

pre

sen

tati

on

(e

.g.,

too

cas

ual

,

reve

alin

g)

Ad

equ

ate

and

no

t d

istr

acti

ng

Wel

l mat

ched

to

co

nte

xt

AU

DIE

NC

E A

DA

PTA

TIO

NP

rese

nta

tio

n is

inse

nsi

tive

to

aud

ien

ce's

pri

or

kno

wle

dge

or

atti

tud

es

Som

e at

tem

pt

to t

ailo

r

pre

sen

tati

on

to

au

die

nce

wit

h

limit

ed e

ffe

ctiv

ene

ss

Pre

sen

tati

on

sh

ow

s cl

ear

adap

tati

on

to

au

die

nce

Spea

ker

use

s au

die

nce

cu

es d

uri

ng

to c

on

ceiv

e an

d d

eliv

er s

pee

ch t

o

max

eff

ect

FLO

RID

A C

OM

MU

NIT

Y C

OLL

EGE

AT

JA

CK

SON

VIL

LE

CO

MM

UN

ICA

TIO

N -

OR

AL

- D

ELIV

ERY

(G

EA R

ub

ric

Gro

up

Lea

der

- J

oh

n W

all)

LEV

ELS

OF

AC

HIE

VEM

ENT

IND

ICA

TOR

S O

F EF

FEC

TIV

E

OR

AL

CO

MM

UN

ICA

TIO

N -

Pre

par

ed b

y ks

tear

ns,

4/1

0/2

00

81

of

1

GEA Task Force, rev 5/12008 Page 17

Page 18: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

This

pag

e is

inte

nti

on

ally

bla

nk.

GEA Task Force, rev 5/12008 Page 18

Page 19: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

Leve

l 1 (

INIT

IATE

)Le

vel 2

(EM

ERG

ING

)Le

vel 3

(P

RO

FIC

IEN

T)Le

vel 4

(SK

ILLE

D)

CO

NTE

NT

Ap

pro

pri

ate

top

ic d

evel

op

men

t w

ith

a

clea

r an

d c

oh

eren

t fo

cus

The

wri

tin

g is

illo

gica

l or

the

auth

or'

s p

oin

t o

f vi

ew is

no

t

dis

cern

ible

.

The

wri

tin

g re

late

s to

th

e

thes

is; t

her

e is

evi

den

ce o

som

e cr

itic

al t

hin

kin

g.

The

po

int

of

the

essa

y is

cle

ar

to t

he

read

er; d

emo

nst

rate

s

com

pet

ent

crit

ical

th

inki

ng.

The

po

int

of

the

wri

tin

g is

 cle

ar

and

per

cep

tive

; dem

on

stra

tes

ou

tsta

nd

ing

crit

ical

th

inki

ng.

Sup

po

rt/e

vid

ence

is c

red

ible

, ad

equ

ate,

rele

van

t, lo

gica

l

The

wri

tin

g la

cks

a d

evel

op

men

t o

f id

eas,

reas

on

s, o

r su

pp

ort

.

The 

essa

y la

cks

clar

ity;

lack

s

adeq

uat

e su

pp

ort

or 

full

dev

elo

pm

ent

of

idea

s;  c

on

ten

t

or

dev

elo

pm

ent

of

idea

s m

ay

be

inco

nsi

sten

t.

The

wri

tin

g h

as a

deq

uat

e

sup

po

rt a

nd

dev

elo

pm

ent

of

idea

s.

The

wri

tin

g is

we

ll-d

evel

op

ed

wit

h s

ou

nd

rea

son

ing

and

det

ails

.

Ori

gin

alit

y o

f id

eas

The

wri

tin

g is

off

-to

pic

or

dev

iate

s fr

om

th

e as

sign

men

t in

an u

nac

cep

tab

le w

ay.

The

wri

tin

g la

cks

ori

gin

alit

y;

ten

ds

tow

ard

gen

era

l

stat

emen

ts a

nd

lack

s a

pro

gres

sio

n o

f id

eas 

.

The 

wri

tin

g d

emo

nst

rate

s a

logi

cal p

rogr

essi

on

of

idea

s.

The

wri

tin

g is

we

ll-cr

afte

d a

nd

con

tain

s a 

vari

ety

of

exam

ple

s

that

are

ori

gin

al a

nd

fre

sh.

OR

GA

NIZ

ATI

ON

Ap

pro

pri

ate

stru

ctu

reN

o c

lear

ob

ject

ive

(th

esis

) is

esta

blis

hed

.

A t

he

sis

is a

pp

aren

t b

ut

no

t

clar

ifie

d.

A c

lear

th

esis

is e

stab

lish

ed.

A p

reci

se a

nd

mat

ure

d t

hes

is is

esta

blis

hed

.

Co

her

ence

in t

he

pro

gres

sio

n o

f id

eas

in

sup

po

rt o

f th

e th

esis

An

y p

arag

rap

hin

g (i

f p

rovi

ded

)

is ju

mb

led

: id

eas

and

sup

po

rtin

g ex

amp

les

are

lack

ing

or

un

de

rdev

elo

pe

d a

nd

ble

nd

ed w

ith

litt

le r

egar

d t

o

pro

vid

ing

a fl

uid

, lo

gica

l

pro

gres

sio

n.

Sup

po

rtiv

e id

eas

and

exa

mp

les,

tho

ugh

pro

vid

ed, a

re h

ou

sed

in

an im

bal

ance

d o

r d

isjo

inte

d

man

ner

.

Idea

s an

d r

elev

ant

exam

ple

s

are

ho

use

d w

ith

in o

rder

ly,

dev

elo

ped

par

agra

ph

s.

Idea

s an

d e

xam

ple

s ar

e fl

uid

ly

arra

nge

d w

ith

in d

evel

op

ed

par

agra

ph

s.

No

dis

cern

ible

fo

rmal

con

clu

sio

n is

pro

vid

ed.

Tran

siti

on

s ar

e ab

rup

t.A

deq

uat

e tr

ansi

tio

ns

are

uti

lized

.    

Tran

siti

on

s ar

e se

amle

ss.

The

wri

tin

g la

cks

a co

nn

ecti

ve

teth

er t

hat

lin

ks it

s p

arts

.

The

con

clu

sio

n is

un

de

rdev

elo

pe

d o

r m

arke

d b

y

red

un

dan

cies

Idea

s an

d e

xam

ple

s ar

e

seq

uen

ced

in a

logi

cal

pro

gres

sio

n.

The

con

clu

sio

n s

atis

fyin

gly

com

ple

tes

the

mo

men

tum

th

at

pre

ced

es

it.  

Ram

blin

g an

d in

coh

eren

ce

def

ine

th

e w

riti

ng.

 

A c

on

ne

ctiv

e te

ther

do

es e

xist

bu

t lin

ks in

con

sist

ent

par

ts.  

 

The

wri

tin

g is

bro

ugh

t to

com

ple

tio

n w

ith

a

com

pre

hen

sive

co

ncl

usi

on

.

The

wri

tin

g fo

rms

an o

rgan

ic

wh

ole

.     

   

An

un

even

ord

er d

efin

es t

he

wri

tin

g.

A c

on

ne

ctiv

e te

ther

lin

ks

bal

ance

d p

arts

.So

un

d r

easo

nin

g d

efin

es t

he

wri

tin

g.  

FLO

RID

A C

OM

MU

NIT

Y C

OLL

EGE

AT

JAC

KSO

NV

ILLE

C

OM

MU

NIC

ATI

ON

- W

RIT

TEN

(G

EA R

ub

ric

Gro

up

Le

ade

rs -

Dav

id B

ow

en &

Yo

ula

nd

a H

enry

)

IND

ICA

TOR

S O

F EF

FEC

TIV

E W

RIT

TEN

CO

MM

UN

ICA

TIO

N

LEV

ELS

OF

AC

HIE

VEM

ENT

Pre

par

ed b

y ks

tear

ns,

4/1

0/2

00

81

of

2

GEA Task Force, rev 5/12008 Page 19

Page 20: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

Leve

l 1 (

INIT

IATE

)Le

vel 2

(EM

ERG

ING

)Le

vel 3

(P

RO

FIC

IEN

T)Le

vel 4

(SK

ILLE

D)

IND

ICA

TOR

S O

F EF

FEC

TIV

E W

RIT

TEN

CO

MM

UN

ICA

TIO

N

LEV

ELS

OF

AC

HIE

VEM

ENT

LAN

GU

AG

E

Ton

e is

inap

pro

pri

ate

for

the

spec

ific

au

die

nce

an

d p

urp

ose

.

Ton

e sh

ow

s so

me

awar

ene

ss o

f

aud

ien

ce a

nd

pu

rpo

se, t

ho

ugh

som

e ch

oic

es m

ay b

e

inap

pro

pri

ate

for

the

wri

tin

g

task

.

Ton

e is

ap

pro

pri

ate

for

the

pu

rpo

se a

nd

au

die

nce

.

Ton

e is

we

ll-su

ited

to

th

e

pu

rpo

se a

nd

au

die

nce

.

Ap

pro

pri

ate

ton

e an

d w

ord

ch

oic

e,

clar

ity

Lan

guag

e ch

oic

es a

re li

mit

ed,

inad

equ

ate,

or

inac

cura

te.

Lan

guag

e ch

oic

es a

re a

deq

uat

e;

occ

asio

nal

err

ors

in d

icti

on

or

usa

ge m

ay in

terf

ere

wit

h

mea

nin

g,

Lan

guag

e ch

oic

es a

re

app

rop

riat

e an

d a

ccu

rate

.

Lan

guag

e ch

oic

es a

re p

reci

se

and

pu

rpo

sefu

l, an

d in

so

me

inst

ance

s, f

resh

an

d in

ven

tive

.

Styl

e an

d c

raft

sman

ship

Syn

tact

ical

err

ors

or

amb

igu

ou

s

wo

rdin

g m

ay le

ad t

o c

on

fusi

on

.

Occ

asio

nal

err

ors

in s

ynta

x o

r

pro

ble

ms

wit

h w

ord

ing

may

lead

to

am

big

uit

y o

r in

terf

ere

wit

h r

ead

abili

ty.

Sen

ten

ces

are

gen

era

lly c

lear

ly

wo

rded

, th

ou

gh t

her

e m

ay b

e

min

or

pro

ble

ms

wit

h s

ynta

x o

r

om

issi

on

s; t

hes

e m

ino

r e

rro

rs

do

no

t in

terf

ere

wit

h

read

abili

ty.

Sen

ten

ces

are

clea

rly

wo

rded

and

un

amb

igu

ou

s.

Sen

ten

ce s

tru

ctu

re is

sim

plis

tic

or

dis

join

ted

.

Sen

ten

ce s

tru

ctu

re la

cks

vari

ety

and

ten

ds

to b

e m

ech

anic

al.

Sen

ten

ce s

tru

ctu

re is

var

ied

.Se

nte

nce

str

uct

ure

is b

oth

vari

ed a

nd

so

ph

isti

cate

d.

PU

RP

OSE

/ A

UD

IEN

CE

AW

AR

ENES

S

Lack

s aw

aren

ess

of

pu

rpo

se

and

au

die

nce

.Li

ttle

aw

aren

ess

of

aud

ien

ce.

Mo

st o

f th

e es

say

rela

tes

to t

he

pu

rpo

se a

nd

au

die

nce

.

Effe

ctiv

e an

d s

kille

d a

war

ene

ss

of

the

aud

ien

ce a

nd

ass

ign

men

t

pu

rpo

se.

Co

nte

nt

and

lan

guag

e is

su

ited

to

th

e

task

Co

nte

nt

and

lan

guag

e is

rep

etit

iou

s an

d r

ead

er lo

ses

inte

rest

.

Rea

der

do

es n

ot

lear

n m

uch

abo

ut

the

top

ic.

Wri

ter

may

wan

der

bu

t th

e

read

er s

till

lear

ns

abo

ut

the

top

ic.

Enti

re e

ssay

rel

ates

to

th

e to

pic

and

to

th

e au

die

nce

.

Sen

siti

vity

Litt

le o

r n

o a

ud

ien

ce

con

nec

tio

n.

Litt

le o

r n

o p

urp

ose

is

esta

blis

hed

.

Mu

ch o

f th

e es

say

is w

ell

org

aniz

ed in

its

pre

sen

tati

on

.

Stro

ng

org

aniz

atio

n a

nd

pu

rpo

se.

Co

nn

ecti

vity

Ram

ble

s fr

om

on

e id

ea t

o n

ext.

Stra

ys in

pu

rpo

se a

nd

org

aniz

atio

n.

Mu

ch o

f th

e es

say

is c

on

sist

ent

in s

tru

ctu

re a

nd

pu

rpo

se.

Essa

y is

ap

pro

pri

ate

and

cle

ar

and

th

e au

die

nce

can

lear

n

mu

ch a

bo

ut

the

top

ic.

CO

NV

ENTI

ON

S

Gra

mm

ar N

um

ero

us

erro

rs c

an b

e fo

un

d

in s

ente

nce

str

uct

ure

,

gram

mar

, an

d m

ech

anic

s.

Som

e gr

amm

atic

al p

rob

lem

s

are 

evid

ent.

Erro

rs in

gra

mm

ar a

nd

mec

han

ics

are

rare

an

d

insi

gnif

ican

t.

Erro

rs in

gra

mm

ar a

nd

mec

han

ics

are

no

nex

iste

nt

or

insi

gnif

ican

t.

Spel

ling

 Sp

ellin

g an

d/o

r w

ord

cho

ice 

erro

rs m

ay b

e ab

un

dan

t.

Spel

ling

and

wo

rd c

ho

ice

erro

rs a

re e

vid

ent,

bu

t n

ot

abu

nd

ant.

Spel

ling

and

wo

rd c

ho

ice

erro

rs

are

no

t e

vid

ent.

Spel

ling

and

wo

rd c

ho

ice

are

corr

ect

and

ap

pro

pri

ate

to

colle

ge-l

evel

wri

tin

g. 

Pu

nct

uat

ion

Form

- A

pp

rop

riat

e p

roce

du

re

Pre

par

ed b

y ks

tear

ns,

4/1

0/2

00

82

of

2

GEA Task Force, rev 5/12008 Page 20

Page 21: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

GENERAL EDUCATION ASSESSMENT ACTION PLAN

2008-2009

DATE TIME ROOM GOAL EXPECTED OUTCOME Friday 9/26/08

8:00 – 12:00 Noon

DEERWOOD

To inform the college community on the college wide general education assessment plan to include faculty’s selection of the appropriate tool(s) to assess the communication skills learning outcome.

To provide a report detailing whether the MAPP data provides the appropriate information on which to make curricular decisions.

To provide an opportunity for interdisciplinary groups of faculty to design and develop a maximum of three pilots that would focus on assessing the general education communication skills student learning outcomes using the chosen assessment tools.

The attendees will be more knowledgeable of the FCCJ general education assessment plan and of tool(s) to be used to assess the communication skills learning outcome.

To analyze and evaluate the MAPP data in relation to the assessment program’s desired outcomes.

Creation of three pilots for implementation in the Spring of 2009 that would use the chosen assessment tools to assess the general education communication skills student learning outcome.

Thursday 10/23/08

1:00-4:30 p.m.

DEERWOOD

To provide an opportunity for inter-disciplinary groups of faculty to develop rubrics to assess information literacy skills and critical thinking skills.

Create rubrics to assess student achievement of the general education information literacy skills and critical thinking skills learning outcomes.

Friday 11/21/08

8:00 – 12:00 Noon

DEERWOOD

To select the appropriate tool(s) to assess the general education information literacy skills and critical thinking skills learning outcomes

To provide an opportunity for inter-disciplinary groups of faculty to develop rubrics to assess scientific and quantitative reasoning skills and global socio-cultural responsibility.

Choose appropriate tool(s) to assess information literacy skills and critical thinking skills using the agreed upon rubrics as a guide.

Create rubrics to assess student achievement of the scientific and quantitative reasoning skills and global socio-cultural responsibility general education learning outcomes.

Thursday 1/29/09

1:00-4:30 p.m.

DEERWOOD

To select the appropriate tool(s) to assess the general education scientific and quantitative reasoning skills and global socio-cultural responsibility learning outcomes.

Choose appropriate tool(s) to assess the general education scientific and quantitative reasoning skills and global socio-cultural responsibility learning outcomes using the agreed upon rubrics as a guide.

GEA Task Force, rev 5/12008 Page 21

Page 22: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

GENERAL EDUCATION ASSESSMENT ACTION PLAN

2008-2009

DATE TIME ROOM GOAL EXPECTED OUTCOME Friday 2/27/09

8:00 – 12:00 Noon

DEERWOOD

Thursday 3/26/09

1:00 – 4:00 p.m.

DEERWOOD

Friday 4/24/09

8:00 – 12:00 Noon

DEERWOOD

GEA Task Force, rev 5/12008 Page 22

Page 23: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

GENERAL EDUCATION ASSESSMENT (GEA) TASK FORCE*

TO DO LIST

1. Determine the method of assessing the communication skills student learning

outcome.

2. Develop and implement a maximum of three (3) pilots for assessing the general

education communication skills learning outcome.

3. Develop Rubrics for Information Literacy.

4. Determine the method of assessing the Information Literacy learning outcome.

5. Determine the date to begin assessing Information Literacy.

6. Develop Rubrics for Critical Thinking Skills.

7. Determine the method of assessing the Critical Thinking Skills learning outcome.

8. Determine the date to begin assessing Critical Thinking Skills.

9. Develop Rubrics for Scientific and Quantitative Reasoning.

10. Determine the method of assessing the Scientific and Quantitative Reasoning

learning outcome.

11. Determine the date to begin assessing the Scientific and Quantitative Reasoning

learning outcome.

12. Develop Rubrics for Global Socio-Cultural Responsibility.

13. Determine the method of assessing the Global Socio-Cultural Responsibility

learning outcome.

14. Determine the date to begin assessing Global Socio-Cultural Responsibility.

15. Finalize the 2008-2009 time line for completing tasks.

16. Design the FCCJ General Education Student Learning Outcomes Assessment

Program.

17. Finalize the schedule for implementation of the assessment program.

18. Identify process design and implementation procedures related to the collection,

storage, retrieval, and assessment of artifacts and reporting outcomes to the

appropriate agencies.

* With the help of the faculty and academic administrators

GEA Task Force, rev 5/12008 Page 23

Page 24: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

RECOMMENDATIONS

I. TASK FORCE

A. Composition

1. That the GEA Task Force be composed at a minimum as follows:

Faculty Senate President or designee – liaison to Senate and the

Center for the Advancement for Teaching and Learning

One faculty leader for each of the five liberal arts areas

Two at large liberal arts faculty

Two workforce faculty

One librarian

One Campus President

AVP for Liberal Arts and Sciences

AVP for Workforce

Two liberal arts academic deans

One workforce dean

Executive Dean of the Virtual College

Director of Program Development, Liberal Arts and Sciences

One Research Analyst

An assistant to the Task Force

Director of Instructional Research

B. Meetings

It is recommended that the GER Task Force.

1. Establish its meeting schedule for the entire academic year 2008-09 at its

first meeting in the fall of 2008.

2. Meet twice per month September through November and once in

December; twice a month January through April; and schedule additional

meetings as necessary.

3. Present to the faculty and academic administrators an end-of-year

summary report in April 2009.

4. Meet with the GER Sub-Committee at the end of the fall term to present a

mid-year report and at the end of the spring term to present a summary

report for the year and seek input and recommendations for inclusion in

the end-of-year report to the Executive Vice President.

GEA Task Force, rev 5/12008 Page 24

Page 25: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

5. Meet with the Executive Vice President at the end of the spring term and

submit an end-of-year report including any recommendations.

6. Submit to the College Curriculum Committee the end-of-year final report

including any recommendations.

7. At its meeting in the fall of 2008, review the tasks to be accomplished in

2009, finalize the time line, and identify specific responsibilities to be

assigned to its members.

II. Academic Assemblies

A. Composition

1. Faculty

2. Academic Administrators

3. Others as appropriate

B. Meetings

1. One per month September through November.

2. One per month January through April

C. Process

1. Send out meeting notices to all faculty, all adjuncts, IAC and other academic administrators as appropriate of college wide academic assemblies.

2. Create multi-disciplinary groups.

3. Use group numbers and color coding when creating multi-disciplinary

groups.

4. Provide to the groups the necessary handouts and written directions to

maximize time utilization.

5. Allow for group summary reports by working group leaders rather than

task force members.

6. Collect, categorize, summarize, and electronically report the assembly

results to attendees, and other appropriate groups as necessary.

7. Use Blackboard format to refine and finalize rubrics as necessary.

8. Distribute electronically mid- and end-of-year summary reports to all

faculty and academic administrators.

9. Create multi-media illustrations as appropriate.

10. Keep and maintain the task force’s web site

11. Provide for open forums as necessary and video record.

GEA Task Force, rev 5/12008 Page 25

Page 26: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

III. Task Force Data Support

A. That the AVP for Liberal Arts and Sciences, in collaboration with the Task

Force faculty chair, appoint a task force sub-committee to crosswalk MAPP

categories and proficiency levels with FCCJ’s learning outcomes definitions

and rubric indicators.

B. That the AVP for Liberal Arts and Sciences, in collaboration with the Task

Force faculty chair, appoint an on-going committee to evaluate currency and

validity of existing rubrics and recommend updates and revisions to the Task

Force.

C. That the AVP for Liberal Arts and Sciences, in collaboration with the Task

Force faculty chair, appoint a sub-committee to seek, collect, analyze, and

evaluate systems available to gather, store, retrieve, and report assessment

data in the manner recommended by the Task Force as well as the

technology needed to successfully implement an FCCJ general education

student learning outcomes program

D. That the AVP for Liberal Arts and Sciences, in collaboration with the Task

Force faculty chair, appoint a sub-committee to design the process for

determining types of artifacts to be assessed, population to be assessed, the

point at which artifacts will be assessed (e.g., work from students who

completed 30 credit, >45 credits?), who will be conducting the assessment,

how the data will be grouped (e.g, qualitatively, quantitatively?), reporting

format, to whom the data will be reported (including to the State and SACS),

and how the data will be incorporated into the curriculum review process to

revise learning outcomes and improve general education learning outcomes.

E. That the task force determine how general education learning outcomes

assessment pilots will be solicited from faculty and the criteria for selecting

pilot proposals.

F. That a task force member be appointed to be the liaison to faculty groups

conducting the Task Force’s sponsored general education learning outcomes

assessment pilots during the spring term 2009 and report and provide to the

task force pilot progress reports.

IV. General Education Program and Curriculum Revision Process

A. That the process of reviewing general education courses be conducted when

the major General Education Requirements review begins in academic year

2009-2010 to line up course outcomes with the FCCJ’s five general

education learning outcomes’ definitions and rubrics, and to revise/update

GEA Task Force, rev 5/12008 Page 26

Page 27: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

curriculum map as applicable to ensure that the five statewide areas are

included in the Learning Outcomes Assessment (LOA) section of the course

outlines.

B. That the review and possible revision of rubrics, indicators and levels of

achievement begin in academic year 2010-2011 and every three years

thereafter. Revision recommendations will be submitted to the GEA Task

Force.

C. That learning outcomes assessment results be incorporated into the three

year course outline review cycle.

V. Concerns Expressed by Faculty

A. Cost of developing a learning outcomes assessment project - will it be funded

after the program is designed, or will it be set aside for lack of financial

commitment?

B. If testing is chosen as one of or the method of assessment, there is a

concern about the allocation of class time to the administer tests or other

aspects of the assessment plan.

C. Who will be charged with doing all this work when class size has increased

tremendously and faculty are still expected to participate in professional

development activities and service to the campus and College? Who will

make this decision?

D. If e-portfolios are to be used, who will be responsible for obtaining and

processing artifacts (e.g. uploading), students or faculty? Who will make this

decision?

E. Are 100% of students to be assessed or will it be a randomized percentage?

Who will make this decision?

F. How many outcomes will be assessed at once? How often will we assess?

Will assessment of a given number of outcomes be rotated? Who will make

these decisions?

G. How are we going to report results? Are we going to state what is (e.g.,

communication skills outcome – 30% at such and such level, 10% at such

and such level), or are we going to set a numerical or qualitative standard first

and then evaluate if standard has been met? How would a standard be set?

Who will make these types of decisions? How will this information be used?

H. Will results be presented as aggregated data or will specific courses and

sections be identified? If so, will the results be used for faculty evaluation?

I. Will there be training available to help faculty incorporate learning outcomes

expectations into courses by helping faculty design appropriate assignments?

Will there be training available to help faculty learn how to properly use

rubrics?

GEA Task Force, rev 5/12008 Page 27

Page 28: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

GENERAL EDUCATION ASSESSMENT TASK FORCE

2009

FACULTY

Sheri Brown, Librarian, DTC

Youlanda Henry, Communications, NC

Patty Lee, Workforce, GER Committee Co-Chair, NC

Bill Meisel, Mathematics, DTC

Nancy Mullins, Faculty Senate President, SC

Lourdes Norman, Science, KC

Joel Rappoport, Mathematics, SC

Wayne Singletary, Workforce, KC

Andrea Thaxton, Humanities, KC

John Wall, Social and Behavioral Sciences, SC

ADMINISTRATION

Maggie Cabral-Maly, Kent Campus President

Lynne Crosby, Director of Program Development, Liberal Arts and Sciences

Julie Giuliani, Executive Dean of the Virtual College

Mike Reynolds, Associate Dean, Mathematics and Natural Sciences, KC

Jim Simpson, AVP for Workforce Development and Adult Education

Charles Smires, Dean of Liberal Arts, SC

Karen Stearns, Research Analyst

Jennifer Stoetzer, Task Force Assistant

Nancy Yurko, AVP for Liberal Arts and Sciences

CO-CHAIRS: Youlanda Henry, Faculty Chair

Nancy Yurko, Administrative Chair

NOTE: Other faculty members and/or academic administrators may be added if needed.

GEA Task Force, rev 5/12008 Page 28

Page 29: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

Prepared by L. Crosby, Revised 4/28/08 Page | 1

Florida Community College at Jacksonville

General Education Assessment Pilot

MAPP Test Abbreviated Form

Spring 2008

Revised April 28, 2008

Overview and Purpose – page 1

Overview of Reports – page 2

Available Data – page 3

Student Sampling and Use of FCCJ‟s Pilot Data – page 3

Use of MAPP Tests – page 4

Lessons Learned – page 5

Appendices – page 7

Overview and Purpose

In Spring 2008, FCCJ administered the Measure of Academic Progress and Proficiency (MAPP)

Test paper and pencil Abbreviated Form as a pilot assessment activity. This activity was

sponsored by the Center for the Advancement of Teaching and Learning as an educational

research project and by the FCCJ General Education Assessment Task Force as a pilot for the

General Education Assessment Project.

The purpose of this pilot was to get a sense of the paper and pencil test administration and testing

experience, as well as to see what types of data and results the institution could use for the

General Education Assessment project.

The Measure of Academic Proficiency and Progress (MAPP) test is a measure of college-

level reading, mathematics, writing, and critical thinking in the context of the humanities,

social sciences, and natural sciences. The MAPP test is designed for colleges and

universities to assess their general education outcomes, so they may improve the quality

of instruction and learning. It focuses on the academic skills developed through general

education courses, rather than on the knowledge acquired about the subjects taught in

these courses (ETS).

“[T]he reading questions and the critical thinking questions represent all three of the academic

contexts - humanities, social sciences, and natural sciences - but the number of questions from

each academic context in each subform can differ” (p. 7). See Appendix 1 for the definitions of

the skills areas tested by the MAPP. Please note that FCCJ did not select to administer the essay

option for the test. This is a computer-based writing assessment that is scored by the ETS „e-

GEA Task Force, rev 5/12008 Page 29

Page 30: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

Prepared by L. Crosby, Revised 4/28/08 Page | 2

rater‟ computer program. For the mathematic questions, the students were advised that they

could choose to bring and use a 4-function, scientific or non-graphing calculator.

The Abbreviated form provides too small a sample of each student‟s performance to permit the

reporting of individual scores (except for total scores). A student who takes the Abbreviated

form is actually only taking one-third of the test, and the individual scores are not a reliable

indication of the scores the student would have received on the full standard test.

Overview of Reports

The MAPP Test Abbreviated Form is not intended to provide information about individual

students. It is intended to provide information about groups of at least 50 students.

ETS provides criterion-referenced and norm-referenced reports.

Summary of Scaled Scores for a group of at least 50 students which can be compared to other groups within the institution or with a large combined group of students from

several other institutions using the Comparative Data Guide. The Summary of Scaled

Scores shows the mean scores for groups; however this information is not as useful for

the General Education Assessment project.

Summary of Proficiency Classifications for a group of at least 50 students (see Appendix

2) which can be compared to other groups within the institution or with a large combined

group of students from several other institutions using the Comparative Data Guide.

Proficiency level includes the percentage of student [in a group] classified as proficient,

marginal, and not proficient” (ETS, 2007, p.13). Proficiency levels include a weighting of

the questions by ETS. Different subscores may not be comparable to each other (ie. do

not compare students‟ or groups‟ scores of reading to scores of writing). However, the cut

scores for each skill area in the Abbreviated form should be fairly comparable to the cut

scores for the same skill area in the Standard Form.

Available Data

Raw data for cohorts can be downloaded into MS Excel for additional analysis by the institution.

However, when using the Abbreviated form, student identifiers such as names or student IDs

cannot be downloaded. “[I]t is not possible to compare data from the Abbreviated form

administrations to other academic indicators for individual students, such as class grades, GPA,

or other test scores” (ETS, 2007, p. 14). The nature of the data does allow us to compare scores

and proficiency classifications with academic indicators such as GPA or credit hours earned,

without the ability to link back to individual students, instructors, or reference numbers. The

demographics in the data are based on students‟ self-reporting during the testing administration,

and is not validated or linked to actual student records in Orion.

FCCJ included additional demographic questions related to program of study and general

education requirements. This information will be downloaded to create additional reports and

analyze results. However, the raw data from the additional demographic questions may be

GEA Task Force, rev 5/12008 Page 30

Page 31: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

Prepared by L. Crosby, Revised 4/28/08 Page | 3

available only in scale scores and not in proficiency classifications. Further clarification is

needed from ETS.

Student Sampling and Use of FCCJ‟s Pilot Data

FCCJ did not use a random sample or spaced sample for this pilot. As mentioned previously, the

purpose was to get a sense of the paper and pencil test administration and testing experience, as

well as to see what types of data and results the institution could use for the General Education

Assessment project. For example, no distance learning classes were included in the pilot.

However, ETS does offer a web-based testing option for the MAPP in both a proctored and

unproctored administration.

The FCCJ sample included two groups of students: Group 1 included students enrolled in a

course that students tend to take in their first or second semester of their associate degree

(specifically ENC 1101 English Composition I) and Group 2 included students enrolled in

courses that FCCJ students tend to take in their last semester of college level coursework in the

associate degree. Please note that there may be students who were enrolled in Group 2 courses

who may actually not be at the end of their associate degree program. The demographic data

includes self-reported responses about the number of credit hours a student has earned and the

amount of their general education program the student has completed. (Faculty who were going

to administer the test were asked to announce the test administration at least one day prior to the

test date and request that students bring this demographic information with them.) This data will

help in analyzing the data in more useful demographic groups than what may result from

comparison and analysis of Group 1 and Group 2. However some students may not have

completed any of their general education categories at the time of the test administration, and

each program may have a different credit hour requirements to satisfy general education.

Furthermore the specific class sections of interest for the study were identified initially from a

transcript review and student record analysis. Then full-time and part-time faculty members

teaching these sections were invited to participate. Ultimately the sections in which the test was

administered included faculty volunteers who were teaching Session 1 (A16), A12, and B8

classes in a total of 40 „face-to-face‟ class sections across multiple campuses and centers - North,

Nassau, Downtown, South, Deerwood, and Kent.

The test was administered in regularly scheduled class periods of the volunteer faculty members‟

classes. Students were informed that the test results would not be linked to them as an individual

student or become a part of the student‟s academic record. Faculty were surveyed (see Appendix

3) after the test administration to inquire about methods of motivating students to take the test,

whether a test proctor would have been helpful in their classroom, and suggestions for test

administration processes and instructions if the College decided to administer the MAPP test

again. Faculty used a variety of means to motivate students to participate in the MAPP test, such

as extra credit points, class participation points, ability to help shape assessment project.

In this pilot, 457 students were enrolled in a total of twenty-one Group 1 class sections and 360

(79%) of those students actually participated in the MAPP test administration; 494 students were

enrolled in a total of nineteen Group 2 class sections and 364 (74%) of those students actually

GEA Task Force, rev 5/12008 Page 31

Page 32: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

Prepared by L. Crosby, Revised 4/28/08 Page | 4

participated in the MAPP test administration. Some students were absent the day of the test, or

had dropped/withdrawn from the class by the time the test was administered. In a few cases, the

student was enrolled in two class sections where the test was administered, and was instructed to

only take the test once. 723 or 76% of the total enrolled student population of the volunteer

faculty class sections participated in the MAPP test administration.

ETS advises that “it may be adequate to test only a sample of students - but only if the sample is

selected in such a way that students taking the test will not differ systematically from those

who are not tested….Students enrolled in particular courses tend to differ systematically from

students not enrolled in those courses….The smaller the sample of students from a subgroup, the

less likely that the statistics will generalize to the entire subgroup….The greater the proportion of

the students in your sample who do not take the test, the more your sample is likely to differ

from the population” (ETS, 2007, pp. 20-21).

“The reliability of scores for small groups would be appropriate for evaluating curriculum but

not appropriate for teacher evaluation or for group-to-group comparisons, due to influence of

very high or very low scores of individual students and group differences” (ETS, nd). Therefore,

use of some type of random sample would allow us to make inferences about the larger college

credit student population at the institution.

If using a random sample, and want to compare a group to another group, ETS advises that

subscores and proficiency classifications are examined as well as total scores to determine if

significant differences in group performance might be due to curricular choices as well as student

achievement.

It is important to note that we do not know if some or all of the students completed the test with

their best effort. Some students may not have approached the test with the goal of an optimal

performance.

Use of MAPP Tests

ETS indicates that colleges and universities typically use MAPP tests for the following reasons:

Standard Form

Measure growth in specific types of skills reflected in subscores or proficiency classifications

Trend Indicator of scaled scores (different students over time, but same overall groups)

Comparison with other institutions of scaled scores and proficiency classifications of

groups (number of credit hours, and institution type/Carnegie)

Counseling tool for assisting in placement into courses

Recruitment aid – to identify students who are likely to benefit from the institution‟s instructional programs.

Abbreviated Form

Measure growth or change in overall test score only

Trend Indicator of scaled scores (different students over time, but same overall groups)

GEA Task Force, rev 5/12008 Page 32

Page 33: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

Prepared by L. Crosby, Revised 4/28/08 Page | 5

Comparison with other institutions of scaled scores and proficiency classifications of groups (number of credit hours, and institution type/Carnegie).

Lessons Learned

Abbreviated vs. Standard Test Forms – the Standard form is two hours long, compared to the 40

minute Abbreviated form. Selecting the Abbreviated form allowed the test to be administered in

50 minute class sessions, but did not provide as thorough reports due to the nature of the test.

(Apparently the Standard form can be administered in two sittings – this option needs to be

investigated further). If the College were to continue to use the Abbreviated form and add the

writing essay option, it would require that the test be administered outside of class, or in class

sessions that are longer than 50 minutes.

ETS Demographics vs. FCCJ Demographics (GER and POS) –

ETS includes communication courses in the Humanities General Education;

ETS groups AA and AS together; and

ETS includes many other names of program groups that would confuse our students and fail to provide us with useful data.

Additional Questions – to overcome the issues mentioned in the previous bullets, FCCJ

developed additional demographic questions and modified the instructions for the ETS

demographics. At this time, ETS does not provide us reports with proficiency level

classifications for the students in terms of how they responded to the additional demographic

questions. We only received scaled scores. However, it is possible to download this raw data for

further analysis.

Administration issues – the coordination and administration of the paper and pencil Abbreviated

MAPP test in class sections, as opposed to a testing center, was time consuming. ETS expects

high standards of test security and test proctoring instructions and scripts that must be

communicated to and adhered by each campus and the volunteer faculty members. Test

inventories were conducted prior to test distribution and upon receipt of completed tests from the

volunteer faculty members. Administering the test during class time took time away from

instruction. Since the paper and pencil format was selected, distance learning courses were not

included in the test administration.

Sampling – it was not helpful to split students into two groups (ENC 1101 and Courses in which

students tend to take at the end of their program); if FCCJ would like to develop a baseline, a

sampling procedure that can be replicated in future years that will yield informative results is

necessary to design.

Student motivation – faculty were asked to announce the MAPP test in the class session prior to

the day it would be administered. This could, in part, account for the 76% completion rate.

About one quarter of the students did not take the test due to absence or due to changing their

GEA Task Force, rev 5/12008 Page 33

Page 34: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

Prepared by L. Crosby, Revised 4/28/08 Page | 6

enrollment status in the class (ie. student may have withdrawn from the class prior to the

announcement of the MAPP). In addition, for those students who completed the test, it is

unknown how seriously students approached the test. Some faculty offered extra credit for

completing the test and others did not.

ETS definitions of outcomes vs. FCCJ definitions – the MAPP test was selected for a pilot prior

to FCCJ defining communication skills outcomes, critical thinking skills outcomes, and

quantitative reasoning. Now that these definitions have been developed, a team of faculty will

need to compare the FCCJ definitions to the ETS definitions that were used to design the MAPP

test. If the MAPP test does not match the FCCJ definitions, discussion on FCCJ‟s future use of

the MAPP should occur. If, for example, the communication skills outcome definitions match,

but the quantitative reasoning definitions do not, the MAPP test cannot be broken apart. The

test, either Abbreviated or Standard, must be administered in its entirety. The only optional piece

is the writing essay. But the test must include the reading, English language and quantitative

portions.

Actionable results – proficiency classification data seems to be much more useful than scaled

score data in terms of the General Education Assessment project. Once available data is analyzed

and reported, the General Education Assessment Task Force should discuss whether the types of

data and results from ETS MAPP are applicable to our institution‟s assessment purposes and

could be used as the basis for future actions.

References:

Educational Testing Service (ETS). (2007, July 1). MAPP User‟s Guide. Retrieved April 8,

2008, from http://www.ets.org/Media/Tests/MAPP/pdf/MAPP_Users_Guide.pdf.

Educational Testing Services (ETS). Measure of Academic Proficiency program workshop:

About scores and reports. Retrieved April 8, 2008, from

http://www.starttest.com/4.0.0.1/starttest.aspx?cmd=login&program=mapp&type=institution&ta

rget=order&limit=one.

GEA Task Force, rev 5/12008 Page 34

Page 35: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

Prepared by L. Crosby, Revised 4/28/08 Page | 7

Appendix 1:

College-level reading questions measure students’ ability to:

Interpret the meaning of key terms

Recognize the primary purpose of a passage

Recognize explicitly presented information

Make appropriate inferences

Recognize rhetorical devices

College-level writing questions measure students’ ability to:

Recognize the most grammatically correct revision of a clause, sentence, or group of sentences

Organize unites of language for coherence and rhetorical effect

Recognize and reword figurative language

Organize elements of writing into larger units of meaning

Critical thinking questions measure students’ ability to:

Distinguish between rhetoric and argumentation in a piece of nonfiction prose

Recognize assumptions

Recognize the best hypothesis to account for information presented

Infer and interpret a relationship between variables

Draw valid conclusions based on information presented

Mathematics questions measure students’ ability to:

Recognize and interpret mathematical terms

Read and interpret tables and graphs

Evaluate formulas

Order and compare large and small numbers

Interpret ratios, proportions, and percentages

Read scientific measuring instruments

Recognize and use equivalent mathematical formulas or expressions

(ETS, 2007, p. 4)

GEA Task Force, rev 5/12008 Page 35

Page 36: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

Prepared by L. Crosby, Revised 4/28/08 Page | 8

Appendix 2:

Proficiency Measures

Reading/Critical Thinking

To be considered proficient at Level 1, a student should be able to:

recognize factual material explicitly presented in a reading passage

understand the meaning of particular words or phrases in the context of a reading passage

To be considered proficient at Level 2, a student should be able to:

synthesize material from different sections of a passage

recognize valid inferences derived from material in the passage

identify accurate summaries of a passage or of significant sections of the passage

understand and interpret figurative language

discern the main idea, purpose or focus of a passage or a significant portion of the passage

To be considered proficient at Level 3, a student should be able to:

evaluate competing causal explanations

evaluate hypotheses for consistency with known facts

determine the relevance of information for evaluating an argument or conclusion

determine whether an artistic interpretation is supported by evidence contained in a work

recognize the salient features or themes in a work of art

evaluate the appropriateness of procedures for investigating a question of causation

evaluate data for consistency with known facts, hypotheses or methods

recognize flaws and inconsistencies in an argument

Writing

To be considered proficient at Level 1, a student should be able to:

recognize agreement among basic grammatical elements (e.g., nouns, verbs, pronouns and

conjunctions)

recognize appropriate transition words

recognize incorrect word choice

order sentences in a paragraph

order elements in an outline

To be considered proficient at Level 2, a student should be able to:

GEA Task Force, rev 5/12008 Page 36

Page 37: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

Prepared by L. Crosby, Revised 4/28/08 Page | 9

incorporate new material into a passage

recognize agreement among basic grammatical elements (e.g., nouns, verbs, pronouns and

conjunctions) when these elements are complicated by intervening words or phrases

combine simple clauses into single, more complex combinations

recast existing sentences into new syntactic combinations

To be considered proficient at Level 3, a student should be able to:

discriminate between appropriate and inappropriate use of parallelism

discriminate between appropriate and inappropriate use of idiomatic language

recognize redundancy

discriminate between correct and incorrect constructions

recognize the most effective revision of a sentence

Mathematics

To be considered proficient at Level 1, a student should be able to:

solve word problems that would most likely be solved by arithmetic and do not involve

conversion of units or proportionality. These problems can be multi-step if the steps are repeated rather than embedded.

solve problems involving the informal properties of numbers and operations, often involving

the Number Line, including positive and negative numbers, whole numbers and fractions

(including conversions of common fractions to percent, such as converting "1/4" to 25%)

solve problems requiring a general understanding of square roots and the squares of numbers

solve a simple equation or substitute numbers into an algebraic expression

find information from a graph. This task may involve finding a specified piece of information in a graph that also contains other information.

To be considered proficient at Level 2, a student should be able to:

solve arithmetic problems with some complications, such as complex wording, maximizing or

minimizing, and embedded ratios. These problems include algebra problems that can be solved by arithmetic (the answer choices are numeric).

simplify algebraic expressions, perform basic translations, and draw conclusions from algebraic

equations and inequalities. These tasks are more complicated than solving a simple equation,

though they may be approached arithmetically by substituting numbers.

interpret a trend represented in a graph, or choose a graph that reflects a trend

solve problems involving sets; problems have numeric answer choices

To be considered proficient at Level 3, a student should be able to:

solve word problems that would be unlikely to be solved by arithmetic; the answer choices are

either algebraic expressions or numbers that do not lend themselves to back-solving

GEA Task Force, rev 5/12008 Page 37

Page 38: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

Prepared by L. Crosby, Revised 4/28/08 Page | 10

solve problems involving difficult arithmetic concepts such as exponents and roots other than

squares and square roots and percent of increase or decrease

generalize about numbers, (e.g., identify the values of (x) for which an expression increases

as (x) increases)

solve problems requiring an understanding of the properties of integers, rational numbers, etc.

interpret a graph in which the trends are to be expressed algebraically or one of the following

is involved: exponents and roots other than squares and square roots, percent of increase or decrease

solve problems requiring insight or logical reasoning

(ETS, 2007, pp. 9-11).

Appendix 3:

Faculty Survey

Comments from Faculty who Administered the Paper and Pencil Abbreviated MAPP Test

in their Class Sections

February 2008

1. How did you motivate the students to take the test?

I advised the students that in no way would the results of the test compromise their standing at FCCJ.

Rather, this was a test to determine that we, as faculty and administrators, were providing the best possible

educational experience for our students. Prior to the test, we did a mock "objective" test and discussed test

taking strategies. I also brought candy the day of the test to alleviate their anxieties and to create a relaxed

environment.

I bribed them with 15 points of Extra Credit..... I also told them I would look over the answer sheets and if

it appeared they had "Christmas treed it", they would not get extra credit. I also told them that it should

take most of the allotted time to take the test, if it did not take them most of the allotted time, then I would

have my doubts about the level of their effort.

I basically just asked the students to do the best they could for me. I really didn‟t know specifics about the

test so it was hard to give them much insight on what was going on.

I gave them the usual class credit for coming that day to take the test.

I gave them 4 points extra credit just to take it (and then take their scheduled test on their own time in our

testing center. (They took the MAPP with me, but in order to avoid losing a day, I let them take what was

to be a regularly scheduled in-class test in the testing center. I gave them a 3-4 (Day???) window to take

their regular test to try to spread it out a little for the testing center.)

GEA Task Force, rev 5/12008 Page 38

Page 39: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

Prepared by L. Crosby, Revised 4/28/08 Page | 11

I gave students a free 100 on a quiz for showing up to take the test that day.

I counted the test as an assignment and gave them points.

I told them they were a part of an important change in education in Florida and were a “special” and

“chosen” class.

To motivate my students I first told them why we were doing the test. That is to try and find a tool to help

the college assess whether students were learning what we said they should learn. Additionally, I gave

each of my students 50 extra credit points. They can generally earn about 250 extra credit points during a

term so the test amounted to about one fifth of the total.

I actually offered my students points just as I would in other written activities or group activities in

my class. Usually when the students participate in group activities, they have the ability to earn 10 points.

On the evening of the test, they were given 10 group activity points for their participation. Actually, this

class was a terrific group of students and were happy to participate. Everyone except for one student

participated. The only reason she did not was because she was late for class that evening.

I offered them bonus points as well as told them that they had the opportunity to participate in an important

project that would help to determine whether the college would be administering the test to students in the

future.

I told them they would receive class credit.

2. From your observation of the students, did they appear motivated to take the test and try to do

their best? If you administered the test in more than one section, were there any apparent

differences in student interest/motivation?

I was assigned only one section. The students were motivated to do their best and several commented, "We

want to do a good job, Dr. Clark, to make you look great." There was a feeling this was a cooperative

venture. Although we strictly adhered to the test requirements, there was a generally relaxed feeling among

the students and I alleviated any anxiety by stating that if a student felt a question was beyond his or her

comprehension, that was expected. I also wrote on the board, "Do your best and you'll exceed the rest."

Yes. I saw the same wrinkled brows I do any classroom exams. Same nervous behavior -- leg bouncing

and ceiling gazing.

Most of my students worked very hard on the test. It appeared that I had two students finish the test too

quickly.

I only had one section. Most of the students were not motivated and did not want to take the test.

I think so, but I really couldn‟t tell if they were motivated. I could observe any difference in the two

sections that I administered.

I think the students took the test seriously. Several of my students told me the test was quite difficult.

I felt as if the students took the test seriously.

Yes, they all brought their calculators and answers to the questions about their program of study. Most of

them used the entire time permitted.

GEA Task Force, rev 5/12008 Page 39

Page 40: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

Prepared by L. Crosby, Revised 4/28/08 Page | 12

I do not know if my Macroeconomic (ECO 2013) students did their best. There was some grumbling about

having to take a timed test, though most seemed to finish in the time allotted. My Microeconomic (ECO

2013) students seemed much more motivated and did not complain about the time limit. The

Microeconomic students are older, more mature and are generally my best students. They also seemed to

appreciate more the reason for the test.

The students seemed very motivated for the test, and I think they did their best. We completed the test

administration in one sitting. Many of the students were finished before the allotted time expired. Also,

when I asked the students how they thought they did, the majority felt they did well. I also asked them if

they found the test to be difficult, and the majority responded that they did not find the material too

difficult.

From my observations, the students seemed to give the test their best effort. I did not see anyone rushing

through it, and I saw a number of students going back to read over sections that they had already been over.

They were not very excited about taking the test at all.

3. Do you think that it would have been helpful to have a proctor assisting you in administering the

test?

No. A proctor would have added to an anxiety level and therefore students would not have done their best.

No. The classroom is tight in space. Adding even one more person would have been more of a hindrance

than a help.

No, not for 24 students.

It would have been better to have someone familiar with the test to administer it.

It couldn‟t hurt, but if there is no real motivation for students to put forth maximum effort, I‟m not sure

how much of an effect that will have.

No – it was easy enough to proctor.

I did fine on my own.

No, I don‟t think it was necessary.

It would have been helpful to have a proctor. It would have speeded up passing out, collecting and

accounting for the exams and answer sheets.

Not really. It was relatively easy to do.

I don't think that it was necessary to have another proctor with me to administer the test.

I don‟t think a proctor was necessary.

GEA Task Force, rev 5/12008 Page 40

Page 41: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

Prepared by L. Crosby, Revised 4/28/08 Page | 13

4. If the college were to do this again, do you have suggestions or recommendations for test

administration process and instructions, including the demographic/course history section at the

beginning, administering the test in a class period, etc.?

Many of our students have had little experience with test taking skills. I spent a class period before the test

teaching these skills and the students, therefore, felt motivated to do their best. Perhaps the college might

think about a pre-test skills session. I was surprised that many students have had bad experiences with the

FCAT and approached this test with a high degree of uncertainty and fear of failure. The pre-test session

helped alleviate these fears.

Do all the sections for that time slot in the Wilson Center or some similar setting. I told them beforehand

that it was an SAT style test. They immediately asked if they would be testing in a room other than the

classroom.

It would be much better if the answer sheets were already complete for the students; and a couple of the

directions on the completion of the answer sheet seemed to not match the answer sheet. If we simplify the

start process I think we can complete the whole testing procedure in one hour so that we don‟t have to lose

much class time.

Have someone who is familiar with the test (not faculty) to administer the test.

I did not like having to work around losing a class period and then inundating our testing center. I think it

should be administered by the assessment centers.

If the college really wants all this information, maybe the students should fill out the answer sheet on a

different day from the one on which they are given the test.

No, the instructions were very clear.

Is there a way to do the demographic information automatically based upon the student‟s id? That part

seemed to take a long time. I gave that information to my students two class periods in advance and to

receive their jps they had to bring it to the exam. Is there a reason the exam is timed? While my students

had enough time, if we would have said they had an hour, there would have been fewer complaints.

I would like to have had the instructions for the test ahead of time so that I would have been more familiar

with the administration of the test as well as the content of it so that I would be more prepared. The only

confusing element about the beginning of the test was the part where the social security number was

supposed to be entered in reverse order, and I believe the last 4 digits were to be entered first.

I don‟t have any suggestions. I think the process was smooth.

5. Any other comments or suggestions about the test………

Reminds me of all those tests we administered in high school.

The main recommendation I have is that I think my students would have responded perhaps more

favorably, if a proctor, very familiar with this test, would have explained specifics regarding its purpose. I

did my best, but I think it is possible that this factor might make a difference for future MAPP tests.

I do not know how to properly motivate the students. If a test was used by colleges as part of admission to

the last two years, then AA students would be more motivated. I would prefer that the test was not given

GEA Task Force, rev 5/12008 Page 41

Page 42: GENERAL EDUCATION ASSESSMENT (GEA) TASK …old.fscj.edu/.../compliance-cert-rpt/submission/documents/19036.pdfGEA Pilot – MAPP Test Abbreviated Form ... Mathematics and Natural Sciences

Prepared by L. Crosby, Revised 4/28/08 Page | 14

during regular class time (when do you give it?) as I am always short of time, particularly in

Macroeconomics. Should this test or one like it be given at the end of the students program? What are we

trying to assess? Do we want to know if students completing our program achieved some level of outcome,

or how much they improved while they were here? Is it possible to have five or ten questions on student‟s

final exams in various courses to see how they did. This would motivate them as they are competing for a

grade. But the questions would have to be different for various courses. Which sections would you

choose? All of them? Another advantage of this approach is that we could tell in which courses they are

achieving (or not) the desired outcomes.

No, I was happy to help.

GEA Task Force, rev 5/12008 Page 42