caveon webinar series: considerations for online assessment program design

Post on 12-Jun-2015

199 Views

Category:

Education

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

This month's Caveon Webinar Series session focuses on the Online Education market, but the message shared by two industry veterans will be helpful for all test programs. In this Webinar, we are joined by special guests Dr. Larry Rudner of GMAC and Dr. Mika Hoffman of Excelsior College. These two esteemed testing veterans will describe basics of good and secure test design and provide considerations for designing an assessment program. Here's what you'll learn: - Identifying strategies for developing and improving online testing - Why good test writing is important to overall learning - Considerations for implementing low and high stakes online assessments - Online assessment strategies that are specifically geared for the online education market The speakers presented real-world examples, with practical considerations for implementing various levels of student assessment. An online assessment checklist will be provided to help you identify priorities for implementing online assessment initiatives. Featuring presentations by: Larry Rudner, Ph.D. – is Vice President of Research and Development at the Graduate Management Admission Council (GMAC). He has 30 years of experience is in the areas of test validation, adaptive testing, professional standards, QTI specifications, test security, data forensics, and contract monitoring. Mika Hoffman, Ph.D. - is the Executive Director of the Center for Educational Measurement for Excelsior College. She has over 20 years of professional experience in test design, quality control, integration of psychometric analyses, assessments development and production processes for higher education and government. Please contact richelle.gruber@caveon.com if you have any questions or problems viewing.

TRANSCRIPT

Considerations for Online Assessment Program Design

Hash tag #CaveonWbnr

Co-hosted by: ExcelSoft

Considerations For Online Assessment Program Design

Presented by:

Mika HoffmanExecutive Director – Center for Educational MeasurementExcelsior College

&

Lawrence M. RudnerVice President and Chief PsychometricianResearch and DevelopmentGraduate Management Admission Council (GMAC)

Contents

IntroductionUnderstanding Online Assessment ProgramsUnderstanding Test Development & PsychometriciansQ&A

Considerations for Online Assessment Program Design

Understanding Online Assessment Programs

Mika Hoffman

Types of academic assessment The stakes involved Validity Test planning Proctoring and identity verification Example—how Excelsior does it

Overview

Diagnostic assessment Placement in sections/courses Identification of strengths and weaknesses

Formative assessment Provides feedback to students May shape lesson plans Identification of strengths and weaknesses

Summative assessment Assesses outcome of learning

Types of academic assessment

Low stakes Quizzes with little impact on grade Self-assessments Assessments in non-credit courses

Mid stakes Tests with substantial impact on grade Challenge exams to bypass requirements

High stakes Summative assessments determining all or most of grade Credit by examination Entrance exams (e.g., SAT, GRE, GMAT)

The stakes involved

In academic testing, we can say that a test is valid if it gives us reasonable assurance that a person claiming to know the relevant academic material actually does know it.

Need to establish What is the knowledge? Is it relevant to the academic subject? Who has the knowledge?

Validity

Deals with what knowledge is being tested and whether it’s relevant to the academic subject and the purpose of the assessment For tests that are for “all the marbles,” the

test plan is the equivalent of a syllabus and learning objectives

Even for quizzes, it’s good to know what the quiz is expected to accomplish

Test Planning

Not just about proctoring Need to know that the test has been secure

throughout development

Security is related to validity If students get a good score because they

saw the material and memorized it ahead of time, what are you testing?

Test Security

Need to verify that the people taking the test are who they say they are

Need to verify that the people taking the test are using their knowledge of the subject, not other aids (references, friends, the Internet)

Need to ensure that the test content is secure

Proctoring and Identity Verification

Excelsior College’s exams are high stakes, “all the marbles” exams: designed to stand alone as the equivalent of a 3-credit course

Test Plans are written by a committee with testing experts and instructors of the subject taken from around the country

Practice exams delivered online with username/password verification

Proctoring and identity verification done in person at Pearson VUE testing centers

Example

Considerations for Online Assessment Program Design

Understanding Test Development & Psychometricians

Lawrence M. Rudner

Overview

Building a quality test

Marks of quality

Sources of error

Question for our attendees

Why should we worry about test quality?

Quality

Test takers are entitled to assurance that no examinee enjoys an unfair advantage

Testing organizations have an obligation to provide, or use its best efforts to provide, only valid scores

Organizations have the right to protect their own reputation by assuring the reliability of the information they provide.

Test Development Process

Identify desired content

Establish test

specifications

Develop new items

Develop new items

Review new items

Review new items

Pilot new items

Pilot new items

Conduct item

analysis

Conduct item

analysis

Assemble new

pool/forms

Assemble new

pool/forms

Administer

Administer

Marks of Quality – Test Reliability

Marks of Quality – Test Reliability

More ValidityValidity: relationship between test score and outcome

measure

Success /True Master

200 500 800 Test Score

Item Analysis – Response Options

Item Analysis – Response Options

Item Analysis – Response Options

Item Analysis – Response Options

Item 27 Rit = 0.43

1 (3)

2* (69)

3 (7)

4 (20)

Pe

rce

nta

ge

Score GroupsSubgroup 0 -- Subtest 0 (Missings)

0

20

40

60

80

100

1 2 3 4

Item Analysis – Response Options

Item 22 Rit = -0.26

1 (5)

2 (5)

3 (59)

4* (30)

Pe

rce

nta

ge

Score GroupsSubgroup 0 -- Subtest 0 (Missings)

0

20

40

60

80

100

1 2 3 4

Item Analysis – Response Options

Item 39 Rit = 0.15

1* (97)

2 (0)

3 (1)

4 (2)

Pe

rce

nta

ge

Score GroupsSubgroup 0 -- Subtest 0

0

20

40

60

80

100

1 2 3 4

Item Analysis – Response Options

Item 34 Rit = 0.57

1* (73)

2 (7)

3 (11)

4 (9)

Pe

rce

nta

ge

Score GroupsSubgroup 0 -- Subtest 0

0

20

40

60

80

100

1 2 3 4

Item Analysis – Item discrimination

Score

R/W

80 170 165 060 150 030 020 0

r = .70

Question 19

Item Analysis – Item discriminationQuestion 7

Item Analysis – Item discrimination

Question 19

Item Analysis – Item discrimination

Question 7

Constructing a quality test

1.Good representation of content

2.Good, proven test questions

3.Enough test questions

4.Equivalent alternate forms

Our Gifts To You!

• Internet-based training – 1 hour of training, code good for 2 weeks!

• ExcelSoft will provide you with a 30-minute consultation, at no cost, for assessing your testing development and delivery needs

• Looking to implement or make change to your test delivery platform

Online Item Writing Training

30 minute needs analysis

URL: training.caveon.netCode word: online14

http://testing-assessments.excelindia.com/consultation.html

Helpful Resources

LinkedIn Group – “Caveon Test Security” Join Us!

Follow us on twitter @Caveon www.caveon/resources/webinars – slides & recordings

Cheating Articles at www.caveon.com/citn

Caveon Security Insights blog – www.caveon.com/blog

CSI Newsletter – Contact us to get on the mailing list!

Thank You! Special Thanks to:

Lawrence M. Rudner, Ph.D.Vice President and Chief PsychometricianResearch and DevelopmentGraduate Management Admission Councillrudner@gmac.com

Mika Hoffman, Ph.D.Executive Director - Center for Educational MeasurementExcelsior Collegemhoffman@excelsior.edu

Please contact: richelle.gruber@caveon.com for feedback or a copy of the slides

Please visit our sessions and our Caveon booth #209 at ATP’s Innovations In Testing – February 3-6, 2013

Steve Addicott presenting at SeaSkyLand Conference in ShenZhen – February 2013

John presenting at TILSA meeting February 7th in Atlanta. Other presenters include John Olson, Greg Cizek.

Release of TILSA Test Security Guidebook – Visit our booth to discuss it with John Fremer at ATP!

Handbook of Test Security – To be published March 2013

CCSSO Best Practices meeting in June 2013

Upcoming Events

Caveon ATP Sessions

Tell it to the Judge! Winning with Data Forensics Evidence in Court Steve Addicott - 2/4/13 – 10 am

Data Forensics: Opening the Black Box John Fremer & Dennis Maynes – 2/4/13 – 2:45 pm

A Synopsis of the Handbook of Test Security David Foster & John Fremer – 2/4/13 – 5 pm

From Foundations to Futures: Online Proctoring and Authentication (Kryterion session) David Foster – 2/5/13 – 11 am

Make, Buy, or Borrow: Acquiring SMEs Nat Foster – 2/5/13 – 1:15 pm

ATP Sessions of our presenters Free Tools to Nail the Brain Dumps

Monday 2/4/2013 4:00 PM – 5:00 PM Lawrence Rudner

The Game’s Afoot: Sleuths Match Wits Tuesday 2/5/2013 11:00 AM – Noon

Lawrence Rudner & Dennis Maynes

Online Education – How Can We Make Sure Students are Really Learning? Tuesday 2/5/2013 11:00 AM – Noon

Jamie Mulkey Mika Hoffman

We hope to “See You” at our next sessions!

Caveon’s Lessons Learned from ATP To be held: Feb 20, 2013

The next webinar in the Online Education series: Designing Assessments for the Online Education Environment To be held: March 20, 2013

top related