research related to the effectiveness e-learning and collaborative tools dr. curtis j. bonk...

90
Research Related to the Effectiveness E-Learning and Collaborative Tools Dr. Curtis J. Bonk Associate Professor, Indiana University President, CourseShare.com http://php.indiana.edu/~cjbonk, [email protected]

Upload: ferdinand-anthony

Post on 22-Dec-2015

214 views

Category:

Documents


0 download

TRANSCRIPT

Research Related to the Effectiveness E-Learning and

Collaborative Tools

Dr. Curtis J. Bonk Associate Professor, Indiana University

President, CourseShare.comhttp://php.indiana.edu/~cjbonk,

[email protected]

Are you ready???

A Vision of E-learning for America’s Workforce, Report of the Commission on Technology and Adult Learning, (2001, June)

A remarkable 84 percent of two-and four-year colleges in the United States expect to offer distance learning courses in 2002” (only 58% did in 1998) (US Dept of Education report, 2000)

Web-based training is expected to

increase 900 percent between 1999 and 2003.” (ASTD, State of the Industry Report 2001).

Brains Before and After E-learning

BeforeAfter

And when use synchronous and asynchronous tools

Tons of Recent Research

Not much of it

...is any good...

Problems and Solutions

(Bonk, Wisher, & Lee, in review)

1. Tasks Overwhelm2. Confused on Web3. Too Nice Due to

Limited Share History

4. Lack Justification5. Hard not to

preach6. Too much data7. Communities not

easy to form

Train and be clear Structure time/dates

due Develop roles and

controversies Train to back up claims Students take lead role Use Email Pals Embed Informal/Social

Benefits and Implications

(Bonk, Wisher, & Lee, in review)

1. Shy open up online2. Minimal off task3. Delayed collab more

rich than real time4. Students can

generate lots of info5. Minimal disruptions6. Extensive E-Advice7. Excited to Publish

Use async conferencing Create social tasks Use Async for debates;

Sync for help, office hours

Structure generation and force reflection/comment

Foster debates/critique Find Experts or Prac. Ask Permission

Basic Distance Learning Finding?

• Research since 1928 shows that DL students perform as well as their counterparts in a traditional classroom setting.

Per: Russell, 1999, The No Significant Difference Phenomenon (5th Edition), NCSU, based on 355 research reports.

http://cuda.teleeducation.nb.ca/nosignificantdifference/

Question: Why is there no learning in e-learning???

A. Poor pedagogy?B. Inferior online tools?C. Unmotivated students and

instructors?D. Poor research and measurement?E. Too new?F. Vendor and administrator visions do

not match reality?

Online Learning Research Problems (National Center for Education Statistics,

1999; Phipps & Merisotos, 1999; Wisher et al., 1999).

Anecdotal evidence; minimal theory.

Questionable validity of tests.

Lack of control groups. Hard to compare given

different assessment tools and domains.

Online Learning Research Problems (National Center for Education Statistics,

1999; Phipps & Merisotos, 1999; Wisher et al., 1999).

Fails to explain why the drop-out rates of distance learners are higher.

Does not relate learning styles to different technologies or focus on interaction of multiple technologies.

Online Learning Research Problems

(Bonk & Wisher, 2000)

• For different purposes or domains: in our study, 13% concern training, 87% education

• Flaws in research designs- Only 36% have objective learning

measures- Only 45% have comparison groups

• When effective, it is difficult to know why- Course design?- Instructional methods?- Technology?

Ten Primary ExperimentsAdaptations from Education to

Training (Bonk & Wisher, 2000)

1) Variations in Instructor Moderation2) Online Debating3) Student Perceptions of e-Learning Envir.4) Devel of Online Learning Communities5) Time Logging6) Critical Thinking and Problem Solving

Applications in Sync/Asynchronous Envir7) Peer Tutoring and Online Mentoring: 8) Student Retention: E-learning and

Attrition9) Conceptual Referencing10) Online Collaboration

Evaluating Web-Based Instruction: Methods and

Findings (41 studies)(Olson & Wisher, in review)

Year of Publication

02468

1012

1996 1997 1998 1999 2000 2001

Year

Nu

mb

er

of

Stu

die

s (Projected)

Wisher’s Wish List Effect size of .5 or higher in

comparison to traditional classroom instruction. But reality:

Web Based Instruction

CBIKulik [8]

CBILiao [18]

Average Effect Size

.31 .32 .41

Number of Studies

11 97 46

Evaluating Web-Based Instruction: Methods and

Findings(Olson & Wisher, in review)

“…there is little consensus as to what variables should be examined and what measures of of learning are most appropriate, making comparisons between studies difficult and inconclusive.”

Evaluating Web-Based Instruction: Methods and

Findings(Olson & Wisher, in review)

What to Measure?• demographics (age, gender, etc.), • previous experience, • course design, • instructor effectiveness or feedback, • technical issues,• levels of participation and collaboration,• student and instructor interactions,• student recommendation of course, • student desire to take add’l online

courses.

Evaluating Web-Based Instruction: Methods and

Findings(Olson & Wisher, in review)

Variables Studied:1. Type of Course: Graduate (18%) vs.

undergraduate courses (81%)2. Level of Web Use: All-online (64%) vs.

blended/mixed courses (34%)3. Content area (e.g., math/engineering

(27%), science/medicine (24%), distance ed (15%), social science/educ (12%), business (10%), etc.)

Other data:a. Attrition data collected (34%)b. Comparison Group (59%)

Different Goals… Making connections Appreciating different

perspectives Students as teachers Greater depth of discussion Fostering critical thinking online Interactivity online

Learning Improved(Maki & Maki, 2002, Journal of Experimental

Psychology: Applied, 8(2), 85-98)

Intro to Psych: Lecture vs. Online Web-based course had more

advantages as comprehension skill increased

Still students preferred the face-to-face over online

Why? More guidance, feedback, & enthusiasm, and less deadlines.

Learning Improved…(Maki, Maki, Patterson, & Whittaker, 2000)

Intro to Psych: Lecture vs. Online

Online consistently higher exam scores

Online learned more as indicated by higher scores on psych graduate record exams during semester

Learning Improved…(Maki et al., 2000)

Intro to Psych: Lecture vs. Online Online performed better on

midterms. Web-based course students

scored higher since had weekly activities due

Lecture students could put off reading until night before exam.

Learning Worse(Wang & Newlin, 2000)

Stat Methods: Lecture vs. Online No diffs at midterm Lecture 87 on final, Web a 72 Course relatively unstructured Web students encouraged to collab Lecture students could not collab All exams but final were open book

Learning Worse(Washull, 2001)

Psych: Lecture vs. Online No diffs at midterm Self-selected sections: Lecture 86 on

final, Web a 77 Random Assignment sections: No

differences Self-selected students more likely to

fail the online course Web course higher student

satisfaction

Learning Improved or Not…

(Hiltz, 1993)

Web may be suited to some and lecture to others…

Students who find Web convenient for them score better.

Ratings of course involvement and ease of access to instructor also important.

Learning Improved or Not…

(Sankaran et al., 2000)

Students with a positive attitude toward Web format learned more in Web course than in lecture course.

Students with positive attitude toward lecture format learned more in lecture format.

Electronic Conferencing: Quantitative Analyses

Usage patterns, # of messages, cases, responses

Length of case, thread, response Average number of responses Timing of cases, commenting,

responses, etc. Types of interactions (1:1; 1: many) Data mining (logins, peak usage, location,

session length, paths taken, messages/day/week), Time-Series Analyses (trends)

Electronic Conferencing: Qualitative Analyses

General: Observation Logs, Reflective interviews, Retrospective Analyses, Focus Groups

Specific: Semantic Trace Analyses, Talk/Dialogue Categories (Content talk, questioning, peer feedback, social acknowledgments, off task)

Emergent: Forms of Learning Assistance, Levels of Questioning, Degree of Perspective Taking, Case Quality, Participant Categories

AC3-DL Course Tools(Orvis, Wisher, Bonk, & Olson)

Asynchronous: Learning Management System E-mail

Synchronous: Virtual Tactical Operations Center (VTOC) (7 rooms; 15 people/extension) Avatar Audio conference by extension/room (voice over

IP) Text Chat Windows—global and private Special tools for collaboration

Overall frequency of interactions across chat categories (6,601

chats).

On-Task55%Social

30%

Mechanics15%

Overall frequency of interactions across chat categories (6,601

chats).

On-Task55%Social

30%

Mechanics15%

0%

10%

20%

30%

40%

50%

60%

70%

Month 1,2 Month 3,4 Month 5,6

On-Task Social Mechanics

Research on Instructors Online If teacher-centered, less explore, engage,

interact (Peck, and Laycock, 1992) Informal, exploratory conversation fosters

risktaking & knowledge sharing (Weedman, 1999) Four Key Acts of Instructors:

pedagogical, managerial, technical, social (Ashton, Roberts, & Teles, 1999)

Instructors Tend to Rely on Simple Tools (Peffers & Bloom, 1999)

Job Varies--Plan, Interaction, Admin, Tchg (McIsaac, Blocher, Mahes, & Vrasidas, 1999)

Study of Four Classes(Bonk, Kirkley, Hara, & Dennen, 2001)

Technical—Train, early tasks, be flexible, orientation task

Managerial—Initial meeting, FAQs, detailed syllabus, calendar, post administrivia, assign e-mail pals, gradebooks, email updates

Pedagogical—Peer feedback, debates, PBL, cases, structured controversy, field reflections, portfolios, teams, inquiry, portfolios

Social—Café, humor, interactivity, profiles, foreign guests, digital pics, conversations, guests

Network Conferencing Interactivity (Rafaeli & Sudweeks, 1997)

1. > 50 percent of messages were reactive.2. Only around 10 percent were truly interactive. 3. Most messages factual stmts or opinions4. Many also contained questions or requests.5. Frequent participators more reactive than low.6. Interactive messages more opinions & humor.7. More self-disclosure, involvement, & belonging.8. Attracted to fun, open, frank, helpful,

supportive environments.

Starter Centered Interaction:

Week 4Scattered Interaction (no starter):

Collaborative Behaviors(Curtis & Lawson, 1997)

Most common were: (1) Planning, (2) Contributing, and (3) Seeking Input.

Other common events were:(4) Initiating activities,(5) Providing feedback,(6) Sharing knowledge

Few students challenge others or attempt to explain or elaborate

Recommend: using debates and modeling appropriate ways to challenge others

Online Collaboration Behaviors by Categories (US

and Finland)

BehaviorCategorie

s

Conferences (%)

Finland U.S. Average

Planning 0.0 0.0 0.0Contributin

g 80.8 76.6 78.7

Seeking Input 12.7 21.0 16.8

Reflection/Monitoring 6.1 2.2 4.2

SocialInteraction 0.4 0.2 0.3

Total 100.0 100.0 100.0

Dimensions of Learning Process

(Henri, 1992)

1. Participation (rate, timing, duration of messages)

2. Interactivity (explicit interaction, implicit interaction, & independent comment)

3. Social Events (stmts unrelated to content)

4. Cognitive Events (e.g., clarifications, inferencing, judgment, and strategies)

5. Metacognitive Events (e.g., both metacognitive knowledge—person, and task, and strategy and well as metacognitive skill—evaluation, planning, regulation, and self-awareness)

Some Findings (see Hara, Bonk, & Angeli, 2000)

Social (in 26.7% of units coded) social cues decreased as semester progressed messages gradually became less formal became more embedded within statement

Cognitive (in 81.7% of units) More inferences & judgments than elem

clarifications and in-depth clarifications

Metacognitive (in 56% of units) More reflections on exper & self-awareness Some planning, eval, & regulation & self q’ing

Cognitive Skills Displayed in Online Conferencing

05

10152025303540

Perc

ent o

f C

oded

Uni

ts

Cognitive Skills

Surface vs. Deep Posts(Henri, 1992)

Surface Processing making judgments

without justification, stating that one shares

ideas or opinions already stated,

repeating what has been said

asking irrelevant questions

i.e., fragmented, narrow, and somewhat trite.

In-depth Processing linked facts and ideas, offered new elements of

information, discussed advantages

and disadvantages of a situation,

made judgments that were supported by examples and/or justification.

i.e., more integrated, weighty, and refreshing.

Level of Cognitive Processing: All Posts

Surface33%

Deep55%

Both12%

Surface

Deep

Both

Critical Thinking (Newman, Johnson, Webb & Cochrane, 1997)

Used Garrison’s five-stage critical thinking model Critical thinking in both CMC and FTF envir. Depth of critical thinking higher in CMC envir.

More likely to bring in outside information Link ideas and offer interpretations, Generate important ideas and solutions.

FTF settings were better for generating new ideas and creatively exploring problems.

Unjustified Statements (US)

24. Author: Katherine

Date: Apr. 27 3:12 AM 1998

I agree with you that technology is definitely taking a large part in the classroom and will more so in the future…

25. Author: Jason Date: Apr. 28 1:47 PM 1998

I feel technology will never over take the role of the teacher...I feel however, this is just help us teachers...

26. Author: Daniel Date: Apr. 30 0:11 AM 1998

I believe that the role of the teacher is being changed by computers, but the computer will never totally replace the teacher... I believe that the computers will eventually make teaching easier for us and that most of the children's work will be done on computers. But I believe that there…

Study #3. Fall, 1997

UnsupportedSocialJustifiedExtension

Indicators for the Quality of Students’ Dialogue(Angeli, Valanides, & Bonk, in review)

ID Indicators Examples

1 Social acknowledgement/Sharing/Feedback

·         Hello, good to hear from you·         I agree, good point, great idea·   \2 Unsupported

statements (advice)

·         I think you should try this….·         This is what I would do…· 3 Questioning for

clarification and extend dialogue

·         Could you give us more info?·         …explain what you mean by…?\\

4 Critical thinking, Reasoned thinking-judgment

·         I disagree with X, because in class we discussed….·        I see the following disadvantages to this approach….

Social Construction of

Knowledge (Gunawardena, Lowe, & Anderson, 1997)

Five Stage Model1. Share ideas,2. Discovery of Idea Inconsistencies, 3. Negotiate Meaning/Areas Agree, 4. Test and Modify,5. Phrase Agreements

In global debate, very task driven. Dialogue remained at Phase I: sharing info

Social Constructivism and Learning

Communities Online (SCALCO) Scale. (Bonk & Wisher, 2000)

___ 1. The topics discussed online had real world relevance.

___ 2. The online environment encouraged me to question ideas and perspectives.

___ 3. I received useful feedback and mentoring from others.

___ 4. There was a sense of membership in the learning here.

___ 5. Instructors provided useful advice and feedback online.

___ 6. I had some personal control over course activities and discussion.

Evaluation…

Kirkpatrick’s 4 Levels

Reaction Learning Behavior Results

Figure 26. How Respondent Organizations Measure Success of Web-Based Learning According to the

Kirkpatrick Model

0102030405060708090

Learner satisfaction Change inknowledge, skill,

atttitude

Job performance ROI

Kirkpatrick's Evaluation Level

Pe

rcen

t o

f R

esp

on

den

ts

My Evaluation Plan…

Considerations in Evaluation Plan

1. Student

2. Instructor

3. Training

4. Task5. Tech Tool

6. Course

7. Program

8. University or

Organization

1. Measures of Student Success

(Focus groups, interviews, observations, surveys, exams, records)

Positive Feedback, Recommendations Increased Comprehension, Achievement High Retention in Program Completion Rates or Course Attrition Jobs Obtained, Internships Enrollment Trends for Next Semester

1. Student Basic Quantitative Grades, Achievement Number of Posts Participation Computer Log Activity—peak usage,

messages/day, time of task or in system

Attitude Surveys

1. Student High-End Success

Message complexity, depth, interactivity, q’ing

Collaboration skills Problem finding/solving and critical

thinking Challenging and debating others Case-based reasoning, critical

thinking measures Portfolios, performances, PBL

activities

2. Instructor Success High student evals; more

signing up High student completion rates Utilize Web to share teaching Course recognized in tenure

decisions Varies online feedback and

assistance techniques

3. TrainingOutside Support

Training (FacultyTraining.net) Courses & Certificates (JIU, e-education) Reports, Newsletters, & Pubs Aggregators of Info (CourseShare, Merlot) Global Forums (FacultyOnline.com; GEN) Resources, Guides/Tips, Link

Collections, Online Journals, Library Resources

3. TrainingInside Support…

Instructional Consulting Mentoring (strategic planning $) Small Pots of Funding Facilities Summer and Year Round Workshops Office of Distributed Learning Colloquiums, Tech Showcases, Guest

Speakers Newsletters, guides, active learning grants,

annual reports, faculty development, brown bags

RIDIC5-ULO3US Model of Technology Use

4. Tasks (RIDIC): Relevance Individualization Depth of Discussion Interactivity Collaboration-Control-Choice-

Constructivistic-Community

RIDIC5-ULO3US Model of Technology Use

5. Tech Tools (ULOUS): Utility/Usable Learner-Centeredness Opportunities with Outsiders Online Ultra Friendly Supportive

6. Course Success Few technological glitches/bugs Adequate online support Increasing enrollment trends Course quality (interactivity

rating) Monies paid Accepted by other programs

7. Online Program or Course Budget (i.e., how pay, how large is course, tech fees charged, # of courses, tuition rate, etc.)

Indirect Costs: learner disk space, phone, accreditation, integration with existing technology, library resources, on site orientation & tech training, faculty training, office space

Direct Costs: courseware, instructor, help desk, books, seat time, bandwidth and data communications, server, server back-up, course developers, postage

8. Institutional Success

E-Enrollments from new students, alumni, existing students

Additional grants Press, publication, partners,

attention Orientations, training, support

materials Faculty attitudes Acceptable policies (ADA compliant)

Online StudentAssessment

Assessment Takes Center Stage in Online Learning

(Dan Carnevale, April 13, 2001, Chronicle of Higher Education)

“One difference between assessment in classrooms and in distance education is that distance-education programs are largely geared toward students who are already in the workforce, which often involves learning by doing.”

Focus of Assessment?1. Basic Knowledge,

Concepts, Ideas2. Higher-Order Thinking

Skills, Problem Solving, Communication, Teamwork

3. Both of Above!!!4. Other…

Assessments Possible Online Portfolios of Work Discussion/Forum Participation Online Mentoring Weekly Reflections Tasks Attempted or

Completed, Usage, etc.

More Possible Assessments

Quizzes and Tests Peer Feedback and

Responsiveness Cases and Problems Group Work Web Resource Explorations &

Evaluations

Sample Portfolio Scoring Dimensions

(10 pts each)(see: http://php.indiana.edu/~cjbonk/p250syla.htm)

1. Richness2. Coherence3. Elaboration4. Relevancy5. Timeliness6. Completeness7. Persuasivenes

s8. Originality

1. Insightful2. Clear/Logical3. Original4. Learning5. Fdback/

Responsive6. Format7. Thorough8. Reflective9. Overall Holistic

E-Peer Evaluation FormPeer Evaluation. Name: ____________________Rate on Scale of 1 (low) to 5 (high):

___ 1. Insight: creative, offers analogies/examples, relationships drawn, useful ideas and connections, fosters growth.

___ 2. Helpful/Positive: prompt feedback, encouraging, informative, makes suggestions & advice, finds, shares info.

___ 3. Valuable Team Member: dependable, links group members, there for group, leader, participator, pushes group.

___ Total Recommended Contribution Pts (out of 15)

Issues to Consider…1. Bonus pts for participation?2. Peer evaluation of work?3. Assess improvement?4. Is it timed? Allow retakes if

lose connection? How many retakes?

5. Give unlimited time to complete?

Issues to Consider…6. Cheating? Is it really that

student?7. Authenticity?8. Negotiating tasks and criteria?9. How measure competency? 10. How do you demonstrate

learning online?

Increasing Cheating Online

($7-$30/page, http://www.syllabus.com/ January, 2002, Phillip Long, Plagiarism: IT-Enabled Tools for

Deceit?)

http://www.academictermpapers.com/

http://www.termpapers-on-file.com/ http://www.nocheaters.com/ http://www.cheathouse.com/uk/index.html http://www.realpapers.com/ http://www.pinkmonkey.com/ (“you’ll never buy Cliffnotes again”)

Reducing Cheating Online Ask yourself, why are they cheating? Do they value the assignment? Are tasks relevant and challenging? What happens to the task after

submitted—reused, woven in, posted?

Due at end of term? Real audience? Look at pedagogy b4 calling

plagiarism police!

Reducing Cheating Online Proctored exams Vary items in exam Make course too hard to cheat Try Plagiarism.com ($300) Use mastery learning for some tasks Random selection of items for item

pool Use test passwords, rely on IP#

screening Assign collaborative tasks

Reducing Cheating Online($7-$30/page, http://www.syllabus.com/ January,

2002, Phillip Long, Plagiarism: IT-Enabled Tools for Deceit?)

http://www.plagiarism.org/ (resource) http://www.turnitin.com/ (software,

$100, free 30 day demo/trial) http://www.canexus.com/ (software;

essay verification engine, $19.95) http://www.plagiserve.com/ (free database

of 70,000 student term papers & cliff notes) http://www.academicintegrity.org/

(assoc.) http://sja.ucdavis.edu/avoid.htm (guide) http://www.georgetown.edu/honor/

plagiarism.html

Turnitin Testimonials

"Many of my students believe that if they do not submit their essays, I will not discover their plagiarism. I will often type a paragraph or two of their work in myself if I suspect plagiarism. Every time, there was a "hit." Many students were successful plagiarists in high school. A service like this is needed to teach them that such practices are no longer acceptable and certainly not ethical!”

New Zealand Universities Consider Lawsuit Against Sites Selling Diplomas in Their Names.

The Web sites, which already offer fake diplomas in the names of hundreds of colleges in the United States and abroad, recently added New Zealand’s Universities of Auckland, Canterbury, and Otago to their lineup. The degrees sell for up to $250 each.

Feb 11, 2002, David Cohen, Chronicle of Higher Education

Online Testing and Survey

Tools

Test Selection Criteria

(Hezel, 1999)

Easy to Configure Items and Test

Handle Symbols Scheduling of Feedback

(immediate?) Easy to Pick Items for

Randomizing Randomize Answers Within a

Question Weighting of Answer Options

More Test Selection Criteria Recording of Multiple

Submissions; control # of submissions

Timed Tests Comprehensive Statistics Summarize in Portfolio and/or

Gradebook Confirmation of Test

Submission

Flexible scoring—score first, last, or average submission

Flexible reporting—by individual or by item and cross tabulations.

Outputs data for further analysis Provides item analysis statistics

(e.g., Test Item Frequency Distributions).

More Test Selection Criteria

(Perry & Colon, 2001; see: http://www.indiana.edu/~best/)

Sample Survey Tools Zoomerang

(http://www.zoomerang.com) IOTA Solutions

(http://www.iotasolutions.com) QuestionMark

(http://www.questionmark.com/home.html) SurveyShare (http://SurveyShare.com;

from Courseshare.com) Survey Solutions from Perseus

(http://www.perseusdevelopment.com/fromsurv.htm)

Infopoll (http://www.infopoll.com)

Web-Based Survey Advantages Faster collection of data Standardized collection format Computer controlled branching

and skip sections Easy to answer clicking Wider distribution of

respondents

Any questions?