a d d e n d u m to assessment guide

23
A D D E N D U M to Assessment Guide February 1, 2017

Upload: others

Post on 06-Apr-2022

1 views

Category:

Documents


0 download

TRANSCRIPT

A D D E N D U Mto

Assessment Guide

February 1, 2017

LLU Program Review and Assessment January 28, 2017 1

ProgramReviewandAssessment1“AtaGlance”Promoting Loma Linda University Success • Preparing for WSCUC 2020

ProgramResponsibilitiesWhat When How Why 1. Program Assessment

Plan• Program Learning

Outcomes (PLOs)• Curriculum Map• Assessment Matrix

If program doesn’t have one yet, develop one now.

If program has one, review annually; update it, if needed.

See the “LLU Assessment Guide2,” pp. 4-11

• To guide, maintain, and strengthenthe program’s curriculum andassessment.

• To prepare the program tocomplete the new annual Inventoryof Educational EffectivenessIndicators (IEEI) survey forWSCUC and the federalgovernment.

2. Program Review (PR)Self-study Report

Required: Due at the end of the LLU program review cycle

See the “LLU Program Review Guide3,” pp. 3-6; 12-18

To evaluate quality, effectiveness, currency, viability, sustainability, alignment with mission, and priorities.

3. PR External ReviewTeam Visit and Report

Required: Due at the end of the LLU program review cycle

See the “LLU Program Review Guide,” pp. 23-29

To provide an outside perspective that provides a constructive, expert analysis of program quality and recommendations for future planning and improvements.

4. PR Action Plan Full plan is the last step of the PR process during the PR cycle.

Annual updates: Due end of October

See the “LLU Program Review Guide,” pp. 8-9, 22 See the Sample Action Plan in the “LLU Program Review Guide,” pp. 30-32

To be derived from the Overview of Proposed Changes of the self-study document and reflects recommendations in the External Review Report. It is the program’s blueprint for planning, implementing, and tracking program development

5. School AdministrativeReview Feedback

First time ever requested - TBA

In development To give programs essential feedback that will help them continue to grow and improve.

6. ILO andProfessional ILOAssessment

Annual updates: Due end of October

See the “LLU Assessment Guide,” pp. 17-18

To ensure that all LLU graduates at every level and discipline have these important skills.

7. Inventory of EducationalEffectiveness Indicators(IEEI)

Annual updates: spring - LLU’s WSCUC annual report

See WSCUC IEEI Rubric4 in this document.

To demonstrate to WSCUC and the federal government that each of our academic degree programs have assessment plans and participate in program review.

LLUProgramReviewandAssessmentTermsAssessment: Processes that identify, collect, use, and prepare data that can be used to evaluate students’ achievement.

Assessment includes Institutional Learning Outcomes, Program Learning Outcomes, Course Learning Outcomes, and so

1 Assessment based on ILOs and PLOs and is tracked over time. 2 http://home.llu.edu/academics/academic-resources/educational-effectiveness/assessment 3 http://home.llu.edu/academics/academic-resources/educational-effectiveness/program-review 4 https://www.wascsenior.org/content/inventory-educational-effectiveness-indicators-ieei

Academic Deans Council February 1, 2017 1

LLU Program Review and Assessment January 28, 2017 2

on. Assessments for Learning Outcomes are usually done with rubrics. Course or assignment grades and tests are not considered to be assessments.

LLU Institutional Learning Outcomes (ILOs): (1) Written Communication, (2) Oral Communication, (3) Quantitative Reasoning, (4) Information Literacy, and (5) Critical Thinking. Critical Thinking is the ILO for this academic year (2017).

School Assessment Specialist: Each school assigns at least one Assessment Specialist to coordinate school assessment activities. These individuals have evaluation and measurement experience and receive additional training and support from the Office of Educational Effectiveness.

WSCUC-only: Programs that do not have professional or external accreditation. These programs must follow the full LLU program review process.

WSCUC-plus: Programs that have professional, discipline-specific accreditation. These programs follow their professional accreditors’ guidelines for the self-study report, external site review, and action plan. For LLU they will also do ILO/Professional ILO assessment.

ProgramReviewandAssessmentResourcesAll of these resources are listed and linked at myLLU.edu Faculty Page:

Resource Why It Is Needed Assessment Management System (AMS)

Enter annual (1) ILO and Professional ILO assessment analysis reports, and (2) Action Plan Updates

LiveText Conduct ILO/Professional ILO and PLO assessments in LiveText for instant and detailed analytics as a basis for the program’s annual analysis report. These assessments will be kept for the long-term over years to look for strengths, gaps, and trends.

Assessment (Office of Educational Effectiveness - OEE)

This site has everything a program needs to know about program review, ILO/Professional ILO assessments, program review, distance and digital education, institutional research, and more.

Canvas LLU’s learning management system (LMS) for all courses: face-to-face, hybrid, and fully online. Although course assignment rubrics and grades can be kept in Canvas, it is not designed to be an assessment management system.

ProfessionalInstitutionalLearningOutcomes–AlternativeApproach5LLU has new freedom and responsibility to define, assess, and document learning in our own way. The following alternative approach to the LLU’s existing ILOs and assessment tools is one of the first freedoms for programs to meet their unique needs:

1. Transform the Institutional Learning Outcomes (ILOs) into Professional Institutional Learning Outcomes(Professional ILOs). Note: The Professional ILOs do not replace the regular Program Learning Outcomes (PLOs)that address the specific curriculum and skills of the program’s discipline or profession.

2. Develop definitions for the ILOs that are meaningful for the program’s discipline and level.3. Select or develop a rubric that accurately assesses student learning for the program’s new Professional ILO

definitions. Programs may continue to use the current LLU/AAC&U VALUE Rubrics or a rubric used within theprogram’s discipline/profession.

4. Alternatively, programs may choose to use clinical versions of the LLU/AAC&U rubrics that are currently beingdeveloped by the Learning Outcomes Committee. The clinical version of the Critical Thinking rubric should beavailable in February.

All programs may choose whether to continue to use the regular ILOs or to implement the Professional ILO alternative assessment process.

5 A more complete resource on LLU’s Professional ILOs is posted here: http://home.llu.edu/academics/academic-resources/educational-effectiveness/llu-institutional-learning-outcomes

Academic Deans Council February 1, 2017 2

Inventory  of  Educational  Effectiveness  Indicators  (IEEI)  Form  

The  IEEI  requests  brief  narrative  information  for  each  degree  program,  for  general  education  (if  applicable),  and  for  the  institution  as  a  whole.  The  IEEI  provides  a  comprehensive  overview  of  the  institution’s  assessment  processes  that  teams,  the  Commission,  and  the  institution  itself  may  use  to  evaluate  educational  effectiveness.  

*The  relevant  definition  of  “program”  as  presented  in  the  glossary  of  the  2013  Handbook  is  “a  systematic,  usually  sequential,  grouping  of  courses  that  forms  aconsiderable  part,  or  all,  of  the  requirements  for  a  degree  in  a  major  or  professional  field.”

How  can  institutions  use  this  exhibit?  Institutions  will  want  to  be  explicit  about  expectations  for  student  learning  and  to  ensure  that  every  degree  program  has  in  place  a  quality  assurance  system  for  assessing,  tracking,  and  improving  the  learning  of  its  students.  This  exhibit  can  assist  institutions  in  determining  the  extent  to  which  they  have  assessment  systems  in  place,  and  what  additional  components  or  processes  they  may  need  to  develop.  Institutions  may  draw  upon  or  reference  this  document  in  preparing  institutional  reports.  

Why  is  WSCUC  interested  in  this  information?  An  institution  committed  to  student  achievement  and  educational  effectiveness  will  have  in  place  a  system  for  collecting  and  using  evidence  to  set  standards  of  student  performance  and  to  improve  learning.  The  indicators  asked  for  in  this  exhibit  reflect  how  an  institution  approaches  quality  assurance  and  improvement  systematically.  Institutions  submit  the  IEEI  to  WSCUC  as  follows:  • Reaffirmation  and  Seeking  Initial  Accreditation:  The  evaluation  team  will  review  the  institution’s  IEEI  to  help  understand  how  comprehensively  and  successfully

the  institution  addresses  both  the  quality  of  its  students’  learning  and  the  quality  of  the  learning  and  assessment  infrastructure.  Teams  and  institutions  areencouraged  to  treat  this  exhibit  as  a  developmental  document:  the  institution  can  indicate  what  activities  it  already  engages  in  and  what  remains  to  be  done.

• Mid-­‐Cycle  Review:  Institutions  submit  an  update  of  their  IEEI  with  the  Annual  Report  in  the  year  of  the  institution’s  Mid-­‐Cycle  Review  as  a  set  of  indicators  relatedto  educational  effectiveness  and  student  achievement.

• Interim  Reports:  Institutions  submitting  Interim  Reports  concerned  with  educational  effectiveness  submit  an  updated  IEEI  with  their  report  when  requested  bythe  Commission.

What  2013  Standards  are  addressed  by  this  exhibit?    The  indicators  listed  in  this  exhibit  collectively  demonstrate  an  institution’s  commitment  to  quality  assurance  and  improvement  of  educational  results  over  time  (CFRs  4.1,  4.3,  and  4.4).  Specific  standards  related  to  academic  quality  and  effectiveness  are  addressed  by  the  IEEI  as  follows:  • Educational  objectives  are  widely  recognized  throughout  the  institution,  are  consistent  with  stated  purposes,  and  are  demonstrably  achieved  (CFR  1.2)• All  degrees  have  clearly  defined  levels  of  student  achievement  (CFR  2.2)• Undergraduate  programs  ensure  the  development  of  core  competencies  (CFR  2.2.a)• Graduate  programs  establish  clearly  stated  objectives  (CFR  2.2.b)• Student  learning  outcomes  and  standards  of  performance  are  clearly  stated  at  the  course,  program,  and,  as  appropriate,  institutional  level  (CFR  2.3)• Learning  outcomes  and  standards  of  performance  are  developed  by  faculty,  who  take  collective  responsibility  for  establishing  appropriate  standards  of

performance  and  demonstrating  through  assessment  the  achievement  of  these  standards  (CFR  2.4)• The  institution  demonstrates  that  its  graduates  consistently  achieve  its  stated  learning  outcomes  and  established  standards  of  performance  (CFR  2.6)• All  programs  offered  by  the  institution  undergo  systematic  program  review,  which  includes  analyses  of  student  achievement  of  the  program’s  learning  outcomes;

retention  and  graduation  rates;  and,  where  appropriate,  results  of  licensing  examination  and  placement,  and  evidence  from  external  constituencies  such  asemployers  and  professional  organizations  (CFR  2.7).

Rev  4/2015  

Academic Deans Council February 1, 2017 3

Inventory  of  Educational  Effectiveness  Indicators  

Category   (1)  Have  formal  learning  outcomes  

been  developed?  

Yes/No  

(2)  Where  are  these  learning  

outcomes  published  (e.g.,  catalog,  syllabi,  other  materials)?  

(3)  Other  than  GPA,  what  

data  /  evidence  are  used  to  determine  that  

graduates  have  achieved  stated  outcomes  for  the  degree?  (e.g.,  capstone  course,  portfolio  review,  licensure  examination)?    

(4)  Who  interprets  the  

evidence?    What  is  the  process?  

(5)  How  are  the  findings  

used?  

(6)  Date  of  the  last  program  review  for  

this  degree  program.  

At  the  institutional  level:  For  general  education  if  an  undergraduate  institution:  List  each  degree  program:  1.  2.  

3.  

4.  

5.  

6.  

7.  

8.  

9.  

10.  

Etc.  

Academic Deans Council February 1, 2017 4

PROGRAM LEARNING OUTCOMES RUBRIC Rubric for Assessing the Quality of Academic Program Learning Outcomes

Criterion Initial Emerging Developed Highly Developed Comprehensive List

The list of outcomes is problematic: e.g., very incomplete, overlydetailed, inappropriate, anddisorganized. It may include onlydiscipline-specific learning,ignoring relevant institution-widelearning. The list may confuselearning processes (e.g., doing aninternship) with learning outcomes(e.g., application of theory to real- world problems).

The list includes reasonable outcomes but does not specify expectations for the program as a whole. Relevant institution-wide learning outcomes and/or national disciplinary standards may be ignored. Distinctions between expectations for undergraduate and graduate programs may be unclear.

The list is a well-organized set of reasonable outcomes that focus on the key knowledge, skills, and values students learn in the program. It includes relevant institution-wide outcomes (e.g., communication or critical thinking skills). Outcomes are appropriate for the level (undergraduate vs. graduate); national disciplinary standards have been considered.

The list is reasonable, appropriate, and comprehensive, with clear distinctions between undergraduate and graduate expectations, if applicable. National disciplinary standards have been considered. Faculty has agreed on explicit criteria for assessing students’ level of mastery of each outcome.

Assessable Outcomes

Outcome statements do not identify what students can do to demonstrate learning. Statements such as “Students understand scientific method” do not specify how understanding can be demonstrated and assessed.

Most of the outcomes indicate how students can demonstrate their learning.

Each outcome describes how students can demonstrate learning, e.g., “Graduates can write reports in APA style” or “Graduates can make original contributions to biological knowledge.”

Outcomes describe how students can demonstrate their learning. Faculty has agreed on explicit criteria statements, such as rubrics, and has identified examples of student performance at varying levels for each outcome.

Alignment There is no clear relationship between the outcomes and the curriculum that students experience.

Students appear to be given reasonable opportunities to develop the outcomes in the required curriculum.

The curriculum is designed to provide opportunities for students to learn and to develop increasing sophistication with respect to each outcome. This design may be summarized in a curriculum map.

Pedagogy, grading, the curriculum, relevant student support services and co- curriculum are explicitly and intentionally aligned with each outcome. Curriculum map indicates increasing levels of proficiency. Assessment

PlanningThere is no formal plan for assessing each outcome.

The program relies on short-term planning, such as selecting which outcome(s) to assess in the current year.

The program has a reasonable, multi-year assessment plan that identifies when each outcome will be assessed. The plan may explicitly include analysis and implementation of improvements.

The program has a fully-articulated, sustainable, multi-year assessment plan that describes when and how each outcome will be assessed and how improvements based on findings will be implemented. The plan is routinely examined and revised, as needed.

The Student Experience

Students know little or nothing about the overall outcomes of the program. Communication of outcomes to students, e.g. in syllabi or catalog, is spotty or nonexistent.

Students have some knowledge of program outcomes. Communication is occasional and informal, left to individual faculty or advisors.

Students have a good grasp of program outcomes. They may use them to guide their own learning. Outcomes are included in most syllabi and are readily available in the catalog, on the web page, and elsewhere.

Students are well-acquainted with program outcomes and may participate in the creation and use of rubrics. They are skilled at self-assessing in relation to the outcomes and levels of performance. Program policy calls for inclusion of outcomes in all course syllabi, and they are readily available in other program documents.

Academic Deans Council February 1, 2017 5

Guidelines on Using the Learning Outcomes Rubric This rubric is intended to help teams assess the extent to which an institution has developed and assessed program learning outcomes and made improvements based on assessment results. For the fullest picture of an institution’s accomplishments, reviews of written materials should be augmented with interviews at the time of the visit.

Dimensions of the Rubric: 1. Comprehensive List. The set of program learning outcomes should be a short but comprehensive list of the most important knowledge, skills, and values

students learn in the program. Higher levels of sophistication are expected for graduate program outcomes than for undergraduate program outcomes.There is no strict rule concerning the optimum number of outcomes, but quality is more important than quantity. Learning processes (e.g., completing aninternship) should not be confused with learning outcomes (what is learned in the internship, such as application of theory to real-world practice).

Questions. Is the list reasonable, appropriate and well organized? Are relevant institution-wide outcomes, such as information literacy, included? Are distinctions between undergraduate and graduate outcomes clear? Have national disciplinary standards been considered when developing and refining the outcomes? Are explicit criteria – as defined in a rubric, for example – available for each outcome?

2. Assessable Outcomes. Outcome statements specify what students can do to demonstrate their learning. For example, an outcome might state, “Graduatesof our program can collaborate effectively to reach a common goal” or “Graduates of our program can design research studies to test theories.” Theseoutcomes are assessable because the quality of collaboration in teams and the quality of student-created research designs can be observed. Criteria forassessing student products or behaviors usually are specified in rubrics that indicate varying levels of student performance (i.e., work that does not meetexpectations, meets expectations, and exceeds expectations).

Questions, Do the outcomes clarify how students can demonstrate learning? Are there agreed upon, explicit criteria, such as rubrics, for assessing each outcome? Are there examples of student work representing different levels of mastery for each outcome?

3. Alignment. Students cannot be held responsible for mastering learning outcomes without a curriculum that is designed to develop increasing sophisticationwith respect to each outcome. This design is often summarized in a curriculum map—a matrix that shows the relationship between courses in the requiredcurriculum and the program’s learning outcomes. Pedagogy and grading aligned with outcomes help encourage student growth and provide studentsfeedback on their development.

Questions. Is the curriculum explicitly aligned with the program outcomes? Do faculty select effective pedagogy and use grading to promote learning? Are student support services and the co-curriculum explicitly aligned to reinforce and promote the development of student learning outcomes?

4. Assessment Planning. Programs need not assess every outcome every year, but faculty are expected to have a plan to cycle through the outcomes over areasonable period of time, such as the timeframe for program review.

Questions. Does the plan clarify when, how, and how often each outcome will be assessed? Will all outcomes be assessed over a reasonable period of time? Is the plan sustainable, in terms of human, fiscal, and other resources? Are assessment plans revised, as needed?

5. The Student Experience. At a minimum, students need to be aware of the learning outcomes of the program(s) in which they are enrolled. Ideally, theycould be included as partners in defining and applying the outcomes and the criteria for varying levels of accomplishment.

Questions: Are the outcomes communicated to students consistently and meaningfully? Do students understand what the outcomes mean and how they can further their own learning? Do students use the outcomes and criteria to self-assess? Do they participate in reviews of outcomes, criteria, curriculum design, or related activities?

Academic Deans Council February 1, 2017 6

PROGRAM REVIEW RUBRIC Rubric for Assessing the Integration of Student Learning Assessment into Program Reviews

Criterion Initial Emerging Developed Highly Developed Required Elements of the Self-Study

Program faculty may be required to provide a list of program-level student learning outcomes.

Faculty are required to provide the program’s student learning outcomes and summarize annual assessment findings.

Faculty are required to provide the program’s student learning outcomes, annual assessment studies, findings, and resulting changes. They may be required to submit a plan for the next cycle of assessment studies.

Faculty are required to evaluate the program’s student learning outcomes, annual assessment findings, bench-marking results, subsequent changes, and evidence concerning the impact of these changes. They present a plan for the next cycle of assessment studies.

Process of Review

Internal and external reviewers do not address evidence concerning the quality of student learning in the program other than grades.

Internal and external reviewers address indirect and possibly direct evidence of student learning in the program; they do so at the descriptive level, rather than providing an evaluation.

Internal and external reviewers analyze direct and indirect evidence of student learning in the program and offer evaluative feedback and suggestions for improvement. They have sufficient expertise to evaluate program efforts. Departments use the feedback to improve their work.

Well-qualified internal and external reviewers evaluate the program’s learning outcomes, assessment plan, evidence, benchmarking results, and assessment impact. They give evaluative feedback and suggestions for improvement. The department uses the feedback to improve student learning.

Planning and Budgeting

The campus has not integrated program reviews into planning and budgeting processes.

The campus has attempted to integrate program reviews into planning and budgeting processes, but with limited success.

The campus generally integrates program reviews into planning and budgeting processes, but not through a formal process.

The campus systematically integrates program reviews into planning and budgeting processes, e.g., through negotiating formal action plans with mutually agreed-upon commitments.

Annual Feedback on Assessment Efforts

No individual or committee on campus provides feedback to departments on the quality of their outcomes, assessment plans, assessment studies, impact, etc.

An individual or committee occasionally provides feedback on the quality of outcomes, assessment plans, assessment studies, etc.

A well-qualified individual or committee provides annual feedback on the quality of outcomes, assessment plans, assessment studies, etc. Departments use the feedback to improve their work.

A well-qualified individual or committee provides annual feedback on the quality of outcomes, assessment plans, assessment studies, benchmarking results, and assessment impact. Departments effectively use the feedback to improve student learning. Follow-up activities enjoy institutional support

The Student Experience

Students are unaware of and uninvolved in program review.

Program review may include focus groups or conversations with students to follow up on results of surveys

The internal and external reviewers examine samples of student work, e.g., sample papers, portfolios, and capstone projects. Students may be invited to discuss what they learned and how they learned it.

Students are respected partners in the program review process. They may offer poster sessions on their work, demonstrate how they apply rubrics to self-assess, and/or provide their own evaluative feedback.

Academic Deans Council February 1, 2017 7

Guidelines for Using the Program Review Rubric For the fullest picture of an institution’s accomplishments, reviews of written materials should be augmented with interviews at the time of the visit.

Dimensions of the Rubric: 1. Self-Study Requirements. The campus should have explicit requirements for the program’s self-study, including an analysis of the program’s learning

outcomes and a review of the annual assessment studies conducted since the last program review. Faculty preparing the self-study can reflect on theaccumulating results and their impact, and plan for the next cycle of assessment studies. As much as possible, programs can benchmark findingsagainst similar programs on other campuses.

Questions: Does the campus require self-studies that include an analysis of the program’s learning outcomes, assessment studies, assessment results, benchmarking results, and assessment impact, including the impact of changes made in response to earlier studies? Does the campus require an updated assessment plan for the subsequent years before the next program review?

2. Self-Study Review. Internal reviewers (on-campus individuals) and external reviewers (off-campus individuals, usually disciplinary experts) evaluatethe program’s learning outcomes, assessment plan, assessment evidence, benchmarking results, and assessment impact; and they provide evaluativefeedback and suggestions for improvement.

Questions: Who reviews the self-studies? Do they have the training or expertise to provide effective feedback? Do they routinely evaluate the program’s learning outcomes, assessment plan, assessment evidence, benchmarking results, and assessment impact? Do they provide suggestions for improvement? Do departments effectively use this feedback to improve student learning?

3. Planning and Budgeting. Program reviews are not be pro forma exercises; they should be tied to planning and budgeting processes, with expectationsthat increased support will lead to increased effectiveness, such as improving student learning and retention rates.

Questions: Does the campus systematically integrate program reviews into planning and budgeting processes? Are expectations established for the impact of planned changes?

4. Annual Feedback on Assessment Efforts. Institutions often find considerable variation in the quality of assessment efforts across programs. Whileprogram reviews encourage departments to reflect on multi-year assessment results, some programs are likely to require more immediate feedback,usually based on a required annual assessment report. This feedback might be provided by an assessment director or committee, relevant dean orothers; and whoever has this responsibility should have the expertise to provide quality feedback.

Questions: Does someone or a committee have the responsibility for providing annual feedback on the assessment process? Does this person or team have the expertise to provide effective feedback? Does this person or team routinely provide feedback on the quality of outcomes, assessment plans, assessment studies, benchmarking results, and assessment impact? Do departments effectively use this feedback to improve student learning?

5. The Student Experience. Students have a unique perspective on a given program of study: they know better than anyone what it means to go throughit as a student. Program review can take advantage of that perspective and build it into the review.

Questions: Are students aware of the purpose and value of program review? Are they involved in preparations and the self-study? Do they have an opportunity to interact with internal or external reviewers, demonstrate and interpret their learning, and provide evaluative feedback?

Academic Deans Council February 1, 2017 8

Academic Deans Council February 1, 2017 9

Academic Deans Council February 1, 2017 10

Academic Deans Council February 1, 2017 11

Academic Deans Council February 1, 2017 12

Academic Deans Council February 1, 2017 13

Academic Deans Council February 1, 2017 14

Academic Deans Council February 1, 2017 15

Academic Deans Council February 1, 2017 16

Academic Deans Council February 1, 2017 17

Academic Deans Council February 1, 2017 18

Academic Deans Council February 1, 2017 19

Academic Deans Council February 1, 2017 20

UpdatedOnlineCourseReclassificationProcessLoma Linda University • Division of Extended Education

In August, 2016, the Division of Extended Education (DoEE) developed an Online Course Reclassification Process. It included an initial review process at the school level that when completed would be sent to the DoEE for the final review and reclassification. This process placed a burden to the schools, so an alternative process has been developed in collaboration with Educational Technology Services (ETS). It is now active.

Below are the steps for online course instructors to initiate the upgrade process for their online courses that were designated as either “not aid eligible” or “correspondence” in the recent Online Course Audit. These online instructors should:

1. Send a request to ETS for a meeting with an instructional designer:https://llu.co1.qualtrics.com/SE/?SID=SV_eQ0F0y97VUvLZlP

a. A response from ETS will be sent to the instructor within 24 hours.

2. Work with ETS to set up an appointment with an instructional designer who will help theinstructor learn what needs to be done and how to move the course to the next classificationlevel. The instructor may need to meet with an instructional designer several times.

3. Complete the “Design and Development Reflection” survey (TBA) when the course has beensatisfactorily updated.

a. A second instructional designer will then review the course largely guided by thereflection survey responses.

b. The instructor and Academic Dean will be notified regarding the results of the course’sreclassification.

c. The course’s reclassification will be entered into the current course entry system forBanner.

4. Complete the “Implementation Reflection” (TBA) survey upon completion of teaching theredesigned course.

a. An instructional designer will then review/audit the completed course to confirm that itwas taught the way it was designed. If it was taught as designed, the updatedclassification will be retained. But if it wasn’t taught as designed, the course will bemoved back to the previous classification. It will need to go through the reclassificationprocess again.

b. The instructor and Academic Dean will be notified about the results of this final review.

This process will allow online courses to be reclassified before they have been taught but will still provide a final review upon completion of the course the next time it is taught. The final review will validate the instructor’s interactions and follow through on the new course design features and strategies.

Academic Deans Council February 1, 2017 21

Distance Education at Loma Linda University

Excellent online teaching promotes student success. It also helps LLU to meet federal regulations for

Title IV student financial aid eligibility.

Active Leaming. Active learning is the key for both online and face-to-face student success. Some

online instructors have asked why the federal government is focusing on distance education and not on

face-to-face education. The federal government is trying to ensure that online students receive a full

and complete learning experience equivalent to students in the face-to-face programs and courses.

Thus, what they require for distance education gives a good indication of what they believe is or

should be happening in the f2f classroom.

Required for Distance Education and Hybrid Courses. In brief, here are some of the things that are

required at LLU: • Regular and substantive interactions that include compelling discussions that draw in the

students• Weekly instruction by the instructor• Weekly requirement for student participation (attendance)• Clear and complete course syllabus• Meaningful and timely feedback• Respond to students' questions within 24 hours during the week• Instruction each week should meet LLU's credit hour policy and definition.• Include the Loma Linda experience through Mission Focused Learning (MFL)

Get to Know the Students. Online students are just as real and important as face-to-face students.

Instructors should take the time and effort to get to know them.

When the Instructor Can't Teach: Unplanned. If the instructor unexpectedly must miss a class session

due to extenuating circumstances including illness, accident, etc., they should at least let the students

know what has happened as quickly as possible and notify the program director. A qualified substitute

with the appropriate degree and expertise that meet accreditation standards should teach in the course

for duration of the instructor's absence.

When the Instructor Can't Teach: Planned. If the instructor knows before the course begins that they

will be absent one or more weeks-such as to attend a conference-and do not plan to interact with the

students during that time in Canvas, the course must be labeled ahead of time as a Hybrid

Correspondence course. However, if the instructor interacts with the students and asynchronously

runs the class while attending the conference, etc., the course must be labeled ahead of time as a Hybrid

Distance Education course.

Synchronous Courses. If a course is taught via Zoom, it should be noted in the course schedule ahead

of time that it is a synchronous online course. Otherwise, students may not be able to attend due to

schedule complications.

For more complete information, please review the LLU Distance Education Instructor Guide.

Academic Deans Council February 1, 2017 22