outline of current servicespath.ccp.edu/iwac/...studentacademiccomputercenterassessmentre…  ·...

79
Learning Lab/Student Academic Computer Center Department Assessment Report, Spring 2014 Description Falling under the College’s Division of Educational Support Services (ESS), the Learning Lab/Student Academic Computer Center Department provides various support services to help students achieve their academic goals. By offering a student-centered learning experience as well as access to computer workstations with various software applications required in all disciplines at the College, the Learning Lab/Student Academic Computer Center Department strives to give students opportunities to recognize their potential and become independent learners. The Learning Lab/Student Academic Computer Center Department has multiple locations on the Main campus and at the Regional Centers. On the Main campus, these locations include three Learning Labs (Humanities, Math, and Science labs) and three Student Academic Computer Center (SACC) labs (two general access labs and one specialty lab for Architecture and Art students). Each Regional Center houses Learning Labs and SACC labs: a Learning Commons area at both the Northeast Regional Center and West Regional Center and separate site locations at the Northwest Regional Center. Services Available in the Learning Labs The Learning Lab Department offers free academic support services to current CCP students. Faculty specialists and tutors with expertise in a broad range of subject areas are available to provide students with a personalized learning experience. All of the Learning Lab Department’s services are designed to help students achieve success throughout their college careers. • Individual tutoring by appointment or first-come, first-served drop-in sessions for a variety of subjects • Workshops in Reading, Writing, Science, Mathematics, and English as a Second Language

Upload: dinhtram

Post on 18-Aug-2019

213 views

Category:

Documents


0 download

TRANSCRIPT

Learning Lab/Student Academic Computer Center Department Assessment Report, Spring 2014

Description

Falling under the College’s Division of Educational Support Services (ESS), the Learning Lab/Student Academic Computer Center Department provides various support services to help students achieve their academic goals. By offering a student-centered learning experience as well as access to computer workstations with various software applications required in all disciplines at the College, the Learning Lab/Student Academic Computer Center Department strives to give students opportunities to recognize their potential and become independent learners.

The Learning Lab/Student Academic Computer Center Department has multiple locations on the Main campus and at the Regional Centers. On the Main campus, these locations include three Learning Labs (Humanities, Math, and Science labs) and three Student Academic Computer Center (SACC) labs (two general access labs and one specialty lab for Architecture and Art students). Each Regional Center houses Learning Labs and SACC labs: a Learning Commons area at both the Northeast Regional Center and West Regional Center and separate site locations at the Northwest Regional Center.

Services Available in the Learning Labs

The Learning Lab Department offers free academic support services to current CCP students. Faculty specialists and tutors with expertise in a broad range of subject areas are available to provide students with a personalized learning experience. All of the Learning Lab Department’s services are designed to help students achieve success throughout their college careers.

• Individual tutoring by appointment or first-come, first-served drop-in sessions for a variety of subjects

• Workshops in Reading, Writing, Science, Mathematics, and English as a Second Language

• Master Student Workshops designed to help students develop the skills necessary to achieve academic success (for example, Time Management, Note-Taking, Test-Taking, Learning Styles, etc.)

• Study groups in Biology, Chemistry, Mathematics, and English as a Second Language

• Online Tutoring

• Support services and computer-assisted instruction for students with disabilities

Services Available in the Student Academic Computer Center

The Student Academic Computer Center (SACC) offers students free access to computer workstations to assist them as they conduct research or complete other academic work for their classes. The computers in each SACC lab are equipped with various software applications, email, Internet, and laser printing capability. At most locations, students must have a valid College ID and be currently enrolled to gain entry into a SACC lab to use the computers. Students should also familiarize themselves with printing limits and

other guidelines posted in each lab location.

• Individual workstations equipped with various software applications, such as Windows 7/Microsoft Office 2010, Adobe Pagemaker and Illustrator

• Instructional aides and student lab helpers who can assist students and troubleshoot

• Free laser printing (daily printing limits are enforced)

• Tutoring for CIS 103 and OA 106

• Adaptive technology and other accommodations for students with disabilities

• Workshops designed to help increase students’ computer literacy skills

Assessment Report

Background

The Learning Lab /SACC Department underwent fundamental changes after the ENGL 098 Lab classes (linked to Level 2 and Level 3 ENGL classes) were discontinued at the end of Spring 2012. Anticipating these changes, the department submitted a vision plan to Dean Bush in the Fall 2011 and began developing an assessment plan for existing services (included as Appendix 1). The department also wanted to begin collecting data which would justify departmental decisions to maintain, expand, or initiate services.

This document that the Learning Lab/SACC Department is preparing for inclusion in the Middle States Self Study is a comprehensive collection of reports dating back to Fall 2012 comprised of data tabulation and/or analyses which the members of the Learning Lab/SACC Department hope will demonstrate not only the faculty’s commitment to student learning and success but also to the betterment of the department’s services. The reports and analyses also reflect our attempts to follow the assessment plan detailed in Appendix 1; however, the department has recently decided to revise our assessment strategy for the future, as explained and proposed in Appendix 18, “Learning Lab/SACC Universal Assessment Template”.

Highlights include, but are not limited to, determining correlations which indicate that ENGL 101 and Math 118 tutoring have a positive impact on student performance/success; students who sought Math tutoring approximately 10 times within a semester could improve their final course grade by half a grade point; attendance in ESL Learning Lab classes is correlated with pass rate, and pass rates are ten percentage points lower in classes with no lab class; and data tabulations which show steady increases in usage of the SACC labs and Learning Commons areas.

Challenges have included, but are not limited to, identifying viable methods to collect and analyze data that indeed provide direct evidence to substantiate the effectiveness of tutoring and other support services to students’ academic success; addressing a generally wary if sometimes contentious response from administration about defining and showing “effectiveness”; and navigating obstacles in collecting data through ITS/Bantasks.

Strengths of our department and assessment plan are 1) providing student-centered, supplemental instruction services through tutoring, workshops, study groups, and ESL Lab classes; 2) collaborating with faculty across the disciplines to support their courses and students; 3) being committed to developing new initiatives to expand support services to students (ex. Online Tutoring, Workshops-to-Go, FYI Course); 4) enabling access to computers and printers in our SACC labs and Learning Commons areas; 5) having a dedicated and professional staff comprised of faculty, tutors, student workers, and administrative assistants.

Recommendations for improvement include obtaining administrative support and increased funding for 1) more points of entry to the internet in the Learning Labs and SACC labs on the Main campus (for example, the Bonnell SACC lab, B2-33, and Central Learning Lab, B1-28, do not have WiFi access); 2) more computer workstations in all of SACC labs to meet user demands; 3) facilities improvement in the Learning Labs and Bonnell SACC lab on the Main campus; and 4) better safety and security systems, equipment, and safety/security personnel response. The department is also committed to developing a department-wide standard assessment method and to devoting more attention to collecting, analyzing, and reporting evidence which shows SACC’s impact on student success.

Table of Appendices

Data for Tutoring and Learning Lab Services

Appendix 1. “Revised Vision Statement with Assessment Proposal” (By the LLab/SACC Department, Submitted February 2012, pp. 5-17)

Appendix 2. “Memo: Data Analysis on ESL Lab Classes” (By Ted Wong, FT Science Specialist, LLab, Submitted March 2012, pp. 18-20)

Appendix 3. “Who is Attending Tutoring?: Assessment of Retention Performance Indicators and Learning Lab Outreach for Math 016, 017, and 118 and English 098 and 101” (By Megan Fuller, FT Science Specialist, LLab, Submitted August 2012, pp. 21-29)

Appendix 4. “Learning Lab and Student Academic Computer Center Survey Fall 2012” (By Megan Fuller and Murray Lowenthal, FT Science Specialist and VL Math Specialist, LLab, Submitted October 2012, pp. 30-33)

Appendix 5. “Saturday Tutoring Attendance” (By Megan Fuller, Submitted October 2012, pp. 34-35)

Appendix 6. “Tutoring Report: English 101 and Math 118, Fall 2011” (By Megan Fuller, Submitted Fall 2012, pp. 36-38)

Appendix 7. “Memo: Full-Time Faculty Positions and Evidence of Effectiveness” (By Megan Fuller, Submitted October 2012, pp. 39-40)

Appendix 8. “Memo: Access to Student-Performance Data” (By Megan Fuller and Ted Wong, Submitted February 2013, pp. 41-42)

Appendix 9. “PBI Fund: Peer Tutoring in Mathematics” (By Lilla Hudoba, FT Math Specialist, LLab, Submitted July 2013, p.43)

Appendix 10. “Outcomes Assessment for Math Tutoring” (By Megan Fuller, Submitted August 2013, p.44-47)

Appendix 11. “Summary Workshops-to-Go, Fall Totals” (By Joan Monroe, FT Learning Disabilities Specialist with Reading/Writing Concentration, LLab, Submitted October 2013, p. 48)

Appendix 12. “Online Tutoring Initiative, Usage Summary Fall 2013” (By Ellen Moscow, VL Reading/Writing Specialist, LLab, Submitted December 2013, p. 49)

Appendix 13. “CCP LLab Online Tutoring Usage, Fall 2013” (By WorldWideWhiteboard via Ellen Moscow, Submitted December 2013, p.50)

Appendix 14. “Online Tutoring, Usage Report Fall 2013” (By Paul Bonila and Ellen Moscow, Submitted January 2014, p. 51-52)

Usage Data for SACC Labs

Appendix 15. “SACC and Learning Commons, Saturday Usage, Spring 2012 and Spring 2013” (By Bantasks, Submitted October 2013, p. 53)

Appendix 16. “WERC SACC Usage, Fall 2012 and Spring 2013” (By Bantasks, Submitted June 2013, p.54)

Appendix 17. “LLab/SACC Student Contacts, Jan 2013-Dec 2013” (By Bantasks, Submitted Dec 2013, p.55)

Current Assessment Plan for Learning Lab/SACC Department

Appendix 18. “Learning Lab/SACC Universal Assessment Template” (By Ted Wong, Submitted January 2014, pp. 56-59)

APPENDIX 1

Learning Lab/SACC Department Assessment Proposal

Mission

The Learning Lab provides supplemental college-level and developmental content-based instruction across all curricula, effectively assisting students in recognizing their potential, becoming independent learners, and achieving their goals. In support of the College mission, the Learning Lab provides opportunities for student success by offering a student-centered learning experience that is guided by a personalized, structured, and problem-solving approach.

The Student Academic Computer Center provides CCP’s diverse population of students with access to computer workstations with various software applications required in all disciplines at the College, including e-mail, the Web and laser printing. Encouraging innovation and excellence across the curricula, SACC assists lab users in the use of information technology for the creation, organization, analysis and presentation of scholarly endeavors.

Changes in LLab/SACC, Priority Areas, and Current Assessment Proposal

The Learning Lab /SACC Department is undergoing fundamental changes in the coming year with the discontinuation of the ENG 098 Lab classes. We view this time as an opportunity to evaluate current services and construct a plan for the future of the Department. Successful existing programs will be enhanced and new initiatives will be designed to ensure optimal student support and improve student outcomes.

A vision plan was submitted to Dean Bush in the Fall 2011. The new concepts and programs outlined in this vision were crafted to reflect and support the Departmental Mission statement and to contribute to the realization of the College’s 2008 – 2012 Strategic Plan. It is the Department’s intention to continue to provide college and developmental-level tutoring, group work, and supplemental instruction across the curricula. The proposed activities, programs, and technologies are aimed at establishing a more student-centered and learning-centered culture at the college while improving intervention strategies and student support.

The current assessment proposal considers 6 priority areas from the original vision plan and identifies an assessment strategy for each area. These areas are 1) Tutoring, 2) Study Groups, 3) Writing Center, 4) FYI, 5) Online Tutoring, and 6) SACC Initiatives.

Outline of Current Services

In support of the College mission, the Learning Lab-SACC Department provides opportunities for student success by offering a student-centered learning experience that is guided by a personalized, structured and problem-solving approach. The faculty in the Department believes in the importance of one-on-one and small group learning environments and their impact on student success. This belief informed and directed the development of new initiatives and supported the decision to continue and expand several current practices.

Brief Description of Existing Services

The Learning Lab and SACC Department currently offers a large range of services to students. As the faculty work to progress the Departmental vision and offer new methods of student support, it is dedicated to the continuation of many of the existing services that have proven successful in the past. This list is not a complete description of departmental offerings, but rather a highlight of the most utilized and most impactful practices that the Department will continue to provide to the student body.

SACC Services

Encourage the appropriate and effective use of educational technology by students in the pursuit of excellence in education

Offer students access to computer workstations with a variety of software applications required in all disciplines at the College

Supervise and staff Student Academic Computer Centers (SACC) on all campuses Provide workshops covering a variety of computer/IT-related topics intended to help students

access information and interact with various software programs. Interact with faculty to keep informed of new software/technology or their upgrades being

implemented on campus and acquire appropriate support materials for SACC

Mathematics Services

One-on-one peer tutoring is available at all campuses. Students are allowed one scheduled appointment per week per math course being taken. Students are also allowed to attend 'drop-in' appointments each week if there is a tutor available.

Weekly workshops led by a faculty member are available on the Main Campus during fall, spring and summer semesters. Workshops occur throughout the week and on Saturdays at a variety of times.

Allied Health Test Prep Workshops are scheduled with input from the Counseling Department in conjunction with actual testing dates for the Allied Health Test...

Clinical Calculations Review Sessions are established on an as needed basis, usually initiated by first and second year nursing students.

Study groups are available on a limited basis, usually assigned due to special circumstances

Science Services

One-on-one peer tutoring is available at the Main, NE, and NW campuses. Students are allowed one scheduled fifty minute appointment per week per science course being taken. Students are also allowed to attend 'drop-in' appointments each week if there is a tutor available.

Study groups led by faculty are available on the Main Campus throughout the week. A study group must be set-up by students who share the same professor. The students select a mutually convenient time and the group meets once a week for the semester.

BIO 106 reading workshops are available throughout the semester on the Main Campus. Students can attend workshops to learn how to utilize and navigate their BIO 106 textbook.

Allied Health Workshops, for Dental Hygiene, Respiratory, and Diagnostic Imaging students to assist them in study skills, test-taking strategies, and learning medical terminology, are conducted by a Learning Lab Reading Specialist each semester.

English Services

One-on-one faculty tutoring is available at all campuses for students in English and writing-intensive courses. Students are allowed one scheduled fifty minute appointment per week per course being taken. Students are also allowed to attend 'drop-in' appointments each week if there is a tutor available.

Writing Center Workshops, for students in college-level English classes and all writing intensive courses, are available on the Main Campus.

English as a Second Language Services

ESL Specialists provide one-on-one tutoring for students enrolled in all levels of ESL (081/091, 082/092, 083/093, 098/099, 071, 072 and 073.) ESL Specialists also tutor English 101 students transitioning from the ESL program.

ESL Specialists provide ESL Workshops on a variety of topics including Conversation Practice, Pronunciation, Paragraph and Essay Writing, and Grammar and Punctuation for ESL students enrolled in 072/073, 083/093 and 098/099 courses.

Study groups are also organized for the two upper level ESL reading and writing courses. These groups are organized by contacting the students through their English classes and by outreach to the English instructors.

Specialists consult and collaborate with ESL classroom faculty.

Humanities Services

One-on-one peer tutoring is available at the Main Campus. Students are allowed one scheduled fifty minute appointment per week per humanities course being taken. Students are also allowed to attend 'drop-in' appointments each week if there is a tutor available.

Learning Disabilities Services

Learning Disabilities Specialists provide one-on-one, weekly appointments for students with documented learning disabilities and related disorders. Students may also attend drop in appointments each week if specialists are available.

Topics covered during appointments include: strategies development in both academic areas and self-advocacy.

Specialists consult and collaborate with classroom faculty and Center on Disability staff.

General Academic Services

Master Student Workshops, taught by Learning Lab faculty, provide an informal setting for students to learn about and apply study skills. Workshops cover a variety of topics: communicating with professors, time management, reading strategies, test taking, note taking, memory strategies, logical fallacies, and learning styles. Workshops are offered at four different times each semester on the Main Campus and have recently been offered at NERC.

Tutoring Priority Area: Assessment Proposal Tutoring is a staple of the Learning Lab’s existing services. Tutorial support has been identified by the National Center for Developmental Education as one of the factors which is related to academic success at community colleges [Just over half of community colleges have tutorial

programs and one-fourth evaluate their tutoring]. At CCP, students in developmental reading and writing, college-level English courses, college-level humanities and social science courses, mathematics, science, ESL, and students with learning disabilities can schedule tutoring appointments with specialists, full-time and part-time faculty with a least a Master’s Degree in their specialty areas.

Students may also be scheduled with peer tutors. Many peer tutors are CCP students, but others are enrolled in nearby colleges or universities or are former CCP students. They are selected on the basis of their academic records, recommendations by their instructors, and their expressed interest in assisting their peers. Peer tutoring is available for most introductory courses and many advanced courses, particularly in mathematics and science. Most peer tutoring is individualized, in which one tutor assists one student. Tutoring may be provided for small groups of students from the same class.

Tutoring Outcomes Assessment

Committee Members: Joan Monroe, Gail Chaskes, Lilla Hudoba, Olympia Mitchell, Diane McManus, Gerald Nwankwo, Tom Hinchcliffe, Betsy Elijah

Goal #1

Evaluate and improve data collection procedures for evening tutoring on Main Campus.

Objectives/Actions:

Train specialists, tutors, receptionists and secretaries in data collection procedures for evening tutoring on the Main Campus

Explore how Banner can be used to aid in data collection and analysis and implement a process/procedure.

Assessment

Compare paper appointment records to Banner appointment records to determine accuracy of data in Banner

Goal #2

Determine the impact of tutoring on student retention at the College.

Objectives/Actions:

Identify course/courses for data collection and analysis

Identify specific criteria for data collection.

Analyze Banner reports

Assessment

Compare student retention rates for those utilizing tutoring to college-wide retention rates using the College’s method of Fall to Fall Retention

Expanding Services

The Department is in the process of expanding some of its most popular and needed services. The programs that have been identified for expansion are detailed below.

Study Group Priority Area: Assessment Proposal

The study group model is currently used with great success in science courses, and with lesser frequency in math courses. The faculty have determined that the model should be implemented for other curricula across the college. The most notable courses that the study group model will be applied to are ENG 098, ESL 083/093 and 098/099, MATH 118, and MATH 017. These courses are particularly important because they will be the first attempt to apply the study group model to developmental courses. Students and professors must approach the Learning Lab if they are interested in establishing a study group, this model will engender responsibility in students and awareness of Learning Lab services in professors. Other courses that may adopt the study group model are ENG 101 and 102, ESL 081/091 and ESL 082/092, and MATH 016. Study groups offer smaller, more cohesive groups than workshops, and meet at a time determined based on availability of the students, faculty and space requirements. It is a good use of faculty time to provide focused support to students who share the same professor. These courses were chosen for the new initiative because they are considered to be high-risk courses (otherwise known in the college as Gatekeeper courses) that are important prerequisites to most college level courses.

Committee Members: Megan Fuller, Jay Howard, Lilla Hudoba, Elizabeth Cuidet, Scarlette Floyd, Judy Reitzes, Anne Francis, Tom Hinchcliffe

Study Group Model Outcomes Assessment

The use of the study group model supports both the departmental and college mission by supporting heightened intellectual curiosity and promoting sharing and learning within a community. The study group program offers important student benefits, as well as departmental advantages. Because of these distinct levels of impact, two major goals were defined for the study group program.

Goal #1

Departmental Level: Study groups provide an opportunity for Learning Lab faculty and academic faculty to communicate and collaborate to aid in students’ learning. The Learning Lab will focus on building the number and quality of Learning Lab – faculty interactions through outreach and promotion of the study group program.

Objectives/Actions

Faculty in each discipline of the Learning Lab offering study groups will implement outreach to faculty and students to inform them of the study group program. The faculty in the discipline will determine the most effective outreach approach for their area.

Assessments

Each discipline will track outreach activities. Faculty interest will be noted with the intention of meeting faculty and student needs as they pertain to the formation of study groups. The frequency of faculty interested will be recorded and any resulting study group formation will be documented.

Review

Outreach activities that generate faculty interest will be implemented in the future. New outreach initiatives may be piloted as needed to improve Lab – faculty communication and collaboration.

Goal #2

Student level: Study groups offer an opportunity for students to receive faculty-guided tutoring and content review. The group sessions will help to clarify concepts and aid students in gaining proficiency with academic skills.

Objectives/Actions

Study group sessions will be held each week to address topics and areas-of-interest as defined by the students in the group and/or the faculty members (either in the Learning Lab or from the academic course being served by the study group).

Assessments

Indirect : Two surveys will be distributed during the semester to measure the student’s perceived benefit of the study group session. This self-reporting survey will be offered once in the first half of the semester (before midterms) and once in the second half of the semester (after midterms) at a time deemed appropriate by the Learning Lab faculty conducting the study group.

Direct: Twice during the semester students participating in a study group will be asked to submit reflective writing detailing what they have learned from their time in a study group and in what ways they have benefited both in their course and as a student in general. These statements will be collected once before midterms and once after midterms at a time deemed appropriate by the Learning Lab faculty conducting the study group.

Review

The direct and indirect assessments will be collected and analyzed for the overall impact of the study group session on the students’ academic improvement. The results of the assessment will be used to continue the evolution of the study group model. Improvements and/or changes will be made if necessary.

Writing Center Priority Area: Assessment Proposal

Established in Fall 2004, the Writing Center has provided workshops and one-on-one tutoring to Community College of Philadelphia students who are in college-level writing courses. While much of the service delivery and promotion of the workshops and tutoring sessions directly target students in English 101 and 102, faculty have determined that these services should be more assertively proposed to faculty across departments, disciplines, and curricula as a resource for their students as well, particularly if essay writing and research papers will be assigned.

Committee Members: John Pinto, John Nace, Olympia Mitchell, Lucia Gbaya-Kanga, Gary Mitchell

Writing Center Outcomes Assessment

Goals

The goal of this assessment plan is to gather information on how well the Writing Center is fulfilling its Mission Statement:

The Writing Center of the Learning Lab at Community College of Philadelphiaserves the interests of all registered students in their curricular and/or personalengagement and persistence in the writing process. The Center encompasses bothprofessional and technical resources as a learning opportunity extending the benefits of primary instruction at the College. The Center welcomes students seeking assistance in meeting both content and language arts course requirements and, more broadly, encourages individual achievement in written expression while supporting independent learning for academic success.

Assessment factors

The assessment plan will consider both quantitative and qualitative measures in determining the Writing Center’s effectiveness.

Objectives

To determine if the Writing Center’s workshops and tutorials are meeting students’ expressed needs.

To determine if the Writing Center’s workshops and tutorials are impacting student retention and classroom performance.

To determine if the students are learning from the Writing Center’s workshops and tutorials.

To determine if the Writing Center is effectively reaching out to the College community.

Implementation of the Objectives

Have students complete a brief survey at the end of tutorial sessions to determine if their expressed needs were met.

Have students complete a brief survey at the end of each workshop topic to determine if

their expressed needs were met.

Have students complete a “mini-quiz” at the end of each workshop topic to determine what the students have learned.

Have students complete a longer survey at the end of the semester to determine the number and types of contacts they had during the semester. Then, when the semester is completed, the committee will determine the impact these types of contacts had on class performance and retention.

Distribute survey to faculty to determine how well the Writing Center is promoting its services.

Analysis

During Summer Session I, the committee will review the various surveys to determine:

A. How well the Writing Center has met its goals and objectives.

B. How to improve where it has fallen short.

C. What its future direction should be.

It will report its findings at a department meeting.

New Services

The discontinuation of ENG 098 lab classes creates a large developmental student population with no formal method of accessing Learning Lab services. More than 140 hours a week of rostered Lab classes were provided each semester for over 2500 students, with over 18,000 student contacts each semester. This change, in combination with the general trend of low student persistence and retention College-wide, caused the faculty to re-envision the role of the Learning Lab /SACC Department in student support and successful student outcomes. The Department has identified several new initiatives that are aimed at bolstering support for students and equipping them with the intellectual confidence and college savvy necessary to set and achieve goals and make informed decisions regarding their education. The new services and programs being proposed for the future include concepts that are both proactive and intervening. The faculty believes that these initiatives will optimize student contact while ensuring that students have the best academic support possible.

First Year Investigations Priority Area: Assessment Proposal

First Year Investigations: Philadelphia

The Learning Lab proposes to create, oversee, and provide the bulk of the instruction for a new course, tentatively titled First Year Investigations (FYI). Several Learning Lab faculty members, in collaboration with the Director of Developmental Education and the English Department’s Assistant Chair for Developmental English, are actively developing the course, which we see as a three-credit course that would be available to students who have placed into a still-to-be-determined set of developmental courses.

Ultimately the Lab would like to see the course be required of all developmental students. As a department, the Learning Lab will be responsible for training FYI faculty, assessing and maintaining course consistency and effectiveness, and making long-term course adjustments.

FYI will teach fundamental academic skills—such as critical thinking, information literacy, and communication skills—as well as the resources of the College and the larger community, in an experiential, intellectual context. FYI sections will each focus on a broad intellectual topic connected with Philadelphia. (For example, a section might focus on the epidemiology of obesity in Philadelphia. Another might examine community-based political activism in Philadelphia.) Within each intellectual focus, the course activities will use experiential learning—individual and group research projects, service-learning, creation of multimedia presentations—to teach and reinforce the underlying academic intellectual skills.

The Learning Lab is currently writing the FYI course proposal, and we expect to provide the instruction for most of the FYI sections. Most Lab faculty members, including both full-time and part-time faculty, will teach at least one and at most two sections, as part of their normal teaching duties. The Lab will also be responsible for recruiting and training faculty members from other departments and for ensuring consistency and effectiveness across all sections. The course design and faculty training will emphasize assessment of learning outcomes, both during the course and over the students’ time at the College. We hope to have the course ready to pilot in Spring 2013.

Committee Members: Ted Wong, Michelle Myers, Mary Ann Yannuzzi, Dorian Geisler

FYI Outcomes Assessment

Goals

Improve developmental students' academic skills.

Improve student success outcomes for developmental students.

Raise the profile of the Learning Lab department.

Raise the profile of CCP in the community.

Assessment

Goal # 1: Academic Skills

Sometime near the beginning of each semester, students will be asked to produce a set of three work products. The type of work product will be the same across all sections and from year to year. Currently, we are thinking that the products will be a summary of a newspaper op-ed piece, a research-topic proposal, and a graph summarizing some data. These work products will be produced before much instruction has taken place.

Three times during each semester, a sample of 5-10 work products will be selected randomly from each section. These work products will be produced after specific instruction has taken place. The first sample might consist of op-ed summaries, the second might consist of research-topic proposals, and the third might consist of graphs summarizing data. Each work product selected for the sample will be paired with the work product of the same type that the same student produced near the beginning of the semester.

The before-and-after work-product samples will be evaluated by a small, rotating committee of FYI faculty members according to a common rubric. The rubric will be designed to allow us to estimate a student's level of mastery of a particular academic skill on the basis of the quality of the work product. For the examples cited above, the rubrics would map work quality to mastery of critical reading, research-topic selection and articulation, and data communication skills. The committee will evaluate both the "before" products and the "after" products.

For each sampling time, we would like to see improvement in the work products of 75% of the students sampled.

For each sampling time, we would like to see satisfactory mastery in the "after" products of 75% of the students sampled.

Goal # 2: Student Success Outcomes

We will track students' persistence in CCP, and performance measured as GPA.

We would like to see persistence and performance become equal to persistence and performance in students who start CCP already college-ready.

Goal # 3: Department Profile

We would like to see a year-to-year increase in the number of faculty members outside of the Learning Lab who are interested in teaching FYI.

We would like to see FYI experiences reflected in CCSSE results.

We would like to see college-wide awareness of FYI among CCP students.

We would like to see student interest in FYI increase from year to year.

Goal # 4: College Profile

We would like to see a year-to-year increase in the number of outside organizations and government agencies working with FYI students and faculty members on community-based learning projects.

We would like to see FYI publicly described as a community-college or developmental-education success story.

Actions

Finish writing course proposal, get course approved, implement pilot.

Look into technology for hosting e-portfolios.

Online Tutoring Priority Area: Assessment Proposal

The Learning Lab would like to explore the expansion of online tutoring, both to alleviate staffing constraints in the Regional Centers and to support distance students and students who find it difficult to make on-campus appointments. Online tutoring can be implemented using a third-party platform program which would enable tutors to make and track appointments online as well as conduct real-time tutoring sessions. Ted Wong’s earlier piloting of video tutoring suggests that program expansion will require recruiting faculty members and tutors to provide the service, creation of online tutoring stations and publicizing the service to course faculty and to students.

Committee Members: Paul Bonila, Lilla Hudoba, Hank May, Gary Mitchell, Ellen Moscow, Shomari Weedor, Mary Ann Yannuzzi

Goal

Provide state-of-the-art support services to CCP students to increase their persistence and achievement and thus assure their academic success.

Assessment

Quantitative: Gather pre- and post-pilot retention numbers.

Quantitative: Examine pre- and post-pilot academic success rates (using such benchmarks as passing rates in Gateway courses).

Qualitative: Evaluate pre- and post-pilot student satisfaction with Lab services by using surveys and focus groups.

Objectives/Action

Expand Learning Lab services to CCP students.

Provide online tutoring, initially as a pilot, to increase the total number of Learning Lab tutoring hours by at least 10%.

Utilize online tutoring to make Learning Lab services to students more accessible.

Sync support services with current technology by making online tutoring available on mobile devices, in social media and via cloud services.

Analysis

Present findings to the Department.

Come up with strategies to close any gaps between outcomes and assessment.

Re-assess, re-evaluate and re-tool as needed

SACC Initiatives Priority Area: Assessment Proposal

The Student Academic Computer Center provides a variety of facilities and caring support to a diverse population. SACC encourages innovation and excellence in the curriculum through the appropriate application of information technology and by fostering personal and professional development among students. SACC assists lab users in the use of information technology for the creation, organization, analysis and presentation of scholarly endeavors.

Committee Members: Julieta Thomas, Ed Adolphus, Aaron Brown, Hank May, Garvin Poole, Otis Stevens, Shomari Weedor (Michelle Morgan could not make the meetings)

Goals

Ensure lab users have equitable and appropriate access to technology to meet their learning needs

Encourage the appropriate and effective use of educational technology by lab users in the pursuit of excellence in education

Promote effective communication between the Information Technology Department, SACC Faculty and lab users

Assessment

The requested lab usage list for each facility at the end of each semester demonstrates a continued increase in usage and need for additional equipment and space

Provide and collect waiting lists signed by lab users to demonstrate the need for additional equipment and space

Short evaluation forms to be filled out by lab users addressing adequate help that meets their needs with the service provided

Objectives/Actions

To provide adequate service that addresses our diverse student body and provide the necessary support warranted

To ensure continued proper maintenance of equipment to enable quality usage for our lab users

To ensure proper communication with the Information Technology Department and SACC faculty regarding updates of software, change in hardware and equipment maintenance so that accurate usage is reported

Ongoing training of Student Helpers with updated Hardware, Software and Interpersonal Communication to provide quality service

Conclusion

Assessment has long been an important component in higher education, and with the rise of evidence-based policy practices, educational support services are beginning to attempt to assess the impact and outcomes of their services. Learning Lab/SACC faculty have traditionally relied on qualitative indirect metrics to assess the perceived impact of its services. These evaluations have always concluded that students who avail themselves of Learning Lab and SACC services find the contact useful. With improved data collection, it seems reasonable to assume that a more quantifiable relationship between services rendered and benefit received could be constructed.

Faculty have invested time and research into determining the most beneficial, efficient, and responsible methods for evaluating the impact of tutoring and workshop services. The picture of student success is a

complex collage of characteristics and factors that make the reality of quantifying impact a daunting one. The most time-consuming, and technically rigorous part of a quantifiable evaluation of student outcome is the assurance that a correlation that is found in the data is truly a causal relationship. Causality can only be claimed if there are robust control and experimental groups of students who are similar in most characteristics but who differ in their use of Learning Lab and SACC services. The time and effort that would be required for such an investigation are not trivial.

The Department is actively considering both indirect and direct assessment methods and is constantly reviewing literature and best practices of other similar institutions to ensure they are at the forefront of current assessment protocols. The question of student outcomes is an important one, not only for Community College of Philadelphia students, but for all higher education students and institutions. The faculty are committed to collaborating with researchers within and outside of the college to create a standardized and manageable assessment process that will measure departmental impact and contribute to the departmental and college mission.

APPENDIX 2

From: Ted WongTo: Joan BushCc: Michelle Myers, John Pinto, Judy Reitzes

Re: Data analysis on ESL lab classesDate: March 12, 2012

Dean Bush:

You asked me to look through some of the student attendance and grade data, and to see if we could say anything about the effectiveness of the Learning Lab lab classes attached to ENG 071, 081/091, and082/092. This memo describes my findings, but here’s the summary:

1. Students’ attendance in the lab classes was positively correlated with frequency of passing theEnglish class. That correlation is sizable and highly significant.

2. Also, courses with no attached lab class have a lower pass rate, compared with courses with a lab class, by ten percentage points.

Overall, I believe that there is good evidence that for students who make use of them, the lab classes do improve their chances of succeeding in the attached English courses.

1. Pass rates are positively correlated with lab-class attendance.

Students who attend more lab classes are more likely to pass the attached English course. The correlation, calculated over all three courses ENG 071, 081/091, and -82/092, is sizable (φ = 0.11) and highly significant (p << 0.001).

The relationship between pass rate and lab-class attendance is presented graphically in the three graphs in Figure 1. Each graph describes, for all students who attended a lab class a certain number of times, the percentage of those students who eventually passed the attached English course. The number of students represented by that percentage is represented as the thickness of the line. In each of the three cases, thereare many students who attend no lab classes but still pass. For the rest of the students, there is a general trend of greater pass rate with better attendance.

Of course, the correlation alone does not imply that it is the lab classes that are responsible for greater success among students who use them. It is possible, for example, that high attendance and high pass rate both result from a third factor, for example good academic

Figure 1

Figure 4

skills. Evidence for lab-class effectiveness as a causal agent must come from examining a homogeneous population of students, and comparing their performance in courses with and without attached lab classes.

2. Pass rates are higher for frequent lab-class users for courses with attached lab classes.

Students who succeed in ENG 071, which has an attached lab class, generally proceed to take ENG 072, which has no attached lab class. Similarly, students who succeed in ENG 082/092 generally proceed to take ENG 083/093, also moving from a course with a lab class to one without one. I examined students who made these transitions to see whether the presenceor absence of a lab class was associated with adifference in pass rate.

Figure 2

In order to address the difficulty of teasing out the possible effect of students’ diligence or level of academic skill, I examined only students whose attendance rates in lab classes attached to ENG 071 or ENG 082/092 were in the 75th

percentile of all attendance rates for those labclasses. My assumption was that students who had high lab-class attendance in one class would have been likely to make good use oflab classes for the subsequent course, had such lab classes been provided. To test this assumption, I examined the attendance rates for studentswho took both ENG 081/091 andENG 082/092. Attendance in the two lab classes was very strongly correlated (φ = 0.46, p << 0.001), and the data are shown in Figure2.

How did the frequent lab-class users do when no lab class was available? In the graphs in Figure3, each bar’s height represents these frequent visitors’ pass rates. Bar width represents the

Figure 3

Learning Lab/SACC Assessment Report Page 20

number of students represented in the bar. In the transition from ENG 071 to ENG 072, the pass rate dropped ten percentage points, from 92% to 82%. For ENG 081/091 and ENG 082/092, both of which have attached lab classes, the pass rate for frequent lab-class users was similar: 82-83%. When these same students took ENG 083/093, however, the pass rate dropped to 72%, again approximately ten percentage points. Students who tend to use lab classes suffer when the lab classes are not available to them.

Notes

I’m happy to clarify in person anything that my writing has obscured. I’m also happy to do further analyses, as I’m sure these analyses raise many interesting questions. Thanks to Attilio Gatto of Information Support Services for gathering and organizing the data. Larger, interactive versions of the figures are at http://faculty.ccp.edu/faculty/twong/esl1.html

Learning Lab/SACC Assessment Report Page 21

APPENDIX 3

Who is Attending Tutoring? Assessment of Retention Performance Indicators and Learning Lab Outreach for Math 016, 017, 118 and English 098 and 101

Goal A1 of the College Strategic Plan states that the College will enhance quality, innovation, and effectiveness in the delivery of academic, administrative, and student support services. By assessing the effectiveness of both the Retention Performance Indicator system (herein referred to as the “early alert system”) and Learning Lab services this study aims to enhance the effectiveness of both administrative and support services.  It is only through assessment that we can learn of our successes and shortcomings. It is paramount to have effective and robust support services, because learning must occur outside of the classroom as well as inside.  The college has a premier support services faculty and staff who want to deliver the best instruction possible.  Assessing the current effectiveness will inform their future development.

This project investigates the strengths and weaknesses of the early alert system currently in place to support at-risk students.  This study assesses the impact of the current outreach initiative and highlights ways to improve the program.  This work is also an opportunity to assess the impact of Learning Lab services by evaluating the correlation between final course outcome and tutoring.  This study tracks student response to early alert outreach and the grade trajectories of early alert students.  Final course outcomes for Math 016, Math 017, Math 118, Eng 098, and Eng 101 were analyzed for early alert and non-early alert students.  The impact of tutoring on pass rates and withdrawal rates were assessed.  By tracking student attendance at the Learning Labs, final grades, and early alert status, student engagement and student performance were measured.  These results inform us as to the benefits and opportunities for improvement of the early alert system and tutoring of at-risk students across a wide range of curricula.

Methodology

The objectives of this study originally were to track students in four categories: 1. students who do not receive an early alert and who do not seek help in the Learning Labs, 2. students who do not receive an early alert but seek help in the Learning Labs, 3. students who receive an early alert, and do not seek help at the Learning Labs, and 4. students who receive an early alert and seek help at the Learning Labs.  The initial approach was to track 800 students during the Spring '11 and Fall '11. We would follow 400 students each semester breaking them into 100 students per category.  We planned to track grades, attendance to learning lab, and results from the PEEK noncognitive student survey.  This survey would highlight academic, personal, and social characteristics of each student. These data would have been analyzed by traditional techniques including correlation and multivariable analysis to determine predictors for early alert recipients, and predictors for students who will benefit from an early alert.  The study was also attempting to determine the noncognitive characteristics of a student who would seek support from the Learning Labs. This data will be used to better focus Learning Lab recruiting to reach students who would not likely visit the Learning Labs and to enhance services that are proven to be effective.

As the project progressed several realities changed the research in 3 major ways.  To determine pass rates

Learning Lab/SACC Assessment Report Page 22

and withdrawal rates a sample of 400 students from each cohort was not used, rather the entire population of the cohorts was used to determine the parameters desired.  Secondly, after a variety of attempts (including outreach through the Counseling Department and the English Department) to distribute the PEEK survey it became clear that acquiring an adequate number of survey responses for the purpose of the study was infeasible due to an overwhelming lack of student participation.  Thirdly, without this important noncognitive component, the development of a predictive model for student intervention would be difficult and lacking in robustness.

The work that has been generated by this study is none-the-less important and will be used to refine the current practices of the Learning Lab Department with regard to early alert students.  The study investigates the pass rates and withdrawal rates of students in Math 016, Math 017, Math 118, Engl 098, and Engl 101 for the four cohorts described above.  For each semester (Spring 2011 and Fall 2011) the students who received an early alert were issued a letter from the College informing them to their professor’s concern. The text of this letter is shown in Appendix 1. They also received an email from the Learning Lab welcoming them to tutoring. The general text from the email is shown in Appendix 2.  For the purpose of this research it was assumed that a student who sought tutoring and who was issued an early alert sought tutoring because of the early alert notice.  This simplifying assumption is necessary, but not ideal.  In the future it would be insightful to determine if a student sought tutoring because of the outreach from the college, or if the student sought tutoring of his/her own initiative.  This research encompassed all students who received an early alert, regardless of the nature of the alert, i.e. a student who was indicated as being disruptive in class was included in the analysis of the impact of tutoring.  A more refined investigation would delineate between behavioral and academic indicators, however the sample sizes of early alerts for certain classes were so small, this work looks at all early alerts to generate a larger sample size.

The withdrawal rates were calculated as the number of students receiving a “W” for the course divided by total number of students enrolled in the course.  The pass rate was calculated as the number of students receiving a C or better in the course divided by the total number of students who completed the course (who did not withdrawal or receive and incomplete).  For pass/fail classes, the pass rates were calculated as the number of passing students divided by the number of students who completed the course.

Any student who attended tutoring at least one time was placed in a “tutored” cohort.  The number of appointments was not used to differentiate students into different subgroups.  That work is likely important and could be investigated in the future.  Although there are many different types of tutoring available in the Learning Lab, this study looks only at one-on-one tutoring between a student and a tutor. The tutors were students, non-students, or faculty members, no distinction was made in this analysis.

Results and Discussion

The frequency of early alerts issued varied widely across the curricula and semesters.  Table 1 shows the number of students that received early alerts as well as the % of Course Reference Numbers (CRNs) that reported early alerts.  The % CRN value is a measure of how many faculty participate in the program. The higher the % CRN reporting, the more faculty are completing the early alert portion of the 20 and 50% attendance form.

Learning Lab/SACC Assessment Report Page 23

Table 1 The % CRN reporting at least one early alert for each of the courses included in the study, as well as the % of the enrolled students receiving an early alert.

Course Fall 2011 Spring 2011

% CRN % Students % CRN % Students

Math 016

43.9 9.2 13.9 1.1

Math 017

23.4 4.1 16 2.5

Math 118

29.1 4.5 20.5 3.0

Eng 098 50.0 10.0 48.5 10.9

Eng 101 44.2 8.8 42.6 7.9

Table 1 shows the frequency of use of the early alert system by faculty in the Math and English departments.  English faculty utilize the outreach system more frequently than the Math faculty and that results in a higher percentage of the enrolled students receiving letters and communications from the college regarding their performance.  In the Spring of 2011 less than 10 students received an early alert in Math 016.  Clearly more can be done to improve faculty buy-in and usage of the system.

Figures 1 through 10 show the pass rate and withdrawal rates of the four cohorts studied in this project. The pass rates and withdrawal rates varied across curricula and semesters, but the general results indicated that students who attended tutoring (either in the early alert or non-early alert cohort) saw an improvement in either pass rate, withdrawal rate, or both when compared to students who did not seek tutoring.

English 098

Figure 1 A comparison of pass rates and withdrawal rates in English 098 for the four cohorts analyzed in this study.

Learning Lab/SACC Assessment Report Page 24

English 098 courses have the highest participation rate in the early alert program of the courses studied in this work. In both the spring and fall nearly half of all sections reported a retention performance indicator for at least one student. Of the approximately 300 students who received an early alert only 9% of them attended one-on-one tutoring. This represents a sizeable gap in the number of students who need academic support and the number of students receiving academic support. The outreach that results from an early alert may not be intrusive or engaging enough to cause a response by the student.

It is not clear from the two semesters analyzed here if tutoring correlates to a higher pass rate or a decreased withdrawal rate. Intervention strategies used with developmental level students are as much about course content as they are about college readiness. These low pass rates and high withdrawal rates in the early alert cohorts suggest that these students are underprepared for English 098 and may require a significant amount of academic support. These findings do underscore that faculty are using the early alert system to successfully indicate which students are struggling in the course early on. According to this research, student outreach and student support both have the capability to be improved.

English 101

The early alert system is underused.  Only 45% of courses as identified by CRN used the early alert system to issue a retention performance indicator for at least one student.  Of the nearly 9% of all English 101 students who received an early alert only 8% of those students attended tutoring.  This results in approximately 20 early alert students seeking tutoring.  Not only are faculty not utilizing the system, but the outreach (in the form of a letter/email to the student from both the college and the Learning Lab) is having little effect on encouraging a significant number of students to seek academic support in the form of tutoring.

Even though results between semesters varied, the pass rate of early alert English 101 students who attended tutoring is higher than that of the non-tutored early alert cohort.  This is not proof of causality,

Figure 2 A comparison of pass rates and withdrawal rates in English 101 for the four cohorts analyzed in this study.

Learning Lab/SACC Assessment Report Page 25

but does suggest that there is a correlation between tutoring and improved pass rate.

The finding that tutored cohorts outperform non-tutored cohorts helps inform our outreach strategies. English 101 students benefit measurable from tutoring.  A more structured, thorough outreach and support program can be developed to bolster student performance.

The data also indicate that the early alert students, both tutored and non-tutored, underperform the non-early alert students.  This suggests that faculty are successfully identifying struggling students. Under 50% of the CRNs are utilizing the early alert system.  If more faculty indicated a student’s early alert status more could be done to ensure that the student receives the support that he/she may need.

Math 016

Math 016 is a difficult course for students.  The best performing cohort only slightly exceeds a 50% pass rate.  Low level remedial courses with potentially weak students have the capacity to benefit the most from the early alert system.  However, as the data shows, there is inconsistent use of the system and in the spring of 2011 only 7 students received early alerts in Math 016.  Improved faculty participation is necessary if the system is to have any impact on student performance.

However, when students did attend tutoring there was little to no improvement in their final course outcome.  This finding is of the utmost importance.  The current practices are not effective, neither the early alert system nor the tutoring services are having a measurable impact on student success. Both the Learning Lab and the early alert administrators need to collaborate to design new intervention techniques and approaches.

Math 017

Figure 3 A comparison of pass rates and withdrawal rates in Math 016 for the four cohorts analyzed in this study.

Figure 4 A comparison of pass rates and withdrawal rates in Math 017 for the four cohorts analyzed in this study.

Learning Lab/SACC Assessment Report Page 26

Very poor participation in the program results in only 4% of the students receiving an early alert. Between 16 and 24% of all Math 017 sections report having at least one student who warranted an early alert indicator.  Math 017 is one of the most failed courses in the college.  If support programs are to work, faculty buy-in is paramount.  Of the students who received an early alert few sought tutoring. There is a gap between indicating a student needs support and getting the student to engage in support activities.

The outcome for the Math 017 analysis is varied, however, in both semesters the tutored early alert cohort outperformed the non-tutored cohort by 9.6% and 6% in the fall and spring respectively.  The results seen in pass rates of the non-early alert cohort is inconclusive, but in all groups the withdrawal rates are better in the tutored cohort.  Seeking tutoring not only affects the pass rate of the cohort, but also the completion rate.

Math 118

Math 118 shows the same low participation rates as the other math courses studied.  Between 20 and 30%

Figure 5 A comparison of pass rates and withdrawal rates in Math 118 for the four cohorts analyzed in this study.

Learning Lab/SACC Assessment Report Page 27

of Math 118 sections indicated that at least one student warranted an early alert indicator.  Of the few number of students who received an early alert only 5 to 10 students, depending on semester, attended tutoring.  In the Fall of 2011 the tutored early-alert cohort showed an improved pass rate with tutoring, however in the Spring of 2011 the reverse was seen.  In both semesters the withdrawal rates of students was decreased in the tutored cohort.

In the non-early alert cohort, attending tutoring correlates to higher pass rates.  Again, as with most courses analyzed in this study, tutoring shows some benefit, but the participation of faculty in the program, and the response of the students to the outreach protocol both show opportunities to enhance the program and create innovative approaches to improve engagement of both students and faculty.

Conclusions

Several important conclusions can be drawn from this work.

·         In all courses, faculty who participate in the early alert program are successfully identifying struggling students.  In many courses, students who receive an early alert and then attend tutoring have an improved pass rate when compared to the non-tutored cohort.  However, in most courses, the early alert cohorts (both tutored and non-tutored) are under-performing when compared to the non-early alert cohorts.

·         Faculty participation and student engagement both have the capacity to improve.  No course had 50% CRN reporting, indicating that half of the courses offered are not being reached by the early system (and often much greater than 50%).  Once early alert letters/emails are sent only a fraction of the students (never more than 20% of students receiving an early alert) seek academic support.  A more refined study should be done to indicate which academic early alerts result in a student seeking academic support, but this study is a useful first glance at the impact of the outreach.

·         The current practices are often successful, but not in the volume or extent that is results in a significant impact.  The Division of Education Support Services and the Office of the Dean of Students need to partner to provide a more impactful outreach strategy that will serve and support both faculty and students.

Appendix 1 – Sample Letter to Early Alert Student from the College (excerpt)

Required Actions to Improve Attendance and Classroom Performance (20%):

“Unsatisfactory Attendance”You can be dropped from your classes for poor attendance. You can lose financial aid if you are dropped. Make sure to check your syllabus or speak with your instructor so that you know how many absences you are permitted. If you have conflicts that are difficult to resolve, make an appointment to speak with a counselor at the site where your class is held: Main Campus (W2-2), NERC, NWRC, or WERC.*

“Frequently Late”In virtually all classes, being frequently late will affect your grade. Therefore, you are expected to meet with a counselor to discuss issues preventing your punctuality and options to ensure timely class

Learning Lab/SACC Assessment Report Page 28

arrival. Counselors can be located at the following locations: Main Campus (W2-2), NERC, NWRC, or WERC.*

“Inadequate Class Participation” and/or “Missing Assignments / Unprepared”Insufficient class participation and/or missing assignments and unpreparedness may have a negative impact on your grade. These issues may have a negative impact on your grade. Please accept this email as encouragement to speak to your instructor about this issue. You are also encouraged to meet with a Learning Lab specialist or attend a Learning Lab workshop. Contact one of the offices below:o Main Campus (Make an appointment in person)o The Central Learning Lab (B1-28) - Humanitieso The South Learning Lab (B2-36) - Math and Businesso The West Learning Lab (W3-26) - Science and Allied HealthNorthwest Regional Center Learning Lab Specialist; stop in or call 215-496-6020;West Regional Center Counselor; stop in Room 132 or call 267-299-5857;Northeast Regional Center Learning Lab Specialist; stop in or call 215-972-6236;Students taking online courses may consult the online resources of the Learning Lab – through MyCCP.

“Inappropriate Classroom Behavior”Inappropriate classroom behavior may lead to disciplinary action and have a negative impact on your standing at the College. You are encouraged to speak with your instructor to discuss these behaviors. You are also encouraged to speak with a counselor at the site where your class is held: Main Campus (W2-2), NERC, NWRC, or WERC.* Please review the “Code of Conduct” found on the “Student Information” channel located on the MyCCP “Student” tab. If your course is on-line, you may review the College’s “Acceptable Use Policy” found on the login page of MyCCP.

“Other (Conference Requested with Student)”Your instructor has indicated the need to speak directly to you in conference. It is essential you make an appointment with your instructor immediately. Failing to do so may have a negative impact on your grade.

*PLEASE NOTE: If you're taking classes online or at a neighborhood site, you may access the following support services online:Learning Lab, http://www.ccp.edu/site/academic/learning_lab/ Counseling Department, http://www.ccp.edu/site/current/counseling.php

Appendix 2 – Sample Letter to Early Alert Student from the Learning Lab Dept.

Hi,

My name is Megan Fuller, I am a tutor here at the Community College of Philadelphia.  If you are receiving this email then you probably also received a letter from the college explaining that your Biology professor indicated concern about your performance in biology this semester.  These 'student performance reports' are sent out in the beginning of the semester so we can know who needs help, and reach out to them- so that's just what I'm doing!

You may already be aware, but there is free tutoring for biology on the main campus and at the regional campuses.  You can make an appointment at the Main Campus and NERC and you can attend drop-in

Learning Lab/SACC Assessment Report Page 29

hours at NWRC.  The labs ask that you make your appointments in person- so all you have to do is walk into the Learning Lab on your campus (or a campus near where you live or work - it doesn't have to be the campus where you have lecture!) and ask to make an appointment.

Tutoring is a wonderful way to make sure that:

1. you are keeping up with lecture material2. completing your homework successfully3. studying for your tests in the most effective way4. getting help on concepts that you find confusing in lecture

Please don't hesitate to respond to this email address with any questions or concerns you may have.  The Learning Labs are here to help you succeed at the College, come in and make an appointment today!

Sincerely,Megan Fuller

APPENDIX 4

Learning Lab and Student Academic Computing Center Survey Results

A survey was recently sent to the entire student body to assess the needs of the students with regard to our services, schedules, and locations. The results indicate some important trends that may help us make decisions about future staffing and scheduling needs.

The survey was emailed to the students on 10.15.2012 and the results shown here are those collected on 10.19.2012. As more surveys are submitted, the results can be updated. As of the 19 th, 251 students completed the survey. Figure 1 indicates the campuses those students most frequently attend. Any student who selected more than one campus was placed in the cohort labeled "multiple" campuses.

Learning Lab/SACC Assessment Report Page 30

Of the 251 students, 167 of them indicated that they attend the Main Campus most frequently, 29 indicated they attend multiple campuses, 25 indicated they attend the NERC, 22 indicate they attend the NWRC, and 7 indicated they attend the WRC.

Students were asked which campus, if any, they were likely to attend on Saturdays now that the Main Campus is closed. Students who attended multiple campuses, or attend only a Regional campus indicated they would attend a Regional campus that they already frequent. The 167 Main Campus students had varied responses. Figure 2 shows the break-down of the campuses the Main Campus students selected as their preferred Saturday location.

Learning Lab/SACC Assessment Report Page 31

The responses collected by the survey demonstrate that nearly half of the Main Campus students will not travel to a Regional location. This is a substantial student population who will not seek academic support on Saturdays. Interestingly, while only 3% of the original 251 responders indicated they attend the WRC, 20% of the Main Campus students who have been displaced by the Saturday closings chose the WRC as their preferred Saturday location, making it the smallest campus with the highest likely increase in demand. This trend may need to be addressed in future budgetary decisions when considering staffing needs.

Students were asked which services they most frequently use when attending the Learning Lab. At every campus, Learning Lab services far out-ranked the Library and Computer Lab services. This seems intuitive for the Main Campus and the NWRC where students only seek the Learning Lab for tutoring services. However, at campuses with a Learning Commons it was expected that there would be a more even distribution of usage of services. Figure 3 shows the usage data for each campus and those responses indicated from students who attend multiple campuses.

Learning Lab/SACC Assessment Report Page 32

The students were also asked how the Learning Lab/SACC department could be improved. The break-down of the frequency of each possible answer is included in Figure 4. Students could chose multiple answers for this question.

Learning Lab/SACC Assessment Report Page 33

At every campus, the most popular responses were increased evening hours and increased tutor availability, followed by increased courses covered by tutoring services. The Main Campus and the NWRC had the highest percentage of students reporting a need for increased computers, which supports the department’s current understanding of computer needs amongst our student population. The NERC and the WRC are well equipped with new computers with the installment of the new Learning Commons.

There was additional data collect regarding preferred days and times that services are preferred. The data did not reveal any dramatic trend, but the information is available if needed.

Learning Lab/SACC Assessment Report Page 34

APPENDIX 5

Saturday Tutoring Attendance

Tutoring attendance data for the first six Saturdays of the Fall semesters were analyzed for the 2010, 2011, and 2012 academic years. The Learning Lab and SACC department wanted to observe the impact of the relocation of Saturday support services from the Main Campus to the Regional Centers. The first figure details the total number of student contacts recorded for the first six Saturdays of each semester. This current semester the Learning Lab and SACC department have seen more students than at this same time during the last two semesters.

The Central Lab data for Fall 2010 and Fall 2011 encompasses the Writing Center, humanities, and ESL appointment data. Also, the Fall 2010 data includes ESL lab class data. This data purportedly represents all drop-in, appointment, and workshop attendances for all semesters included in the study.

It is clear that while the West Campus and Northeast campus offer Saturday services, the bulk of the relocation of services falls on the Northwest campus. This could be true for a number of reasons. Firstly, the number of tutors, the variety of subjects, and the total number of hours of tutoring may be greater at the NW than the other two regional campuses (this is certainly true of the WRC). An investigation of the budget for Saturday services could help normalize this data and correct for differing budget allocations. Furthermore, the location of the campus is somewhat more central to Philadelphia than the NERC. Additionally, the NWRC has had Saturday tutoring in the past and may be more familiar to students. Regardless of the factors, the need for strong support services at the NWRC is clear.

Learning Lab/SACC Assessment Report Page 35

Figure 2 shows the tutoring contacts broken down by subject. It is clear that while Science and English courses more than doubled in frequency, it is the volume of Math tutoring that has driven up attendance at the NWRC

While it is still unclear how much money is spent at each campus for Saturday services, the overall budget of the Learning Lab has been restricted. That the Learning Lab and SACC department are meeting the same (and in fact, slightly higher) standard of service as in past semesters is impressive. This data, in combination with the survey results that show that 20% of Main Campus students would prefer to travel to the WRC, it would be in the Department’s best interest to move towards sufficient staffing of the NW and West Regional Centers for Saturday services.

Learning Lab/SACC Assessment Report Page 36

APPENDIX 6

Tutoring Report: English 101 and Math 118, Fall 2011

A study in 2011 showed that students who attended tutoring for Math 118 and English 101 achieved higher pass rates and lower withdrawal rates than their non-tutored counterparts. Of the over-3000 students who enrolled in English 101, 10.3% attended tutoring services of some type (at least one time throughout the semester). The tutored student cohort had a pass rate of 84.4%, a marked improvement over the non-tutored cohort’s pass rate of 73.3%.

Not only did the students pass with more regularity, but the tutored cohort also had a dramatically lower withdrawal rate.

These two metrics are approximations of how tutoring can positively impact student performance through both retention and final course outcome.

Learning Lab/SACC Assessment Report Page 37

Of the nearly 2500 students who enrolled in Math 118 in the Fall of 2011, 11.4% of the students attended some type of tutor service at least once throughout the semester. The pass rate of the tutored cohort was 67.1% which is markedly higher than the non-tutored cohort’s pass rate of 54.0%. There was a small decrease in the withdrawal rate as correlated with tutoring attendance.

Learning Lab/SACC Assessment Report Page 38

To investigate the relationship between the frequency of tutoring appointments and a student’s course outcome, frequency and grades were plotted below. For English 101 tutoring, all frequencies of tutoring led to an outperformance of the non-tutored cohort. In general, the more tutoring appointments attended, the higher the pass rate of the student cohort.

The Math tutoring data showed a similar trend, in that all frequencies of tutoring resulted in pass rates higher than the non-tutored cohort. In Math, the more tutoring attended, generally the higher the pass rate of the cohorts.

Learning Lab/SACC Assessment Report Page 39

APPENDIX 7

TO: DR. JUDY GAY

FROM: LEARNING/SACC DEPARTMENT

SUBJECT: FULL-TIME FACULTY POSITIONS

DATE: OCTOBER 31, 2012

CC: JOAN BUSH, LARRY ARRINGTON

Evidence of EffectivenessRecently the Learning Lab/SACC (LS) Department put forth a request to fill a vacant, full-time Learning Disabilities Specialist position (see Appendix A for the proposal). The department was told, via the Education Support Services Dean, that Dr. Gay would not approve this permanent, full-time, tenure-track position in the LS Department until the department can show effectiveness through assessment outcomes.

This decision offers an opportunity to engage in thoughtful, educated, and productive discussions regarding the assessment of support services, specifically tutoring and academic skills support. The best practices and current trends in the assessment of tutoring are limited and highly variable in their methodology and outcomes [1, 2, 3]. Moreover, the assessment of faculty tutoring is extremely limited. The most common research approach used to assess the impact of tutoring is to link tutoring attendance to indirect metrics such as retention and persistence. These studies routinely find a positive correlation between tutoring attendance and improved retention and persistence rates [4, and references therein]. Faculty members in the LS Department have made strides to measure the effectiveness of academic skills support services (peer, non-peer, and faculty tutoring, and lab classes) through direct metrics, such as course outcomes. The results of those studies can be found in Appendix B of this memo. Quantifying the effectiveness of support services is challenging at best, and numerous variables must be accounted for. Such variables include, but are not limited to, selection bias of students seeking support, student preparation, curriculum faculty variability, and use of course outcome (a metric not controlled by the LS faculty) to gauge student knowledge. Some, but not all, of these variables can be controlled, but the quantitative effort of that work is sizeable. The LS faculty are actively considering alternative methods of assessment, including direct and indirect approaches; however, these trials take time and often require curriculum faculty participation, two conditions that have been largely unavailable to the department in measuring outcomes. Despite these impediments, LS faculty are dedicated to finding creative, beneficial, and insightful forms of assessment.

Learning Lab/SACC Assessment Report Page 40

The three studies shown in Appendix B outline many favorable outcomes for tutoring, including one-on-one, group, and lab class formats. The study of the ESL lab classes found that pass rates of ESL classes were positively correlated with lab-class attendance and, perhaps most telling, the more frequent the attendance the higher the pass rates. The evaluation of the effectiveness on tutoring in Math 016, 017, and 118 and Engl 098 and 101 found that over a two semester period all classes studied either had a positive correlation between tutoring attendance and pass rate or tutoring attendance and completion rate, and often both trends were found. The assessment of the Supplemental Instruction program for Math 118 for Spring 2012 shows that students who attended SI sessions often had higher pass rates than their classmates who did not attend the SI sessions. The LS faculty are trying to make a good-faith effort to produce evidence of our impact. It is clear that more can be done; however, if full-time positions go unfilled, we fear that we will be compromising our effectiveness while we are trying to measure it.

The department requests a meeting with Dr. Gay, Dean Bush, and Asst. Dean Arrington to help the faculty focus on the path forward toward useful assessment, to gain a better understanding of the specific expectations of the term “effectiveness”, and to establish the necessary steps to ensure that future requests for faculty searches will be met with excitement rather than hesitation.

Thank you for your time and attention to this matter.

References:

1. Arco-Tirado, Jose L. ; Fernandez-Martin, Francisco D.; Fernandez-Balboa, Juan-Miguel (2011) The impact of a peer tutoring program on quality standards in higher education, Higher Education: The International Journal of Higher Education and Educational Planning, v62 n6 p773-788

2. Carter, Edythe ; Wetzel, Kathryn (2010)The Mathematics Outreach Center--Saving Dreams, Community College Journal of Research and Practice, v34 n11 p901-903

3. Cooper, Erik (2010) Tutoring Center Effectiveness: The Effect of Drop-In Tutoring, Journal of College Reading and Learning, 40 n2 p

4. Rheinheimer, David C. ; Grace-Odeleye, Beverlyn; Francois, Germain E.; Kusorgbor, Cynthia (2010) Tutoring: A Support Strategy for At-Risk Students, Learning Assistance Review, v15 n1 p23-34

Learning Lab/SACC Assessment Report Page 41

APPENDIX 8

From: Ted Wong and Megan FullerTo: Joan BushCc: Michelle MyersDate: Feb. 27, 2013Re: Access to student-performance data

Joan:

As you know, the Learning Lab makes frequent use of student-performance data from Banner to assess the effectiveness of its services. Occasionally our data requests are rejected by ITS, with no clear explanation. We would like more reliable access to the student-performance data that we need, or at least a clear statement of the criteria by which our data requests are evaluated. In this memo, we explain why we need the data, what happens when we request it, why no one should fear our requests, and a suggestion for how a data-access system could be structured.

Why we need the data

The Learning Lab helps students succeed in their courses, and so we need to know how well our students perform in their courses in order to know how well we’re fulfilling our mission. In a typical data-based assessment, we compare course outcomes for students who have made use of one of our services to the outcomes of students who haven’t. As part of our department’s efforts to improve its capacity to assess its services, we are collecting more and more data on who visits the Learning Lab and for what purpose. For any of that information to connect to student outcomes, we need to access particular students’ grades, sometimes in more than one department.

What happens when we request the data

All of our data requests are made through Bantasks. Sometimes we fill out the data request form available through MyCCP, and sometimes we simply describe a request in an email to the IT support account. In the past, Ted has made complicated requests that required the help of a programmer. More recently, however, every request that we have made has been straightforward, amounting to a simple database query.

Occasionally, but (we believe) with increasing frequency, our requests are rejected. When they are rejected, we receive no notification of the rejection, though sometimes we hear that the request has been passed to you, Joan, for approval, or to Institutional Research. (The one time that IR was involved, IR consented to the request and IT still refused to fulfill it.) Informally, we sometimes hear that our requests are rejected for being too large or because the data we request is not our business. There seems to be a notion that only Institutional Research has legitimate claim to the data, and that all data-based research must come from or be approved by that office. Also, you seem to have been told that

Learning Lab/SACC Assessment Report Page 42

there are concerns that making student-performance data available for faculty use opens doors to misuse and to breaches of student privacy.

We have had data requests rejected 3-4 times now.

How we can assure that our requests are harmless

If data access is restricted because of privacy concerns, we can suggest some ways to assure that privacy is protected.

1. No names. When our data requests are fulfilled, they include students’ names, even though we generally don’t need or request names. We ignore students’ names in our analyses and would be perfectly happy if the data did not include them.

2. No J-numbers. We do need our data to be keyed to individual students, and the key that we use is J-number. We do not, however, need the data we receive from IT to include J-numbers. Once IT uses the J-numbers that we supply to generate our data, we don’t need them. Sometimes it is useful for us to have some unique identifier for each student—for example, when an analysis follows student performance from one semester to another. For this purpose, some institutions provide data keyed to unique student identifiers that have no meaning outside of the data. That is, these indentifiers match particular lines of data to individual students, but they cannot be used to look up any information about those students except for what’s in the data.

3. No publications. We would be happy to agree not to use any Banner data for publication beyond CCP without your or Dr. Gay’s express permission.

Our suggestion for a data-access system

We would be happy to see all data requests go through John Moore, the Director of Assessment. John understands our needs and can address data-quality concerns we have that IT cannot address. John would also be able to track all requests and data-based research projects and would therefore be in a position to handle issues like data misuse and breaches of privacy. For this to work, however, John would have to be given free access to the Banner data. He should be able to perform queries and to generate data without involving IT. Otherwise, involving him would only add a bureaucratic layer to a process that is cumbersome enough. As an administrator, John will protect the College’s interest in monitoring and controlling the flow of data. He can make sure that the data we receive contains no private information, and that data requests are made for legitimate purposes. John is also a researcher, and we believe he would be willing to work with us collaboratively, not only on making the data available to us, but also on assuring its quality and usefulness.

Thanks!

Ted and Megan

Learning Lab/SACC Assessment Report Page 43

APPENDIX 9

PBI Fund: Peer Tutoring in Mathematics 2012/2013

During the Fall and Spring semesters of 2012-2013 we employed 10 PBI tutors: three in WRC, three in NWRC and four at NERC.

They provided 1207 tutoring contacts in mathematics courses of 016, 017, 118, 151,161, 162, 171, 172 and 251. Math tutoring at the regional Centers were available in the range of math courses offered at those sites. There was no tutoring available for a few upper level classes, which are only taught at the main campus. Furthermore, even on Main Campus, there is at most one or two sections of those classes offered each semester.

Students who attended Math 118 tutoring had a higher passing rate in both Fall and Spring semesters.

Passing rates for Math 118 including all 3 Regional Centers:

Fall Semester Spring Semester Fall and Spring together

Math 118 students who received

tutoring76.25% 64.1% 69.2%

Math 118 students who did not receive

tutoring64 % 61.7% 62.8%

There were a total of 2328 tutoring appointments during these two semesters, 1207 of them were provided by PBI tutors. (52%) This ratio is lower than expected because the Learning Lab had a VL math specialist (Murray Lowenthal) at NERC in both semesters, and NWRC employed two veteran non-student tutors. Other full time specialists (Megan Fuller and Lilla Hudoba) also provided tutoring at the regional sites.

Learning Lab/SACC Assessment Report Page 44

APPENDIX 10Outcomes Assessment for Math Tutoring

The Learning Lab Department is in the process of establishing assessment techniques for its wide variety of academic support modalities (drop-in, scheduled appointments, workshops, and study groups) as each support method may offer unique benefit to students. Surveys and course outcome data are being collected and analyzed in order to determine how best to assess the impacts of tutoring. Accurately estimating the cognitive and metacognitive value added during a tutoring session (one-on-one or in a group) and further assessing how this academic growth is captured in the context of course performance and resulting academic success in our students is a challenging process.

The Learning Lab Department is interested in both direct and indirect forms of assessment. Both qualitative and quantitative outcomes are of interest to the faculty so we can best decide how to meet the perceived needs of the students while ensuring that tutoring has a positive result on their learning as a whole and their course grades specifically.

This report shows a preliminary attempt to understand the quantitative significance of tutoring and other student characteristics on successful course outcomes, a series of regression analyses were done for the limited data sets available. This data will ultimately be integrated into the larger assessment plan being developed by the department.

Methodology and Results

Data for this report comes from two sources: the first contained data from two consecutive semesters: Fall 2012 and Spring 2013. It contained course grades, and the numbers of scheduled, drop-in, and workshop appointments for students who attempted Math 016, 017, or 118. A total of 10038 students were included, 2561 of whom attended tutoring ranging from 1 total session to 102 total sessions. The mean number of sessions attended was 4.8. This data set allowed for an examination of the impact of tutoring not only on a given semester, but in a subsequent semester as well.

The second dataset contained data from Spring 2011. In addition to grades for Math 016, 017, and 118, it also included student age, zip code, total credits completed, and whether they were a recipient of financial aid. Using data from the Census, the median income from the students’ zip codes was computed. There were a total of 3907 students who attempted the above courses, 610 of whom attended tutoring. Students attended between 1 and 42 total tutoring sessions. The mean number of sessions was 3.9. With this dataset, it was possible to explore the impact of tutoring while statistically controlling for a number of additional factors.

Regressions were used in the below analyses because it allows for additional variables to be controlled for and makes use of the full range of the variables, rather than constraining them to discrete categories.

Learning Lab/SACC Assessment Report Page 45

Who attends Tutoring

Students with higher grades, with more hours earned, who were older, and who were from zip codes with lower median incomes were more likely to have attended more math tutoring sessions.

Dependant = Total # of tutoring visits N= 3612

Model Standardized

Coefficients

Sig.

Beta

Grade .038 .022

Hours Earned .036 .034

Age .198 .000

Median Income -.033 .045

Learning Lab/SACC Assessment Report Page 46

Effects of Tutoring

When controlling for age, median income, and number of credits (each of which were significant), students who attended more math tutoring sessions were more likely to achieve a Pass or a higher grade.

Learning Lab/SACC Assessment Report Page 47

Not only were more tutoring sessions correlated with a higher grade in the course, students who attended tutoring in the prior semester were also more likely to perform better (Pass or a higher grade) as well, whether they received tutoring in the subsequent semester or not.

Dependent Variable: grade_rcd N = 5173

Model Standardized

Coefficients

Sig.

Beta

Total Appoints .054 .000

Tutoring Last

Semester

.039 .005

Types of Tutoring

Taken separately, each type of tutoring contributed significantly to higher grades; perhaps unexpectedly, drop in tutoring had the highest beta weight (the most impact / session attended).

Conclusions

This preliminary study has several positive outcomes which show that tutoring has a significant positive effect on student’s ability to pass their math classes. The beta coefficient quantifies the GPA points added per tutoring visit attended (to optimize the model fit). So theoretically, if a student attended 10 tutoring appointments, they would have added 0.54 GPA (or half of a letter grade) to their course outcome. Perhaps even more compelling, the effects of tutoring possibly extend beyond the current semester in which tutoring was sought and into the subsequent math course indicating that both cognitive math skills and metacognitive learning skills may be gained through tutoring.

The results specify that more tutoring leads to a more positive course outcome, this correlation is one that was anecdotally believed, and is now supported quantitatively. Also, the insight that all tutoring modalities were significant to improving course outcome suggests that the Learning Lab Department

Dependant = Grade N= 3612

Model Standardized

Coefficients

Sig.

Beta

Total Visits .038 .022

Hours Earned .123 .000

Age .068 .000

Median Income .115 .000

a. Dependent Variable: grade_rcd

Learning Lab/SACC Assessment Report Page 48

should work to enhance all types of tutoring being offered to students in all levels of math courses at the college. Overall, this is a very positive and encouraging report.

APPENDIX 11

Writing Workshops-To-Go

Through our Workshops-To-Go program, the Learning Lab offers Developmental English faculty a choice from various 30-45 min workshop sessions on a specific topic, from study skills to test taking skills to summary writing, etc. Reading/Writing faculty specialists schedule a visit and present the workshop to students in-class. The Learning Lab began piloting these workshops in the Fall 2013 semester with one faculty member presenting a Summary Workshop-To-Go. For the Spring 2014, offerings will be expanded to include a Paraphrasing Workshop.

______________________________________________________________________

Joan Monroe

Fall 2013

Summary Workshops-To-Go - Classroom Presentations

Date Lab Instructor

Instructor Room/Time Topic # Students

9/9/13 Monroe Allene Murphey BR-70/9:00 Night sky 179/9/13 Monroe Allene Murphey BR-70/1:20 Night sky 159/10/13 Monroe Kate Brady S2-12C Night sky 149/16/13 Monroe Richard Keiser C3-13 Night sky 159/17/13 Monroe Lyn Buchheit B2-02 Night sky 199/17/13 Yannuzzi Leslye Friedberg Weight Loss

Memory15

9/18/13 Monroe Amy Lewis B2-19 Night sky 179/24/13 Dowdell Theresa Marsh WERC 159/25/13 Monroe Chris Reinhardt S2-12A Night Sky 179/26/13 Monroe Leslye Friedberg S2-12A Night sky 1610/1/13 Monroe Larry Pinkett B2-37 Night sky 1610/2/13 Monroe Andrea Ross C3-11 Night sky 2210/2/13 Monroe Andrea Ross S2-11 Night Sky 1910/9/13 Monroe James Landers BR-73 Night Sky 1610/11/13 Monroe Naomi Geschwind BR-54 Night Sky 1610/15/13 Monroe Melanie

MorningstarBR-42 Night Sky 16

TOTAL 265

Learning Lab/SACC Assessment Report Page 49

APPENDIX 12

Online Tutoring Initiative

Usage Summary Fall 2013

The Online Tutoring Initiative has had a productive and successful semester. All English 101 students were enrolled as the target group. Our numbers are above other schools whose programs are considered successful. We have exceeded our target number of fifty posts which is the standard set by other community college programs in existence for three years. We are in only year two of our program. Three faculty members are currently scheduled for limited live and asynchronous tutoring, and one additional faculty member works only on asynchronous postings. At this time, there is no live online tutoring in the evening, and asynchronous tutoring is not available Friday through Sunday evenings. For next semester, we have several new ideas. We have been selected to present a session at Professional Development Week. This presentation will expand awareness of online tutoring at CCP, and faculty members will be able to actually use the platform. When they become more familiar with online tutoring, they can encourage their students to use it more. We plan to revise our flyer or use an additional flyer to highlight asynchronous tutoring. Student Government will be contacted to further promote this program, and we are also considering advertising in the student newspaper to further spread the word about this program.

Learning Lab/SACC Assessment Report Page 50

APPENDIX 13

Community College of Philadelphia, Learning Lab Online Tutoring Usage

9/1/13- 12/9/2013—Usage by Service

Usage by ServiceUsage Modes User

SubmissionsUser Minutes Leader Submissions Leader Minutes Total Submissions

LiveClass 0 0:00:00 4 0:31:04 4LiveTutorial 29 1:47:03 0 61:19:56 29

QandA 7 0:05:34 4 0:07:29 11PaperCenter 31 0:00:17 29 62:38:12 60

Notes 3 0:00:45 0 0:00:00 3

Usage by Course/GroupCourse ID Group / Course Name Total submissions Total Minutes

302 LLAB1 1 0:17:43776 Tutor 102 65:01:771044 CIS 103 - 009 4 0:31:04

Learning Lab/SACC Assessment Report Page 51

APPENDIX 14Dr. Michelle MyersChair, Learning Lab/SACC

Jan. 21, 2014

Paul Bonila and Ellen MoscowFacilitators for Online Tutoring via the WorldWideWhiteboardLearning Lab

Subject: Fall 2013 Usage Report

Dear Dr. Myers,

The WorldWideWhiteboard Usage Report may be summarized in a reader-friendly format thus:

Usage by Service:

1. Activities recorded in the report are in in hours/minutes/seconds.

2. The “Live Class” mode is a one-to-many tutoring session on the whiteboard; this mode was not

utilized except for practicing or demonstration purposes.

3. The “Live Tutorial” mode is the one-to-one synchronous tutoring mode; 29 users entered the

whiteboard in that mode.

4. In the “Q and A” mode, 7 questions were posted by users and four were recorded as answered.

5. The “Paper Center” mode is the one students used to submit their drafts asynchronously; we

had 31 submissions and 29 were reviewed/commented on. This number does not include the

many essays students submitted as email attachments using the Canvas emailing tool.

6. The “Notes” mode is when students save their notes for later review; 3 students apparently did.

Usage by Course/Group:

1. “LLAB1” is the course name Paul used to enter the platform for demo/practice purposes. Ellen’s

course, LLAB2, somehow did not show up in the report.

Learning Lab/SACC Assessment Report Page 52

2. “Tutor” is the name given by Dean Hauck to the entire English 101 student body that she

‘enrolled’ as the target group; the total number of submissions in the “Service” block equals

107, and the same number is reflected in the “Course/Group” area.

3. The course titled “CIS 103-009” is an anomaly; apparently. Students enrolled in the tutoring

initiative have the option of forming sub-groups, and some did. We are having the tech folks at

LSI check into this. They are also checking into data for the LLAB2 group, but that is not yet

available.

Thank you.

_____________________________________________________________________________________

Cc: Joan Bush, ESS Division Dean

Learning Lab/SACC Assessment Report Page 53

APPENDIX 15

SACC and Learning Commons

Saturday Usage

Spring 2012 and Spring 2013

Spring 2012 Spring 2013 Number increase Percentage IncreaseCBI C3-17 702 721 19 10%NERC 514 1044 530 49.00%NWRC 872 999 127 9.00%West 295 295

TOTALS 2088 3059 971 68.00%

Learning Lab/SACC Assessment Report Page 54

APPENDIX 16

West Regional Center, SACC Usage

Fall 2012 and Spring 2013

COMP_LAB TERM

TOTAL_SWIPES TOTAL_PERSONS

WERC 160 Fall 2012 2859 2449

COMP_LAB TERM

TOTAL_SWIPES TOTAL_PERSONS

WERC 160 Spring 2013 5041 4465

Learning Lab/SACC Assessment Report Page 55

APPENDIX 17

Below are the student contacts for LLAB for the period 1/1/13 - 12/6/13:

1 19276 Drop In2 9723 Attended3 184 Tutor Absent4 3233 No show

Below is the SACC count for the period 1/1/13 - 12/6/13: 288350

Thank you,

4ITSupportB2-38Data Request Assistance:215-496-6000 - choose option 1 and then option 2or 215-751-8060

Learning Lab/SACC Assessment Report Page 56

APPENDIX 18Learning Lab / SACC Universal Assessment Template

Overview

This document describes a universal assessment template, which may be used to assess the effectiveness of most programs and services provided by the Learning Lab, and many provided by the Student Academic Computer Center (SACC). The method combines data from pre- and post-surveys, attendance records, and measures of student performance including grades and enrollment behavior to build a picture of department programs’ effectiveness in promoting academic-success strategies and improving academic performance.

The challenge

One difficulty we have faced arises from the indirect nature of our impact on students’ academic performance. The Learning Lab and SACC serve students not by teaching their course material to them, but by improving their ability to succeed in their courses. We do not participate directly in assigning grades or in other components of students’ academic performance. Traditional performance measures like GPA and retention therefore reflect our impact only indirectly. This indirectness has made assessment of out effectiveness difficult. Variation in the teaching and evaluation styles of students’ course instructors introduces a great deal of statistical noise, through which it is difficult to discern the signal of our impact. (This problem holds regardless of whether we are effective or not.) Furthermore, when we are able to find a compelling relationship between use of one of our services and student performance, we have seldom been able to rule out the effect of selection bias: the students who make use of our services are often highly motivated, diligent students who might have succeeded even without our help. (When we have been able to rule out selection bias, it was only under unusual circumstances—circumstances which are rare enough that sample sizes, and therefore statistical power, are small.)

A further difficulty arises from the teaching philosophy of the Learning Lab. We do not believe that direct re-teaching of course material harms, rather than benefits, students. Re-teaching encourages dependence on tutors and discourages the development of the cognitive and academic skills that lead to a lifetime of independent, effective learning. We do re-teach material in a limited way, but Learning Lab faculty members—and SACC instructors as well—spend a good deal of instructional time trying to inculcate in students the skills and habits of mind that will make them excellent students, in any course. These skills are not measured directly in students’ course exams, papers, or any of the instruments that inform traditional measures of academic performance. Thus one of the most important elements of Learning Lab and SACC instruction has gone unassessed.

Impact model

This assessment template is built on a model of our impact that includes students’ skills, knowledge, strategies, and habits (SKSHs). In our model (Figure 1), students begin every semester with a set of

Learning Lab/SACC Assessment Report Page 57

SKSHs, the pre-SKSHs. The pre-SKSHs affect a student’s decisions whether to attend or make use of a particular service provided by Learning Lab or SACC. Use of that service in turn affects the student’s SKSHs later in the semester—we hope for the better. These late-semester SKSHs, the post-SKSHs, are affected by the service, but also by the SKSHs. Finally, we take academic performance to be determined by the post-SKSHs.

The indirectness of our impact is reflected in how academic performance is one causal step removed from our service: we affect the post-SKSHs, and it is the post-SKSHs that affect performance. Earlier data-based assessments in our department have attempted to measure the indirect effect of the service on performance (the dotted arrow in the figure), but we believe our effectiveness is more authentically reflected in the direct causal arrow from our service to the post-SKSHs.

Quantifying SKSHs

In order to quantify SKSHs, we will use web surveys of random samples of CCP students. We will obtain from IT’S THE email addresses of all or a large random sample of students, and the student ID numbers associated with those addresses. For the several SKSHs that we wish to quantify in a given semester, we will craft survey questions in the Qualtrics online survey tool. We then solicit responses to the survey using the Qualtrics mailer, and collect responses in such a way that all responses are associated with the respondent’s email address, and therefore student ID number.

Pre-SKSHs will be quantified in a survey sent near the beginning of each semester. Post-SKSH surveys will be identical to the pre-SKSH surveys, and they will be sent near the end of the semester to each student who completed the pre-SKSH survey.

Quantifying usage

Some of the survey respondents will make use of Learning Lab and SACC services, and some will not. Some of those who do, will do so many times, and some will do so once or a few times. We will be able to attach usage frequency to all of our services’ users by keeping attendance and usage records of every departmental interaction with every student. These interaction records include which service was utilized, and the student’s ID number. The department has been collecting attendance records, along with student ID numbers, for many years, so the procedures for this step are well established for many of our services.

Quantifying performance

There are many ways to define and quantify performance, and it is not important that we choose the relevant ones for this template, which is meant to be general and to cover a wide variety of departmental services. In the past, we have examined many performance measures, including:

Probability of graduation Probability of enrolling in the next course in a sequence Probability of passing the course Final course grade

Learning Lab/SACC Assessment Report Page 58

Final course grade, normalized by section average Final normalized grade in the next course in a sequence

Quantifying causation flows

Once we have quantitative data reflecting pre-SKSHs, service usage, post-SKSHs, and academic performance, we will be able to estimate the magnitudes of the impacts represented by the arrows in Figure 1. To do so, we will use path analysis, a form of multiple-regression analysis that is useful for quantifying causal relationships and indirect relationships like that between service usage and academic performance (the dotted arrow). Our goal will be to quantify this dependency (performance’s dependency on service usage), as well as post-SKSH’s dependency on service usage. In other words, we will be looking to see whether using our services has a positive impact on post-SKSHs and on academic performance.

Other causal flows will be interesting as well. For example, we would like to know how participation in our programs is affected by pre-SKSHs. This information might help us to conceive of new programs or improve the marketing of existing ones.

Note that our analysis will only be possible if some of the students whose SKSHs we measure never make use of the departmental services that we are assessing. Participation of non-users in our SKSH surveys will allow us to compare changes from the pre-survey to the post-survey between users and non-users.

Learning Lab/SACC Assessment Report Page 59

Figure 1: Impact model. Pre-SKSHs directly impact both usage of departmental services and post-SKSHs. Academic performance depends directly only on post-SKSHs, but indirectly on service usage. We would most like to quantify the direct relationship between service usage and post-SKSHs.