2015 certification & program officials conference sessions e1-6: gapsc/caep approval process...

75
2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC Education Specialists

Upload: suzanna-johnston

Post on 17-Jan-2016

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

2015 Certification & Program Officials Conference

Sessions E1-6: GaPSC/CAEP Approval ProcessDecember 2, 2015

Enjolia Farrington and Nate ThomasGaPSC Education Specialists

Page 2: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Visit our Conference Padlet throughout the day to post your questions. At 3:00 there will be a general session wrap up by Kelly Henson and

then a GaPSC Panel will answer your questions. Be in the know!http://padlet.com/julie_beck3/certification_conference

Password: Cert Conf

Collaborate with us!

Page 3: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Presentations

http://www.gapsc.com/Commission/Media/DocsPresentations.aspx

Page 4: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Our MissionTo build the best prepared, most qualified, and most ethical education workforce in the nation.

Page 5: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Topics

1. Warm-Up Activity: KWL2. Why Program Approval?3. Overview of GaPSC approval process4. Overview of CAEP accreditation5. Program Approval Standards & Evidence6. GaPSC Updates7. Annual Reporting

Page 6: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Warm-Up Activity

Page 7: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

KWL Chart

1. Each table discuss what you know about the approval standards and process

2. Individually add post-its of what you want to learn during the session

3. As you learn something new, add to the chart through the session.

Page 8: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Overview

Page 9: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Program Approval

Establishes and enforces standards used to prepare teachers, service personnel, and school leaders, and approves education program providers and programs whose candidates receive state certification.

Page 10: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Big Questions of Program Approval• Is this Educator Preparation Provider

preparing quality candidates who can be effective educators?

• How do we know?

Page 11: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Peer Review System

Site Visit(peer review of education program provider and preparation programs)

Evaluation Review Panel(peer approval recommendation to the Ed Prep Standing Committee if not all

standards are met)

Educator Preparation Standing Committee(final approval recommendation)

Professional Standards Commission(final approval decision)

Page 12: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

GaPSC Approval Process

Page 13: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Process - GaPSC• ISA Form Completed• PRS – II Created• Site Visit team formed• Site Visit team reviews PRS II• State Program Reports sent to Educator Preparation Provider (EPP)• EPP creates Addendum in PRS II• Site Visit team reviews new evidence• Previsit with chair and Education Specialist to finalize site visit• Site visit team comes on campus• Report completed and recommendations made to ERP/EPSC/Commission

Page 14: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

PRS-II

• Use of PRS-II– Upload evidence– Provide narrative about evidence– No other evidence room– When will PRS II be completed? (typically 8

months before onsite)

Page 15: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

CAEP Accreditation Process

Page 16: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Process - CAEP• ISA Form Completed and confirmed by Education Specialist• Select dates and notify CAEP (12-18 months prior to visit) • AIMS Template Available• Site Visit team formed• Site Visit team reviews Self-Study Report• Site Visit team submits Formative Feedback Report• EPP submits Addendum in AIMS• Site Visit team reviews new evidence• Virtual Previsit with chair and Education Specialist to finalize site visit• Site visit team comes on campus• Report completed within 30 days and EPP submits Rejoinder

Page 17: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Process – CAEPSelected Improvement Pathway• Format: structured report

• Addressing the standards: EPPs write directly to the standards with evidence and supporting narrative

• Demonstrating quality assurance: The EPP develops and implements a data-driven Selected Improvement Plan that focuses on improvement with respect to a selected Standard, Standard component, or cross-cutting theme

Page 18: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

CAEP Self-Study• EPP context & Conceptual Framework

• Capacity Tables (regional accreditation, clinical educator qualification, parity)

• Evidence uploaded for each standard

• Questions/prompts specific to the standard about the source of evidence – questions or prompts are specific to the type of evidence and the standard– Characterization of the quality of the evidence

– Discussion of results and their implications

– Demonstration of quality assurance

• Response to previous NCATE Area(s) for Improvement

• Submission of Selected Improvement Plan and Recruitment Plan

• Evidence of integration of cross-cutting themes of diversity and technology

Page 19: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

CAEP Review of Assessments

• Improve the quality of assessments used by EPPs to evaluate and report candidate/completer performance

• EPP-wide assessments/surveys reviewed by CAEP up to three years prior to submitting the Self-Study Report– Specific assessments created or modified by the EPP and used across all discipline specific

content areas in the EPP– Student teaching observation instruments, exit surveys, teacher work samples, portfolios, etc.

• The intent is to allow EPPs: – time to improve their assessments/surveys and scoring guides – provide more precise feedback to candidates– improve the program’s ability to analyze data for evidence leading to continuous

improvement, and– ensure consistency among evaluators using the same assessment.

Page 20: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

CAEP AFIs & Stipulations

• Area for Improvement: Identifies a weakness in the evidence for a component or a standard. A single AFI is usually not of sufficient severity that it leads to an unmet standard. Must be corrected within seven years.

• Stipulation: Deficiency related to one or more components or a CAEP standard. A stipulation is of sufficient severity that a standard is determined to be unmet. For EPPs seeking to continue their accreditation, a stipulation must be corrected within two years to retain accreditation.

Page 21: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standards & Features of Evidence

Page 22: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standards

• Use of Program Approval Standards (2016)– 5 CAEP Standards– 1 Georgia Special Requirements Standard– Effective Fall 2016

• Use PRS-II– Upload evidence– Provide narrative about evidence

Page 23: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 1Content & Pedagogical Knowledge

Page 24: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 1:Content and Pedagogical Knowledge

Provider Quality Assurance and

Continuous Improvement

Candidates demonstrate:Understanding of InTASC

Standards

Diversity (InTASC Standard 2)

Providers ensure:Candidate Use of

Research and Evidence

Providers ensure:Candidates apply

content and pedagogical knowledge

Providers ensure:Candidates Demonstrate

skills and commitment that afford access to college &

career ready standardsProviders ensure: Candidates model and

apply technology standards

Page 25: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 1:Content and Pedagogical Knowledge

• How do candidates demonstrate an understanding of the 10 InTASC standards at the appropriate progression level(s) in the following categories: the learner and learning; content; instructional practice; and professional responsibility?

• How does the provider ensure that candidates use research and evidence to develop an understanding of the teaching profession?

• How do candidates use research and evidence to measure their P-12 students' progress and their own professional practice?

Page 26: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 1:Content and Pedagogical Knowledge• How do candidates apply content and pedagogical knowledge as reflected in

outcome assessments in response to standards of Specialized Professional Associations (SPA), the National Board for Professional Teaching Standards (NBPTS), states, or other accrediting bodies (e.g., National Association of Schools of Music - NASM)?

• How do candidates demonstrate skills and commitment that afford all P-12 students access to rigorous college- and career-ready standards (e.g., Next Generation Science Standards, National Career Readiness Certificate, Common Core State Standards)?

• How do candidates model and apply technology standards as they design, implement and assess learning experiences to engage students and improve learning; and enrich professional practice?

Page 27: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 1: Features of Potential Evidence

• Data are disaggregated by licensure area • Evidence is provided directly informing

on candidate proficiency for each of the four InTASC categories

• At least two cycles of data are provided • At least one comparison point is available

for analysis

Page 28: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 1: Features of Potential Evidence

• Specific documentation is provided that candidates “use research and evidence” – Items on observational instruments – Required in unit or lesson plans – Part of work sample – edTPA

Page 29: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 1: Features of Potential Evidence• Plan or submitted assessments that include:

– Assessment or observation proficiencies specific to college- and career-ready teaching are identified

– Plan or assessments are specific to subject content area – Candidates demonstrate deep content knowledge – Candidates have required students to apply knowledge to

solve problems and think critically in subject area – Candidates have demonstrated the ability to differentiate

instruction for students with at least two different needs (e.g. ELA, urban/rural, disadvantage, low or high performing)

Page 30: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 1: Features of Potential Evidence• At least three of the four categories listed below are

addressed: – Accessing databases, digital media, and tools to improve

P-12 learning – Knowing why and how to help P-12 students to access

and assess quality digital content – Ability to design and facilitate digital learning, mentoring,

and collaboration including the use or social networks – Candidate use of technology to track, share, and evaluate

student learning

Page 31: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 2Clinical Partnerships & Practice

Page 32: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 2:Clinical Partnerships & Practice

Provider Quality Assurance and

Continuous Improvement

Partners co-construct mutually beneficial P-12 collaborations(establish mutually agreeable

expectations for candidate entry, exit, theory and practice, coherence, and shared accountability for candidate

outcomes)

Partners co-select, prepare, evaluate,

support and retain high quality clinical educators

Providers work with partners to design clinical experiences of sufficient depth, breadth, diversity, coherence, and duration

Page 33: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 2:Clinical Partnerships and Practice • How do clinical partners co-construct mutually beneficial P-12 school and

community arrangements, including technology-based collaborations, for clinical preparation and share responsibility for continuous improvement of candidate preparation?

• What are the mutually agreeable expectations for candidate entry, preparation, and exit to ensure that theory and practice are linked, maintain coherence across clinical and academic components of preparation, and share accountability for candidate outcomes?

• How do clinical partners co-select, prepare, evaluate, support, and retain high-quality clinical educators, both provider- and school-based, who demonstrate a positive impact on candidates' development and P-12 student learning and development?

Page 34: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 2:Clinical Partnerships and Practice • What are the multiple indicators and appropriate technology-based applications used

to establish, maintain, and refine criteria for selection, professional development, performance evaluation, continuous improvement, and retention of clinical educators in all clinical placement settings?

• How does the provider work with partners to design clinical experiences of sufficient depth, breadth, diversity, coherence, and duration to ensure that candidates demonstrate their developing effectiveness and positive impact on all students' learning and development?

• How are clinical experiences, including technology-enhanced learning opportunities, structured to have multiple performance-based assessments at key points within the program to demonstrate candidates' development of the knowledge, skills, and professional dispositions (as delineated in Standard 1), that are associated with a positive impact on the learning and development of all P-12 students?

Page 35: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 2:Features of Potential Evidence • Evidence documents that P-12 schools and EPPs have both benefitted from the

partnership

• Evidence documents that a collaborative process is in place – a shared responsibility model that includes such things as co-construction of instruments and

evaluations, curriculum revisions, and key assignments– A system is in place to ensure P-12 educator are involved in on-going decision-making (e.g., exit

and entry requirements, etc.)

• Evidence documents that clinical experiences are sequential, progressive, and linked to coursework.

• Educators and/or administrators co-construct criteria for selection of clinical educators– are involved in the selection and evaluation of clinical educators– candidates and clinical educators evaluate each other– results of evaluations are shared with clinical educators

Page 36: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 2:Features of Potential Evidence• Data are collected (survey data specific to clinical educators)

and used by EPPs and P-12 educators to refine criteria for selection of clinical educators

• Resources are available on-line to ensure access to all clinical educators

• Clinical educators receive professional development on the use of evaluation instruments, professional disposition evaluation of candidates, specific goals/objectives of the clinical experience, and providing feedback.

Page 37: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 2:Features of Potential Evidence• Evidence documents that all candidates have active clinical

experiences in diverse settings

• Attributes (depth, breath, diversity, coherence, and duration) are linked to student outcomes and candidate/completer performance– Impact is assessed by candidates in more than one clinical experience – Candidates use both formative and summative assessments– Candidates are assessed on the ability to use data to measure impact on

student learning and development and to guide instructional decision-making

– Candidates have purposefully assessed impact on student learning using two comparison points

Page 38: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 2:Features of Potential Evidence• Evidence documents that candidates have used technology to

enhance instruction and assessment– Use of technology is both by candidate and students– Specific criteria for appropriate use of technology are identified

• Clinical experiences are assessed using performance-based criteria – Candidates are assessed throughout the program with data

supporting increasing levels of candidate competency– Clinical experiences are assessed using performance-based criteria – Evidence documents a sequence of clinical experiences that are

focused, purposeful, and varied with specific goals for each experience

Page 39: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 3Candidate Quality, Recruitment, and Selectivity

Page 40: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 3:Candidate Quality, Recruitment, and Selectivity

Provider Quality Assurance and

Continuous Improvement

Provider presents plans and goals to recruit high-quality

candidates(diverse backgrounds,

populations, hard-to-staff schools and shortage areas)

Provider sets admission

requirements

Establish and monitor dispositions beyond

academic ability

Create criteria for program progression

and monitor candidate advancement

Document high standard for content knowledge

and effectively impact P-12 learningUnderstand Code of

Ethics, Professional Standards of Practice,

Relevant Laws

Page 41: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 3:Candidate Quality, Recruitment, and Selectivity

• Presents plans and goals to recruit and support completion of high-quality candidates from a broad range of backgrounds and diverse populations to accomplish their mission.

• How does the provider ensure that the admitted pool of candidates reflects the diversity of America's P-12 students?

• How does the provider address community, state, national, regional, or local needs for hard-to-staff schools and shortage fields, currently, STEM, English-language learning, and students with disabilities?

• What are the admission requirements?

Page 42: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 3:Candidate Quality, Recruitment, and Selectivity

• How does the provider gather data to monitor applicants and the selected pool of candidates?

• Provide an analysis of the evidence that ensures that the average grade point average of its accepted cohort of candidates meets or exceeds the minimum of 3.0, and the group average performance on nationally normed ability/achievement assessments is in the top 50 percent from 2016-2018;

• How does the provider establish and monitor attributes and dispositions beyond academic ability that candidates must demonstrate at admissions and during the program?

• How does the provider select criteria, describes the measures used and evidence of the reliability and validity of those measures, and reports data that show how the academic and non-academic factors predict candidate performance in the program and effective teaching?

Page 43: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 3:Candidate Quality, Recruitment, and Selectivity

• How does the provider gather data to monitor applicants and the selected pool of candidates?

• Provide an analysis of the evidence that ensures that the average grade point average of its accepted cohort of candidates meets or exceeds the minimum of 3.0, and the group average performance on nationally normed ability/achievement assessments is in the top 50 percent from 2016-2018;

• How does the provider establish and monitor attributes and dispositions beyond academic ability that candidates must demonstrate at admissions and during the program?

• How does the provider select criteria, describes the measures used and evidence of the reliability and validity of those measures, and reports data that show how the academic and non-academic factors predict candidate performance in the program and effective teaching?

Page 44: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 3:Candidate Quality, Recruitment, and Selectivity

• What are the criteria for program progression and how does the provider monitor candidates' advancement from admissions through completion?

• Analyze the evidence to indicate candidates' development of content knowledge, pedagogical content knowledge, pedagogical skills, and the integration of technology in all of these domains.

• How does the provider document that the candidate has reached a high standard for content knowledge in the fields where certification is sought and can teach effectively with positive impacts on P-12 student learning and development?

• How does the provider document that the candidate understands the expectations of the profession, including codes of ethics, professional standards of practice, and relevant laws and policies, before recommending for licensure?

Page 45: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 3:Features of Potential Evidence• Documentation of existence of a recruitment plan, based on EPP mission with

targets for 5 to 7 years out

• Data on admitted and enrolled candidates are disaggregated by race/ethnicity and gender

• Evidence that results are recorded and monitored, and results are used in planning preparation for shifting cohorts including modifications to recruitment strategies

• Informed knowledge of employment opportunities in schools/districts/regions where completers are likely to be placed is documented

• STEM and ELL opportunities are explicitly addressed in the EPP analysis of shortage area employment needs, along with employment needs in hard to staff schools

Page 46: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 3:Features of Potential Evidence• Documentation of existence of a recruitment plan, based on EPP mission with

targets for 5 to 7 years out

• Data on admitted and enrolled candidates are disaggregated by race/ethnicity and gender

• Evidence that results are recorded and monitored, and results are used in planning preparation for shifting cohorts including modifications to recruitment strategies

• Informed knowledge of employment opportunities in schools/districts/regions where completers are likely to be placed is documented

• STEM and ELL opportunities are explicitly addressed in the EPP analysis of shortage area employment needs, along with employment needs in hard to staff schools

Page 47: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 3:Features of Potential Evidence• EPP documents that the average score of each cohort of admitted

candidates meets a minimum GPA of 3.0 and performance on a nationally normed test of academic achievement/ability in the top 50 percent

• OR similar average cohort performance using a state normed test of academic achievement/ability in the top 50 percent [GACE scores]

• OR EPP has a “reliable, valid model” in which they use admissions criteria different from those specified in 3.2 that result in positive correlation with measures of P-12 student learning

Page 48: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 3:Features of Potential Evidence• EPP establishes non-academic factors to use at admission or

during preparation that are research based

• EPP monitors progress and uses results from the evaluation of the non-academic factors for individual candidate mentoring and program improvement (curriculum and clinical experiences)

• EPP reports data showing how academic and non-academic factors predict candidate performance in the program

Page 49: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 3:Features of Potential Evidence Measures provide evidence of developing proficiencies of

candidates in critical areas such as:o Ability to teach to college- and career-ready standards; o Content knowledge;o Dispositions; pedagogical content knowledge;o Pedagogical skills;o Integration of use of technology;o Impact on P-12 student learning

• EPP documents candidates’ understanding of codes of ethics, professional standards and ethics, and relevant laws and policies

Page 50: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 3:Features of Potential Evidence Measures provide evidence of developing proficiencies of

candidates in critical areas such as:o Ability to teach to college- and career-ready standards; o Content knowledge;o Dispositions; pedagogical content knowledge;o Pedagogical skills;o Integration of use of technology;o Impact on P-12 student learning

• Evidence of actions taken such as:– Changes in curriculum or clinical experiences– Changing admissions criteria– Providing mentoring– Counseling out

Page 51: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 4Program Impact

Page 52: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 4:Program Impact

Provider Quality Assurance and

Continuous Improvement

Provider documents that program completers

contribute to an expected level of student learning

growth

Provider demonstrates that completers effectively

apply professional knowledge, skills, and

dispositions

Provider demonstrates that employers are

satisfied with completer’s preparation

Provider demonstrates that program completers perceive their preparation

as relevant and preparation was effective

Page 53: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 4: Program Impact

• How does the provider document, using multiple measures, that program completers contribute to an expected level of student-learning growth?

• How does the provider demonstrate, through structured and validated observation instruments and/or student surveys, that completers effectively apply the professional knowledge, skills, and dispositions that the preparation experiences were designed to achieve?

• How does the provider demonstrate, using measures that result in valid and reliable data and including employment milestones such as promotion and retention, that employers are satisfied with the completers' preparation for their assigned responsibilities in working with P-12 students?

• How does the provider demonstrate, using measures that result in valid and reliable data, that program completers perceive their preparation as relevant to the responsibilities they confront on the job, and that the preparation was effective?

Page 54: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 4:Features of Potential Evidence• Preparation Program Effectiveness Measures (PPEM)

• EPP analyzes, evaluates, and interprets information provided by the State – Characteristics and patterns in data – At least 20% of completers are represented – Explanation of results based on results of teacher placement – EPP judges the implications of the data and analyses for the

preparation program and considers appropriate modifications

Page 55: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 4:Features of Potential Evidence• Observation instruments are structured and inclusive of the application of

professional knowledge, skills, and dispositions

• Interpretations of performance are made, especially in relation to benchmarks, norms and cut scores

• EPP administered surveys– Survey return rates are at acceptable levels and inclusive of most licensure areas in the

EPP– The representativeness of the sample, the characteristics of the respondents, and the

survey response rate – Disaggregated data specific to high need schools or licensure areas – Data are analyzed, evaluated, and interpreted – Conclusions are supported by the data and comparison points for data are provided

Page 56: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 5Provider Quality Assurance & Continuous Improvement

Page 57: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 5:Provider Quality Assurance & Continuous Improvement

Provider Quality Assurance and

Continuous Improvement

Quality assurance system monitors candidate progress, completer

achievements, provider operational effectiveness

Quality assurance system relies on relevant,

verifiable, representative, cumulative, and actionable

measures

Provider regularly and systematically assess

performance against its goals, tracks results, tests innovations,

uses results for improvement

Measures of completer impact are summarized, externally benchmarked, analyzed, shared widely,

acted uponProvider assures that

appropriate stakeholders are involved in program

evaluation and improvement

Page 58: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 5:Provider Quality Assurance & Continuous Improvement

Quality Assurance

System

Monitors candidate progress, completer

achievements, provider operational

effectiveness

Relies on relevant, verifiable,

representative, cumulative, and

actionable measures

Provider regularly and systematically assess

performance against its goals, tracks results,

tests innovations, uses results for

improvement

Measures of completer impact are summarized,

externally benchmarked,

analyzed, shared widely, acted upon

Appropriate stakeholders are

involved in program evaluation and improvement

Page 59: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 5:Provider Quality Assurance and Continuous Improvement

• Describe how the quality assurance system is comprised of multiple measures that can monitor candidate progress, completer achievements, and provider operational effectiveness.

• Describe how the quality assurance system relies on relevant, verifiable, representative, cumulative and actionable measures, and produces empirical evidence that interpretations of data are valid and consistent.

• How does the provider regularly and systematically assess performance against its goals and relevant standards, track results over time, test innovations and the effects of selection criteria on subsequent progress and completion, and use results to improve program elements?

Page 60: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 5:Provider Quality Assurance and Continuous Improvement

• How are measures of completer impact, including available outcome data on P-12 student growth, summarized, externally benchmarked, analyzed, shared widely, and acted upon in decision-making related to programs, resource allocation, and future direction?

• How does the provider assure that appropriate stakeholders, including alumni, employers, practitioners, school and community partners, and others defined by the provider, are involved in program evaluation, improvement, and identification of models of excellence?

Page 61: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 5: Features of Potential Evidence• EPP’s quality assurance system monitors candidate progress,

completer achievements, and EPP operational effectiveness

• Documentation that quality assurance system supports targeted change (e.g., through capacity to disaggregate data by program and/or candidate level, and to respond to inquiries)

• Documentation that the system operations and data are regularly reviewed

Page 62: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 5: Features of Potential Evidence• Documentation that evidence is:

– Relevant (related to standard) – Verifiable (accuracy of sample) – Representative typical and free of bias – Cumulative (generally 3 cycles or more) – Actionable in a form to guide program improvement

• Documentation that interpretations of evidence are consistent (across different sources of data) and valid (e.g., inter-rater reliability)

Page 63: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 5: Features of Potential Evidence• Documentation that EPP regularly and systematically:

– Reviews quality assurance system data, – Poses questions, – Identifies patterns across

• Documentation that EPP addresses areas of apparent areas for improvement and makes appropriate changes in preparation.

• EPP documents appropriate tests of effects of selection criteria (under 3.2) and other program changes: – Baseline(s) – Intervention description – Comparison(s) of results and next steps taken and/or planned

Page 64: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 5: Features of Potential Evidence• Outcome and impact measures include: – Analysis of trends – Comparisons with benchmarks – Indication of changes made in preparation – Considerations for distribution of resources – Future directions anticipated

• Evidence that outcome measures and their trends are posted on the EPP website and in other ways widely shared

• Development of a data driven Improvement Plan

Page 65: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 5: Features of Potential Evidence• Specific evidence is provided that:

– Indicates which particular stakeholders are involved (e.g., alumni, employers, practitioners, school and community partners, others defined by the EPP)

– Illustrates ways stakeholders are involved (e.g., communications, discussions of implications of data, program evaluation, selection and implementation of changes for improvement, decision making)

– Regular and appropriate stakeholder groups are involved in decision-making, evaluation, and continuous improvement (Note: Not every stakeholder group would necessarily be appropriate for every decision process)

• EPP identifies at least two examples of use of and input from stakeholders

Page 66: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Standard 6Georgia Requirements for Educator Preparation Programs

Page 67: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Georgia Standard 6:Georgia Requirements • Admission Requirements• Reading Methods (Applies only to ECE, MG, Special Ed General

Curriculum, Special Ed Adapted Curriculum, and Special Ed General Curriculum/Early Childhood Education

• Identification and Education of Children with Special Needs• Georgia P-12 Curriculum, Instruction, and Educator Evaluation• Professional Ethical Standards and Requirements for Certification and

Employment• Field Experiences Appropriate to the Grade Level and Field of

Certification Sought and Clinical Practice• Content Coursework Requirements for Service Programs in Curriculum

and Instruction, Instructional Technology, and Teacher Leadership

Page 68: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Georgia Standard 6: Features of Potential Evidence• Admission Data• Course syllabi and/or content matrices for reading methods,

special needs, and technology courses• Content matrices from preparation program reports or folios

showing correlation with Georgia mandated P-12 standards (i.e. Common Core Georgia Performance (CCGPS), Georgia Performance Standards (GPS), and GACE objectives.

• Course syllabi and/or content matrices that includes knowledge about and application of professional ethics

• Field and Clinical experience tracking chart• Preparation program Reports and/or Folios

Page 69: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Annual Reporting

Page 70: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Preparation Approval Annual Report (PAAR)

• EPPs will retrieve their PPEM data via PAAR• Majority of data in PAAR will be pre-populated• EPPs will respond to PPEM data and other data such as:

– Ethics violations data– Data from the administration of the ethics

assessment – Survey data (completer survey, first year teacher

survey, employer survey)• 2016 PAAR due after PPEM data is available

Page 71: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

CAEP EPP Annual Report

• Due in April• Only NCATE/TEAC/CAEP accredited EPPs• Report outcome measures and progress on

correcting AFIs• Report candidate & completer data• Feedback will be provided after submission

Page 72: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Onsite Visit

Page 73: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Onsite Logistics to be Confirmed during Pre-visit• Lodging, work room, meals at the hotel• Work room, interview rooms on site• Transportation from hotel to campus• Ability for Wi-fi at both locations,

projector, screen, shredder, printer• Interviews scheduled for Monday

Page 74: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Questions?

Page 75: 2015 Certification & Program Officials Conference Sessions E1-6: GaPSC/CAEP Approval Process December 2, 2015 Enjolia Farrington and Nate Thomas GaPSC

Thank you!

Take the survey:

https://www.surveymonkey.com/r/PSCDriveIn15