results-based monitoring for idea & esea programs office of consolidated planning &...
TRANSCRIPT
![Page 1: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/1.jpg)
Results-based Monitoringfor IDEA & ESEA Programs
Office of Consolidated Planning & MonitoringFall 2014
![Page 2: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/2.jpg)
2
Results-based Monitoring
Purpose: To monitor and support districts in the implementation of IDEA & ESEA programs that improve outcomes for students, while recognizing continuous improvement is necessary.
Shifting our focus from COMPLIANCE to PERFORMANCE
![Page 3: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/3.jpg)
3
IDEA Monitoring Shift of Focus
IDEA compliance monitoring has produced 85-95% compliance in the areas of student records, parent notification, etc.
Student outcomes measured by academic achievement, graduation rate and dropout rate have not improved over time.
Focus is changing from strict compliance to student outcomes.
![Page 4: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/4.jpg)
4
Continuous Improvement Cycle
![Page 5: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/5.jpg)
CPM: Monitoring
Desired Outcomes
Streamlining process for districts Improve planning and monitoring
around targeted accountability goals Cycle of Continuous Improvement for
district improvement
First Steps
Consolidation of monitoring tools Creating and piloting a results-based
tool and monitoring process Training and additional support for
LEAs through CPM and Special Populations
To monitor and support districts in the implementation of IDEA & ESEA programs that improve outcomes for students, while recognizing continuous
improvement is necessary:Results based Monitoring Implemented 2014-15 School Year
![Page 6: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/6.jpg)
6
Results-based Monitoring Tool
![Page 7: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/7.jpg)
7
Process Overview
Why a results-based monitoring tool? • Shifts focus from compliance to program effectiveness• Encourages collaborative conversations around district
programs• Provides a better understanding of successes and
challenges
How is the results-based monitoring tool organized?• Based on factors influencing student outcomes• Combines IDEA & ESEA monitoring items• Adds an Improvement Plan focusing on suggested
strategies to increase student outcomes • Includes a Compliance Action Plan
![Page 8: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/8.jpg)
8
Process Overview
Who is the Monitoring team? • Lead: CPM Regional Consultant • CPM Regional Consultant• Staff representing Special Education and other critical
subgroups and other areas (EL, non-public, etc.)• Fiscal consultants when needed• CPM Nashville staff when needed
![Page 9: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/9.jpg)
9
Monitoring Process Overview
Step 1. TDOE identifies LEAs based on risk and notifies selected LEAs
Step 2. Data collection • TDOE gathers assessment and growth data on districts and
schools by proficiency levels, subjects and subgroups (SWD, ED, etc.)
• TDOE reviews LEA consolidated application and budget for ESEA and IDEA
• TDOE reviews LEA strategic plan• TDOE requests LEA upload specific items
Step 3. TDOE selects schools based upon school assessment results and other factors• At least two schools are selected for a two hour onsite visit• Additional sites such as a preschool, non-public school, or
program maybe also be selected
![Page 10: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/10.jpg)
10
Monitoring Process Overview
Step 4. Phone call between TDOE and LEA to explain the process and clarify expectations• TDOE wants to see and hear about day-to-day procedures• LEA and School interviews• Note: Preparing boxes of information is not necessary• Agenda is negotiated with the LEA
Step 5. On-site visit approximately 3.5 days• Entrance meeting (approximately one hour) with the district
leadership and other relevant staff to review district strategic plan, initiatives, best practices, challenges, etc.
• Meetings with ESEA Director, IDEA Supervisor and program staff
• School visit schedules• Time slots for TDOE to write comments and final report (at
least two)• Review and modify final report with ESEA Director, IDEA
Supervisor • Exit Conference with LEA leadership
![Page 11: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/11.jpg)
11
Monitoring Process Overview
Step 6. Written Comments, Improvement Plan, Compliance Plan and LEA Letter• Prior to the exit conference, the results-based monitoring
comments are drafted • The Improvement Plans and Compliance Action Plans are
finalized in the exit conference (suggestions from the LEA are incorporated into the final document)
• Letter indicating Closed or Incomplete status is sent to the LEA within two weeks
![Page 12: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/12.jpg)
12
Monitoring Process Overview
Documentation required for the visit is uploaded electronically.
Many documents reviewed on-site but no copies of documentation are expected.
Process relies heavily on interviews with LEA staff and listening to their procedures and challenges within their districts.
School site visits are conducted by meeting with the principal and school leadership then walking through classrooms and parts of the building.
TDOE staff writes up all comments, improvement plans and compliance action plans while in the district.
An on-site exit conference reviews the completed written monitoring instrument with LEA leadership.
![Page 13: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/13.jpg)
13
Major Sections of the Monitoring Tool
Quality Leadership
Effective Teachers
Instructional Practices
Climate and Culture
Parent and Community Involvement
Appendices
![Page 14: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/14.jpg)
14
![Page 15: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/15.jpg)
15
![Page 16: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/16.jpg)
16
![Page 17: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/17.jpg)
17
![Page 18: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/18.jpg)
18
District Selection: Risk Analysis
![Page 19: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/19.jpg)
District Selection: Risk Analysis
Designed to target TDOE support where most needed
IDEA & ESEA monitoring selection based on risk analysis• IDEA & ESEA indicators considered with more weight for
program monitoring• Fiscal indicators considered with more weight for fiscal
monitoring
TDOE may add additional on-site reviews on an as-needed basis
Informational Webinars to be scheduled for LEAs selected for monitoring
![Page 20: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/20.jpg)
District Selection: Risk Analysis
An annual risk analysis identifies which LEAs are perceived to be at-risk based on various indicators.
The risk analysis includes programmatic, fiscal administrative, and achievement components of both IDEA and ESEA.
The programmatic risk analysis is weighted differently for program monitoring and fiscal monitoring.
There is often considerable overlap between program and fiscal risk, so many LEAs will receive both program and fiscal monitoring.
Two LEAs, not identified through the risk analysis, are randomly selected for results-based and fiscal monitoring.
![Page 21: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/21.jpg)
District Selection: Risk Analysis Indicators
The following categories are included in the risk analysis:
FISCAL18 Indicators
PERSONNEL 3 Indicators
MONITORING & AUDIT6 Indicators
FISCAL Monitoring WEIGHT 40%PROGRAM Monitoring WEIGHT 15%
FISCAL & PROGRAM Monitoring WEIGHT 15%
FISCAL & PROGRAM Monitoring WEIGHT 15%
REPORTING DEADLINES 8 Indicators
COMPLAINTS & HEARINGS 3 Indicators
STUDENT RESULTS18 Indicators
FISCAL & PROGRAM Monitoring WEIGHT 10%
FISCAL & PROGRAM Monitoring WEIGHT 5%
PROGRAM Monitoring WEIGHT 40%FISCAL Monitoring WEIGHT 15%
ADDITIONAL RISK CONCERNS3 Indicators
ADDITIONAL RISK CONCERNS not Weighted - Actual Points Added
![Page 22: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/22.jpg)
22
District Selection: Risk Analysis Indicators
FISCAL• Indicator 1: ESEA Title I Allocation > $500,000 {1 point} • Indicator 2: ESEA Title I Allocation > $1,500,000 {1 point}• Indicator 3: ESEA Title I Allocation > $5,000,000 {1 point}• Indicator 4: ESEA Discretionary Grants FY14 {1 point per}• Indicator 5: IDEA Part B Allocation > $500,000 {1 point}• Indicator 6: IDEA Part B Allocation > $1,500,000 {1 point}• Indicator 7: IDEA Part B Allocation > $3,000,000 {1 point}• Indicator 8: IDEA Preschool Allocation > $75,000 {1 point}• Indicator 9: IDEA Part B Discretionary Grants FY14 {1 point per}• Indicator 10: IDEA Preschool Discretionary Grants FY14 {1 point per}• Indicator 11: Title II-A Funds used for Class Size Reduction FY14 {1
point}
![Page 23: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/23.jpg)
23
District Selection: Risk Analysis Indicators
FISCAL• Indicator 12: ESEA Maintenance of Effort (MOE) Non-compliance FY13
{1 point each}• Indicator 13: IDEA Maintenance of Effort (MOE) Non-compliance FY13
{1 point each}• Indicator 14: ESEA & IDEA & CTE Potential Dropdead Funds Left to
Draw > $0 FY13 (as of Aug. 1, 2014) {1 point each}• Indicator 15: Title I Potential Carryover = or > 15% FY14 (as of Aug.
1, 2014) {1 point}• Indicator 16: IDEA Part B Potential Excess Carryover = or > 50% FY14
(as of Aug. 1, 2014) {1 point per grant}• Indicator 17: IDEA Preschool Potential Excess Carryover = or > 50%
FY14 (as of Aug. 1, 2014) {1 point per grant}
• Indicator 18: Bookkeeper < 3 yrs exp w/ Federal Programs FY14 {1 point}
![Page 24: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/24.jpg)
24
District Selection: Risk Analysis Indicators
PERSONNEL• Indicator 19: ESEA Director < 3 yrs exp FY14 {1 point}• Indicator 20: IDEA Director < 3 yrs exp FY14 {1 point}• Indicator 21: Director of Schools < 3 yrs exp FY14 (NOT USED for
FY14) {1 point}
MONITORING & AUDIT
• Indicator 22: ESEA Monitoring CAP 2013-14 {1 point per item}• Indicator 23: IDEA Monitoring CAP 2013-14 {1 point per item}• Indicator 24: Fiscal Monitoring CAP 2013-14 {1 point per item}• Indicator 25: CAP Items Not Addressed in Timely Manner for 2012-13
{1 point per item}• Indicator 26: TDOE Single Audit (A133) Findings 2013-14 {1 point per
item}• Indicator 27: US Ed Monitoring Findings 2013-14 {1 point per item}
![Page 25: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/25.jpg)
25
District Selection: Risk Analysis Indicators
REPORTING DEADLINES• Indicator 28: Missed Deadline ePlan Con. Funding App & Projected
Budget FY15 {1 point}• Indicator 29: Missed Deadline ePlan Final Budget FY14 {1 point}• Indicator 30: Missed Deadline Comparability Title I and/or Not
Compliant 2013-14 {1 point}
• Indicator 31: Missed Deadline Final Expenditure Report (NOT USED for FY14) {1 point}
• Indicator 32: Missed Deadline IDEA Dec. 1 Census Counts 2013-14 {1 point}
• Indicator 33: Missed Deadline IDEA EOY Statewide Frequency 2013-14
{1 point}• Indicator 34: Missed Deadline IDEA SSEER Report (NOT USED for
FY14) {1 point}
• Indicator 35: Missed Deadline IDEA Excess Cost Report (NOT USED for FY14) {1 point}
![Page 26: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/26.jpg)
26
District Selection: Risk Analysis Indicators
COMPLAINTS & HEARINGS• Indicator 36: ESEA Complaints 2013-14 {1 point per}• Indicator 37: IDEA Complaints 2013-14 {1 point per}• Indicator 38: IDEA Due Process Hearings 2013-14 {1 point per}
STUDENT RESULTS• Indicator 39: LEA In Need of Improvement (INI) 2013-14 {1 point}• Indicator 40: LEA In Need of Subgroup Improvement (INSI) 2013-14
{1 point}• Indicator 41: LEA Exemplary 2013-14 {credit 1 point}• Indicator 42: LEA Reward Schools 2014 {credit 1 point per}• Indicator 43: LEA Priority Schools 2015 {1 point per}• Indicator 44: LEA Focus Schools 2015 {1 point per}• Indicator 45: LEA 3-8 Math All Students % Below Basic is > State %
2013-14 • {1 point}• Indicator 46: LEA 3-8 Reading All Students % Below Basic is > State %
2013-14 {1 point}
![Page 27: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/27.jpg)
27
District Selection: Risk Analysis Indicators
STUDENT RESULTS• Indicator 47: LEA Algebra I All Students % Below Basic is > State %
2013-14 {1 point}
• Indicator 48: LEA English II All Students % Below Basic is > State % 2013-14 {1 point}
• Indicator 49: LEA 3-8 Math Students w Disabilities % Below Basic is > State % 2013-14 {1 point}
• Indicator 50: LEA 3-8 Reading Students w Disabilities % Below Basic is > State % 2013-14 {1 point}
• Indicator 51: LEA Algebra I Students w Disabilities % Below Basic is > State % 2013-14 {1 point}
• Indicator 52: LEA English II Students w Disabilities % Below Basic is > State % 2013-14 {1 point}
• Indicator 53: LEA has Suspension Rates > State Rate 2013-14 (NOT USED for FY14) {1 point per subgroup}
![Page 28: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/28.jpg)
28
District Selection: Risk Analysis Indicators
STUDENT RESULTS• Indicator 54: LEA has Students w Disabilities Grad Rate < State Rate
2011-12 {1 point per subgroup}
• Indicator 55: LEA IDEA Least Restrictive Environment (LRE) (% SWD in GenEd Setting = or > 80% of Day) 2013-14 {1 point}
• Indicator 56: LEA IDEA Disproportionate Representation of SWD by race/ethnicity 2012-13 {1 point}
ADDITIONAL RISK CONCERNS• Indicator 57: CPM Office - TDOE Concerns of Additional Risk {5
points}• Indicator 58: IDEA Program - TDOE Concerns of Additional Risk {5
points}• Indicator 59: Fiscal - TDOE Concerns of Additional Risk {5 points}
![Page 29: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/29.jpg)
2014-15 IDEA & ESEA Results-based & Joint Fiscal MonitoringTDOE Consolidated Planning & Monitoring
ASD
CampbellDavidson DeKalb
Fayette Hamilton
Hawkins
Knox
Johnson
Lake
Lauderdale
Monroe
Montgomery
Murfreesboro
Roane
Shelby
SullivanSumnerTSB
Wayne
* *
* *
* ** * * *
* * * *
* *
* *
* *
* *
* *
* **
* *
* * *
*
![Page 30: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/30.jpg)
30
Pilot Survey Results
![Page 31: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/31.jpg)
31
Participating LEA Demographics
Six districts and twelve schools volunteered for the monitoring pilot:
RuralStudent
Enrollment% Economically Disadvantaged
% Students w Disabilities
% Black / African American
% Hispanic / Latino % White
Yes 1,205 86.6% 14.9% 73.3% 5.0% 21.2%Yes 3,179 64.0% 13.8% 6.4% 2.8% 90.4%No 3,971 43.6% 13.3% 7.6% 1.5% 90.5%Yes 7,341 67.9% 17.2% 1.3% 2.9% 95.3%No 7,439 66.3% 15.6% 1.6% 2.1% 95.8%No 11,084 54.8% 14.6% 3.8% 10.2% 83.9%
![Page 32: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/32.jpg)
32
Relevant Survey Results
![Page 33: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/33.jpg)
33
FRAUD, WASTE or ABUSE
Citizens and agencies are encouraged to report fraud, waste or abuse in State and Local government.
NOTICE: This agency is a recipient of taxpayer funding. If you observe an agency director or employee engaging in any activity which you consider to be illegal, improper or wasteful, please call the state Comptroller’s toll-free
Hotline:
1-800-232-5454
Notifications can also be submitted electronically at:
http://www.comptroller.tn.gov/hotline
![Page 34: Results-based Monitoring for IDEA & ESEA Programs Office of Consolidated Planning & MonitoringOffice of Consolidated Planning & Monitoring Fall 2014Fall](https://reader038.vdocument.in/reader038/viewer/2022102906/56649f515503460f94c73e5f/html5/thumbnails/34.jpg)