district-determined measure example · assabet valley collaborative– ddm – teacher teams use of...

28
Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 1 District-Determined Measure Example Teacher Teams’ Use of Data to Support Student Learning Content Area and Grade Range: Administrators / School Leaders, K-12 DDM Summary: This DDM assesses administrators’ indirect impact on student learning by measuring growth in teachers’ use of student data during their common planning meetings to identify the varied needs of their students and to inform their interventions and instruction. Developed by: James Pignataro, High School Principal (Grafton Public Schools), Timothy Fauth, Middle School Principal (Grafton Public Schools), and Donna Dankner, Elementary Principal (Maynard Public Schools) Date updated: June 2015 Table of Contents Introduction ............................................................................................................................ 2 Instrument .............................................................................................................................. 5 Administration Protocol ........................................................................................................ 6 Scoring Guide ...................................................................................................................... 10 Measuring Growth and Setting Parameters ....................................................................... 14 Piloting .................................................................................................................................. 14 Assessment Blueprint ......................................................................................................... 14

Upload: nguyennhi

Post on 20-May-2018

213 views

Category:

Documents


0 download

TRANSCRIPT

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 1

District-Determined Measure Example Teacher Teams’ Use of Data to Support Student Learning

Content Area and Grade Range: Administrators / School Leaders, K-12 DDM Summary: This DDM assesses administrators’ indirect impact on student learning by measuring growth in teachers’ use of student data during their common planning meetings to identify the varied needs of their students and to inform their interventions and instruction.

Developed by: James Pignataro, High School Principal (Grafton Public Schools), Timothy Fauth, Middle School Principal (Grafton Public Schools), and Donna Dankner, Elementary Principal (Maynard Public Schools) Date updated: June 2015

Table of Contents Introduction ............................................................................................................................ 2

Instrument .............................................................................................................................. 5 Administration Protocol ........................................................................................................ 6 Scoring Guide ...................................................................................................................... 10 Measuring Growth and Setting Parameters ....................................................................... 14 Piloting .................................................................................................................................. 14 Assessment Blueprint ......................................................................................................... 14

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 2

Introduction Description of the Measure & Rationale The purpose of this indirect measure is to provide timely feedback to school leaders about their impact on school-wide data use for instructional purposes during teachers’ common planning time meetings. The central measure is a 14-item Team Meeting Data Use Survey to be completed by each common planning team (or if time is limited, by a designated member of the team) at the close of two meetings per month throughout the school year. The survey requires less than five minutes to complete and solicits team perceptions of the extent to which their team engaged in data analysis to identify students’ strengths and needs and to determine how best to advance students’ learning.

School administrators are charged with providing teachers with common planning time and supports for using this time effectively, especially in relation to data inquiry cycles and use of assessment and other student learning data. This DDM provides administrators with frequent feedback from common planning teams about the types of data they are discussing, the depth of their data use, the conclusions and actions that occur as a result of their data use, as well as the supports the team feels would help to advance their data use.

Intended use of the measure To show growth over time, administrators are expected to directly observe teams using data and to reflect on their own priorities for collaborative data use in the school, as a means of more fully understanding the feedback provided through this measure. Administrators will need to consider the ways and the extent to which they provide support to teams, through direct feedback and the provision of tools, resources, and professional learning. In addition, administrators will need to consider how their current practices may even undermine teams’ progress. For example, results of this measure may indicate that teams spend their meetings discussing school- or district-assigned tasks or specific matters that teams propose are important to their work, but do not involve data use. In order to demonstrate growth on this measure and improve effective data use across the school’s teams, administrators may need to revisit their own priorities and communications with teams, as well as the allocation of time for teachers to accomplish tasks that are either required of them or deemed important to their work.

Teachers’ teams across a school – and certainly across schools in a district – are often quite varied in their ability to use a range of data effectively to support and advance students’ learning. Therefore, this DDM is designed as a growth measure to support continuous improvement over time, rather than as a target measure where the goal is to maintain a previously achieved performance level. This DDM may be modified to serve as a target measure when school-based teams begin to demonstrate high levels of data use to inform decisions about programs, student interventions, and classroom instruction.

Context & Modifications for Other Contexts This DDM was designed for use in small- and mid-sized districts, although the tools and processes might be used in larger districts, as well. This DDM relies on honest and meaningful survey feedback and shared commitment to developing effective data use in teams. It is thus

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 3

recommended that any district adopting this DDM consider the current level of trust between administrators and teachers, as well as between central office and school leaders. Administrators should also consider the frequency, structure, and priorities already established for teachers’ common planning time meetings and how this DDM may need to be modified to align with local priorities. The Team Meeting Data Use Survey should be modified to reflect local data sources, and the administration protocol should be modified to reflect the frequency of local team meetings. For example, the current protocol indicates that teacher teams should complete the survey at two meetings per month; in another district, however, teams may meet more or less frequently.

Description of the Development Process This DDM was developed during October 2014 – June 2015 under a DDM Leadership Grant (FC-217) awarded to the Assabet Valley Collaborative by the Massachusetts Department of Elementary and Secondary Education (ESE). In partnership with Teachers21, Risk Eraser, and with primary support from the Learning Innovations Program at WestEd (Woburn, MA), the Collaborative convened six building administrators – three principals and three assistant principals – representing schools spanning grades pre-K through 12 from five participating districts. Participants worked in smaller teams of one to three to strengthen and apply their assessment literacy toward the development of several direct and indirect measures of student growth.

Participants grew their expertise over six sessions by engaging in a guided DDM development process framed by a series of questions: (1) Why are we doing this work together? (2) What is most important to measure? (3) How shall we measure what’s most important? (4) How can we strengthen and refine our measure? (5) How can we prepare our measure for broader use? (6) How will we measure growth? Throughout, participants engaged in large group discussion and critique, as well as team collaboration and problem solving.

This measure has not yet been piloted. Districts in and beyond the Assabet Valley Collaborative may decide if they would like to pilot and/or modify the attached tools and processes for use as a DDM in their district. Because this is a newly developed measure, it is important that districts engage administrators in examining results from the first year of implementation. It is also important to identify, over time, any revisions or refinements that may further strengthen the quality of the assessment, scoring tools, administration protocol, and/or growth parameters to suit the circumstances and realities of each district’s local context.

Content Alignment & Rationale This measure is aligned to the following Core Objective (CO)1:

Educators regularly analyze a range of student assessment data to identify students’ strengths and needs and to determine interventions and adjustments to instructional practice that will ensure all students progress in their learning.

1 A Core Objective is a statement that describes core, essential, or high priority content (knowledge, skills, or abilities) that was identified by those who designed the assessment and was drawn, synthesized, or composed from a larger set of curriculum or professional standards.

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 4

This Core Objective (CO) was identified as the basis for this DDM because of the central role that data use now plays in our public schools. Rather than using only standardized academic assessments as the chief means of measuring students’ growth or regression, educators are now expected to use a broader range of data – academic, social/emotional, and behavioral assessment data - to inform instructional, programmatic, and intervention decisions. This DDM measures the extent to which teacher or department teams utilize data that they already have, or may create, to identify and address students’ learning needs, and subsequently make adjustments to instruction and student interventions

The standards that informed this CO are listed in the Content Chart below and center on four aspects of leaders’ support for teachers’ data use:

1. The extent to which educators use a range of student data at meetings 2. The extent to which educators engage in the analysis of student data at meetings 3. The extent to which educators use data at meetings to identify student strengths and needs 4. The extent to which educators use data at meetings to determine appropriate interventions

and adjustments to instructional practice

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 5

Instrument This measure consists of a Team Meeting Data Use Survey that serves to gather meeting participants’ perceptions about their data use during two common planning meetings per month throughout the school year. The survey requires less than five minutes to complete and consists of 14 items. These include four team meeting context questions – e.g., date, grade level, department subject, name of person completing the survey. It also included ten core questions about team data use, such as whether the team used data during their meeting and if so, what types of data, followed by several questions about the purpose of their data use and actions the team decided to

Content (Job Responsibility) Weight

CO: Educators use a range of student data at meetings Leadership Standard I-C: Assessment

• 1-C-1 Supports educator teams to use a variety of formal and informal methods and assessments, including common interim assessments that are aligned across grade levels and subject areas.

Team Meeting Data Use Survey Items #5 and #6

25% of the measure

CO: Educators engage in an analysis of student data at meetings Leadership Standard I-C: Assessment

• I-C-2 Provides planning time and effective support for teams to review assessment data and identify appropriate interventions and adjustments to practice. Monitors educators’ efforts and successes in this area.

Team Meeting Data Use Survey Items #4, #7 and #8

25% of the measure

CO: Educators use data at meetings to identify students’ strengths and needs. Leadership Standard I-E: Data-Informed Decision Making

• 1-E-3 Uses multiple data sources to evaluate educator and school performance. Provides educator teams with disaggregated assessment data and assists faculty in identifying students who need additional support.

Team Meeting Data Use Survey Items #9, #10

25% of the measure

CO: Educators use data at meetings to determine appropriate interventions and adjustments to instructional practice. Leadership Standard I-C: Assessment

• 1-C-2 Provides planning time and effective support for teams to review assessment data and identify appropriate interventions and adjustments to practice. Monitors educators’ efforts and successes in this area.

Team Meeting Data Use Survey Items #9, #10, #11

25% of the measure

100%

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 6

take based on their data discussion. This survey tool, designed as a Google Form to facilitate ongoing compilation of results, provides the only data that is formally tracked and evaluated for this DDM. Although teams are asked to reflect on their data use by completing the survey as a group during the final minutes of their meetings, they also have the option, if needed due to lack of time, to ask the designated team leader or department head or a designated teacher on the team to complete the survey within 24 hours of the meeting on the team’s behalf. Results of the survey are evaluated quarterly using the School-Wide Team Data Use Rubric. When survey results are downloaded as an Excel file, administrators may also disaggregate data to strengthen the understanding of team-level patterns and progress over time, thereby informing more targeted administrative supports.

NOTE that the school administrator will need to create four copies of the online survey2, each titled to represent the given period of the year. At the start of each new period (the beginning of November, February, and April), send the new survey link to team leaders with a note to delete the previous link. This allows a new data collection cycle by returning the Responses to zero; otherwise, using the same Google Form will simply accumulate survey responses throughout the year, making it significantly more difficult to complete the School-Wide Data Use Rubric at the end of each period and to compare progress over time.

An optional Teacher Post-Observation Debrief Form is also provided in this DDM, as well. Although not formally scored as part of the DDM, this tool aligns to the questions in the Team Meeting Data Use Survey. It also is designed to assist administrators in aligning their classroom observation debrief discussions with their efforts to support teachers’ effective data use during their team meetings.

Administration Protocol The Administration Protocol addresses how the measure is intended to be implemented to best support a common conversation about growth, as well as to strengthen the consistency of administration across schools in a district.

When is the measure administered? This is a repeated measure where teachers’ common planning teams, presumed to meet at least twice per month, are asked to complete an online version of the Team Data Use Survey at the close of two meetings per month (and no later than 24 hours after any given team meeting) from the start of school to the end of May.

The request to use the final five minutes of team meetings for this task is to convey respect for teachers’ time and to avoid asking teachers to fit another task into their busy work days. It is recommended that teams establish a predictable routine, such as completing the survey together at the close of the first two meetings of every month. This will help to prevent teams from accidentally

2 If using Google Forms, go into the Team Meeting Data Use Survey Google Form and select “Make a Copy” from the drop down “File” menu options.

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 7

completing the form more or less frequently, which could skew results. (See section on deviations to the protocol, below.)

The administrator analyzes the survey results four times per year: (1) end of October, (2) end of January, (3) end of March, and (4) end of May. Districts that have a different team meeting schedule may need to modify the frequency that the survey is completed and/or compiled and analyzed. Asking teams to complete the survey at two meetings per month helps to build in regular team reflections regarding their data use and quarterly analysis by the administrator provides ongoing information that is likely to prompt important adjustments to various team supports. This approach ensures more effective and widespread data use as the school year progresses.

How is the measure administered?

Introducing the Team Data Use Survey to Staff

Prior to the first survey use, the administrator should facilitate collaborative meetings with teachers to discuss the focus of this DDM. In particular, it is important to clearly communicate that the survey is a DDM for the administrator (not the teachers). Teachers will be familiar with DDMs, but may not realize that the administrator also has DDMs and is expected to show growth in leadership skills. Also, the administrator should make clear that the DDM is aimed at school-wide progress with team data use. This may mean the administrator sometimes needs to disaggregate the data to better understand where or how to provide supports to teams most effectively, but the DDM results are interpreted based on school-wide improvement trends. The administrator should work closely with team leaders and department heads to explain and clarify the process, tools, and expectations prior to the initial implementation of the DDM.

During September faculty meetings, the administrator should facilitate a comprehensive initial discussion and/or professional learning on the role of data-based inquiry. The session will identify student learning strengths and needs and, subsequently, determining appropriate interventions and instructional adjustments to ensure all students progress in their learning. The administrator may want to engage teachers in a text-based protocol to read and reflect on an article about collaborative data use for student learning, ask teachers to work together to chart how their teamwork might look different with more effective data use, or to generate a list of the additional knowledge, skills, or resources they might need to use data more effectively. In particular, the administrator will need to introduce the priorities for team data use with particular attention to previous school or district expectations for the use of student learning data and the use of common planning time. Teachers may express concerns or frustrations if they receive mixed messages from the district, school, and/or union about the intended use of their common planning time.

During the September faculty meetings, the administrator must also introduce teachers to the Team Data Use Survey and state explicitly that the administrator’s interest is in supporting teams to use data effectively to achieve school-wide improvement. Administrators should point out the following:

o Item #4 asks whether data was discussed and, although not all team meetings will necessarily involve data use, the school is working to have more evidence-based team meeting discussions about how best to advance students’ learning.

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 8

o Items #5 and #6 relate to the range of data from which teams may want to draw when investigating student learning needs.

o Items #7 and #8 relate to the varied ways that team may discuss data, with some discussions achieving greater depth and analysis than others. In particular, teams are encouraged to drill down to the small- or sub-group and individual student levels in their data discussions as this is more likely to inform classroom instructional decisions than staying at the department, grade, or school levels.

o Item #9 relates to the purpose of the team’s data use, with the aim of focusing on students’ strengths, needs, and performance gaps, and identifying supports, interventions, and instructional adjustments that will ensure the progress of all students.

o Items #10 and #11 - both open-response items – ask teams to articulate very specifically what teams are learning about students and their instructional supports as a result of their data-based discussions and how they are going to apply what they have learned toward next steps for students.

The administrator should acknowledge that the survey measures only the team’s perceptions in these areas. Administrators should also note their intentions to attend and observe team meetings across the school to better understand the survey feedback, to learn more about how teachers actually work together in their teams, and to be able to identify and facilitate sharing of helpful team practices across the school.

Throughout the year, the administrator must ensure and protect opportunities for regularly scheduled common planning time and provide professional learning opportunities to support data use for instruction and student learning. In between team meetings, continued conversation should occur between the administrator and teachers to support teams’ analysis of student data. This may take place during post-observation debrief discussions between administrators and individual teachers, when administrators attend team meetings to observe team data use in action, or by teacher or team request. Administrators should use information gathered from these interactions to inform professional development planning.

Throughout these interactions, the administrator must engender teachers’ trust that the results of the Team Meeting Data Use Survey will never be used in any punitive way, and that the administrator is trying to use the results to improve his or her own practice to support teams appropriately. Teachers may initially be concerned about what happens if their meetings do not address the stated priorities or may express concerns that other factors, such as school or district agendas, field trips, parent issues, or testing schedules, determine the extent to which they can focus their team time on data use for students and instruction. The administrator should be ready to think with teachers about how they, collectively, might shift current practices, such as the allocation of time and tasks, to achieve progress. The administrator must then follow through on the results of these conversations, both to earn teachers’ trust and to demonstrate growth with this DDM.

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 9

Email Communication

After the initial staff meeting discussions, the administrator must send an email to staff that clearly and succinctly reiterates the purpose of the Team Meeting Data Use Survey, as noted above, and requesting that teams collectively complete the survey during the final five minutes of at least two team meetings per month. The administrator should also encourage teachers to let him or her know if they have any questions or concerns about this request.

This email provides the link to the Team Data Use Survey, requests that team leaders bookmark the survey site on the Internet browsers for use throughout the year, and includes the following directions:

1. During the last five minutes of two team meetings per month, please engage your full team in discussing and completing this brief 14-item survey. The purpose of the survey is to provide me with information about the extent to which teams across the school are engaging in data-based discussions about instruction and student learning and then determining how best to take action to ensure that all students are progressing in their learning.

2. Periodically, I will analyze all survey responses to identify school-wide patterns and trends over time with the aim of providing appropriate and helpful supports and professional learning opportunities to ensure all teams are getting the most from their team meetings.

3. The aim is to strengthen our ability to use data to inform decisions about instruction and student learning across the school. This means that data use in our common planning time meetings leads to the identification of actions the team will take to strengthen instruction or provide interventions or supports for student learning.

Ongoing Communications Administrators should maintain the visibility of these common planning team priorities throughout the year by providing feedback and follow-up to the staff about progress, sharing useful tools and strategies from teams across the school, and/or providing professional learning opportunities and new resources as part of ongoing staff meetings or development time. This improvement orientation also reinforces the non-punitive nature of this work and the administrator’s interest in supporting continuous improvement.

How are deviations to protocols addressed?

The administrator should periodically review the results of the Team Meeting Data Use Survey to ensure that all teams have established a reliable routine for completing the survey twice per month. Email reminders and/or one-to-one conversations with team leaders may be necessary to ensure response rates remain high throughout the school year.

If online administration of the Team Data Use Survey is not an option, the administrator will need to arrange for support staff to manage data entry based on return of paper versions of the survey. Districts will need to modify the frequency of survey administration and data analysis of results according to team meeting schedules and expectations in their local context.

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 10

It is possible that teams may complete the Team Meeting Data Use Survey more than two times per month, either because they forgot if they had already completed it twice that month, or because they want to provide more frequent updates to the school administrator. If this occurs, results may be skewed by the over- or under-representation of particular teams. For example, if one team frequently lacks a focus on data use, but completes the survey after every meeting – perhaps four times a month instead of two – this might skew school-wide results toward looking like teams are not making progress.

To avoid this situation, the administrator is encouraged to download survey results as an Excel file and sort the data by team-level to gain a view of whether teams are, in fact, completing the Team Feedback Survey twice per month. If not, the administrator should connect with particular common planning teams to learn why surveys are not being completed, as requested, and to ensure that staff understand that the results are to help the administrator improve supports and guidance to teams and are not an accountability measure for teams or an assessment of individuals on the teams.

To increase teachers’ commitment to these data-use priorities, the administrator may want to involve teachers in further defining these criteria or survey items. It is important, however, to be aware that changes may require revisions to other aspects of the DDM, such as the Core Course Objective, survey items, and Assessment Blueprint. Teacher commitment and involvement are strongly encouraged, but administrators must be aware of the subsequent changes that would need to be made to the broader DDM and have the knowledge, skills, and/or support to make these changes.

How will special accommodations be provided?

No special accommodations are needed for this indirect measure, particularly since teacher teams are encouraged to collectively complete the Team Meeting Data Use Survey, so support is provided within the team. However, administrators should work with teachers to understand the extent to which data are being used in teams to serve the full range of students in the school. In addition, administrators should provide supports and professional development to help teams understand the kinds of modifications and accommodations that may be necessary to ensure that all students are able to demonstrate the full extent of their learning through classroom assessments, thereby strengthening the quality of the data the teams will use to determine interventions and instructional adjustments.

Scoring Guide

To score the results of the Team Meeting Data Use Survey at the designated times (end of October, end of January, end of March, and end of May), the administrator downloads the survey results for that period (preferably as an Excel sheet) and analyzes results in relation to the criteria and performance levels described on the School-Wide Team Data Use Rubric. When using Google

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 11

Forms, the percentages described in the rubric can be quickly collected from the provided Summary Data sheet.

Note first that several survey items are included in the survey only to provide background information, such as the date, the grade-level or department, and the role of the person completing the form. Responses to these items are not evaluated in the School-Wide Team Data Use Rubric, but do allow the administrator to gauge whether all teams are completing surveys twice per month and whether any follow-up with team leaders is needed to maintain a high response rate. In addition, there are two survey items that are not evaluated, but are included to provide information to help the administrator determine appropriate team meeting supports. These items include: • Item 12: If you did not analyze data in your team meeting today, which data (if any) might have

been helpful to collect and/or review to inform your instructional planning and intervention decisions?

• Item 13: What would help you and your team use data more effectively to inform instructional decisions moving forward?

Also, note that the administrator’s analysis of these quarterly survey results occurs at the school level and does not involve disaggregating the data to the team level. Administrators may pursue team-level analysis to inform their team supports and planning for professional development throughout the year. Scoring the DDM, however, only requires analysis of school-wide patterns and trends over time.

The School-Wide Data Use Rubric is used to evaluate survey results. The administrator reads across one row of the rubric at a time to determine the level of performance demonstrated across the school for the given period of the year. Several rows in the rubric require summary statistics from the Team Meeting Data Use Survey. If using Google Forms, the administrator enters edit mode of the Team Meeting Data Use Survey Google Form and goes to the “Responses” menu option – notice that Google tells how many survey responses have been recorded in parentheses – then selects “Summary” from the “Responses” drop-down menu.

These “Summary” statistics provide the school-wide percentages needed to complete the School-Wide Data Use Rubric.

In addition, the administrator must also carefully review and analyze the school-wide results for the two open-response items to determine the appropriate level of performance on the rubric. If the administrator finds that teams are not clear enough or are not elaborating enough in their responses to provide useful information, the administrator should provide feedback to the team during the next team meeting.

To illustrate, Item 10 – “Based on your discussion about data, what did your team learn?” – and Item 11 – “What are the team’s next steps as a result of the data discussion today?” – will likely

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 12

elicit quite varied narrative responses from team to team or across survey periods. According to the School-Wide Data Use Rubric, responses to Item 10 are evaluated in relation to the strand, Data Use for Identifying Students’ Needs to see to what extent teams tended to specify team learning about student needs or gaps in learning. The administrator then evaluates responses to Item 10 again in relation to the next row in the rubric: Determination of Interventions and Instructional Adjustments. In this case, the same open responses for Item 10 need to be coded or highlighted again, but this time for the extent to which teams have committed to actions related to intervention strategies or instructional adjustments. Note: This is the only case where a single open response item is evaluated twice for different criteria.

Although the open response items are more time-consuming to analyze, these qualitative responses will likely provide more nuanced information about team data use. They will therefore better inform the administrator’s subsequent discussions with teams and allocation of team supports and resources. Each performance level on the rubric is valued at a certain number of points, ranging from 0 points (Emerging Team Data Use for Student Learning) to 25 points (Extending Team Data Use for Student Learning) and including several levels in between. Note that there are also three levels included without descriptors – shown as narrow columns marked with an asterisk. These columns are for use when results for a single row cross two contiguous levels on the rubric. For example, evidence for Data Use and Analysis might cross both the Developing and Solidifying performance levels. The administrator can then assign points according to the level situated in between the two levels; in this case, the in between is worth 12 points.

The administrator should maintain a tracking sheet, such as the following, to record results from the School-Wide Data Use Rubric for each period during the year. The Administer may want to set up more detailed records to track results at the team level, or at more frequent intervals to inform thinking about team supports, resources, and/or interventions for progress over time, as well as the results of those supports.

SCORING CHART School-Wide Responses

End of October # Responses:

Rubric Points

End of January # Responses:

Rubric Points

End of March # Responses:

Rubric Points

End of May # Responses:

Rubric Points

Points Gained

Range of Data Sources Items 5 & 6 Data Use & Analysis Items 6 & 7 Identification of Student Strengths & Needs Items 9 & 10 Determination of Interventions & Instructional Adjustments Items 9 & 11 Total Points (from 100 possible pts) Total Pts

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 13

Going down the column for the current time period, the administrator should insert into each white cell the total rubric points for each of the four portions of the School-Wide Data Use Rubric. This tracking will help the administrator see in which areas teams may need the greatest support. An analysis of the open-ended response items (#10 and #11) should provide the administrator with additional information about how best to target supports, resources, and/or professional learning.

In the far right column, administrators should calculate the gain in points for each rubric section across the four periods shown on the chart in order to gain a view of particular areas of strength and weakness at year’s end. This is done by subtracting the initial points recorded at the end of October, from the final points recorded at the end of May. The total points gained during the year is recorded in the far, lower-right cell in the chart.

A sample completed chart for the year might look like this:

SCORING CHART School-Wide Responses

End of October # Responses:

Rubric Points

End of January # Responses:

Rubric Points

End of March # Responses:

Rubric Points

End of May # Responses:

Rubric Points

Points Gained

Range of Data Sources Items 5 & 6 0 8 8 16 16 Data Use & Analysis Items 6 & 7 4 8 16 12 8 Identification of Student Strengths & Needs Items 9 & 10 8 12 16 20 12 Determination of Interventions & Instructional Adjustments Items 9 & 11 4 12 8 16 12 Total Points (from 100 possible pts) Total Pts 16 40 48 64 48

It is recommended that administrators collaborate with role-alike colleagues in the district to calibrate their scoring practices with both the quantitative and qualitative data collected via the Team Data Use Survey. Calibration strengthens the likelihood that scoring will be conducted in a consistent and reliable manner over time and across schools. A sample calibration protocol can be found at the Rhode Island Department of Education website: http://www.ride.ri.gov/Portals/0/Uploads/Documents/Teachers-and-Administrators-Excellent-Educators/Educator-Evaluation/Online-Modules/Calibration_Protocol_for_Scoring_Student_Work.pdf. This protocol can be modified for use with this DDM in a couple of ways. First, a calibration exercise can be completed by administrators comparing their quantitative team data to discuss whether they have organized and calculated their results, then applied these to performance level determinations on the rubric, in the same ways. Second, administrators can code or highlight – then determine rubric performance levels – for several common sets of sample open response data to gauge whether they are interpreting these qualitative data in the same ways. Calibration is particularly

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 14

important for tuning administrators’ interpretation of qualitative data, as is required with items #10 and #11 in this measure.

Measuring Growth and Settings Parameters

To determine growth in team data use over the course of the year, the administrator subtracts the total school-wide rubric score for the first period of the year (September through the end of October) from the total rubric score for the final period of the year (beginning of April thorough the end of May). For example, as shown in the sample score chart above, if one administrator’s school-wide rubric score at the end of October was 16 points and at the end of May, the score was 64 points, this represents a gain of 48 points.

The following Growth Parameters describe estimated growth bands for school-wide team data use, representing estimations for low, moderate, and high growth. Parameters are based on systematic estimations of results for a hypothetical school imagined by the developers of this DDM. Returning to the example above, where the sample school gained 48 points during the year, this would be evaluated as moderate growth, according to the Growth Parameters shown below.

Low Growth Moderate Growth High Growth <30 pt gain 30-50 pt gain >50 pt gain

Districts that aim to adopt or modify this DDM should revisit and refine these estimates based on a careful review of data collected during the first years of administration in their own school sites. The activity proceeds until historical trends can be established and used to inform and refine these preliminary parameters.

Piloting

This DDM will be piloted with a subset of teacher teams in 2015-16. In addition to piloting the Team Meeting Data Use Survey, administrators will be piloting the administration protocol, assessing the appropriateness of the estimated growth parameters, and gauging the overall school climate for implementing this type of DDM. The developers acknowledge that staff trust is a necessary pre-condition for the success of this DDM, as well as central office support for the priorities described in this measure. Assessment Blueprint The assessment blueprint is not a task to be completed as part of the DDM, but is an elaboration of the content table included in the introduction. It serves two purposes: (1) it is a roadmap for the assessment development team to ensure balanced coverage of the most important content and (2) it is a key for other potential users of the assessment by concisely indicating what content the assessment is designed to measure, whether the goal is growth or target attainment, and the

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 15

difficulty of the items associated with each piece of content. (See pages 12 and 29 of Technical Guide A for more information.)

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 16

Team Meeting Data Use Survey Designed Using Google Forms and Utilizing Branched Questions

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 17

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 18

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 19

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 20

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 21

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 22

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 23

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 24

Date: End of October End of January End of March End of May School-Wide Team Data Use Rubric

For each criterion (row), highlight the cell that most closely describes the demonstrated school-wide performance level

Criteria

Performance Levels Emerging Team Data

Use for Student Learning 0 pts

* 4

Developing Team Data Use for Student Learning

8 pts pts

*

12

Solidifying Team Data Use for Student Learning

16 pts

*

20

Extending Team Data Use for Student Learning

25 pts Range of Data Use Items #5, #6

Items 5 & 6: Overall, teams report consulting a small or limited set of data sources

Items 5 & 6: Overall, teams report consulting some varied types of data sources

Items 5 & 6: Overall, teams report consulting clearly varied types of data sources

Items 5 & 6: Overall, teams report consulting clearly varied “academic” data, clearly varied “other” types of data, and additional data sources (in “other” space)

Data Use & Analysis Items #4, 7, 8

Item 4: Teams report data use occurring in less than 25% of all meetings reported for this period Item 7:Teams report SHARING and distributing data for more than 40% of meetings Item 8: Teams report analyzing data at the small group or individual levels for less than 20% of meetings

Item 4: Teams report data use occurring in 26-50% of all meetings reported for this period Item 7: Teams report REVIEWING data – sharing and discussing data at a general or introductory level for more than 40% of meetings and/or Teams report less than 40% of the meetings demonstrating in any response category Item 8: Teams report analyzing data at the small group or individual levels for 20-33% of meetings

Item 4: Teams report data use occurring in 51-70% of all meetings reported for this period Item 7: Teams report ANALYZING data – sharing and analyzing data by sorting, disaggregating, or comparing data and/or ANALYZING and CONCLUDING – includes drawing clear conclusions from the analysis of data for more than 40% of meetings Item 8: Teams report analyzing data at the small group or individual levels for 34-50% of meetings

Item 4: Teams report data use occurring in more than 70% of all meetings reported for this period Item 7: Teams report FOLLOWING UP with data – analyzing data to assess progress based on decisions made at prior meetings for more than 40% of meetings Item 8: Teams report analyzing data at the small group or individual levels for 51-100% of meetings

* Columns marked with asterisks indicate intermediate levels for use when evidence within a single row crosses varied performance levels. For example, evidence for Data Use and Analysis might cross both the Developing and Solidifying performance levels, in which case the administrator would select the intermediate performance level, worth 12 points.

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 25

Criteria

Performance Levels Emerging Team Data

Use for Student Learning 0 pts

* 4

Developing Team Data Use for Student Learning

8 pts pts

*

12

Solidifying Team Data Use for Student Learning

16 pts

*

20

Extending Team Data Use for Student Learning

25 pts Data use for Identifying Student Needs Items #9, #10

Item 9: Teams report analyzing data to determine student needs or gaps in learning in 0-25% of meetings Item 10: Majority of open responses indicate team learning that does not mention student needs or gaps in learning

Item 9: Teams report analyzing data to determine student needs and gaps in learning in 26- 40% of meetings Item 10: Majority of open responses indicate team learning about student needs or gaps in learning but are general, non-specific, or basic

Item 9: Teams report analyzing data to determine student needs and gaps in learning in 41- 55% of meetings Item 10: Majority of open responses indicate team learning about student needs or gaps in learning that are clear and specific

Item 9: Teams report analyzing data to determine student needs and gaps in learning in 56-100% of meetings Item 10: Majority of open responses indicate team learning about student needs or gaps in learning that are clear, specific, and cite evidence

Data Use for Determining Interventions & Instructional Adjustments Items #9, #10, #11

Item 9: Teams report analyzing data to determine appropriate student support strategies interventions, or adjustments to instruction in 0-25% of all meetings reported for this period Item 10: Majority of open responses indicate team learning that does not mention student interventions or instructional adjustments Item 11: Majority of open responses do not mention committing to intervention strategies or instructional adjustments

Item 9: Teams report analyzing data to determine appropriate student support strategies interventions, or adjustments to instruction in 26-40% of all meetings reported for this period Item 10: Majority of open responses indicate team learning about student interventions or instructional adjustments, but are general, non-specific, or basic Item 11: Majority of open responses mentioning committing to generally-stated intervention strategies or instructional adjustments

Item 9: Teams report analyzing data to determine appropriate student support strategies interventions, or adjustments to instruction in 41-55% of all meetings reported for this period Item 10: Majority of open responses indicate team learning about student interventions or instructional adjustments that are clear and specific Item 11: Majority of open responses indicate clear and specific commitments to intervention strategies or instructional adjustments

Item 9: Teams report analyzing data to determine appropriate student support strategies, interventions or adjustments to instruction in 56-100% of all meetings reported for this period Item 10: Majority of open responses indicate team learning about student interventions or instructional adjustments that are clear, specific, and cite evidence Item 11: Majority of open responses indicate specific, measurable, achievable, time-bound commitments to intervention strategies or instructional adjustments

TOTAL POINTS DEMONSTRATED

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 26

Teacher Post-Observation Debrief Form Question numbers align to question numbers on the Team Data Use Survey Form

Teacher: Grade / Department: Evaluator: Date and Time of Classroom Observation: Focus of Observed Lesson: Date and Time of Debrief Discussion: 1. Did you use any particular student data to plan today’s lesson? 2. & 3. Which type(s) of data did you use? Academic Data (Check all that apply) 𑠀 Content-Specific Assessments (ex:

DIBELS, DRA, math fluency, STAR) 𑠀 Common Assessments

𑠀 Student Work

𑠀 MCAS / PARCC

𑠀 Unit Assessments

𑠀 Mid-term/final exams

𑠀 Formative Assessment (e.g., checks for understanding, ticket to leave, “Do Now,” daily quizzes)

𑠀 Performance Assessment (demonstration of knowledge and skills)

𑠀 Capstone Project

𑠀 Student Portfolio

𑠀 Other

Other Student Data (Check all that apply) 𑠀 Student Support Team / Building-Based

Student Team notes or minutes 𑠀 Specialist Reports (ex: School

Psychologist reports) 𑠀 Free & Reduced Lunch data 𑠀 Teacher Observations (ex: Student

engagement, behavior) 𑠀 Guidance / School Adjustment

Counselor Notes 𑠀 Nurse Visits 𑠀 Absences 𑠀 Tardies 𑠀 Discipline Logs 𑠀 Bathroom Visits 𑠀 Detentions / Suspensions Reports 𑠀Other

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 27

2. If you did not analyze data in preparation for today’s lesson, which data might have been helpful to collect and/or review to inform your lesson plans and instructional decisions?

3. If you did work with data in preparation for today’s lesson, did you discuss or review these data with your team or colleagues, or did you review the data individually? How did you actually work with the data? Please describe (e.g., disaggregate, analysis over time). 4. What were you trying to understand by reviewing these data? 5. What was the purpose of your data use?

To determine students' strengths To determine students' needs To determine students' progress To determine appropriate student support strategies or interventions To determine adjustments we want to make to our instructional practice To determine differences or gaps in students' learning Other?

6. At what level did you focus your data analysis?

Individual student level Small group level Class level Grade or department level School level

Assabet Valley Collaborative – DDM – Teacher Teams Use of Data 28

7. What did you learn as a result of your data analysis?

What specific needs, trends, or patterns did you find?

How did these inform your lesson design or interventions? 8. Based on your lesson, what data did or will you collect next to determine whether your lesson addressed the needs you identified?

9. What will you do next?

10. What would help you use data more effectively to inform interventions and instructional decisions moving forward? How can I support you? Consider: Does your team have the necessary skills to analyze and interpret student data? Does everyone have access to necessary student data? Does your team need support linking data analysis to next steps with instruction?