technical assistance summary report february 2019 · technical assistance summary report. february...

68
Technical Assistance Summary Report FEBRUARY 2019

Upload: others

Post on 11-Jul-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

Technical Assistance Summary Report

FEBRUARY 2019

Page 2: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S
Page 3: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

Technical Assistance Summary ReportFEBRUARY 2019

This document was produced by American Institutes for Research under U.S. Department of Education contract number ED-ESE-15-A-0006/0001. The content of this document does not necessarily reflect the views or policies of the U.S. Department of Education, nor does mention of trade names, commercial products, or organizations imply endorsement by the U.S. government. Authorization to reproduce this document in whole or in part for educational purposes is granted.

1000 Thomas Jefferson Street, NWWashington, DC [email protected]

Page 4: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S
Page 5: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

v

State Support Network—Technical Assistance Summary Report

ContentsPAGE

List of Tables .............................................................................................................................viList of Figures ............................................................................................................................viAcknowledgments .................................................................................................................... viiExecutive Summary ................................................................................................................ viii

Introduction ....................................................................................................................................1What Is the State Support Network?.............................................................................................1Overview of Network Technical Assistance Completed to Date ..................................................4Evaluation of Network Technical Assistance ................................................................................7Purpose of This Report ..................................................................................................................8

Quality .............................................................................................................................................9Assessing Quality ...........................................................................................................................9Results ............................................................................................................................................9

Relevance .................................................................................................................................... 12Assessing Relevance ....................................................................................................................12Results ..........................................................................................................................................12

Application and Impact ............................................................................................................. 15Assessing Application and Impact ..............................................................................................15Results ..........................................................................................................................................16

Project Objectives ....................................................................................................................... 19Assessing Project Objectives .......................................................................................................19Results ..........................................................................................................................................19

Partnerships ................................................................................................................................ 21

Case Studies ................................................................................................................................ 23CoP: Implementing Needs Assessment ......................................................................................23P2P: Equity Lab Series .................................................................................................................27

Conclusion ................................................................................................................................... 32

Appendix A. List of Technical Assistance Projects (Completed May 2016–May 2018) ................................................................................... A-1

Appendix B. State Participation in Network Technical Assistance (May 2016–May 2018) ....................................................................................................... B-1

Appendix C. Overview of Performance Management System ............................................ C-1

Appendix D. Standard Evaluation Survey ...............................................................................D-1

Appendix E. Interview Protocol ................................................................................................ E-1

Appendix F. List of Network Partners ......................................................................................F-1

Page 6: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

vi

State Support Network—Technical Assistance Summary Report

List of TablesPAGE

List of FiguresPAGE

Figure 1. State Support Network Logic Model ..........................................................................1

Figure 2. Participation in Network Technical Assistance Projects by State .........................5

Figure 3. Participation in CoP Technical Assistance Projects by State .................................6

Figure 4. Participation in P2P Technical Assistance Projects by State .................................6

Figure 5. Participation in ITA Technical Assistance Projects by State ...................................7

Figure C1. Typical Performance Management Cycle ........................................................... C-1

Table 1. Network Priority Areas ...................................................................................................3

Table 2. Percentage of Participants Who Agreed or Strongly Agreed With Quality Composite Items ............................................................................................. 10

Table 3. Average Quality Ratings of Network Technical Assistance ................................... 10

Table 4. Percentage of Participants Who Agreed or Strongly Agreed With Relevance Composite Items ....................................................................................... 13

Table 5. Average Relevance Ratings of Network Technical Assistance ............................. 13

Table 6. Percentage of Participants Who Agreed or Strongly Agreed With Application and Impact Item ...................................................................................... 16

Table 7. Average Application and Impact Ratings of Network Technical Assistance ....... 16

Table 8. Percentage of Participants Who Agreed or Strongly Agreed That Objectives Were Met .................................................................................................... 20

Table 9. Average Objectives Met Ratings of Network Technical Assistance ..................... 20

Table 10. Percentage of Activities Involving Meaningful Partnership ................................ 22

Table 11. Survey Results From CoP Implementing Needs Assessment ............................ 25

Table 12. Survey Results From P2P Equity Lab Series ......................................................... 29

Page 7: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

vii

State Support Network—Technical Assistance Summary Report

AcknowledgmentsSeveral staff from the Office of Elementary and Secondary Education and the State Support Network (the Network) contributed to the development of this report. Katelyn Lee was the lead contributor from the Network. The following Network staff provided feedback and guidance throughout the development of the report: Sara Wolforth, Alicia Garcia, Bob Stonehill, and Steve Plank. The Network is grateful to our partners at the Office of Elementary and Secondary Education, especially Danielle Smith, Irene Harwarth, Lisa Sadeghi, Kim Light, and Bryan Thurmond for their feedback throughout the report development process, and Christopher Tate for his contributions to creation of the Network performance management system. The Network would also like to thank the state and district staff who shared feedback on the Network’s technical assistance, which was an invaluable contribution to this report and the Network’s commitment to continuous improvement.

Page 8: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

viii

State Support Network—Technical Assistance Summary Report

Executive SummaryThis report summarizes the technical assistance provided by the State Support Network (the Network) from May 2016 through May 2018 and reviews available data regarding the quality, relevance, application and impact, and partnerships involved in the technical assistance provided during that period. The Network is a four-year technical assistance initiative of the U.S. Department of Education (ED), Office of Elementary and Secondary Education (OESE) designed to support state and district school improvement efforts. The Network provides a range of technical assistance services and resources to states and districts, ranging from customized support to individual states and districts to universal approaches (convenings and publications) aimed to support all OESE grantees. This report will focus on the three types of technical assistance that comprised the majority of Network activities during 2016–2017 and 2017–2018: communities of practice (CoPs), peer-to-peer (P2P) exchanges, and individualized technical assistance (ITA).

Page 9: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

Goals of the Network

Communities of Practice (CoPs): In a CoP, SEA and/or LEA leaders work together over an extended period of time to learn, share knowledge, and collaborate on a specifi c focus area through targeted technical assistance, facilitated problem solving, and information sharing supported by an online environment.

Peer-to-Peer (P2P) Exchanges: In a P2P exchange, multiple SEA and/or LEA leaders convene to share their experiences, build on one another’s successes and challenges, hear what other SEAs or LEAs are doing, and partner across states and regions to address shared challenges.

Individualized Technical Assistance (ITA):ITA is differentiated, hands-on support designed to meet the specifi c needs of individual SEAs and LEAs to help state and district staff establish conditions for improvement and effect organizational change.

The State Support Network (the Network) is a four-year technical assistance initiative of the U.S. Department of Education (ED), Office of Elementary and Secondary Education (OESE) designed to support state and district school improvement efforts. The Network evaluates the quality, relevance, application and impact, project objectives, and partnerships of its technical assistance to support continuous improvement.

What is the State Support Network?

� Facilitate the building of sustainable learning communities and partnerships within and across state and local educational agencies (SEAs and LEAs).

State Support Network Technical Assistance

State Participation in Network Technical Assistance

Three Key Types of Technical Assistance Approaches to Support State and Local Educational Agencies

states have participated in CoPs

states have participated in P2P exchanges

states have participated in ITA44

24

36

� Help to scale up effective systemic approaches and practices within and across SEAs and LEAs.

� Support SEA and LEA efforts to achieve signifi cant improvements in student outcomes.

� Identify and share effective practices to facilitate learning from SEAs, LEAs, and others.

MAY 2016–MAY 2018

41

states have participated in multiple types of Network technical assistance

The Network completed 8 CoPs, 8 P2Ps, 32 ITAs (27 state-focused projects and 5 district-focused projects), 13 tools and 8 resources, and 3 meetings. 43 states, the District of Columbia, Puerto Rico, and the Bureau of Indian Education participated in Network technical assistance across CoPs, P2Ps, and ITAs.

Page 10: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

� Providing consistent quality. Network technical assistance was consistently perceived by participants to be of high quality between 2016 and 2018.

Results for Quality, Relevance, Application and Impact, Project Objectives, and Partnerships

3.63

3.71

3.35

3.35

3.16

3.28

3.34 3.05

3.493.443.323.19Rating for objectives met

Quality rating

Relevance rating

Application and impact rating

2016–2018

95%

90%

89%

2016–2017

97%Of high quality

Relevant to participant needs

Applicable and impactful for participant priorities and goals

100%

83%

Ratings Across Task Areas

2017–2018

93%

89%

91%

83%Meeting project objectives 87% 83%

Percentage of Participants Who Agreed or Strongly Agreed that Network Technical Assistance is:

Average Ratings of Network Technical Assistance for:

2016–20182016–2017 2017–2018

Ratings Across Task Areas

� Differentiating supports to increase relevance. The Network also received high ratings for the relevance of its technical assistance between 2016 and 2018.

� Increasing evidence of application and impact. The Network demonstrated some improvement in the application and impact domain between 2016 and 2018.

Key Findings

Network Partnerships. The Network engages a broad spectrum of experts and technical assistance providers in technical assistance to build effi ciencies, reduce duplication of effort, and maximize impact. As of May 2018, the Network has collaborated with 52 organizations across technical assistance projects.

A review of the data indicate that the Network is:

� Enhancing the fi t of project objectives. The Network has an opportunity for growth in the area of meeting project objectives.

The Network primarily evaluates the quality, relevance, and impact of each technical assistance project using a standard set of Likert scale survey items. The Network then uses two indicators to assess its performance: (1) the percentage of participants who indicated agree or strongly agree with items for each evaluation domain and (2) the average rating (on a 4-point scale) of all responses to the items for each evaluation domain.

Page 11: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

1

State Support Network—Technical Assistance Summary Report

IntroductionWhat Is the State Support Network?Established by OESE at ED in May 2016, the Network is a four-year technical assistance initiative designed to support state and district school improvement efforts. The Network brings SEAs and LEAs together to analyze practical challenges and develop strategies for supporting districts and schools. States and districts that participate in Network technical assistance receive coordinated support, from a range of technical assistance providers and subject matter experts, to share successful strategies and understand how to apply evidence-based, best practices to address their educational challenges.

From 2016 to 2018, the Network has collaborated with states, districts, and technical assistance partners to work toward four Network goals. The first three goals are as follows:

1. Facilitate the building of sustainable learning communities and partnerships within and across SEAs and LEAs.

2. Identify and share effective practices to facilitate learning from SEAs, LEAs, and others.3. Help to scale up effective systemic approaches and practices within and across SEAs and LEAs.

These three goals are ultimately in service of a fourth, overall goal:

4. Support SEA and LEA efforts to achieve significant improvements in student outcomes.

As depicted in Figure 1, the Network delivers high-quality, relevant, and wide-reaching technical assistance to advance these goals.

Figure 1. State Support Network Logic Model

Elevatestudent

outcomes

Build andsustain

partnerships

Share evidence-based

practices

Scale systemic solutions

High-quality, relevant, and wide-reaching

universal, collective, and individualized

technical assistance

Page 12: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

2

State Support Network—Technical Assistance Summary Report

The Network provides several types of technical assistance to states and districts, ranging from customized support to individual states and districts to universal approaches (convenings and publications) aimed to support all OESE grantees.

� CoPs: In a CoP, leaders from SEAs and LEAs come together to learn, share knowledge, and collaborate on a specific focus area. The CoP supports the needs and preferences of CoP members through targeted technical assistance and facilitated problem solving and information sharing, supported by subject matter experts and facilitators, in an online environment.

� P2P exchanges: In a P2P exchange, leaders from SEAs and LEAs convene to share their experiences and build on one another’s successes and challenges to achieve collective and continuous growth. P2P exchanges leverage opportunities created when several states or districts request similar support, want to hear what other states or districts are doing, or seek to partner across states and regions to address shared challenges.

� ITA: The Network provides customized technical assistance designed to meet the specific needs of states and districts. Through differentiated, hands-on support, ITA offers a partnership to help state and district staff establish conditions for improvement and effect organizational change. ITA services include fast-response consultative support designed to solve a specific policy or implementation challenge as well as ongoing support designed to help states and districts implement key outcomes.

� Tools and products: The Network develops tools and products to share lessons learned from technical assistance efforts with a broader audience. Technical Assistance Tools are designed to support state, district, or school staff in implementing school improvement and to meet requirements of the Elementary and Secondary Education Act (ESEA), as amended in 2015 by the Every Student Succeeds Act (ESSA). Examples of technical assistance tools are self-assessments, reflection guides, and planning templates. Best Practice Resources are designed to share, synthesize, and explain important information on key topics related to school improvement and ESSA implementation.

� State convenings: The Network brings states together at in-person meetings to learn from subject matter experts, share successes, and brainstorm solutions to common challenges. These convenings cover a range of topics (such as developing ESSA state plans or implementing accountability and support systems for English learners) depending on state needs.

Through these technical assistance approaches, the Network offers support to states and school districts around identified needs related to school improvement. OESE and the Network collaboratively determine priority areas for technical assistance based on formal and informal needs-sensing activities.

From September to December 2016, the Network examined the national district and school improvement landscape to identify gaps in supports and resources for SEAs and LEAs. This review was accomplished through an initial collaborative needs assessment, which included a document review, focus groups, and interviews. The process concluded with a Co-interpretationSM meeting for stakeholders to review and analyze data and come to a consensus on priorities.

Page 13: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

3

State Support Network—Technical Assistance Summary Report

After this initial needs assessment, the Network launched a series of ongoing activities to identify current and emerging technical assistance needs. Activities included convening subject matter experts in facilitated discussions, conducting focus groups with SEA staff, interviewing formal technical assistance recipients, and administering online needs-sensing activities that allowed potential technical assistance recipients to identify technical assistance topical areas and activities of interest. In addition, the Network conducts ongoing outreach to organizations and experts to identify areas for coordination and collaboration in carrying out technical assistance.

Table 1 describes the guiding priority areas that emerged through needs sensing for 2016–2017 and 2017–2018.1

1 Year 2 priority areas were refined from Year 1 priority areas.

May 2016–May 2017 May 2017–May 2018

1. School improvement stock-taking: Technical assistance to take stock of lessons learned from prior systemic school improvement efforts

2. District and school needs assessment: Technical assistance to assess districts’ and schools’ needs and assets to inform improvement strategies

3. Continuous improvement: Technical assistance to build sustainable systems to support continuous improvement and ensure student success

1. Evidence-based practices for school improvement: Technical assistance to support SEAs and LEAs in assessing school improvement needs to select and implement appropriate evidence-based strategies as part of a cycle of continuous improvement

2. School improvement monitoring and support: Technical assistance to support SEAs and LEAs in working collaboratively on the implementation and monitoring of school improvement requirements

3. Resource allocation: Technical assistance to support SEAs and LEAs in promoting the efficient and effective planning and use of resources for school improvement, and in implementing the ESSA financial transparency requirements

Table 1. Network Priority Areas

Page 14: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

4

State Support Network—Technical Assistance Summary Report

Overview of Network Technical Assistance Completed to DateAs of May 2018,2 the Network completed eight CoPs, eight P2Ps, and 32 ITAs (including 27 state-focused ITA projects and 5 district-focused ITA projects); developed 13 tools and eight resources; and convened three meetings.3 Please see Appendix A for a list of technical assistance projects completed by May 2018. During this two-year period, the Network has worked with 43 states, the District of Columbia, Puerto Rico, and the Bureau of Indian Education.4 As noted, the Network also offers technical assistance to districts, typically in collaboration with their SEAs. Two projects have focused exclusively on districts during the first two years of the Network.5

Figure 2 provides a heat map of state participation in Network technical assistance projects between May 2016 and May 2018. The darker colors on the map (California, Arkansas, Mississippi, Indiana, Ohio, Georgia, New Jersey, and Massachusetts) signify greater state participation in Network technical assistance as measured by the number of Network technical assistance projects. No shading (Wyoming, Kansas, Iowa, Wisconsin, West Virginia, Virginia, and Connecticut) illustrates that the state did not participate in Network technical assistance projects during the period covered in this report. The gradient between dark and light illustrates the range and depth of state participation in Network technical assistance across the country. The number of projects that each state participated in is listed in parentheses after the state label.

2 May 2018 marked the mid-point of the Network’s four-year contract with OESE.3 Two meetings were hosted by the Network and one meeting was co-hosted with another technical assistance organization.4 See Appendix B for more information about which states and territories have participated in Network technical assistance as well as a list of the projects in which they participated.5 Projects that focused exclusively on districts between 2016 and 2018 include the District Strategic Planning and Resource Allocation CoP, which supported three districts (Wake County Public School System, North Carolina; Beaverton School District, Oregon; and Bellevue School District, Washington), and an ITA project supporting an Arizona district with school leadership development.

Page 15: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

5

State Support Network—Technical Assistance Summary Report

Figure 2. Participation in Network Technical Assistance Projects by State

Note. Figure 2 includes projects that took place between May 2016 and May 2018. Figure 2 depicts the extent that states (or districts in collaboration with SEAs) have engaged in multiple technical assistance projects. Some types of technical assistance projects are more intensive and have more activities than others (e.g., CoPs compared with ITA). The numbers of activities within the projects are not represented in Figure 2. Projects focused exclusively on districts are also unrepresented in Figure 2.

District of Columbia (3)

Bureau of Indian Education (3)

Puerto Rico (2)

ME(5)

RI (4)

DE (5)

NY(2)

VT (2)

NH (4)

MA (7)

CT (N/A)NJ (7)

MD (5)

ND(2)

SD(6)

KS(N/A)

NE(3)

OK(6)

MN(5)

IA (N/A)

MO(5)

AR(9)

TN (3)NC (2)

LA(3)

FL(3)

MS(7)

AL(3)

GA(8)

WI(N/A)

IL(4)

KY (5)

IN(7)

OH(7)

WV(N/A)

PA(1)

VA(N/A)

SC(4)

MI(5)

TX(2)

WA(2)

MT(2)

OR(5) ID

(2)WY

(N/A)

UT(2)

CA(9)

NV(4)

CO(2)

AZ(4) NM

(4)

AK(3)

HI(1)

1–2 Total TA Activities

13N/A

73–4 Total

TA Activities

155–6 Total

TA Activities

107 Or More Total

TA Activities

8

Among states and territories that have participated in Network technical assistance, 41 participated in multiple types of Network technical assistance during the reported period (May 2016–May 2018). Additionally, 44 of the 50 states and 3 territories participated in CoPs, 24 participated in P2Ps, and 36 participated in ITA. Figures 3, 4, and 5 provide heat maps of the states with the highest concentration of CoPs, P2Ps, and ITAs, respectively.

Page 16: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

6

State Support Network—Technical Assistance Summary Report

Figure 3. Participation in CoP Technical Assistance Projects by State

Figure 4. Participation in P2P Technical Assistance Projects by State

Note. Figure 3 includes projects that took place between May 2016 and May 2018. Projects focused exclusively on districts are not represented in Figure 3.

Note. Figure 4 includes projects that took place between May 2016 and May 2018.

1 CoP

14N/A

92 CoPs

133 CoPs

114 Or More

CoPs

6

District of Columbia (1)

Bureau of Indian Education (2)

Puerto Rico (1)

ME(2)

RI (1)

DE (3)

NY(1)

VT (2)

NH (3)

MA (2)

CT (N/A)NJ (3)

MD (3)

ND(1)

SD(2)

KS(N/A)

NE(1)

OK(3)

MN(3)

IA (N/A)

MO(1)

AR(5)

TN (1)NC (1)

LA(2)

FL(2)

MS(5)

AL(2)

GA(2)

WI(N/A)

IL(3)

KY (3)

IN(5)

OH(5)

WV(N/A)

PA(1)

VA(N/A)

SC(4)

MI(2)

TX(N/A)

WA(1)

MT(2)

OR(3) ID

(1)WY

(N/A)

UT(1)

CA(4)

NV(3)

CO(1)

AZ(2) NM

(3)

AK(2)

HI(N/A)

1 P2P

18N/A

302 P2Ps

33 P2Ps

2

District of Columbia (1)

Bureau of Indian Education (N/A)

Puerto Rico (N/A)

ME(1)

RI (1)

DE (1)

NY(N/A)

VT (N/A)

NH (N/A)

MA (1)

CT (N/A)NJ (1)

MD (2)

ND(1)

SD(N/A)

KS(N/A)

NE(N/A)

OK(1)

MN(N/A)

IA (N/A)

MO(3)

AR(1)

TN (N/A)NC (N/A)

LA(1)

FL(N/A)

MS(1)

AL(1)

GA(3)

WI(N/A)

IL(N/A)

KY (1)

IN(N/A)

OH(2)

WV(N/A)

PA(N/A)

VA(N/A)

SC(N/A)

MI(1)

TX(1)

WA(N/A)

MT(N/A)

OR(1) ID

(N/A)WY

(N/A)

UT(N/A)

CA(2)

NV(N/A)

CO(N/A)

AZ(1) NM

(N/A)

AK(1)

HI(N/A)

Page 17: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

7

State Support Network—Technical Assistance Summary Report

Evaluation of Network Technical AssistanceThe Network places a strong emphasis on routinely evaluating its technical assistance activities to determine the extent to which Network technical assistance has been effective as well as to learn from successes and challenges to continuously improve the quality, relevance, and impact of the services provided. To assess and improve the effectiveness of technical assistance provided, the Network has developed a performance management system (described next) that has structures and processes in place to promote continuous improvement (see Appendix C). The overarching questions for evaluating Network technical assistance include the following:

� To what extent has Network technical assistance exhibited quality and relevance?

� To what extent has Network technical assistance been applied and had an impact?

� To what extent has the Network formed meaningful partnerships with a broad array of states, districts, and partners in the provision of technical assistance (i.e., succeeded in the recruitment and retention of states and districts for participation in technical assistance, the engagement of those states and districts, and the formation of meaningful partnerships with other technical assistance providers)?

Figure 5. Participation in ITA Technical Assistance Projects by State

Note. Figure 5 includes projects that took place between May 2016 and May 2018. Projects focused exclusively on districts are not represented in Figure 5.

1 ITA

22N/A

172 ITAs

83 ITAs

44 ITAs

2

District of Columbia (1)

Bureau of Indian Education (1)

Puerto Rico (1)

ME(2)

RI (2)

DE (1)

NY(1)

VT (N/A)

NH (1)

MA (4)

CT (N/A)NJ (3)

MD (N/A)

ND(N/A)

SD(4)

KS(N/A)

NE(2)

OK(2)

MN(2)

IA (N/A)

MO(1)

AR(3)

TN (2)NC (1)

LA(N/A)

FL(1)

MS(1)

AL(N/A)

GA(3)

WI(N/A)

IL(1)

KY (1)

IN(2)

OH(N/A)

WV(N/A)

PA(N/A)

VA(N/A)

SC(N/A)

MI(2)

TX(1)

WA(1)

MT(N/A)

OR(1) ID

(1)WY

(N/A)

UT(1)

CA(3)

NV(1)

CO(1)

AZ(1) NM

(1)

AK(N/A)

HI(1)

Page 18: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

8

State Support Network—Technical Assistance Summary Report

The Network primarily evaluates the quality, relevance, and impact of each technical assistance project using a standard set of survey items (see Appendix D) aligned to these performance management evaluation questions. Each project also has a set of defined objectives that project teams routinely revisit to assess their progress in meeting stated project goals. The extent to which project objectives are met is also assessed by the standard survey questions. Each of the evaluation domains is also assessed biannually (i.e., twice a year) through interviews for a subset of technical assistance projects using a protocol (see Appendix E) aligned to the performance management evaluation questions.

Purpose of This ReportThis report summarizes the progress and milestones of the Network’s technical assistance provided from May 2016 through May 2018, and it reviews available data regarding the quality, relevance, application and impact, and partnerships involved in the technical assistance provided during that period. The report will focus on the three types of technical assistance that comprised the majority of Network activities during this timeframe: CoPs, P2P exchanges, and ITA. Because of the small number of tools and products, and large state convenings between May 2016 and May 2018, these types of technical assistance will not be discussed in depth in this report.

The primary data sources for this report are post-event surveys (administered for almost all projects6) and follow-up interviews (conducted for select projects7) with technical assistance participants. These sources provide robust descriptive data regarding participant feedback; however, the data may not be wholly representative of all participant perspectives and should thus be interpreted with caution. In instances with small numbers of respondents and/or low response rates, survey data may not represent all participant viewpoints. Given that survey respondents may be systematically different from nonrespondents, it is also possible that the feedback provided is not representative of the participant groups as a whole for each survey. Interview data should also be interpreted with caution given that interviews were only conducted for a subset of projects. Therefore, feedback provided through interviews may not be representative of the perspectives of the participant groups as a whole.

Using the performance management evaluation questions as a guide, the report includes sections for each of the following: quality, relevance, application and impact, and partnerships. A discussion of project objectives is embedded within the section discussing application and impact. The first three sections begin with a spotlight on relevant data from post-event surveys followed by contextual information from follow-up interviews.8 The Partnerships section includes data on the percentage of technical assistance activities involving collaborations with other technical assistance providers, an essential component of the Network’s structure and approach to technical assistance. This report also has two case studies that offer concrete examples of the technical assistance provided to support states’ school improvement initiatives.

6 Fast-response projects and tools and products are not currently evaluated through post-event surveys.7 Interviews were conducted with 27 technical assistance participants, representing eight projects across the Network’s three technical assistance approaches, including three CoPs, two P2Ps, and three ITAs. To the extent possible, at least one technical assistance recipient was interviewed for every state and/or district that participated in each technical assistance project selected for interviews. Interviews took place the month after the end of May 2018.8 Interview data were analyzed using a codebook and NVivo coding software to determine key findings.

Page 19: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

9

State Support Network—Technical Assistance Summary Report

Quality Commitment to high-quality technical assistance is one of the key drivers of Network activities. This section reviews stakeholder feedback on the quality of the Network’s technical assistance, as evidenced by post-event survey responses and feedback from follow-up interviews.

Assessing QualityTo evaluate the quality of technical assistance, the Network considers the extent to which the content of technical assistance is based on research and best practice; the technical assistance is delivered by highly knowledgeable subject matter experts and facilitators; and the technical assistance uses effective pace, organization, communication strategies, and follow-up support.

Technical assistance participants responded to post-event surveys regarding the extent to which they agreed or disagreed (on a 4-point Likert scale of strongly disagree to strongly agree) with the following quality statements:

� The knowledge and skills of the presenters were appropriate for the goals of the technical assistance.

� I am satisfied with the overall quality of the technical assistance experience.

In addition to the Likert scale questions, post-event surveys also include the following open-ended question9 to solicit general feedback about technical assistance projects:

� What aspects of this event are most useful and relevant for your work, and why?

Participants also provided feedback through follow-up interviews, which include questions about each of the three criteria that the Network uses to assess quality (see Appendix E).

ResultsResponses to Likert quality items on the surveys were combined and averaged to create a composite score for “quality” that reflects the percentage of respondents who agreed or strongly agreed with the statements in these two survey items. Table 2 shows the percentage of respondents who agreed or strongly agreed with the survey items related to quality for Network technical assistance overall as well as by technical assistance approach for 2016–2017, 2017–2018, and 2016–2018.

9 Although the wording of this open-ended question maps most closely to the “relevance” domain, technical assistance participants often provided comments that mapped to the other domains of “quality” and “application and impact.” Relevant data from this open-ended question are therefore included in the sections on these topics as appropriate.

Page 20: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

10

State Support Network—Technical Assistance Summary Report

Table 3 displays the average rating of all Likert responses (strongly disagree, disagree, neither agree nor disagree,10 agree, strongly agree) to the two quality survey items on a 4-point scale.11

The results from the survey composite score for “quality” indicate that the percentage of participants who agreed or strongly agreed that Network presenters and overall technical assistance are of high quality has consistently remained above 90 percent, both for the Network overall and for each technical assistance approach. Additionally, the Network has consistently received an average quality rating above 3.2 out of 4 points from survey respondents. P2P exchanges and ITA (the task areas that had survey data for both

10 This response option was only offered for select surveys.11 Average quality ratings were derived by assigning numeric values to each Likert response (1 = Strongly disagree, 2 = Disagree, 2.5 = Neither agree nor disagree, 3 = Agree, 4 = Strongly agree) and then calculating the average value of the sum of the responses.

Quality Rating 2016–2017 2017–2018 2016–2018

Overall 97%(109/112)

93%(241/258)

95%(350/370)

CoP n/aa 93%(146/157)

93%(146/157)

P2P 95%(63/66)

92%(44/48)

94%(107/114)

ITA 100%(46/46)

96%(51/53)

98%(97/99)

Quality Rating 2016–2017 2017–2018 2016–2018

Overall 3.63 3.35 3.49

CoP n/aa 3.47 3.47

P2P 3.54 3.26 3.40

ITA 3.71 3.31 3.51

Table 2. Percentage of Participants Who Agreed or Strongly Agreed With Quality Composite Items

Table 3. Average Quality Ratings of Network Technical Assistance

Note. The text in parentheses corresponds to the number of Agree and Strongly agree responses, followed by the total number of responses for survey items. a Although several CoPs began in 2016–2017, the projects did not conclude until the 2017–2018 timeframe. CoP endpoint survey data were therefore unavailable for 2016–2017 reporting.

a Although several CoPs began in 2016–2017, the projects did not conclude until the 2017–2018 timeframe. CoP endpoint survey data were therefore unavailable for 2016–2017 reporting.

Page 21: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

11

State Support Network—Technical Assistance Summary Report

2016–2017 and 2017–2018) experienced a slight drop in the two survey metrics between 2016–2017 and 2017–2018 (three and four percent, respectively, for the “Percent agree and strongly agree” and 0.28 and 0.40 points, respectively, for the average rating).

Open-ended responses across surveys for CoPs, P2Ps, and ITAs referenced the quality of the Network’s technical assistance. Six participants provided specific comments to the open-ended survey question about the quality of the information presented in technical assistance. 16 participants commented on the expertise of the subject matter experts and facilitators. Example responses include the following:

� “The process was efficient, the content was relevant, and the overall achievement was proficient.”

� “Facilitation was clear and direct conversations were honest. We get so busy ‘doing’ the work that [it is helpful] to step back from it and take a broad perspective and then determine what’s missing and strategize next steps.”

� “Assistance from the trainers was great! They thought of different angles for us to consider, which was very helpful.”

Participants in follow-up interviews provided positive feedback regarding the quality of Network technical assistance. A majority of interviewees (16 of 27) indicated that the content of the technical assistance was grounded in research and best practice. Almost all interviewees (22 of 27) reported that presenters were highly knowledgeable about the topics of their respective technical assistance projects. Of the 22 interviewees who commented on the quality of Network presenters, seven interviewees commented on presenters’ abilities to translate abstract concepts into tangible examples, either by leveraging their own experiences in state or district contexts, or by involving other experienced practitioners in the discussions. The following quotations are examples of these comments:

� “…provided very good context and reference to the resources they were using. It was very clear, these were content area experts and they relied on tested and proven methods for supporting the states... They were practitioners too. I specifically appreciated [that they] provided real life context to the work we were trying to accomplish.”

� “…if we were discussing an area that wasn’t an expertise of the facilitators, they would bring in staff or others who were experts in a specific area.”

With regard to logistical components of the technical assistance (pace, organization, communication strategies), interviewee comments included reports that projects were well paced, organized, and presented in a clear manner. For example, 17 interviewees indicated that they were satisfied with the pace of the technical assistance, 20 were pleased with the organization, and five were satisfied with the communication strategies. Respondents also offered suggestions for enhancing logistical aspects of future technical assistance projects. For example, some interviewees noted that the timing (either the time of year, time of day, or spacing of sessions) of their technical assistance projects made it challenging for them to fully engage. Example responses include the following:

� “At times I just felt like the burden was on the states. That’s not to say it shouldn’t be, please understand. I think the time frame is what influenced that it felt like we were pulling it along. Had we had more time in between each meeting we would have had more time to work on it.”

Page 22: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

12

State Support Network—Technical Assistance Summary Report

� “We’re probably 60-hour employees on a regular basis. It’s setting priorities or either extending your work day. So, it is time intensive. I would say the outcome outweighs the time commitment. However, I just need to acknowledge that it does create additional work for the team.”

� “We get lots of offers for help. The thing is if to help us requires us to spend time doing more work, that generally isn’t helpful. However, from the agency’s perspective, how can you know what every state is doing if the states can’t take the time to explain it? So, I see the value in it. I think part of it was probably dependent on the size of your state and the size of your team. We have a very small team here, so I think it was valuable, but it was it was also difficult and hard. I would’ve preferred time on the call to say, ‘can you summarize this quickly,’ or understanding that was going to be the expectations. Having to complete documents beforehand is difficult.”

RelevanceThis section reviews stakeholder feedback on the relevance of the Network’s technical assistance, as evidenced by post-event survey responses and feedback from follow-up interviews.

Assessing RelevanceTo evaluate the relevance of its technical assistance, the Network assesses the extent to which technical assistance projects address state and district needs and priorities; have clear potential for direct application to state and district priorities; and appropriately match participants’ knowledge and resources.

Participants responded to post-event surveys regarding the extent to which they disagreed or agreed (on a Likert scale) with the following relevance statements:

� The technical assistance provided meets the specific needs of my project, office, or agency.

� The information and resources provided are appropriate for my level of experience and knowledge.

Post-event surveys also included the following open-ended question to solicit general feedback about technical assistance projects:

� What aspects of this event are most useful and relevant for your work, and why?

Participants also provided feedback through follow-up interviews, which include questions about each of the three criteria that the Network uses to assess relevance (see Appendix E).

ResultsThe responses to Likert relevance items on the surveys were combined and averaged to create a composite score for “relevance” that reflects the percentage of respondents who agreed or strongly agreed with the statements in these two survey items. Table 4 shows the percentage of respondents who agreed or strongly agreed with the survey items related to relevance for the Network overall as well as by technical assistance approach for 2017–2018.

Page 23: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

13

State Support Network—Technical Assistance Summary Report

Table 5 displays the average rating of all Likert responses (strongly disagree, disagree, neither agree nor disagree,12 agree, strongly agree) to the two relevance survey items on a 4-point scale.13

12 This response option was only offered for select surveys.13 Average relevance ratings were derived by assigning numeric values to each Likert response (1 = Strongly disagree, 2 = Disagree, 2.5 = Neither agree nor disagree, 3 = Agree, 4 = Strongly agree) and then calculating the average value of the sum of the responses.

Relevance Rating 2016–2017 2017–2018 2016–2018

Overall 100%(14/14)

89%(225/252)

90%(239/266)

CoP n/aa 90%(141/156)

90%(141/156)

P2P 100%(14/14)

86%(37/43)

89%(51/57)

ITA n/ab 89%(47/53)

89%(47/53)

Relevance Rating 2016–2017 2017–2018 2016–2018

Overall 3.71 3.16 3.44

CoP n/aa 3.33 3.33

P2P 3.71 3.06 3.39

ITA n/ab 3.11 3.11

Table 4. Percentage of Participants Who Agreed or Strongly Agreed With Relevance Composite Items

Table 5. Average Relevance Ratings of Network Technical Assistance

Note. The text in parentheses corresponds to the number of Agree and Strongly agree responses, followed by the total number of responses for survey items. a Although several CoPs began in 2016–2017, the projects did not conclude until the 2017–2018 timeframe. CoP endpoint survey data were therefore unavailable for 2016–2017 reporting.b Although endpoint survey data were collected for ITAs during the 2016–2017 timeframe, data were not available for the composite relevance survey items or comparable relevance survey items. ITA endpoint survey data for the relevance domain were therefore unavailable for 2016–2017 reporting.

a Although several CoPs began in 2016–2017, the projects did not conclude until the 2017–2018 timeframe. CoP endpoint survey data were therefore unavailable for 2016–2017 reporting.b Although endpoint survey data were collected for ITAs during the 2016–2017 timeframe, data were not available for the composite relevance survey items or comparable relevance survey items. ITA endpoint survey data for the relevance domain were therefore unavailable for 2016–20 17 reporting.

Page 24: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

14

State Support Network—Technical Assistance Summary Report

The results from the survey composite score for “relevance” indicate that more than 85 percent of participants agreed or strongly agreed that Network technical assistance met their specific needs and was appropriate for their level of experience and knowledge, both for the Network overall and for each technical assistance approach. Additionally, the average relevance rating was consistently above 3 out of 4 points for technical assistance overall and for all technical assistance approaches. However, P2P exchanges (the single task area that had survey data for both 2016–2017 and 2017–2018) experienced a drop in the two survey metrics between 2016–2017 and 2017–2018 (14 percent for the “Percent agree and strongly agree” and 0.65 points for the average rating). Participant interviews (feedback described later in this section) offered a potential explanation for this trend.

Open-ended responses across surveys for CoPs, P2Ps, and ITAs referenced that the technical assistance has relevant application to state and district priorities. Responses include the following:

� “The subject presents ‘heavy lift’ for SEAs and LEAs and it requires a huge shift in policy, practice, and requirements for administration. There are still so many unanswered questions that many of us have, but not as a result of the CoP. [The presenter] did a great job with a diverse group who are still trying to understand and get a good grip on the work ahead.”

� “Most useful and relevant were the ability to have immediate feedback and input into the issues that surrounded the implementation of our state’s needs assessment. The challenges faced were real and having access to timely feedback and input was great!”

� “The sessions were overall supportive and useful! The sessions prompted us to consider putting on an equity lab to support LEAs in completing LEA equity plans. We have used the written tools we’ve learned about during the sessions.…The sessions asked participants to be actively engaged throughout (e.g., doing pre-work beforehand, testing out ‘elevator pitches’ during) and this pushed our thinking about the work.”

Follow-up interviews revealed a more nuanced view of technical assistance participants’ perception of the relevance of the Network’s projects, reflecting different perceptions of relevance based on the prior experience and current context in participating states. Some states and districts that self-identified as more experienced viewed the technical assistance to be less relevant to their current level of knowledge and experience with a topic. Other states and districts found the Network’s technical assistance to be well suited to their level of experience and knowledge. The following participant responses reflect these different experiences:

� “I would say we were a little further ahead. Some of the content, for us, was further behind than we had hoped it would be. It didn’t mean we didn’t want to participate in it. Again, it offered us a little reflection on how to address some of our challenges that we were facing and how maybe to go about addressing some of those challenges.…”

� “Because everyone was at a different spot…I love the activities, I love the interaction, but it’s difficult to do an activity related to content when the content is not the same for everyone. That was a little bit challenging…some of the states might have been struggling trying to participate because they didn’t have the level of context yet.”

� “Being the newbie to things, I never felt lost. I felt like [the presenters] were able to start where people were and bring them along from there so that they weren’t talking past us or assuming we had prior knowledge that we didn’t that would have prevented us from engaging in the networking and the learning so I think they did a really good job of that.”

Page 25: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

15

State Support Network—Technical Assistance Summary Report

Relatedly, a number of interviewees commented on the timeliness of the support they received from the Network. Some technical assistance participants had a need to quickly learn about a new topic for which they had little to no prior knowledge. Participant comments include the following:

� “I am still very new at this and I won’t know what I’m doing until I am doing it, and going through it. I felt that [the technical assistance was] able to meet me where I was and it was immediately useful and applicable to what I was trying to do.”

� “It was beneficial for us to participate in the Network because we could get beneficial and meaningful feedback on our agenda and some of the materials we were using. We were also able to get ideas from the other states that were participating. So, [it] was a beneficial project for us to participate in.”

Application and ImpactThe Network places a strong emphasis on evaluating the application and impact of its work in relation to the objectives and expected outcomes for technical assistance projects. This section reviews stakeholder feedback on the application and impact of the Network’s technical assistance, as evidenced by post-event survey responses and feedback from follow-up interviews.

Assessing Application and Impact To evaluate the application and impact of its technical assistance, the Network considers the extent to which participants express an intent to use or share information, content, tools, and resources presented in the technical assistance; actually use the information, content, tools, and resources presented in the technical assistance in their own contexts; develop increased capacity for their professional roles, directly or indirectly, due to participation in the technical assistance; and associate the technical assistance with desired changes in other targeted organizational initiatives.

Technical assistance participants responded to post-event surveys regarding the extent to which they disagreed or agreed (on a Likert scale) with the following statement:

� I will share the knowledge and skills I learned in the technical assistance with others.

In addition to the Likert scale question, post-event surveys also include the following open-ended question to solicit general feedback about technical assistance projects:

� What aspects of this event are most useful and relevant for your work, and why?

Participants also provided feedback through follow-up interviews, which include questions about each of the four criteria that the Network uses to assess application and impact (see Appendix E).

Page 26: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

16

State Support Network—Technical Assistance Summary Report

The second table displays the average rating of all Likert responses (strongly disagree, disagree, neither agree nor disagree,14 agree, strongly agree) to the application and impact survey item on a 4-point scale.15

14 This response option was only offered for select surveys.15 Average application and impact ratings were derived by assigning numeric values to each Likert response (1 = Strongly disagree, 2 = Disagree, 2.5 = Neither agree nor disagree, 3 = Agree, 4 = Strongly agree) and then calculating the average value of the sum of the responses.

Application and Impact Rating 2016–2017 2017–2018 2016–2018

Overall 83%(29/35)

91%(112/123)

89%(141/158)

CoP n/aa 91%(72/79)

91%(72/79)

P2P 85%(11/13)

82%(18/22)

83%(29/35)

ITA 82%(18/22)

100%(22/22)

91%(40/44)

Application and Impact Rating 2016–2017 2017–2018 2016–2018

Overall 3.35 3.28 3.32

CoP n/aa 3.43 3.43

P2P 3.29 3.00 3.15

ITA 3.41 3.42 3.42

Table 6. Percentage of Participants Who Agreed or Strongly Agreed With Application and Impact Item

Table 7. Average Application and Impact Ratings of Network Technical Assistance

Note. The text in parentheses corresponds to the number of Agree and Strongly agree responses, followed by the total number of responses for survey items. a Although several CoPs began in 2016–2017, the projects did not conclude until the 2017–2018 timeframe. CoP endpoint survey data were therefore unavailable for 2016–2017 reporting.

a Although several CoPs began in 2016–2017, the projects did not conclude until the 2017–2018 timeframe. CoP endpoint survey data were therefore unavailable for 2016–2017 reporting.

ResultsTable 6 shows the percentage of respondents who agreed or strongly agreed with the Likert survey item related to application and impact for Network technical assistance overall as well as by technical assistance approach for 2016–2017, 2017–2018, and 2016–2018.

Page 27: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

17

State Support Network—Technical Assistance Summary Report

The results from the “application and impact” survey item indicate that more than 80 percent of participants agreed or strongly agreed that they would share the knowledge and skills learned in Network technical assistance with others, both for the Network overall and for each technical assistance approach. Additionally, the average application and impact rating was 3 points or higher for technical assistance overall and for all technical assistance approaches. P2P exchanges (a task area that had survey data for both 2016–2017 and 2017–2018) experienced a slight drop in the two survey metrics between 2016–2017 and 2017–2018 (three percent for the “Percent agree and strongly agree” and 0.29 points for the average rating). Conversely, ITA (the other task area that had survey data for both 2016–2017 and 2017–2018) experienced a substantial increase in the “Percent agree and strongly agree” survey metric (18 percent) as well as a very slight increase in the average rating survey metric (0.01 points) between 2016–2017 and 2017–2018.

Several responses to the open-ended question across surveys for CoPs, P2Ps, and ITAs referenced the Network’s application and impact criteria. Responses include the following:

� “I lead strategic planning and work with district leaders to define quantitative measures for our priorities. Given the importance of equity, this will help with recommendations for both priority strategies and measures for rethinking our allocation methodology.”

� “The process helped to identify the largest project areas we are already working on and which parts we want to build upon with ESSA.”

Four comments to the same open-ended survey question described how participants’ capacity increased as a result of participating in technical assistance, as follows:

� “Helped key members organize plan to implement initiative.”

� “The different processes to design and develop the content as well as [to] conceptualize how implementation might work.”

� “Strengthening connections between rural educational leaders. Bigger and stronger networks of support are important to sustaining change in our systems.”

� “The identification of competencies in both my strengths and weaknesses. I will be able to grow and apply areas as needed.”

Follow-up interviews provided additional context for understanding the extent to which Network technical assistance was applied by and had the intended impact for participants. 11 participants provided specific examples of how they used the information, content, tools, and resources presented in the technical assistance in their own contexts. Respondents submitted the following examples:

� “[The technical assistance] did help give us some insight to our how schools are funded and what some of the differences are, and what some of the things we could do to change, to better support our higher needs schools…Actually, we’ve applied [the information] on two fronts, we’ve really applied it on the financial and the non-financial fronts. On the financial front, we’ve actually changed some of our resource allocation models and funding formulas to our schools from both a staffing and M-cost or Material Supplies and Operating costs, and so we’ve actually changed some of our funding models and how we’re funding various schools based on those things.”

Page 28: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

18

State Support Network—Technical Assistance Summary Report

� “I think things did change a bit in terms of what we first thought we were doing and what we actually did. In terms of our larger goal…we needed to conduct a formal well-planned needs assessment for [a grant]. With the uncertainty…if that grant would be available, that changed. That shifted our focus a little bit because we may not have had the urgency we had originally with finishing this up earlier instead of later…I think our end goal has only morphed in terms of, we’re not just doing this for a federal grant opportunity, we’re doing this for whatever we really need it for in terms of assessing the statewide…needs.”

Five respondents also described how participation in technical assistance increased their capacity for their professional roles, directly and indirectly. Comments include the following:

� “…it was the affirmation that where we were headed was right. Getting that from our colleagues in other states and folks leading the [project] was helpful. We’ve worked through our guidance for this year; we did hook on to a deeper focus on how we were communicating and tried to streamline that as best we could. That was the benefit.”

� “What I took away, at the time when we were doing the webinars, the content was pertinent, because this was our first year identifying teacher equity gaps under ESSA.… Since the webinar, I have developed and managed the efforts to do [this work].… I would say we are a great example of that peer to peer being a huge capacity build lift because I went to the webinar and walked away and said, ‘We should do it,’ and now we’re doing it.”

13 interviewees also described ways in which Network technical assistance contributed to desired changes in practice, most notably by helping them to reflect on their work and rethink whether their current strategies are the most effective in light of what they learned about the research and best practices for their respective topics.

The following quotation is an example:

� “The document gave us good, concrete ideas and/or direction to move in some places where we knew our monitoring tool was lacking on the secondary side. We wanted to make sure we gave high schools good information about the systems and structures they need to put in place—not just the “what,” but the “what it should look like,” over their overall turnaround journey is essential to our rubric on a continuum. Just focusing on college career readiness is not enough. You need it to be a certain level of implementation to be successful. The…team took recommendations and made changes to the protocol to ask better questions relative to some of the findings of this report for all schools—not just the secondary. That was great. It didn’t just benefit our secondary work, it benefitted all our schools in several ways.”

In a few cases, these changes in practice carried over into other targeted organizational initiatives that were not directly tied to the focus of the technical assistance. Two interviewees who learned about key messaging strategies for one topic reported that this knowledge enhanced their overall capacity to clearly communicate with stakeholders in their states about a variety of different initiatives.

Page 29: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

19

State Support Network—Technical Assistance Summary Report

The following quotation is an example:

� “I think we went into the project thinking that the methodology and reporting and accountability part of it was the most important part, but…we talked not only about the mechanics of how the transparency work needed to be done but also about the communication and public relations that would go along with it. So it wasn’t necessarily just about getting the number right but creating some kind of communications infrastructure that enables you to talk about what a number means vs. what it doesn’t mean. Because it would be very easy for someone here who doesn’t work in the finance area in education to jump to all kinds of conclusions about what the transparency numbers are going to look like and what they mean about whether schools are being effective or not effective stewards of tax dollars. So we spent a lot of time talking about how the communications piece is just as important as the technical aspects, potentially even more impactful.”

Project ObjectivesAnother strategy that the Network uses to assess the application and impact of its technical assistance is the extent to which objectives are met for each individual project. These objectives are designed to address the specific needs of participating states and districts.

Assessing Project ObjectivesTo evaluate the Network’s project objectives, the Network assesses the extent to which each objective for a technical assistance project is met.

Technical assistance participants responded to post-event surveys regarding the extent to which they disagreed or agreed (on a Likert scale) with the following statement:

� Please indicate your agreement with the following statements about the objectives of the [Event]. As a result of the [Event], participants can [Insert objective].

Participants also provided feedback through follow-up interviews, which include questions about each of the objectives for a given Network project (see Appendix E).

ResultsResponses to project objective survey questions were combined and averaged to create a composite score for “objectives” that reflects the average percentage of respondents who agreed or strongly agreed that the project objectives were met. Table 8 shows the percentage of respondents who agreed or strongly agreed that project objectives were met for Network technical assistance overall as well as by technical assistance approach for 2016–2017, 2017–2018, and 2016–2018.

Page 30: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

20

State Support Network—Technical Assistance Summary Report

Table 9 displays the average rating of all Likert responses (strongly disagree, disagree, neither agree nor disagree,16 agree, strongly agree) to the objectives survey item on a 4-point scale.17 P2P exchanges and ITA (the task areas that had survey data for both 2016–2017 and 2017–2018) experienced a drop in the two survey metrics between 2016–2017 and 2017–2018 (six and 11 percent, respectively, for the “Percent agree and strongly agree” and 0.26 and 0.46 points, respectively, for the average rating).

16 This response option was only offered for select surveys.17 Average objectives met ratings were derived by assigning numeric values to each Likert response (1 = Strongly disagree, 2 = Disagree, 2.5 = Neither agree nor disagree, 3 = Agree, 4 = Strongly agree) and then calculating the average value of the sum of the responses.

Rating for Objectives Met 2016–2017 2017–2018 2016–2018

Overall 87%(33/38)

83%(500/601)

83%(533/639)

CoP n/aa 86%(300/348)

86%(300/348)

P2P 86%(25/29)

80%(99/123)

82%(124/152)

ITA 89%(8/9)

78%(101/130)

78%(109/139)

Table 8. Percentage of Participants Who Agreed or Strongly Agreed That Objectives Were Met

Note. The text in parentheses corresponds to the number of Agree and Strongly agree responses, followed by the total number of responses for survey items. a Although several CoPs began in 2016–2017, the projects did not conclude until the 2017–2018 timeframe. CoP endpoint survey data were therefore unavailable for 2016–2017 reporting.

Objectives Met Rating 2016–2017 2017–2018 2016–2018

Overall 3.34 3.05 3.19

CoP n/aa 3.19 3.19

P2P 3.29 3.03 3.16

ITA 3.39 2.93 3.16

Table 9. Average Objectives Met Ratings of Network Technical Assistance

a Although several CoPs began in 2016–2017, the projects did not conclude until the 2017–2018 timeframe. CoP endpoint survey data were therefore unavailable for 2016–2017 reporting.

Page 31: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

21

State Support Network—Technical Assistance Summary Report

The 27 technical assistance participants who responded to follow-up interviews largely agreed that objectives for their projects were met; 15 participants reported that all the objectives for their technical assistance project were met and six participants indicated that the objectives for their technical assistance projects were partially met. The 12 interviewees across technical assistance approaches who indicated that project objectives were either partially met or not met offered one of two explanations. First, several respondents participating in technical assistance involving multiple states stated that, after beginning a technical assistance project, they were farther along in the work than their peers; therefore, these respondents indicated that the objectives were not as applicable to them for the duration of the technical assistance.

The following quotation is an example:

� “The issue wasn’t something on [the Network] side. For me it has to do with where we’re at. We’ve been doing [this work] for two years already.… We were very satisfied with how the [work was] working. That made what the peer to peer we were trying to do with other states less relevant.… That’s why I felt out of step with everybody else. That’s totally [my state’s] issue. It has nothing to do with what you all provided and put out there. We were just in a different place than would match here.”

Respondents also cited changes within their own context (for example, leadership, staffing, or political changes) that limited their capacity to engage in the work as anticipated or altered their expectations for the outcomes of the technical assistance. Participant responses include the following:

� “We initially started out reviewing [the guidance and materials], we did review them, [but] the person leading this work was no longer here so we didn’t follow through with implementing them.”

� “The state is in huge flux right now.… That is all coming into play right now which is going to further hinder the first objective. I think we talked about that quite a bit but I don’t know that we got through all of it. Part of it is because our state is in such flux right? Part of that is on us, on our state legislators.”

PartnershipsPart of the Network’s overall design is to engage a broad spectrum of experts and technical assistance providers in Network technical assistance to build efficiencies and reduce duplication of effort and to maximize the impact of technical assistance. Partners include official Network partners, including the lead contractor (American Institutes for Research) and partner organizations (including Synergy, Battelle for Kids, and Pivot Learning Partners), contracted subject matter experts, and outside technical assistance providers (including federally funded centers and national organizations that bring external resources to support shared objectives). For evaluation purposes, the Network tracks the percentage of activities (overall and by task) that involve meaningful partnerships with external technical assistance providers.

Page 32: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

22

State Support Network—Technical Assistance Summary Report

The Network defines meaningful partnership as the involvement of an external organization in one or more of the following roles within the technical assistance activity:

� Expert presenter or advisor (virtual or in person)

� Quality assurance reviewer

� Planner/designer (or co-planner/co-designer)

� Facilitator (or co-facilitator)

At a minimum, the following characteristics must be present:

� Defined roles and responsibilities

� Agreement on intended outcomes of the partnership

� Mutual contribution of resources (i.e., the Network is not compensating the partner organization for its participation)

The meaningful partnership criteria emerged from a commitment to leveraging resources and building collaborations across technical assistance providers (including federally funded technical assistance providers). The Network began collecting baseline data for the meaningful partnership performance indicator at the end of the 2016–2017 timeframe. Table 10 shows the percentage of Network technical assistance activities18 that involve meaningful partners for 2016–2017 and 2017–2018.

18 To capture the depth of involvement of technical assistance partners, the meaningful partnership percentages are calculated based on the technical assistance activity (e.g., a CoP learning cycle) as opposed to the overall technical assistance project (the CoP). This differs from other calculations throughout the report, which are typically reported at the project level.

Task

Percentage of Activities Involving Meaningful Partnership

2016–2017 2017–2018

Overall 44%(28/64)

30%(42/138)

CoP 33%(3/9)

27%(15/56)

P2P 38%(5/13)

58%(7/12)

ITA 48%(20/42)

29%(20/70)

Table 10. Percentage of Activities Involving Meaningful Partnership

Note. The values in parentheses correspond to the number of activities involving meaningful partnership followed by the number of activities in the reporting period.

Page 33: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

23

State Support Network—Technical Assistance Summary Report

Additionally, the Network has worked with multiple organizations that, while they do not meet the Network’s meaningful partnership definition, have played a critical role in providing subject matter expertise and participating in technical assistance projects. Between May 2016 and May 2018, the Network worked with federal providers including regional educational laboratories, national and regional comprehensive centers,19 and centers funded by the Office of Special Education Programs; national organizations including membership organizations like the Council of Chief State School Officers; and nonprofit organizations with missions that are aligned with Network technical assistance priorities. See Appendix F for a list of Network partners and organizations contributing expertise to Network technical assistance.

Case StudiesThis section presents two case studies of projects from two of the Network’s primary technical assistance approaches. The purpose of these cases studies is to provide specific examples of how the Network has supported states with school improvement initiatives as well as how states have begun to implement knowledge and skills acquired from Network technical assistance.

CoP: Implementing Needs Assessment

Supporting States in Designing and Implementing Comprehensive Needs Assessment Protocols

The ScenarioThe Implementing Needs Assessment CoP supported SEAs in adopting needs assessment protocols that were grounded in effective practice, responsive to individual state context and needs, and aligned to the requirements of ESEA in their states. The four participating states had begun the rollout, pilot, or implementation phase of these needs assessments and had expressed interest in collaborating with peers and subject matter experts around their needs assessment initiatives. These efforts were motivated by Title I, Part A requirements under the ESEA, which stipulate that schools identified for comprehensive support and improvement must conduct a needs assessment to inform the development of a school improvement plan (ESEA §1111(d)(1)(B)(iii)).

Needs assessments are also cited in the broader context of comprehensive schoolwide programs (ESEA §1114(b)(1)(A)). Under Title I, Part A SEAs may use Title I funds to support comprehensive schoolwide reforms if the program satisfies certain criteria (ESEA §1114(b)(1)(A)). As mentioned above, one precondition for schoolwide reforms is that schools leverage a comprehensive needs assessment protocol. This type of protocol “[helps] local stakeholders and system leaders understand how the pieces of a complex educational system interact...[and] can uncover both strengths and challenges that will inform growth and improvement.”20 Benefits of enacting schoolwide reforms include greater ability to serve all students (without the need to identify eligible students) and the flexibility to consolidate federal, state, and local funds and enlarge the pool of resources available for students.

19 Please visit https://compcenternetwork.org/ for additional information about the national and regional comprehensive centers.20 Cuiccio, C., and Husby-Slater, M. 2018. Needs Assessment Guidebook: Supporting the Development of District and School Needs Assessments (Tech.). Washington, DC: State Support Network. https://statesupportnetwork.ed.gov/system/files/needsassessmentguidebook-508_003.pdf.

Page 34: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

24

State Support Network—Technical Assistance Summary Report

The Network’s SupportThe CoP launched in July 2017 and held seven virtual sessions before concluding in February 2018. The states that participated in this CoP were Arizona, Georgia, Ohio, and South Dakota. Five objectives were collaboratively developed by the Network and OESE to shape the work of the CoP, including that participating states would:

1. Refine and continue to develop their state-specific plans for implementing their comprehensive needs-assessment protocols statewide. Areas of focus included communications, training, tracking and using needs assessment data over time, and using implementation feedback to refine the process.

2. Include multiple SEA and local stakeholders in the needs assessment, with the understanding that participatory decision-making would break down silos, streamline data collection, and move states from a focus on compliance to a focus on continuous improvement.

3. Define flexibilities in the process and provide a forum in which states could troubleshoot implementation issues, thereby ensuring the needs assessment process was responsive and sufficiently tailored to local context.

4. Clearly define the roles of the district staff, SEA coaches, and other technical assistance providers supporting the needs assessment process.

In addition to these objectives, each virtual session, or learning cycle, was guided by session-specific objectives that were tied to the broader objectives of the CoP. The learning cycles were facilitated by Network staff with expertise in needs assessments. The Network also provided an online portal to facilitate discussion, resource sharing, and interactions among CoP participants in between monthly virtual meetings. All of these supports were informed by the Needs Assessment Modules developed prior to this CoP, which helped ground the progression and facilitation of the CoP.

ResultsThe Network performance management team conducted a follow-up survey and interviews with the states that participated in this CoP. Three participants responded to the survey and five participants provided feedback through interviews. All four states that participated in the CoP were represented in the feedback provided through interviews.

The follow-up survey provided participant feedback on key indicators mapped to the evaluation domains of quality, relevance, application and impact, and project objectives. Table 11 displays the survey results, condensed into these four domains.

Page 35: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

25

State Support Network—Technical Assistance Summary Report

Evaluation Domain Relevant Survey Items

Percentage Agree or Strongly Agree

Average Rating

Quality � The knowledge and skills of the presenters were appropriate for the goals of the technical assistance.

� I am satisfied with the overall quality of the technical assistance experience.

100%(6/6)

4.00

Relevance � The technical assistance provided meets the specific needs of my project, office, or agency.

� The information and resources provided are appropriate for my level of experience and knowledge.

100%(6/6)

4.00

Application and Impact

� I will share the knowledge and skills I learned in the technical assistance with others.

100%(3/3)

4.00

Project Objectives

� Please indicate your agreement with the following statements about the objectives of the CoP. As a result of the CoP, participants can:

100%(17/17)

3.42

a. Continue developing and refining our state-specific plan for implementing a comprehensive needs assessment protocol statewide

100%(3/3)

3.33

b. Engage multiple stakeholders at the SEA level to streamline data collection, and move from compliance toward continuous improvement

100%(3/3)

3.33

c. Troubleshoot local implementation to ensure the needs assessment process is responsive to local context and achieves desired outcomes

100%(3/3)

3.67

d. Define essential flexibilities in the implementation process that will permit customization at the local level

100%(2/2)

3.50

e. Clearly define how role of the district, SEA coaches and other technical assistance providers supporting the needs assessment process have impacted (and/or should impact) implementation

100%(3/3)

3.33

f.  Formally capture lessons learned from Year 1 implementation across all levels (school, district, SEA) to improve the process overall

100%(3/3)

3.33

Table 11. Survey Results From CoP Implementing Needs Assessment

Note. The data for the “Percentage agree and strongly agree” column were derived by dividing the number of Agree and Strongly agree responses by the total number of responses for survey items. “Average Rating” was calculated by assigning numeric values to each Likert response (1 = Strongly disagree, 2 = Disagree, 2.5 = Neither agree nor disagree, 3 = Agree, 4 = Strongly agree) and then calculating the average value of the sum of the responses.

Page 36: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

26

State Support Network—Technical Assistance Summary Report

Survey respondents positively rated the overall quality, relevance, application and impact, and extent to which project objectives were met for this CoP. All survey respondents (100 percent) agreed or strongly agreed that they were satisfied with the four domains for evaluating the experience. The average rating of all Likert responses (strongly disagree, disagree, neither agree nor disagree,21 agree, strongly agree) to the quality, relevance, and application and impact survey items on a 4-point scale was 4.00. The average rating of all Likert responses to the objectives survey item was 3.42.

Interview data also speak to the CoP’s quality, relevance, application and impact, and project objectives. When asked whether the technical assistance delivered through the CoP met state needs, three interview participants described ways in which the CoP catered to priorities of their states. For example, one interviewee noted the following:

“…[I]t was helpful to see that some of our challenges were not only unique to us. They were also faced by other districts or in other SEAs. So, it was good for us to hear that. It also gave us some priorities for other things that may not have been a priority for us, in terms of how to communicate [the needs assessment] to larger stakeholders that were a priority for other SEAs. So it allowed us to shift some of our thinking to make sure we communicated to a larger audience in terms of stakeholders. That was one of our priorities: [communicating] to a broader audience so that they understand the larger impact of where a comprehensive needs assessment could go for a school if implemented well.”

Interviews also gauged how well the CoP content matched states’ individual circumstances. States often fall along a continuum in their readiness to engage with CoP topics, which can make it challenging to differentiate support. Although all survey takers strongly agreed that the information and resources provided were appropriate for their level of experience and knowledge, interview data revealed more nuanced perspectives. One participant said, “I sometimes felt...intimidated by the level of expertise present in [the virtual meetings] and felt like I was probably way behind everyone else.” Conversely, another interviewee noted, “Some of the content for us, was further behind than we had hoped it would be.”

With regard to objectives, four of five interviewees indicated that the CoP met its first objective, which was to support states in refining their plan for implementing a needs assessment protocol. Interviewees unanimously agreed that the CoP met its second objective, which was to help states engage multiple stakeholders to streamline data collection and shift towards a continuous improvement mind-set. Three of four interviewed states indicated that the CoP met its third objective, which was to give states an opportunity to troubleshoot implementation issues and ensure their needs assessment process was responsive to local context. Three of four interviewed states also agreed that the CoP helped them define essential flexibilities in the implementation process to permit customization at the local level. Participant evaluations of whether the CoP met its fifth objective to capture lessons learned from Year 1 implementation were mixed; half of the interviewed states reported that the CoP met this objective.

Application of Lessons LearnedThe Implementing Needs Assessment CoP contributed to the Network’s substantial, interconnected body of work focused on needs assessment. For example, the experiences of this CoP (as well as the Scaling Needs Assessment CoP that ran concurrently to address the needs of states who were in the preliminary

21 This response option was only offered for select surveys.

Page 37: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

27

State Support Network—Technical Assistance Summary Report

stages of needs assessment development) informed components of the Needs Assessment Guidebook22 that the Network published in June 2018. This resource describes the elements and implementation phases of an effective needs assessment process. Another resource that grew out of the Implementing Needs Assessment CoP was a series of blogs with embedded audio recordings on the topic of singular statewide needs assessments, featuring interviews with CoP participants from Arizona and Georgia. All states who participated in the Implementing Needs Assessment CoP expressed an interest in understanding other states’ experiences with the singular statewide approach and directly informed the content and format of the blog series; specifically, audio recordings were incorporated in the format in response to CoP participants’ desire for the information to be presented as stories detailing the development of other states’ singular statewide needs assessments rather than descriptions of different elements of the process. The Implementing Needs Assessment CoP also led to the development of Network ITA support around needs assessments, including projects with Arkansas, California, Florida, South Dakota, and the Bureau of Indian Education. As a result of their experiences with the Implementing Needs Assessment CoP, Network staff approached these ITA projects with a deeper understanding of how to support states in developing a vision for their needs assessment efforts and structure technical assistance accordingly to achieve those outcomes. Additionally, the Network began developing a related CoP focused on root cause analysis, an area cited by Implementing Needs Assessment CoP participants for follow-up support since many states consider this component to be the last step in their comprehensive needs assessment process.

P2P: Equity Lab Series

Improving Access to Excellent Educators: Designing and Implementing Equity Labs to Support District Equity Plans

The ScenarioSEAs are charged with developing educator equity plans to improve access to excellent educators and supporting LEAs in implementing those plans.23 Equity labs are one strategy that SEAs can use to support LEAs in developing local equity plans. The Equitable Access Support Network defined equity labs as “state-led convenings of district leaders and stakeholders designed to give SEA staff the opportunity to share the purpose of state equity plans, collect feedback on state-level strategies, facilitate district-level equity planning and provide districts access to critical friends and a network of colleagues for planning and implementation.”24 At the time this P2P exchange series was proposed, a few states (e.g., Connecticut, Mississippi, Missouri, and Ohio) had hosted equity labs, but several others expressed interest in learning from their peers and subject matter experts on how to approach the design and implementation of an equity lab.

22 Cuiccio, C., and Husby-Slater, M. 2018. Needs Assessment Guidebook: Supporting the Development of District and School Needs Assessments. Washington, DC: State Support Network. https://statesupportnetwork.ed.gov/system/files/needsassessmentguidebook-508_003.pdf.23 Equitable Access Support Network. 2011. Equity Labs: Engaging LEAs in Ensuring Equitable Access to Excellent Educators. Washington, DC: Author. https://www2.ed.gov/about/offices/list/oese/oss/technicalassistance/easnequitylabsengagingleaswebinar.pdf24 Ibid.

Page 38: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

28

State Support Network—Technical Assistance Summary Report

The Network’s SupportNine states participated in a series of virtual P2P exchange sessions on the topic of equity labs between October 2017 and January 2018. Six states (Alaska, Arkansas, the District of Columbia, Georgia, Kentucky, and Maryland) expressed an interest in designing and implementing equity labs for the first time. Three states (Mississippi, Missouri, and Ohio) had previously conducted equity labs and were invited to share their experiences and lessons learned. The P2P sessions were hosted by the Network and involved a meaningful partnership with the Center on Great Teachers and Leaders (GTL Center) and the Council of Chief State School Officers (CCSSO). This series built on the efforts of the GTL Center and the Equitable Access Support Network to support educator equity labs by grounding session discussion and activities in Connecting the Dots: A Toolkit for Designing and Leading Equity Labs (The Toolkit),25 developed by the Equitable Access Support Network.

Prior to the launch of the P2P series, the Network and OESE developed a set of objectives for states participating in the P2P exchange. The objectives were that participating states would do the following:

1. Apply the nine steps, guidance, and materials in The Toolkit to the local SEA context.2. Use the guidance and materials in The Toolkit to:

� Articulate objectives for educator equity labs that align with the existing educator equity plan in the ESSA consolidated plan;

� Identify available resources and partnerships to supplement internal capacity to support educator equity labs and data analysis at the local level;

� Build a work plan with tasks, assignments, timelines, and including opportunities to train presenters and facilitators;

� Design agendas, activities, and materials for an educator equity lab;

� Define intended goals, outcomes, products, measures, and implementation benchmarks;

� Anticipate key logistical considerations;

� Create a communication and engagement strategy; and

� Identify next steps for ongoing improvement of educator equity labs.

3. Identify strategies to address common challenges or barriers to planning educator equity labs.

To meet these objectives, the Network assembled a project team consisting of representation from the GTL Center and CCSSO to serve as subject matter experts on equity labs and local equity planning, and a facilitation team from the Network. Additionally, participating states were placed in one of two cohorts so that the Network could differentiate supports specific to state needs. Both cohorts met as a group for the initial and final sessions of the series, which took place in October 2017 and January 2018, respectively. States participated in cohort-specific meetings for the second and third sessions, which took place in November and December 2017. To maximize the productivity of the virtual sessions, states were asked to respond to reflection questions on The Toolkit prior to each session. Sessions were also guided by a set of discussion questions that were tied to the overall project objectives.

25 Equitable Access Support Network. 2017. Connecting the Dots: A Toolkit for Designing and Leading Equity Labs. Washington, DC: Author. https://www2.ed.gov/about/offices/list/oese/oss/technicalassistance/easnequitylabtoolkit.pdf.

Page 39: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

29

State Support Network—Technical Assistance Summary Report

Results

The Network performance management team conducted a follow-up survey at the close of the project and interviews several months after the project ended with the states that participated in this P2P series. 11 participants responded to the survey and 10 participants provided feedback through interviews. For both the survey and interviews, all but two states that participated in the P2P exchange were represented in the feedback provided by participants.

The follow-up survey provided participant feedback on key indicators mapped to the evaluation domains of quality, relevance, application and impact, and project objectives. Table 12 displays the survey results, condensed into these four domains.

Evaluation Domain Relevant Survey Items

Percentage Agree or Strongly Agree

Average Rating

Quality � The knowledge and skills of the presenters were appropriate for the goals of the technical assistance.

� I am satisfied with the overall quality of the technical assistance experience.

82%(18/22)

3.14

Relevance � The technical assistance provided meets the specific needs of my project, office, or agency.

� The information and resources provided are appropriate for my level of experience and knowledge.

81%(17/21)

3.10

Application and Impact

� I will share the knowledge and skills I learned in the technical assistance with others.

73%(8/11)

3.09

Project Objectives

� Please indicate your agreement with the following statements about the objectives of the P2P. As a result of the P2P, participants can:

77%(66/86)

2.91

a. Articulate goals and objectives for educator equity labs that align with our existing state equity plan under ESSA

82%(9/11)

3.09

b. Identify available internal and external resources and partnerships to supplement internal capacity to support educator equity labs planning and data analysis at the local level

73%(8/11)

3.00

c. Identify measures of impact and data collection plans that will support measuring progress toward meeting the stated goals and objectives of local equity labs

64%(7/11)

2.73

d. Develop and strengthen a stakeholder engagement and communication plan, including drafting and disseminating key messages that describe the importance of the equity lab work

73%(8/11)

2.91

Table 12. Survey Results From P2P Equity Lab Series

See notes at end of table.

Page 40: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

30

State Support Network—Technical Assistance Summary Report

Evaluation Domain Relevant Survey Items

Percentage Agree or Strongly Agree

Average Rating

Project Objectives (cont.)

e. Establish or strengthen a high-level agenda(s) for facilitating local equity labs in our state, including identifying key session topics, session formats or protocols, and the materials required to facilitate each session

82%(9/11)

3.27

f.  Engage in early implementation planning considerations, including work plan development and project management considerations

80%(8/10)

3.10

g.  Strengthen approaches or artifacts related to equity lab planning (e.g., agendas, outreach materials, facilitation protocols) through peer consultancy feedback

73%(8/11)

3.09

h. Identify additional technical assistance supports available for equity lab design and implementation

90%(9/10)

3.00

Table 12. Survey Results From P2P Equity Lab Series—Continued

Note. The data for the “Percentage agree and strongly agree” column were derived by dividing the number of Agree and Strongly agree responses by the total number of responses for survey items. “Average Rating” was calculated by assigning numeric values to each Likert response (1 = Strongly disagree, 2 = Disagree, 2.5 = Neither agree nor disagree, 3 = Agree, 4 = Strongly agree) and then calculating the average value of the sum of the responses.

Survey results indicate that the percentage of participants who agreed or strongly agreed to the survey items for each evaluation domain was above 72 percent. With regard to objectives, the majority of P2P participants who responded to the survey felt that the project objectives were met. Additionally, the average rating of all Likert responses (strongly disagree, disagree, neither agree nor disagree,26 agree, and strongly agree) for each evaluation domain was above 2.8.

Interview data provided a similar picture to the survey results. Interviewees offered the most positive feedback in their responses to questions about the quality of the P2P. For example, seven out of 10 interviewees indicated that the content was based on research and best practices, and that the presenters were knowledgeable about the content. Additionally, three out of 10 interviewees observed that the P2P was well paced and organized. As one participant noted, “I thought [the presenters] were very innovative, the way we were working together and sharing ideas was very unique and something new for myself. It was something I could take back and use with some of the other work I was doing with the states. It was a great way to allow us to share effectively without leaving our offices and spend[ing] travel money. We were able to share effectively through the site we were using.” However, other interviewees reported challenges with competing demands and scheduling conflicts that limited their ability to engage in the technical assistance.

Interviewees also offered mixed feedback with regard to the relevance of the P2P to their state needs and priorities. Some interviewees felt that the work was less relevant for them because of their experience with equity lab work prior to the P2P. As one participant commented, “We were fine with the equity labs we were doing. It was only as we did them we realized that we didn’t have the capacity to do [them to] get anywhere significant.… We need to do it much more at scale.… I would hope that at some point there’s a next generation version of [the P2P that enables us to explore this issue].” Other interviewees

26 This response option was only offered for select surveys.

Page 41: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

31

State Support Network—Technical Assistance Summary Report

realized, through their experience with the P2P, that they were not ready to pilot equity labs in their states: “Ultimately, we decided not to run equity labs; it just wasn’t the right time for us. That doesn’t mean the [P2P] didn’t meet us where we were, we just needed to explore that and make that decision.”

Nevertheless, a number of interviewees described ways in which they would apply knowledge and skills gained from the P2P to begin equity lab work or to enhance equity labs already in progress. The following quotes are examples:

� “We felt affirmed with some of the things we decided to do. So, I feel like not only did we meet objectives, but it moved us forward. It helped us feel confident and prepared going forward.”

� “We changed some of our labs due to what we were experiencing. It helped us reflect on what we had already done. The other piece we took away from this was thinking through the lens of evaluating our lab effectiveness.”

Application of Lessons LearnedThis P2P exchange series was designed to provide a launch pad for several states to begin developing equity labs as a strategy for engaging LEAs in developing local educator equity plans. For example, four states submitted requests for follow-up support through the GTL Center immediately after the work ended. Additionally, two states offered to provide support to another state at the conclusion of the P2P. In the months after formal evaluation was conducted through the follow-up survey and interviews, the Network contacted P2P participants by e-mail to see whether the states had made progress in planning for or implementing equity labs. Through this correspondence, the Network learned that five of nine participating states had conducted equity labs; three of these states had hosted their first lab and two of these states had hosted additional labs. One state that hosted an equity lab for the first time commented: “Thanks so much to you and your team for a very helpful and thought-provoking partnership. Our team’s participation with this group definitely had a positive impact on the work we were able to accomplish over the past year related to equity.” Three states also expressed interest in future support on equity labs. The Network will continue to monitor state needs relative to equity labs as well as state efforts to enhance existing equity lab efforts or launch new equity lab initiatives and to develop technical assistance opportunities accordingly to support states in this work.

Page 42: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

32

State Support Network—Technical Assistance Summary Report

ConclusionFrom May 2016 through May 2018, which represents the halfway point of the Network’s contract with OESE, the Network completed 72 technical assistance projects, including eight CoPs, eight P2Ps, 32 ITAs; 13 tools and eight resources; and three state convenings. Of these projects, eight CoPs, seven P2Ps, and five ITAs were evaluated through post-event surveys, the feedback from which was a central focus of this report. The report also drew upon interview data from three CoPs, two P2Ps, and three ITAs. Feedback for tools and products, and large state convenings was not discussed in this report. The results from post-event surveys and follow-up interviews measured the quality, relevance, application and impact, and extent to which Network technical assistance met stated objectives.

Key Findings and Next StepsA review of the data indicate that the Network is:

� Providing consistent quality. During the 2016–2018 timeframe, Network technical assistance was consistently perceived by participants to be of high quality. Across all task areas and date ranges, more than 90 percent of participants who completed evaluation surveys either agreed or strongly agreed with the criteria used to evaluate quality. Additionally, the Network consistently received an average quality rating above 3.2 out of 4 points from survey respondents. Interview participants noted the Network’s use of evidence-based practices to inform technical assistance as well as the knowledge and skills of presenters who led the work. Considering these evaluation results, the Network will continue to build upon this foundation for the quality of its work to make the best use of the time that participants dedicate to the technical assistance amidst other competing demands.

� Differentiating supports to increase relevance. The Network also received high ratings for the relevance of its technical assistance between 2016 and 2018. Almost 90 percent of participants for each task area and date range who completed evaluation surveys either agreed or strongly agreed with the criteria used to evaluate relevance (90 percent for technical assistance overall, 90 percent for CoPs, 89 percent for P2Ps, and 89 percent for ITAs). Additionally, the Network consistently received an average relevance rating above 3 out of 4 points from survey respondents. However, P2P exchanges (the single task area that had survey data for both 2016–2017 and 2017–2018) experienced a drop in the two survey metrics between 2016–2017 and 2017–2018 (14 percent for the “Percent agree and strongly agree” and 0.65 points for the average rating). Participant interviews offered a potential explanation for this trend. While interviews suggest that technical assistance met the specific interests of most participants, in some instances (especially collective technical assistance), participants expressed a need for more differentiated support. These evaluation results suggest that the Network should continue to hone its strategies for differentiating the technical assistance support provided to appropriately match participant needs and readiness (for example, through the co-creation of project objectives with technical assistance participants).

Page 43: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

33

State Support Network—Technical Assistance Summary Report

� Increasing evidence of application and impact. The Network demonstrated some improvement in the application and impact domain between 2016 and 2018. For example, the percentage of participants responding to surveys who either agreed or strongly agreed with the criteria used to evaluate application and impact increased from 83 percent to 91 percent for technical assistance overall during this timeframe. Among the individual task areas, ITA most closely reflected this trend, increasing from 82 percent to 100 percent on the same metric between 2016 and 2018. Participant interviews provided some examples of how knowledge and skills gained from technical assistance translated into changes in practice. However, the average application and impact rating for technical assistance overall and for P2P specifically decreased slightly on the surveys between 2016–2017 and 2017–2018 (0.07 points for technical assistance overall and 0.29 points for P2P). Given these evaluation results, the Network will continue to foster environments within its technical assistance that enable participants to reflect on connections between their work and the technical assistance experiences. The Network will also seek additional opportunities to collect evidence of the long-term impact of its technical assistance in the future.

� Enhancing the fit of project objectives. Evaluation data suggest that the Network has an opportunity for growth in meeting project objectives. Between 2016 and 2018, the percentage of participants responding to surveys who either agreed or strongly agreed that project objectives were met ranged from 78 percent to 86 percent (83 percent for technical assistance overall, 86 percent for CoPs, 82 percent for P2Ps, and 78 percent for ITAs). Average ratings of objectives met during the 2016–2018 timeframe ranged from 3.16 to 3.19 (3.19 for technical assistance overall, 3.19 for CoPs, 3.16 for P2Ps, and 3.16 for ITAs). Feedback from interviews offered some explanations for these ratings. For example, some states and districts that self-identified as more experienced indicated that objectives for technical assistance were less relevant to their current level of knowledge and experience with a topic. Other states and districts experienced changes within their own context that limited their capacity to engage in technical assistance. Considering these evaluation results (and building upon the relevance key findings of this report), the Network will consider differentiating objectives to match participants’ varying levels of readiness and/or adjusting objectives to meet participant needs throughout the course of technical assistance to enhance the fit of project objectives for all participants. The Network will also continue to support project teams in developing high-quality project objectives through resources such as an objectives guidance document developed by the Network in 2017–2018 on writing objectives that are specific, measurable, achievable, relevant, and timebound (i.e., SMART goals).

Page 44: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S
Page 45: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

A-1

State Support Network—Technical Assistance Summary Report: Appendix A

Timeframe Project Completed

Technical Assistance Project Title

CoP P2P ITA

May 2016–May 2017 � District Strategic Planning and Resource Allocation

� Evidence-Based Practices � ESSA State Planning � English Language Proficiency

� Comprehensive Needs Assessment Series

� Top 10 Strategic Goals � Rural Professional Learning Network

Series

� Maine SIG Support � California Targeted Schools Work Group � Maine Taking Stock � New Jersey Needs Assessment

Protocols � Massachusetts CCSSO Critical Friends � New Jersey CCSSO Critical Friends � New York CCSSO Critical Friends � Colorado CCSSO Critical Friends � Georgia Root Cause Analysis Follow-up � Indiana STEM Standards � Michigan Strategic Partnership � Massachusetts Equity Lab Planning � Oklahoma CCSSO Critical Friends � Rhode Island CCSSO Critical Friends � Georgia Risk Assessment Tool � South Dakota District Audit Needs

Assessment � New Hampshire CCSSO Critical Friends � Nebraska CCSSO Critical Friends � Arkansas CCSSO Critical Friends � Idaho CCSSO Critical Friends � South Dakota CCSSO Critical Friends � Washington CCSSO Critical Friends

Table continues on next page.

Appendix A. List of Technical Assistance Projects (Completed May 2016–May 2018)

Page 46: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

A-2

State Support Network—Technical Assistance Summary Report: Appendix A

Timeframe Project Completed

Technical Assistance Project Title

CoP P2P ITA

May 2017–May 2018 � Implementing Needs Assessment � Scaling Needs Assessment � Differentiated Systems of Support for

Rural Agencies � Data Systems

� ESSA Long Term Goals Series � Massachusetts/California Evidence-

Based Practices � Scaling Up Educator Equity Labs � Equity Lab Series � Georgia/California Comprehensive

Literacy Needs Assessment Planning

� Georgia Consolidated District Plan Review

� Needs Assessment 101 � DC Accountability Frameworks for

Alternative Schools � South Dakota Needs Assessment

Rollout Support � Massachusetts High School Turnaround � Chinle, Arizona District and School

Leadership Training � South Dakota Needs Assessment

Development for School Pilot � El Dorado, California Countywide

System of Support � Results for America State Consultations � California Literacy Needs Assessment

Page 47: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

B-1

State Support Network—Technical Assistance Summary Report: Appendix B

State or TerritoryTotal TA

ActivitiesNumber of CoPs

Number of P2Ps

Number of ITAs CoP Project Names P2P Project Names ITA Project Names

Alabama 3 2 1 � State Report Card � State Support for

Identification and Improvement

� Equity Lab Series

Alaska 3 2 1 � Differentiated Systems of Support for Rural Agencies

� Implementing Evidence-Based Practices

� Equity Lab Series

Arizona 4 2 1 1 � English Language Proficiency

� Implementing Needs Assessment

� Comprehensive Needs Assessment Series

� English Language Proficiency Consultation Call

Arkansas 9 5 1 3 � English Language Proficiency

� State Report Card � School Quality and

Student Success � Financial Transparency � State Support for

Identification and Improvement

� Equity Lab Series � English Language Proficiency Consultation Call

� CCSSO Critical Friends � AR Comprehensive

Needs Assessment Support

Bureau of Indian Education 3 2 1 � State Report Card � Data Systems

� Needs Assessment Training

Table includes projects that were active during the reporting period but may not have completed between May 2016 and May 2018. Table continues on next page.

Appendix B. State Participation in Network Technical Assistance (May 2016–May 2018)

Page 48: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

B-2

State Support Network—Technical Assistance Summary Report: Appendix B

State or TerritoryTotal TA

ActivitiesNumber of CoPs

Number of P2Ps

Number of ITAs CoP Project Names P2P Project Names ITA Project Names

California 9 4 2 3 � Scaling Needs Assessment

� Evidence-Based Practices

� English Language Proficiency

� Implementing Evidence-Based Practices

� MA/CA Evidence-Based Practices

� GA/CA Comprehensive Literacy Needs Assessment Planning

� CA Literacy Needs Assessment

� El Dorado, CA Countywide System of Support

� CA Targeted Schools Work Group

Colorado 2 1 1 � Differentiated Systems of Support for Rural Agencies

� CCSSO Critical Friends

Connecticut

Delaware 5 3 1 1 � English Language Proficiency

� Implementing Needs Assessment

� School Quality and Student Success

� Comprehensive Needs Assessment Series

� Results for America State Consultations

District of Columbia 3 1 1 1 � Scaling Needs Assessment

� Equity Lab Series � DC Accountability Frameworks for Alternative Schools

Florida 3 2 1 � State Report Card � Implementing Evidence-

Based Practices

� FL Needs Assessment Process

Georgia 8 2 3 3 � Implementing Needs Assessment

� Implementing Evidence-Based Practices

� Comprehensive Needs Assessment Series

� Equity Lab Series � GA/CA Comprehensive

Literacy Needs Assessment Planning

� GA Consolidated District Plan Review

� GA Root Cause Analysis Follow-up

� GA Risk Assessment Tool

Hawaii 1 1 � ESSA Consultation Call

Idaho 2 1 1 � State Support for Identification and Improvement

� CCSSO Critical Friends

Table includes projects that were active during the reporting period but may not have completed between May 2016 and May 2018. Table continues on next page.

Page 49: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

B-3

State Support Network—Technical Assistance Summary Report: Appendix B

State or TerritoryTotal TA

ActivitiesNumber of CoPs

Number of P2Ps

Number of ITAs CoP Project Names P2P Project Names ITA Project Names

Illinois 4 3 1 � English Language Proficiency

� Scaling Needs Assessment

� State Support for Identification and Improvement

� ESSA Consultation Call

Indiana 7 5 2 � Differentiated Systems of Support for Rural Agencies

� ESSA State Planning � Scaling Needs

Assessment � State Support for

Identification and Improvement

� English Language Proficiency

� IN STEM Standards � Needs Assessment 101

Iowa

Kansas

Kentucky 5 3 1 1 � Differentiated Systems of Support for Rural Agencies

� English Language Proficiency

� Implementing Evidence-Based Practices

� Equity Lab Series � English Language Proficiency Consultation Call

Louisiana 3 2 1 � Data Systems � English Language

Proficiency

� ESSA Long Term Goals Series

Maine 5 2 1 2 � State Support for Identification and Improvement

� English Language Proficiency

� ESSA Long Term Goals Series

� ME Taking Stock � ME SIG District Support

Table includes projects that were active during the reporting period but may not have completed between May 2016 and May 2018. Table continues on next page.

Page 50: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

B-4

State Support Network—Technical Assistance Summary Report: Appendix B

State or TerritoryTotal TA

ActivitiesNumber of CoPs

Number of P2Ps

Number of ITAs CoP Project Names P2P Project Names ITA Project Names

Maryland 5 3 2 � Scaling Needs Assessment

� Evidence-Based Practices

� State Support for Identification and Improvement

� Comprehensive Needs Assessment Series

� Equity Lab Series

Massachusetts 7 2 1 4 � English Language Proficiency

� Evidence-Based Practices

� MA/CA Evidence-Based Practices

� CCSSO Critical Friends � MA High School

Turnaround � Results for America

State Consultations � MA Equity Lab Planning

Michigan 5 2 1 2 � English Language Proficiency

� Scaling Needs Assessment

� Top 10 Strategic Goals � English Language Proficiency Consultation Call

� MI Strategic Partnership

Minnesota 5 3 2 � Evidence-Based Practices

� Scaling Needs Assessment

� ESSA State Planning

� MN ESSA Planning and School Improvement Support

� MN Evidence-Based Practices Support

Mississippi 7 5 1 1 � Differentiated Systems of Support for Rural Agencies

� Data Systems � Evidence-Based

Practices � State Report Card � Implementing Evidence-

Based Practices

� Equity Lab Series � MS Data Quality Plan

Missouri 5 1 3 1 � ESSA State Planning � Top 10 Strategic Goals � Equity Lab Series � Scaling Up Educator

Equity Labs

� Needs Assessment 101

Table includes projects that were active during the reporting period but may not have completed between May 2016 and May 2018. Table continues on next page.

Page 51: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

B-5

State Support Network—Technical Assistance Summary Report: Appendix B

State or TerritoryTotal TA

ActivitiesNumber of CoPs

Number of P2Ps

Number of ITAs CoP Project Names P2P Project Names ITA Project Names

Montana 2 2 � Differentiated Systems of Support for Rural Agencies

� Financial Transparency

Nebraska 3 1 2 � State Report Card � CCSSO Critical Friends � Needs Assessment 101

Nevada 4 3 1 � Financial Transparency � State Support for

Identification and Improvement

� State Report Card

� Results for America State Consultations

New Hampshire 4 3 1 � Evidence-Based Practices

� State Report Card � Implementing Evidence-

Based Practices

� CCSSO Critical Friends

New Jersey 7 3 1 3 � English Language Proficiency

� Scaling Needs Assessment

� Implementing Evidence-Based Practices

� Comprehensive Needs Assessment Series

� English Language Proficiency Consultation Call

� NJ Needs Assessment Protocols

� CCSSO Critical Friends

New Mexico 4 3 1 � Financial Transparency � State Report Card � English Language

Proficiency

� Results for America State Consultations

New York 2 1 1 � State Support for Identification and Improvement

� CCSSO Critical Friends

North Carolina 2 1 1 � English Language Proficiency

� English Language Proficiency Consultation Call

North Dakota 2 1 1 � Financial Transparency � ESSA Long Term Goals Series

Table includes projects that were active during the reporting period but may not have completed between May 2016 and May 2018. Table continues on next page.

Page 52: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

B-6

State Support Network—Technical Assistance Summary Report: Appendix B

State or TerritoryTotal TA

ActivitiesNumber of CoPs

Number of P2Ps

Number of ITAs CoP Project Names P2P Project Names ITA Project Names

Ohio 7 5 2 � Implementing Needs Assessment

� Differentiated Systems of Support for Rural Agencies

� ESSA State Planning � School Quality and

Student Success � State Support for

Identification and Improvement

� Equity Lab Series � Scaling Up Educator

Equity Labs

Oklahoma 6 3 1 2 � School Quality and Student Success

� Financial Transparency � State Report Card

� ESSA Long Term Goals Series

� CCSSO Critical Friends � Results for America

State Consultations

Oregon 5 3 1 1 � English Language Proficiency

� Data Systems � Scaling Needs

Assessment

� Comprehensive Needs Assessment Series

� OR Financial Transparency and Resource Allocation

Pennsylvania 1 1 � Evidence-Based Practices

Puerto Rico 2 1 1 � English Language Proficiency

� English Language Proficiency Consultation Call

Rhode Island 4 1 1 2 � English Language Proficiency

� ESSA Long Term Goals Series

� CCSSO Critical Friends � Results for America

State Consultations

South Carolina 4 4 � Data Systems � ESSA State Planning � Evidence-Based

Practices � State Report Card

Table includes projects that were active during the reporting period but may not have completed between May 2016 and May 2018. Table continues on next page.

Page 53: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

B-7

State Support Network—Technical Assistance Summary Report: Appendix B

State or TerritoryTotal TA

ActivitiesNumber of CoPs

Number of P2Ps

Number of ITAs CoP Project Names P2P Project Names ITA Project Names

South Dakota 6 2 4 � Implementing Needs Assessment

� State Support for Identification and Improvement

� SD Needs Assessment Development for School Pilot

� SD Needs Assessment Rollout Support

� SD District Audit Needs Assessment

� CCSSO Critical Friends

Tennessee 3 1 2 � Differentiated Systems of Support for Rural Agencies

� English Language Proficiency Consultation Call

� Results for America State Consultations

Texas 2 1 1 � Scaling Up Educator Equity Labs

� Needs Assessment 101

Utah 2 1 1 � Implementing Evidence-Based Practices

� Needs Assessment 101

Vermont 2 2 � Evidence-Based Practices

� Implementing Evidence-Based Practices

Virginia

Washington 2 1 1 � School Quality and Student Success

� CCSSO Critical Friends

West Virginia

Wisconsin

Wyoming

Table includes projects that were active during the reporting period but may not have completed between May 2016 and May 2018.

Page 54: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

C-1

State Support Network—Technical Assistance Summary Report: Appendix C

Appendix C. Overview of Performance Management SystemThe Network developed a performance management system to develop and track performance metrics and coordinate continuous improvement of Network technical assistance.

The Network performance management system includes several components, including the following:

� Work plan and objectives development. Network and OESE task leads and project teams develop strong work plans and clear, measurable objectives for individual projects.

� Objectives tracking and continuous improvement. Network task leads and project teams participate in reflective conversations to assess their progress in meeting project objectives to promote continuous improvement.

� Evaluation surveys. Network performance management team administers evaluation surveys for projects and engage in conversations with task leads and project teams around using the results to promote continuous improvement.

� Interviews. Network performance management team conducts follow-up interviews for select projects to assess long-term impact of Network projects.

� Task team reflections. Network and OESE task leads engage in quarterly conversations to reflect on the quality and relevance of projects, and the extent to which projects are meeting intended outcomes.

� Data reviews. Network and OESE leadership teams review evidence from objectives tracking, evaluation surveys, and other artifacts on a quarterly basis to assess the Network’s performance on key indicators.

Figure C1 illustrates how the different activities of the performance management system fit together. Note that interview data collection is intentionally omitted from this graphic because it does not apply to all projects.

Figure C1. Typical Performance Management Cycle

Work plan and objectives

development

Objectives tracking and continuous

improvmentData reviews

Evaluationsurveys

Task team reflections

Page 55: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

D-1

State Support Network—Technical Assistance Summary Report: Appendix D

Appendix D. Standard Evaluation SurveyU.S. Department of Education Technical Assistance Services FeedbackThe U.S. Department of Education and the State Support Network are committed to providing quality technical assistance. Please take a few minutes to provide feedback about your experience to help us improve future technical assistance events and understand how they benefit district and school improvement. The valid OMB control number for this information collection is 1880-0542.

Title:

Location: (if appropriate)

Project Goal(s): (big picture)

Dates of Sessions: (participants check dates that they attended)

1.  Please indicate your agreement with the following

statements regarding this CoP.Strongly Disagree Disagree Agree

Strongly Agree

a. I am satisfied with the overall quality of this technical assistance experience. (Quality) 1 2 3 4

b. The technical assistance provided meets the specific needs of my project, office, or agency. (Relevance)

1 2 3 4

c. The knowledge and skills of the presenters were appropriate for the goals of the [event]. (Quality) 1 2 3 4

d. The materials and/or resources were of high quality and easily accessible. (Quality) 1 2 3 4

e. I will share the knowledge and skills I learned in the [event] with others. (Application) 1 2 3 4

2.  Please indicate your agreement with the following statements about the objectives of the [Event]:

Strongly Disagree Disagree Agree

Strongly Agree Comments

As a result of the [Event], participants...:

a. 1 2 3 4

b. 1 2 3 4

c. 1 2 3 4

d. Etc. 1 2 3 4

Page 56: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

D-2

State Support Network—Technical Assistance Summary Report: Appendix D

Open-Ended Questions:1. What aspects of this event are most useful and relevant for your work, and why?2. What suggestions do you have that would make future events more useful?

For CoP surveys only, one or more of the following questions may be asked:1. As you are aware, there is an online space. It has tabs for discussion questions, resource sharing,

and a calendar. Its purpose is to facilitate discussion, resource sharing, and interactions among CoP participants in between (monthly) meetings.

i. Have you used the online space (yes/no)? ii. If yes, for what purposes (e.g., participating in a discussion, resource sharing, accessing webinar

materials)?

2. What aspects of this CoP are most useful and relevant for your work and why?3. How do you and your state team plan to apply the information from this CoP to your work?4. Are there changes you would suggest for future similar events to make them more useful for

participants?

Page 57: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

E-1

State Support Network—Technical Assistance Summary Report: Appendix E

Appendix E. Interview ProtocolU.S. Department of EducationTechnical Assistance (TA) Interview Protocol This interview protocol can be used for either 1-on-1 interviews or small-group interviews with the following stakeholders depending on State educational agency (SEA), Local educational agency (LEA) and/or TA activity context: (1) TA recipients; (2) senior SEA/LEA staff not directly involved in TA, or; (3) partner in providing TA to Network service recipients. Questions are organized to differentiate between prompts for different stakeholder groups.

Text in italics indicates language to be spoken during interview while standard text indicates language to guide interviewer through the conversation. Highlighted text indicates language that will need to be customized by the interviewer prior to the interview.

ContextHello, [name of interviewee]. My name is [interviewer’s name], a member of the evaluation team for the State Support Network (which I’ll sometimes refer to as “the Network”). The Network is a technical assistance initiative funded by the U.S. Department of Education’s Office of State Support. Thank you for taking time today to share the experiences you and your colleagues have had with the Network. Your participation in this interview is voluntary. The results of the interview will be shared with the State Support Network and the Office of State Support to inform and improve future technical assistance efforts. I anticipate this conversation will last no more than 45 minutes. I will be taking notes as we talk and would also like to record our conversation to ensure accuracy. May I have your permission to record this conversation?

Introduction

QuestionNotes/actions for

interviewer if applicable

(Q1) Confirm interviewee’s affiliation with the Network.

TA recipientsFirst, you were identified as someone who has worked with the State Support Network, but can you confirm that you participated in [State Support Network TA activity], which took place on/around [date(s)]?

Senior SEA staff not directly involved in TAFirst, can you confirm that you are familiar with the State Support Network and your state’s involvement with some of its activities such as [State Support Network TA activities]?

TA partnersFirst, you were identified as someone who has worked with the State Support Network, but can you confirm that you contributed to [State Support Network TA activity], which took place on/around [date(s)]?

For all interviewee types:

If yes to Q1, skip to Q3

If no to Q1, move to Q2

Page 58: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

E-2

State Support Network—Technical Assistance Summary Report: Appendix E

QuestionNotes/actions for

interviewer if applicable

(Q2) In the event that the interviewee is not familiar with the State Support Network seek to better understand their experiences working with other providers.

TA recipientsThank you for clarifying that you are not familiar with the State Support Network through [State Support Network TA activity]. (Is there a point of contact that you recommend I reach out to?)

If I may, though, can I ask one or two questions about your experience working with technical assistance providers?

Senior SEA staff not directly involved in TAThanks for clarifying that you are not familiar with the State Support Network’s activities with your state. (Is there a different project-specific point-of-contact that you recommend I reach out to?)

If I may, though, can I ask one or two questions about your experience working with technical assistance providers?

TA partnersThanks for clarifying that you are not familiar with the State Support Network through [State Support Network TA activity]. (Is there a point of contact that you recommend I reach out to?)

If I may, though, can I ask one or two questions about your experience working with technical assistance providers?

For all interviewee types:

If yes to Q2, skip to Q8

If no to Q2, politely end the interview, thanking [name of interviewee].

The interviewer should always try to identify a recommended point of contact.

(Q3) Gauge the interviewee’s awareness/recall of the TA activity that is the focus of this interview.

TA recipientsIn your own words, please describe the technical assistance activity that you participated in. How was the assistance helpful to you? For example, was there a specific activity or resource that impacted your work?

Senior SEA staff not directly involved in TAIn your own words, please describe the ways your state is being served through technical assistance provided by the State Support Network. For example, what assistance did the state receive? What specific activity or resource was especially helpful?

TA partnersWith regard to [State Support Network TA activity], how have you interacted with the Network during this project? For example, do any specific activities, tools, and/or resources come to mind?

For all interviewee types:

Interviewer may wish to refer to a list of activities, tools, and/or resources associated with the TA activity/activities that are the focus of this interview.

If interviewee is grasping for a name, date, or descriptor, interviewer may offer prompts or suggestions from the list to minimize burden.

Page 59: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

E-3

State Support Network—Technical Assistance Summary Report: Appendix E

QualityI want to ask a couple of questions about quality. When my colleagues at the State Support Network think about quality, they are thinking about technical assistance with:

� content based on research and best practice

� highly knowledgeable subject matter experts and facilitators

� effective pace, organization, communication strategies, and follow-up support

� objectives met as evidenced by participant feedback and measurable outcomes

QuestionNotes/actions for

interviewer if applicable

(Q4) Gather information about the quality of the technical assistance provided by the Network.

TA recipientsWith those aspects of quality in mind, and thinking about [State Support Network TA activities], what is your general sense of the quality of the TA?

a.  Did the content of [State Support Network TA activity] seem to draw on research and best practice?

b.  Did the subject matter experts and facilitators seems highly knowledgeable about the content?

c.  Were the pace, organization, communication strategies, and follow-up support of the TA effective?

d.  One of the stated objectives of this particular project or activity was [INSERT SUCCINCT SUMMARY HERE]. In your estimation, was this objective met?

e.  Another of the stated objectives of this particular project or activity was [INSERT SUCCINT SUMMARY HERE]. In your estimation, was this objective met?

f.  [Repeat for up to as many as three or four primary objectives.]

Senior SEA staff not directly involved in TAWith those aspects of quality in mind, and thinking about State Support Network TA activities, what is your general sense of the quality of the TA?

a.  Was the TA designed such that it could serve the needs and objectives of your state?

b.  Was the TA delivered such that it could serve the needs and objectives of your state?

TA partnersWith those aspects of quality in mind, and thinking about [State Support Network TA activities], what is your general sense of the quality of the TA?

a.  Did the content of [State Support Network TA activity] seem to draw on research and best practice?

b.  Did the subject matter experts and facilitators seem highly knowledgeable about the content?

c.  Were the pace, organization, communication strategies, and follow-up support of the TA effective?

d. One of the stated objectives of this particular project or activity was [INSERT SUCCINCT SUMMARY HERE]. In your estimation, was this objective met?

e. Another of the stated objectives of this particular project or activity was [INSERT SUCCINT SUMMARY HERE]. In your estimation, was this objective met?

f. [Repeat for up to as many as three or four primary objectives.]

For all interviewee types:

If interviewee offers a “yes/no” response to any question, probe for additional detail or examples as appropriate.

Page 60: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

E-4

State Support Network—Technical Assistance Summary Report: Appendix E

RelevanceThank you. Next, I want to talk about relevance. When my colleagues at the State Support Network think about relevance, they are thinking about the extent to which:

� technical assistance is addressing state and district needs and priorities

� technical assistance has clear potential for direct application to state and district priorities

� technical assistance appropriately matches participants’ knowledge and resources

QuestionNotes/actions for

interviewer if applicable

(Q5) Gather information about the relevance of the TA activity that is the focus of this interview.

TA recipientsWith those aspects of relevance in mind, and thinking about [State Support Network TA activities], what is your general sense of the relevance of the TA?

a.  How did [State Support Network TA activities] address the needs and priorities of your context?

b. To what extent has [State Support Network TA activity] had direct application to the priorities of your context?

c. How was the content of [State Support Network TA activity] appropriate for your knowledge and resources?

Senior SEA staff not directly involved in TAWith those aspects of relevance in mind, and thinking about State Support Network TA activities, what is your general sense of the relevance of the TA?

a.  How did Network TA activities address the needs and priorities of your state and/or district(s)?

b. How were Network TA activities directly applicable to the priorities of your state and/or district(s)?

c. How was the content of State Support Network TA appropriate for meeting the needs of those that participated from your state and/or district(s)?

TA partnersWith those aspects of relevance in mind, and thinking about [State Support Network TA activity], what is your general sense of the relevance of the TA?

a.  To what extent [State Support Network TA activity] address the needs and priorities of the state(s) and/or district(s)?

b. How did [State Support Network TA activity] have direct application to the priorities of the state(s) and/or district(s)?

c. Was the content of [State Support Network TA activity] appropriate for participants’ knowledge and resources?

For all interviewee types:

If interviewee offers a “yes/no” response, probe for additional detail or examples as appropriate. For example: What aspects of the State Support Network’s technical assistance have been most useful and relevant to your work, and why?

Page 61: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

E-5

State Support Network—Technical Assistance Summary Report: Appendix E

Application of information, tools, or resourcesNext I want to talk about actual use or application of the information, tools, or resources that resulted from [State Support Network TA activity].

QuestionNotes/actions for

interviewer if applicable

(Q6) Gather information about the application of information, tools, or resources from the TA activity that is the focus of this interview.

TA recipientsHow have the information, tools, or resources presented in [State Support Network TA activity] been used and applied by individuals in your state or district?

a.  One of the stated objectives of this particular project or activity was [INSERT SUCCINCT SUMMARY HERE]. What can you tell me about the extent to which this objective was met, and has led to ongoing use or application of the information, tools, or resources featured in the Network TA?

b. [Repeat for up to as many as three or four primary objectives, limiting to TA objectives that are logically about ongoing use, application, or implementation of information, tools, or resources.]

Senior SEA staff not directly involved in TAHow have the information, tools, or resources presented in [State Support Network TA activity/activities] been used and applied by your state and/or district(s) from your context?

TA partnersBased on your knowledge and experience, how have the information, tools, or resources presented in [State Support Network TA activity/activities] been used and applied by the state(s) and/or district(s)?

For all interviewee types:

If “yes” to Q6, ask: What specific instances or examples come to mind? Are there other instances or examples?

If “no” to Q6, ask: Why do you think there has not been use or application in ways that are easy to trace or identify?

Page 62: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

E-6

State Support Network—Technical Assistance Summary Report: Appendix E

Application and extensionA goal of the State Support Network is for its TA to provide opportunities for participating states and districts to move their own work forward—whether this is in writing draft language for a policy or plan, developing or piloting new tools and systems, or in other ways.

QuestionNotes/actions for

interviewer if applicable

(Q7) Gather information about the application and extension of the TA activity that is the focus of this interview.

TA recipientsa.  How has participation in [State Support Network TA activity] informed state and/

or district work in your context? Please cite specific examples.b. How has participation in [State Support Network TA activity/activities]

contributed to your state and/or district(s)’s capacity to meet ESEA goals? Please cite specific examples.

c. As a result of participation in [State Support Network TA activity/activities], how has your state and/or district(s) changed or improved practice? Please cite specific examples.

d. As a result of participating in [State Support Network TA activity/activities], has your state and/or district(s) developed or strengthened partnerships with other states, districts, or TA providers that may support your ongoing work in [reform area]? Please cite specific examples.

Senior SEA staff not directly involved in TAa. Have you observed any changes to state and/or district work in your context as a

result of participation in [State Support Network TA activity]? If yes, please cite specific examples.

b. Has participation in [State Support Network TA activity/activities] enhanced your state and/or district(s)’s capacity to meet ESEA goals? If so, how?

c. As a result of participation in [State Support Network TA activity/activities], has your state and/or district(s) changed or improved practice? Please cite specific examples.

TA partnersa. How have the state(s) and/or district(s) changed or improved practice as a result

of participation in [State Support Network TA activity/activities]? Please cite specific examples.

b. How has participation in [State Support Network TA activity/activities] influenced your work providing assistance in this area? Please cite specific examples.

c. Has participation in [State Support Network TA activity/activities] resulted in continued engagement with participating state(s) and/or district(s)? If so, how?

For all interviewee types:

If interviewee offers a “no” response to any portion of Q7, ask: Do you have thoughts about how the TA could have been structured or operated differently to allow this to happen?

Page 63: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

E-7

State Support Network—Technical Assistance Summary Report: Appendix E

ConclusionThank you for sharing your valuable insights about quality, relevance, use, and application of State Support Network activities and materials. Before we conclude, I have a few general questions about other technical assistance that you have received and what your technical assistance needs might be in the near future.

QuestionNotes/actions for

interviewer if applicable

(Q8) Inquire about effective experiences with other technical assistance providers.

TA recipientsWhen you think of other technical assistance opportunities that your state has participated in, aside from Network TA (from federally funded or otherwise), what made that assistance effective/impactful for you? For the state?

Senior SEA staff not directly involved in TAWhen you think of other technical assistance organizations aside from the Network (federally funded or otherwise) that have supported your state and its school districts, what made that assistance effective/impactful?

TA partnersWhen you have collaborated with other technical assistance organizations aside from the Network (federally funded or otherwise), what has characterized an effective partnership?

(Q9) Inquire about challenging experiences with other technical assistance providers.

TA recipientsWhat types of technical assistance (either through the Network or other organizations) have not been particularly effective, if any, in your opinion? Why? How can the TA provided by the State Support Network be improved to better meet the needs of your state and/or district(s)? Please cite specific recommendations.

Senior SEA staff not directly involved in TAWhat types of technical assistance (either through the Network or other organizations) have not been particularly effective, if any, in your opinion? Why? How can the TA provided by the State Support Network be improved to better meet the needs of your state and/or district(s)? Please cite specific recommendations.

TA partnersWhat types of technical assistance collaborations (either with the Network or other organizations) have not been particularly effective, if any, in your opinion? Why?

Page 64: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

E-8

State Support Network—Technical Assistance Summary Report: Appendix E

QuestionNotes/actions for

interviewer if applicable

(Q10) Inquire about the need for future technical assistance.

TA recipientsWhat technical assistance do you and/or your state anticipate needing in the next 3 to 6 months? Do you and/or your state have an emerging need?

Senior SEA staff not directly involved in TAWhat technical assistance will your state and/or district(s) need in the next 3 to 6 months? Does your state and/or district(s) have an emerging need?

TA partnersWhat technical assistance do you recommend the State Support Network consider undertaking in the next year?

For all interviewee types:

If interviewee does not explain response, probe for additional detail or examples as appropriate.

Wrap-upThank you again for your time. The information that you’ve shared with us will be very helpful in informing future TA offerings. We look forward to working with you again in the future on other State Support Network opportunities.

Page 65: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

F-1

State Support Network—Technical Assistance Summary Report: Appendix F

Appendix F. List of Network PartnersNetwork Partners and Organizations Contributing Expertise to Network Technical Assistance The core Network partners are as follows:

� American Institutes for Research

� Battelle for Kids

� Pivot Learning Partners

� Synergy Enterprises, Inc.

The organizations that contribute subject matter expertise1 are as follows: � AEM Corporation

� American Bar Association

� California State University Northridge

� Center for Assessment

� Chiefs for Change

� Council of Chief State School Officers

� Corbett Education, Inc.

� Education Strategy Group

� Education First

� Education Trust

� Edunomics Lab at Georgetown University

� Education Resource Strategies

� ExcelinEd

1 In addition to the experts from the organizations listed, the Network engaged experts as independent consultants.

� Federal Education Group

� Kansas University

� Migration Policy Institute

� National Implementation Research Network

� The Opportunity Institute

� Public Impact

� RAND Corporation

� Results for America

� Southern Regional Education Board

� University of Arkansas

� WestEd

� WIDA Consortium at the Wisconsin Center for Education Research

Page 66: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

F-2

State Support Network—Technical Assistance Summary Report: Appendix F

The following organizations collaborate in technical assistance partnerships meeting meaningful partnership criteria2:

� Building State Capacity and Productivity Center

� Center on Great Teachers and Leaders

� Center on Innovations in Learning

� Center on School Turnaround

� Center on Standards and Assessment Implementation

� Central Regional Comprehensive Center

� College and Career Readiness and Success Center

� Council of Chief State School Officers

� Eastern Shore of Maryland Education Consortium

� Education Counsel

� ELPA 21

� Great Lakes Comprehensive Center

� HCM Strategists/Learning Heroes

� National Center for Education Statistics/ Common Education Data Standards/State Longitudinal Data Systems Teams

2 Some organizations appear on both the subject matter expert list and the collaborative partnerships list because they partnered with the Network in both capacities.

� National Center for Systemic Improvement

� National Center on Educational Outcomes

� Northeast Comprehensive Center

� Northwest Rural Innovation and Student Engagement Network

� Partners for Education at Berea College

� Privacy Technical Assistance Center

� Regional Educational Laboratory (REL) Midwest

� REL Southeast

� REL West

� Results for America

� South Central Comprehensive Center

� West Comprehensive Center

� WIDA Consortium at the Wisconsin Center for Education Research

Page 67: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S
Page 68: Technical Assistance Summary Report February 2019 · Technical Assistance Summary Report. FEBRUARY 2019. This document was produced by American Institutes for Research under U.S

1000 Thomas Jefferson Street, NWWashington, DC 20007-3835

[email protected]