REFLECTING ON PROGRESS IN SUPPORTING CONTINUOUS IMPROVEMENT WORK

Amber Humm Patnode | Proving Ground

Volume 6 Issue 4 (2024), pp. 22-27

This is the twelfth and final installment of Improving Improvement, our quarterly series focused on leveraging the power of research-practice partnerships (RPPs) to build schools’, districts’, and states’ capacity to improve. In our previous article, we shared how our partners perceive the amount of time they spend with us on improvement work and ideas for us to consider how to shift perceptions towards viewing that time as investments in the bank of a desired future state rather than withdrawals. In this final installment, we reflect on the progress we’ve made in building our partners’ capacity to engage in rigorous improvement efforts. 

    OUR VISION FOR SUCCESS IN SUPPORTING EVIDENCE-BASED CONTINUOUS IMPROVEMENT

      Each summer our team takes a step back and reflects on the progress of each of our cohorts in the prior year as well as the effectiveness of the Proving Ground model overall using data from 2021 to present. This informs what components, processes, or tools we will retain and what modifications we will test with current and future partners so we can be as helpful as possible in supporting the districts and states we work with in tackling fundamental education challenges through continuous improvement. 

      At Proving Ground, our vision for success is this: Our partners will make evidence-based continuous improvement –which includes piloting and evaluating interventions– part of the ordinary course of business in their agencies. (You can read more about this in a previous installment.) We conceptualized monitoring progress towards this vision in three phases:

      • Phase 1: Do our partners execute a high-quality improvement cycle while working with us? How well did they execute each competency and to what degree was the decision outcome optimizing? 
      • Phase 2: Based on their work with us, how confident are we that they are able to do this without us – can they generalize from the model we worked on together to other problems of practice? Additionally, how confident are we that they are willing to do this without us – have we created the internal demand?
      • Phase 3: Are they doing this after our engagement ends? How well and for how many of their strategic priorities?

      We use a variety of data sources to determine progress in these three phases, including: 

      • Whether each district has completed a high-quality improvement cycle while working with us (using data from 2021 to present)
      • Post-session evaluations completed by our district participants that are plotted in aggregate and across time and cohorts (using data from 2021 to present)
      • Interviews with partners after their work with us ends on what did and did not work for them in the partnership or process, as well as on suggestions for improvement they have for us (using data from 2023)
      • Bi-annual surveys with district partners on their perceptions of the process and model (using data from 2022 and 2024)
      • Self-assessments, administered before partners start their two- to three-year work with us and then again after they conclude this work, and measuring partners’ perceived utilization and skill of nine continuous improvement competencies [1] aligned to the Proving Ground model (using data from 2021 and 2023 from our cohort of eight districts)
      EXAMINING PROGRESS IN INSTITUTIONALIZING CONTINUOUS IMPROVEMENT

      Phase 1: Do our partners execute a high-quality improvement cycle while working with us? 

      Since the 2020-2021 school year, out of the 33 school districts that have partnered with us, only 6% (2 districts) have not been able to complete an improvement cycle. For one of the districts, this was due to significant senior leadership changes within the district, and for the other it was due to a lack of accessible data, which delayed the process, as well as due to district leadership staff turnover. An additional 12% (4 districts) were not able to complete their intended improvement cycle within a given partnership year but were able to complete an improvement cycle within the two or three years of partnership. For almost all of the districts, the reason for the delay was leadership changes either at implementation sites or district-wide. All of the partners that completed an improvement cycle were able to leverage the information and data gained as a result to guide decision-making about the continued use of implemented strategies. 

      Phase 1: How well did our partners execute each competency?

      A large component of the Proving Ground process is direct coaching and support for districts to execute the nine continuous improvement competencies at a high level. Among the districts that completed improvement cycles, all effectively identified a problem of practice based on data, set an improvement goal, determined root causes, and identified and designed a root cause-aligned intervention (competencies 1 through 5). However, based on observation and partner reflection and feedback. districts had varying levels of success in planning for and monitoring implementation, developing and adhering to a pilot plan to generate impact evidence, and using the resulting evidence to make decisions about continued intervention usage (competencies 6 through 9). 

      In 2022, one of our biggest reflections from engaging in coaching sessions and district feedback was that we needed to provide more intensive support in these areas. As such, in our most recent cohorts, we guided districts through more detailed implementation planning and piloting processes and check-ins to discuss completion of action items. We added a session focused on implementation monitoring where teams developed an implementation monitoring plan outlining what data they were going to collect to measure implementation, when, how, and by whom it would be collected, and when the team would review the implementation data to determine any implementation challenges and corresponding supports. In the 2022 partner survey, 75% of respondents indicated that they agreed or strongly agreed that the PG process helped them better implement solutions, making this the lowest endorsed survey item. Following the specific changes we made to better support implementation, 94% of respondents agreed or strongly agreed with this item in the 2024 administration — making it our second-highest endorsed item. In addition, we added a new item to the 2024 survey asking partners about PG’s support in designing a pilot to test solutions, and 88% of respondents agreed or strongly agreed that the PG process helped them better design a pilot. 

      Phase 2: How confident are we that our partners are able to do this work without us and how confident are they that they can do the work without us?

      When we first started evaluating progress for this phase, we quickly realized that our partners’ confidence in their ability to do the work without us was a better indicator of continued usage than our predictions. In 2024, 94% of respondents agreed or strongly agreed that the PG process better equipped them to personally engage in continuous improvement efforts in the future, and 82% agreed or strongly agreed that the PG process helped improve the continuous improvement efforts of their overall team. 

      When asked in 2024, 84% of our partners indicated they are likely or very likely to use the PG process/tools to address other challenges, which is up from 78% in the 2022 administration. In addition, for at least 10 of our tools, 80% or more of respondents indicated they would use it in the future in the 2024 administration of the survey, which is up from 4 tools in the 2022 administration.

      Following our 2022 survey administration, we revised or eliminated the tools that received the lowest rating for likelihood of future use. We are in the process of reviewing the tools that were lower rated in both the 2022 and 2024 administrations to determine ways to make them more user-friendly. 

      Phase 3: Are our partners doing this after our engagement ends? How well and for how many of their strategic priorities? 

      We asked our partners if they have used the PG process for other challenges after their engagement with us ended and 51% of respondents in the 2024 survey administration indicated they have, which is an increase from 34% in 2022. Respondents indicated they have used the process to address the following topics: discipline, school improvement efforts, academic improvement, graduation rates, acceleration pathways, equitable access to courses, math challenges, attendance, and curriculum implementation.

      Our first cohort of eight districts completed both the pre- and the post-continuous improvement self-assessment where they are asked to rate themselves on how well (quality) and to what extent (frequency) they are engaging in nine continuous improvement competencies. Each district submitted a single self-assessment that is intended to represent the perspective of the team. In all but one of the districts, the same individual submitted the pre and post ratings. The pre-assessment was submitted before any partnership activities began in 2021 and the-post assessment was submitted at the end of the contract period in 2023.  

      In terms of the quality of engaging in the nine competencies, districts’ self-assessments increased between 19 to 66 percentage points from their pre- to post-self-assessments, suggesting significant growth in the two years of engaging with us. The area of greatest growth was in competency 8, using data from the pilot to determine whether to stop, adapt, or scale up an intervention (66 percentage point improvement), followed by 52 percentage point improvement in competency 9, incorporating pilot results in a broader reflection on progress towards strategic priority areas. In terms of the frequency of engaging in the nine competencies, there was a 13-50 percentage point increase from districts’ pre- to post-self-assessments. The area with the greatest reported increase in frequency was competency 1, developing problem statements for all problems of practice aligned to organizational goals. 

      We recognize that this is a small sample of districts to complete the self-assessment and there are biases inherent in self-report. We are looking forward to two additional cohorts completing the post self-assessment at the end of this school year to see if patterns emerge and what we can learn about the effectiveness of our support.

      REFLECTING ON PROGRESS

       

      We are incredibly proud of the progress we have made and the support offered to our partners over these last four years. Based on our established criteria we have evidence that our partners are meeting Phase 1 (completing an improvement cycle with us) and Phase 2 (confidence that they can do the work independently) indicators of success. We have some evidence that some districts are meeting Phase 3 (continuing the work post-partnership). While there are definitely positives reflected in the data shared, we continue to reflect on opportunities for growth and how we can ensure that districts are not only equipped, but also motivated to continue engaging in high-quality improvement cycles in the future.

      LOOKING AHEAD

       

      It has been our great pleasure to share our lessons learned and reflect on our progress since 2020. Please feel free to reach out to us with questions or to continue the conversation. 

      [1] The nine Proving Ground continuous improvement competencies are:

      1. Clearly define the problem and set an improvement goal for it
      2. Identify root causes
      3. Identify a set of potential interventions aligned to the root causes
      4. Prioritize a potential intervention from that set to try
      5. Design the intervention using user-centered design principles
      6. Plan for implementation and progress monitoring
      7. Pilot to generate evidence of impact
      8. Use evidence from the pilot to decide whether to stop, scale or adapt
      9. Reflect on the results in light of your improvement goal

      Amber Humm Patnode is Acting Director of Proving Ground.

       

      Suggested citation: Humm Patnode, A. (2024). Improving Improvement: Reflecting on Progress in Supporting Continuous Improvement Work. NNERPP Extra, 6(4), 22-27. https://doi.org/10.25613/XNGD-XN73 

      NNERPP | EXTRA is a quarterly magazine produced by the National Network of Education Research-Practice Partnerships  |  nnerpp.rice.edu