SHIFTING OUR PERSPECTIVES OF THE TIME IT TAKES TO ENGAGE IN IMPROVEMENT WORK

Amber Humm Patnode | Proving Ground

Volume 6 Issue 3 (2024), pp. 21-27

This is the eleventh installment of Improving Improvement, our quarterly series focused on leveraging the power of research-practice partnerships (RPPs) to build schools’, districts’, and states’ capacity to improve. In our most recent article, we shared how we support districts to meaningfully connect with students and families to inform improvement work. In this installment, we are revisiting and reflecting on the most frequently reported barrier to improvement work: insufficient time. This is a topic we previously wrote about in a 2022 installment, where we discussed working with partners to identify and let go of inefficient or ineffective work to instead make space for cost-effective, vision-aligned, and impactful efforts. Here, we examine this barrier from a different perspective.

Within the context of a larger survey of our current and past partners focused on better understanding how their partnership with us has improved their capacity to engage in continuous improvement, we are in the process of more closely examining time as a barrier. We began using the survey in 2022, which you can read more about in this article – this data collection is part of our own continuous improvement work to ensure that we are supporting partners in the best possible way. The survey asked partners their perceived value of the specific tools and activities used in the Proving Ground model as well as the overall value of partnership in developing their continuous improvement skills. Preliminary data were consistent with prior responses, with over 90% of participants reporting that the Proving Ground process better equipped them to engage in improvement work. Participants named the thoroughness, deliberate focus, and support provided for the steps in the process as strengths — and the time required to engage in each step as the primary limitation or barrier. Upon seeing this data, we decided to collect additional information from four of our recent cohorts (which represent our most time-intensive service delivery model) to dig deeper into the question: How much time are partners actually spending on Proving Ground-related improvement work? This may get us closer to understanding how much time is too much time, and why?

    HOW PARTICIPATING SCHOOL DISTRICT LEADS SPEND THEIR TIME

      We asked a sample of district team leads in the four cohorts about the amount of time their team spent working on the steps and/or activities in the Proving Ground process between individual district and whole-group cohort sessions in the first partnership year (the first year being our most time-intensive year with whole group session occurring every 4-6 weeks). In addition to the district-reported time spent working between sessions, we calculated the amount of time each cohort spent in the whole-group sessions and one-on-one district check-ins that occurred between each session. There was some variation between the cohorts in the number of whole-group sessions in the first year (six or seven sessions) as well the amount of time spent per whole-group session (six or seven hours of scheduled time). Below is a breakdown of time spent by districts in their first partnership year:

      • 36-49 hours spent in whole group cohort sessions that took place every 4-6 weeks 
      • 7 hours spent in one-on-one check-ins with the PG team in-between the whole group sessions to answer any questions the team may have encountered while working in-between sessions and ensure they are ready to continue the work in the upcoming whole group session
      • 21 hours spent working on PG pilot planning steps between the whole group sessions (reported median and modal response of 3 hours was spent between each session)

      When aggregated, the amount of time district teams spent on Proving Ground-related activities in the first year totals 64-77 hours – less than two full eight-hour day work weeks. This represents 3.1-4.3% of available working (contract) time, which is often an underrepresentation of hours actually worked by district and building administrators. This estimate is based on the typical district and building administrator contracts, which are longer than teacher or itinerant staff contracts, and best represent those participating on the district team working with Proving Ground. 

      In initial reflection, less than two weeks or up to 4% of time spent in understanding and planning to address big problems of practice doesn’t seem like a big ask. However, we wondered if a key way to interpret the significance of this amount of time may be asking, “in comparison to what?”. Thus, we went back to some of our district partner leads [1] to verify the data, get their perspectives, and identify other common activities that school or district staff engage in over the course of a year and the corresponding amounts of time as reference points. Here are some activities and associated lengths of time they shared with us:

      • 90+ hours are spent in district Cabinet meetings each year
      • Approximately 80 hours a year are spent in preparing presentations for school board meetings
      • 30+ hours are spent in school faculty meetings annually (not including prep time or follow-up activities post-meeting)
      • 10 hours annually spent on required Human Resource trainings

      The examples cited above largely represent mandatory events focused on conveying or receiving information to or from someone relative to work, activities, or events that have been or will be done as well as expectations. One district Lead described these as “housekeeping” or transactional activities as opposed to time spent in innovative and collaborative problem-solving to address adaptive problems that require novel solutions. 

      I recently read the Right Kind of Wrong: The Science of Failing Well, and recognized that the steps in our model have many parallels with Amy Edmondson’s concepts of taking “intelligent risks” and as necessary learning from “intelligent failure” when attempting to solve adaptive challenges that require novel solutions. According to Edmondson, taking intelligent risks or learning from intelligent failure may take longer than other decision-making processes due to information gathering, processing, and planning and are characterized by actions that are:

      • Aimed at achieving a goal
      • Situated in new territory where novel strategies are required
      • Driven by a hypothesis, informed by available information,  and that the strategy is likely to work
      • No bigger than they have to be to gain insights

      In addition to taking intelligent risks, Edmondson notes that we also have to prevent basic failures (those due to carelessness or ignorance) and as often as possible predict and mitigate complex failures (those due to multiple causes that alone would not have resulted in failure). The steps our partners take are aligned with this concept of taking intelligent risks and learning from an intelligent failure while attempting to prevent basic failures through, for example,  identifying key implementation activities, the resources, professional learning and capacity, and communication protocols necessary to implement well. We also guide our partners in predicting, scanning for, and mitigating complex failure through incorporating user-centered design strategies, engaging in communication planning, and developing implementation monitoring tools. Time spent on intelligent risk taking and planning for preventing and mitigating failure as opposed to equivalent time spent in more transactional activities seems less burdensome and a strategic approach to allow for organizational learning. Going forward, this may be a helpful framing for us to share with our partners.

      UNDERSTANDING PARTICIPATING SCHOOL DISTRICT LEADS’ PERCEPTIONS OF TIME

      In addition to asking the district leads questions to better contextualize and understand time spent in PG activities to answer “in comparison to what?”, we also asked their perceptions about the extent to which they perceived the time spent as “too much” and their ideas about why others may feel it is too much of a time commitment.  

      Relative to their perceptions of time, the leads we spoke with indicated they personally did not feel it was too much, and that the structure and time requirements for PG did not differ from time spent with other RPPs or external consultants. However, they did offer insights as to why others may perceive it to be. Their responses clustered into three main themes: perceptions of alignment, logistics, and frenzy culture. 

      Alignment. The perceptions of alignment were interesting in that multiple leads hypothesized that across teams, participants may not view the improvement work they engage in with PG as directly aligned to work they already are/will or “should” be doing. The majority of districts applied to join a cohort partnering with us and as part of the application, teams are asked to specify how this will support district priorities, initiatives, and workstreams. This makes us wonder to what extent the perceived alignment referenced in the district’s application is being communicated to all team members. If team members perceive the time spent working on improvement activities with PG is in addition to rather than in support of important and/or mandatory work, it is not surprising that it may be viewed as overly burdensome or even a distraction. 

      Logistics. Setting aside six hours of a day every four to six weeks is a logistical challenge for many team members. This is often complicated by the fact that while they are otherwise occupied, many are participating from their respective offices and appear physically available/present, resulting in multiple disruptions throughout the day. In addition to unanticipated disruptions, there are often building or district-level meetings that are scheduled concurrently that staff members may feel compelled to attend due to proximity. The district leads shared that team members engage in varying levels of communication with building and district-level colleagues regarding their availability during session times which creates competition for their time and attention and pressure to select a priority. 

      Frenzy Culture. Of particular note, multiple leads identified that the deliberate pacing and steps in the PG improvement process is often at odds with the frenzied pace that educators have become accustomed to. Quotes from different leads exemplified the current reality from both a global perspective and specifically for building principals:

      “As education leaders we have become programmed to be fast and furious. Finding ways to slow down and think through things can be difficult.”

      “Given how much comes at a building principal and how many decisions they have to make in the span of a single day [to perform their job well]. the shift to a more slow and deliberate process in the course of a year would be jarring.”

      Both of these quotes highlight that the speed of decision-making to which many education leaders have had to adapt to is dramatically different than the PG process where they are asked to consider multiple perspectives and data sources to formulate hypotheses and select strategies as well as extensively plan implementation in a way that will support intelligent risk-taking.

      CREATING MORE TIME: EXISTING PRACTICES TO CREATE EFFICIENCIES

       

      Based on partner feedback over the years at Proving Ground, we have evolved our actions to try to lessen time burdens and maximize working time for our partners by creating efficiencies where possible. These include:

      • Consulting each district’s academic calendar prior to the start of a cohort to identify the most optimal session dates for the entire year and sending calendar invites to hold those dates prior to cohort launch
      • Encouraging district teams (who are already in the same locale) to gather together in the same place during whole group sessions to maximize their collaboration — because we (our Proving Ground staff) meet virtually with district teams due to time demands and cost, this has implications for facilitation and overall group activities that we design
      • Encouraging teams to the largest extent possible to find a location, offsite if necessary, to meet during whole group and working sessions where they are least likely to be disrupted and can have focused time together
      • Creating large blocks of time (3-4 hours) during the whole group sessions where teams have dedicated time to work collaboratively on the next step in their improvement process with a PG facilitator available to provide coaching support as needed 
      • Scheduling their one-on-one check-in while they are in the whole group session to avoid unnecessary later email exchanges
      • Offering the option to reduce the session length by one hour for cohorts as needed (which has the cumulative effect of one less day in the first year), while acknowledging that this will increase the amount of time they will need to spend outside of sessions to complete activities
      • Sending consistent reminders for upcoming sessions, recapping next steps, and resources to locate and access all materials in a predictable and clear format
      CREATING MORE TIME: FUTURE POSSIBILITIES

       

      Beyond the steps we have taken to create efficiencies in our request and usage of partners’ time, we need to do more to address this real and perceived barrier. One wondering we have is how to better frame the work as an investment rather than an expenditure. The problems of practice our partners select to focus on are predominantly aligned with district and or state priorities, many of which they specifically name in their partnership application. Suggesting that this is intensive and contextually relevant work they would, regardless of partnership with Proving Ground, spend precious amounts of time upon might be an important lens. In addition, these problems are predominantly adaptive challenges, where there is high uncertainty about the best way to address them, which necessitates experimentation and the process of intelligent risk-taking Edmondson references. Perhaps framed through the lens of experimentation and intelligent risk-taking coupled with continued efforts on our part to reduce unnecessary time burdens, we can support districts in viewing the time they spend engaging in planful improvement efforts as investments in the bank of a desired future state rather than withdrawals.

      LOOKING AHEAD

       

      In the next installment, we will share findings from our most recent partner survey and our reflections on our own improvement work over time and future directions.

      [1] The author would like to express gratitude to the following individuals for their insights and thought partnership in the development of this article: Matthew Berkshire, Chastity Trumpower, and Carrie Conaway.

      Amber Humm Patnode is Acting Director of Proving Ground.

       

      Suggested citation: Humm Patnode, A. (2024). Improving Improvement: Shifting Our Perspectives of The Time it Takes to Engage in Improvement Work. NNERPP Extra, 6(3), 21-27. https://doi.org/10.25613/E3EZ-SD19

      NNERPP | EXTRA is a quarterly magazine produced by the National Network of Education Research-Practice Partnerships  |  nnerpp.rice.edu