ENSURING EVERY STUDENT IS KNOWN: HOW AN RPP’S RAPID RESPONSE STUDY PROVIDED QUICK ANSWERS TO GUIDE PROGRAM IMPLEMENTATION AT SCALE

Abbey Loehr | Metro Nashville Public Schools + Nashville Partnership for Educational Equity Research (PEER) | PRACTICE-SIDE
Marcy Singer-Gabella and Jessica Holter | Vanderbilt University + Nashville Partnership for Educational Equity Research (PEER) | RESEARCH-SIDE

Volume 6 Issue 2 (2024), pp. 2-9

OVERVIEW

The Research Artifact

By Maury Nation, Caroline Christopher​, and Megan McCormick

Presentation of findings from the RPP’s rapid response study on the implementation of the Metro Nashville Public School’s Navigator Program to Metro Nashville Public Schools

The RPP: Mission

The Nashville Partnership for Educational Equity Research (PEER) is a collaboration between Metro Nashville Public Schools (MNPS) and Vanderbilt University’s Peabody College of Education and Human Development. Founded in 2021, the partnership pursues equity-driven research to guide public education policy and practice, disrupt educational inequities, and enable all students to thrive.

The RPP: Launching Rapid Response Studies

A central space for this work is PEER’s cross-institutional working groups. These groups of leaders from the district and researchers from the university come together to co-construct research questions, design and conduct studies related to PEER’s research priority areas, and then turn learnings into recommendations for policy and practice.

While the working group structure has supported partners’ deep engagement in and shared ownership of the work, it is not designed to systematically answer questions on a relatively short timeline. To fill this gap, PEER launched a new strand of work in 2023 called “Rapid Response Studies,” which are quick-turnaround studies designed to provide evidence on pressing questions within six months or less. 

Potential rapid response studies are typically identified by district leaders during routine program review and strategic planning conversations. These topics and a proposed timeline are then brought to the PEER Partnership Planning Group (which includes representatives from the district and the university) for vetting and refinement. If the study is determined to be viable, district leadership will choose an MNPS sponsor for the project. Once the project sponsor and desired deliverable timelines are set, PEER’s co-directors issue a Call for Proposals to the Peabody College community, host an informational session, and request 1-2 page proposals within two weeks. Submitted proposals are reviewed by PEER’s co-directors and a representative from the district’s Research Review Committee. If there are multiple strong proposals (to date, proposing teams have chosen to join forces rather than compete!), the co-directors provide a recommendation to district leadership regarding which should move forward. The co-directors and PEER’s research operations manager facilitate communication and collaboration of rapid response cross-institutional teams to finalize study design details and logistics, support data collection, and interpret preliminary findings as they become available. At the conclusion of the study, researchers share their findings and recommendations with relevant district leadership and project teams who provide feedback and share initial plans for using study findings. MNPS’s PEER co-director informally monitors and supports ongoing use of study findings through their embedded role in the Research Assessment and Evaluation department and involvement in district strategic planning conversations.

WHY THIS WORK

PEER’s first rapid response study focused on the district’s “Navigator Program,” which grew out of a need to connect to students during the pandemic and was first piloted at scale across the district by MNPS in the 2020-21 school year. The Navigator Program aims to connect every student in the district to a caring adult who serves as their mentor, advocate, and advisor – their “navigator.” Navigators can be teachers, counselors, or other school staff members. Every school identifies a lead navigator to support program implementation and serve as a liaison with the district implementation team. The program is designed to strengthen student-adult relationships through students’ use of a computer-based weekly reflection platform and an in-person monthly check-in with their navigator. Navigator has become an important initiative to fully realize MNPS’ commitment to creating belonging in schools and ensuring every student is known. In particular, the Navigator Program seeks to connect every student in MNPS to a caring adult through regular check-ins, communication points, and opportunities to build a relationship. These strong student-adult relationships are theorized to be a key strategy for increasing school attendance, decreasing school discipline, and supporting the collaborative referral process that connects students to the supports they need. 

In 2023, as PEER first launched the rapid response studies strand of work, the Navigator Program emerged as a priority area for district leaders who wanted to assess program implementation and understand what had been learned from the first few years of implementation. Because the Navigator Program began in the 2020-21 school year as a response to meeting student needs during the pandemic, the program evolved over time. Implementation, data tracking, and progress monitoring looked different each school year. Additionally, each year there were shifts in the district leadership who oversaw the program. By 2022-23, the district felt an urgent need to better understand implementation and examine impact, and they wanted to start with perspectives from the school-level implementers—the navigators. In response, PEER activated its first rapid response study, in which researchers interviewed current navigators in schools with varying levels of implementation quality, reported their findings back to district leaders, and helped district leaders establish a set of action steps to improve monitoring and supports for implementation in the upcoming school year.

WHAT THE WORK EXAMINES

The objective of this rapid response study was to identify factors influencing the implementation of the Navigator Program and opportunities to improve implementation. The specific research questions emerged through a dialogue among the district research team, the district project sponsor, the PEER co-directors, and the university researchers. Given the initial broad charge of helping the district assess implementation, the researchers generated an outline of potential strands of research they could pursue to examine implementation and presented the options to the district team and PEER co-directors. After collaboratively considering the affordances and constraints of each approach, the district team members decided that conducting interviews with navigators about their experiences with implementation would be most helpful as a first step.

The following primary questions were identified:

  • What are navigators doing to implement the program in their school and with their students? 
  • What are the barriers and supports that are affecting the implementation of the Navigator Program?
  • How do the experiences of navigators differ across high and low implementation schools? 
  • What are navigators’ suggestions or recommendations for improving the Navigator Program and program implementation? 

To carry out the analysis, the research team first needed to select the sample of schools. To do this, they created a weighted composite score for Navigator implementation level based on four indicators, including student engagement rate (how often students participated in weekly reflections), teacher feedback rate (how often teachers responded to student entries), navigator contact attempts rate (how often navigators attempted to check in with students), and navigator successful contact rate (how often navigators successfully checked in with students). The team then identified the eligible school sample based on grade tier (elementary, middle, high), implementation level (low or high), and nominations from the district Navigator Program implementation team. From this eligible pool of schools, the final sample included 7 schools total (3 elementary schools, 2 middle, and 2 high schools). Lead navigators and navigators (both teacher and non-teacher) were recruited from each school.  

The research team then conducted a thematic analysis across a series of 1 hour interviews, the majority held virtually, with 12 navigators (9 teacher, 1 non-teacher, and 2 lead navigators), paying special attention to responses that differentiated interviews from low and high implementation schools (i.e., “difference makers”). 

At the beginning of the study, PEER’s co-directors held daily stand-up meetings with the research team and the district project sponsor to flesh out the study design details and work out logistics, including gaining expedited research approval from the district, submitting an application to the university’s institutional review board, and preparing data to support the study. Once the study was approved, PEER’s co-directors met weekly with the research team and the MNPS project sponsor to check in on research status and help clear obstacles during the research process. Because of the tight timeline and limited capacity, the district’s Research, Assessment, and Evaluation team assisted with some of the interviews.

FINDINGS

Four main themes emerged from the interviews with the navigators: Program successes, questions about the program purpose, concerns about time, and variation in how students were engaging with navigators. In summary, the following key ideas were named for these four themes:

Program Successes​

  • Strengthening relationships with students​
  • Identifying student and family needs and connecting to supports​
  • Utilization of the monthly navigator check-ins and weekly emotional reflection platforms as computer-based tools for implementation

Questions about the Program Purpose​

  • Navigators wanted to know the “why”, as well as what the long-term goals and vision were
      • Interviews revealed gaps in understanding about the program components (the weekly student reflection and monthly check-ins with navigators were seen as two different initiatives), which was surprising to the implementation team.

    Concerns about Time​

    • Competing priorities made it difficult to implement the Navigator Program effectively in the time given (at high implementation schools especially). Dedicating time for program activities into existing schedules helped.​

    Variation in how students engaged with navigators​

    • Some students engaged less with the weekly reflection platform and more with monthly navigator check-ins (and vice versa) for different reasons

    Interviews revealed a number of aspects that facilitated program implementation, which the research team called “difference makers.” These included:

    • Effective dissemination of materials and ongoing training for navigators throughout the year​
    • Buy-in from school-level administration​
    • Having a flow chart of supports and knowing who and how to connect students to supports, including follow-up (closing the loop)​
    • Having a clearly identified “go-to” person for all things Navigator. This was intended to be the lead navigator, but sometimes individuals looked to someone else to answer questions
    • Having a community of support: Knowing what happened after students were connected to supports, getting help from fellow staff and talking to other navigators about their work. In the case of lead navigators, having monthly meetings with leads from other schools was a difference maker

    The researchers summarized their findings in a PowerPoint deck and presented the results to a group of district leaders involved in decision-making and implementation of the Navigator Program. The group discussed the results and began to identify action steps in response to the findings. Slides from the deck have since been integrated into ongoing program planning and formative evaluation presentations at the district and have contributed to a deeper understanding of both program implementation and what needs to change to improve. Ultimately, the findings revealed a common challenge faced by large, urban districts – effectively adapting and scaling effective practices across school contexts, grade levels, and for students with unique learning needs.

    IMPACT AND USE OF THE WORK

    The learnings stemming from lower implementation schools inspired a deeper quantitative analysis to understand what was happening at the school level across implementation levels. This analysis revealed that many schools were having similar challenges, which led to several changes in program practice for the 2023-2024 school year:

    • Combining weekly and monthly check-in platforms into a single platform for ease of use and consolidation of implementation monitoring data
    • Identifying and designing key implementation metrics to monitor progress across multiple indicators of implementation
    • Starting bi-weekly district implementation team meetings with the platform team to review and monitor quantitative implementation data by school and follow up with targeted supports for schools

    MNPS is still using learnings from the study findings to:

    • Improve strong messaging and coherence around the “why” of the program, including redefining its purpose as a Tier 1 support that every student receives through regular instruction
    • Refine expanded supports for all schools (provided at the beginning of the year and throughout the year), such as
        • Protected, individualized support blocks (one-on-ones) for navigators and navigator leads
        • Updated SharePoint with tutorials and handbooks for administrators, teachers, and students (including videos, written communication, and prompt guidance to support student-navigator conversations)
        • Continued monthly navigator leads meetings
        • Weekly communications to navigator leads
      • Refine protocol for follow-up at low-implementation schools identified through monitoring of quantitative implementation data
      • Refine scheduling and determination of who is best positioned to serve as a navigator (e.g., teachers vs. other school staff)

      Ultimately, the initial rapid response analysis and findings presentation generated a snowball of new hypotheses to test, which then led to additional analyses that informed the creation of new measures for the upcoming school year. Reviewing findings from this rapid response study and additional analyses has helped build understanding at the district Cabinet level about key initiatives and how to problem solve for improvement.

      A second rapid response study was completed Fall 2023 and a third is currently underway, demonstrating a shared commitment from both the research and the practice side to build capacity for evidence-based decision-making that leads to changes in practice. Thinking nimbly about specific initiatives and getting research findings back fast was really exciting for the district. Pursuing responsive, just-in-time partnership research has increased opportunities to engage expertise on both sides of the partnership and inspired deeper understanding of district initiatives aimed at creating conditions for students to thrive, in turn building trust and relationships among all members of the partnership.

      The process of engaging in rapid response studies has also informed how to improve the efficiency of working groups taking on big ideas with a longer timeframe. For example, identifying more specific problems of practice and creating urgency for research milestones that map onto key decision-making timelines in the district are two learnings that were reinforced by the rapid response study process.

      OPEN QUESTIONS AND NEXT STEPS

      The PEER team is currently pondering two main questions:

      The first relates to sustainability. The rapid response studies have generated significant goodwill toward PEER – building awareness and a sense of value for the partnership among a wide swath of district stakeholders. However, rapid response studies are essentially evaluation studies, which for many RPPs are a key source of income. While PEER has chosen to provide these for free to the district, the studies have costs: Faculty time (which is currently donated), as well as graduate student support and participant incentives (which are paid in real dollars). Currently, these costs are covered by seed funds from the university. How can PEER ensure the sustainability of this work once the initial funding runs out? One possible avenue the partnership is exploring is for the district to create a line item in its operating budget for rapid response studies.

      The second question relates to measuring impact. The PEER team is interested in tracking ongoing learning and uptake of research findings from rapid response studies, as well as the ways in which rapid response studies lead to new lines of inquiry. How can PEER systematically track the impact of this strand of work?

      PEER is excited to continue pursuing these questions and building out the rapid response studies strand of work to help the district address pressing problems and questions quickly.  

      This article was written by members of the Nashville Partnership for Educational Equity Research (PEER) team: Abbey Loehr is Co-Director of PEER and Manager of Research-Practice Partnerships at Metro Nashville Public Schools; Marcy Singer-Gabella is Co-Director of PEER and professor in the Department of Teaching and Learning at Vanderbilt University; and Jessica Holter is PEER’s Director of Communications.

      Suggested citation: Loehr, A., Singer-Gabella, M., & Holter, J. (2024). Ensuring Every Student is Known: How An RPP’s Rapid Response Study Provided Quick Answers to Guide Program Implementation at Scale. NNERPP Extra, 6(2), 2-9.

      NNERPP | EXTRA is a quarterly magazine produced by the National Network of Education Research-Practice Partnerships  |  nnerpp.rice.edu