ATLAS looking at data protocol
Based on the ATLAS Looking at Data protocol, this tool encourages team members to describe what they see in the data, make inferences, and share implications for future work.
Use the Plan-Do-Study-Act cycle to refine new strategies.
In addition to driving overall schoolwide SEL implementation, PDSA improvement cycles also provide a powerful structure for quickly testing specific innovative strategies to support SEL. This kind of rapid prototyping and testing allows teams to frequently and efficiently reflect on whether they’re moving in the right direction so they can act quickly and adjust course when necessary (Bryk et al., 2015).
These PDSA cycles test potential solutions by focusing on three essential questions:
What are we trying to accomplish?
What change or innovation might we introduce and why?
How will we know whether the change is actually an improvement?
Learn more about using each step of the PDSA cycle to address these questions below:
Before deciding what innovation to test, an SEL team begins the PDSA cycle by identifying an aim and a problem of practice.
An aim is a goal that answers the question “What are we trying to accomplish?” It may be directly related to an SEL implementation goal or linked to an important outcome that schoolwide data indicates needs improvement. For example, an aim might be to fully engage families in the schoolwide approach to SEL.
A problem of practice (PoP) is a key issue that the school hopes to improve in order to accomplish their aim. For example: families feel they do not know how to support SEL in their school.
After establishing an aim and determining problems of practice, a root-cause analysis can help school teams break down the problem into parts that can be feasibly addressed. In our example, one root-cause might be insufficient communication with families about SEL.
From their work with the Carnegie Institute, High Tech High’s GSE Center for Research on Equity and Innovation have assembled a library of tools and protocols for continuous improvement. They’ve made these resources freely available to use, adapt, and share. Some tools that school teams may use in the Plan phase to better understand their problem include:
During this phase of the PDSA cycle, the team also needs to plan carefully what data will help them assess whether they’ve met their aim and spot trends. For example, it may be helpful to collect descriptive data on families (age of children, number of children in school, language spoken at home, race/ethnicity, etc.) to examine who is more likely to engage in school events and any barriers to engagement.
After identifying data sources, the team determines what success would look like. For example, they may want to increase attendance at family events by 50% throughout the year and increase the number of positive responses from families to 80% on an end-of-year survey.
If the SEL team is looking to assess students’ SEL competencies for continuous improvement, CASEL’s SEL Assessment Guide may be a valuable resource.
During the Plan phase, the team will:
Decide on a problem of practice to address.
For example: Families say they are unaware of SEL activities and how they can support SEL in the school
Identify a continuous improvement aim.
For example: By the end of of the school year, 75% of families will report that they have opportunities to participate in SEL events in the school.
Determine the drivers that you need to influence in order to achieve your aim.
For example: Engage families in SEL learning opportunities
Identify current challenges or barriers related to your drivers that may be the target of improvement efforts and change ideas.
For Example: The team can use tools like empathy interviews, process maps, and root cause analyses to find where problems may exist that can be tackled/improved.
Describe what you’ll change or innovate (i.e., change ideas) to achieve the goal.
For example: Introduce a new communication strategy that reaches families through social media and texting.
Determine data you’ll use to measure change.
For example: Attendance at family SEL events, end-of-year parent survey
Define how you’ll know you achieved your goal (what success would “look like”).
For example: Attendance in SEL family events will increase by 50% in second semester, and at least 80% of parents will respond on the end-of-year survey that they agree/strongly agree that they were positively engaged in schoolwide SEL
The SEL team implements the innovation and learns from the result. This step involves putting the change into practice, documenting how things are done, what goes well, and what challenges are encountered, as well as collecting data to help determine impacts.
The SEL team may want to start on a smaller scale over a designated period. So if a team is testing out a new communication strategy for recruiting families to school events, they may start with a handful of staff members implementing the strategy for six weeks, and then review data in the next step to determine how well it worked.
One common structure for testing innovations are 90-day test cycles. These cycles provide a time limited and structured process for testing small-scale innovations, iteratively improving on them, and possibly determining if they can be implemented at a larger scale (e.g., schoolwide). The Carnegie Foundation for the Advancement of Teaching developed a helpful, downloadable, and freely available handbook for using 90-day cycles in education settings (see Carnegie 90-day Cycle Handbook).
Barrington school Engaging Students in Continuous Improvement
An elementary school in Barrington, Ill. was focused on improving sense of safety throughout the school. To get an impression of students’ perceptions of safety, they gave each student red and green stickers and asked them to place them around the school to indicate where they felt safe (green stickers) and unsafe (red stickers). This provided a visual representation that helped the planning team make decisions about where to focus their attention.
After this activity, the school held a schoolwide assembly to discuss the results with students and asked teachers to address any concerns or issues the activity raised in their individual classrooms. To avoid singling out any one teacher, the school looked at the information provided by the students at the school level, rather than the classroom level. They planned to also assess student-teacher relationships at the classroom level to ensure they had a full understanding of students’ perceptions.
During the Do phase, the team will:
Test your chosen change idea by carrying-out a small-scale and time limited trial of your planned action.
For Example: The school uses a 90-day PDSA cycle to role-out and begin using a new text messaging communication system for parents.
During the test, observe and document any problems or unexpected events, document things that went particularly well, and collect data that will help you determine the impact of your test.
For Example: The team (may be the entire SEL team, or a subcommittee responsible for this project) sets up an online form to log observations and learnings, assigning responsibility to team members to complete the form regularly. The number of parents enrolling in the system may also be tracked as an early indicator of roll-out success.
The SEL team lead may check-in periodically throughout the test/trial period with members of the team that are responsible for carrying-out the change, as well as those responsible for collecting and managing incoming data that documents the process, to ensure that both aspects are on-track.
For Example: The SEL lead or project lead may convene the team weekly to review data from log submissions and reflect on the rate of parent enrollment, determining where slight adjustments in process may be needed.
In the Study phase, the SEL team examines their data, comparing what actually happened to what they predicted would happen. Generally, it is helpful to avoid making assumptions and leaning-on preconceived notions in this process. The Study phase is not time for interpretation and meaning-making, but for determining whether goals were met on a basic level, and also for noticing trends and surprises in the data, which may help with interpretation and meaning-making during the “Act” phase.
Using a data reflection protocol can be helpful for teams, as it can be difficult to really allow the data to speak for itself and avoid inserting personal interpretations into this conversation. This is important, so that full exploration of the data can occur, drawing on the skills and wide range of perspectives of the entire team. If interpretation occurs too early/quickly, valuable discussion and exploration of the data is often cut-off and discussion shifts too quickly to solutions (sometimes called “solution-itis”). This can be problematic because it results in only a small number of narrowly-focused solutions being considered.
Examine data for disparities between student subgroups:
When using student data to inform SEL practices, it is important to examine the impact on subpopulations for students (such as race, IEP status, gender, free/reduced lunch status, or other categories). Disaggregating data in this way can highlight discrepancies, inequity, and misallocation of resources.
For example, disaggregated data can be used to see if certain sub groups of students feel different levels of engagement in school. Highlighting disproportionate practices and planning meaningful ways to address them is an important step in ensuring school wide SEL practices create an equitable, culturally sustaining school environment. Disaggregated data can also be used to advocate for specific policy and practice changes and make decision about where to target additional funding.
Most reflection protocols begin by asking participants to state what they see in the data without making interpretations or trying to make meaning from it (e.g., “I see that 30 percent of families attended and offered input.”). Participants often next reflect on trends they notice and/or things that surprise them about the data (e.g., “As I look across the data, it seems that families who have younger students in the school have been participating to a higher degree.”). Eventually, most data protocols guide participants to a point of meaning-making, and drawing conclusions in a scaffolded and structured manner.
School teams can use the ATLAS looking at data protocol or High Tech High’s Digging into Data Protocol to engage in productive dialog about data.
In addition to using a reflection protocol during the meeting, adequate preparation will help ensure that team data reflection meetings run smoothly and are productive. Prior to the meeting, data should be compiled and summarized in a user-friendly manner (who is responsible for this task is established during the earlier Plan phase). It is also best to establish clear roles and responsibilities for team members during the meeting, including roles like: facilitator, norm keeper, timekeeper, and notetaker. The facilitator’s responsibility is to carefully review the data reflection protocol prior to the meeting and ensure they are comfortable with the process, can move through it smoothly, and have any necessary materials at-hand (e.g., sticky notes, computer, notepads, etc).
During the Study phase, the team will:
Compile data gathered over the 90-day test period and prepare a user-friendly summary of these data to be reflected-on during a team meeting.
For Example: The SEL team’s data manager develops a report that shows the number of parents enrolling in the text message system by week, a summary of challenges or surprises reported, and the percentage of parents reporting positive engagement in SEL activities over time.
Convene the team for a structured and facilitated data reflection meeting.
For Example: The SEL team lead or project lead may pull together the team, including decision-makers and possibly other involved stakeholders (like parents in our example) to help reflect on the data.
In this meeting, discuss trends or surprises team members notice in the data, compare what you find with what you expected to happen, and summarize what you learned from your 90-day test.
For Example: Overall, we hoped for 90% enrollment in our text messaging system, but I see we only got to 65%. I also notice that the rate of enrollment in the text messaging system is lower for families for which English is not the primary language spoken at home.
The final step, now what, gave the staff an opportunity to brainstorm and strategize ways to improve their school’s climate and culture for the future. The participants generated dozens of positive ideas for how they could improve their schools climate and culture.
Washoe County high school’s data reflection process
A Washoe County high school used an approach they call “Here’s What, So What, Now What” to study their climate and culture data. The school did this to minimize staff anxiety levels, which tended to be high when areas needing improvement were identified.
To begin, the staff simply looked at the data and determined what it was telling them. The emphasis during this part of the process was on stating facts while reserving judgment.
Part two, the so what, gave them the opportunity to react and express their thoughts and feelings about the data. This was important because it allowed staff members to process the information and express ideas about underlying causes that might account for the results.
The SEL team decides what to do next based on what they learned throughout the process. For example, what was learned through the previous steps of the cycle could inform a targeted strategy for engaging parents of older children to increase engagement. Alternately, the team may have evidence that the new communications strategy is ready to be implemented at a larger scale, or that it should be abandoned.
During the Act phase, the team will:
Make meaning of the data, discuss what trends and surprises might mean for their effort, and what they should do next.
For example: Given the discrepancy in enrollment based-on home language, it seems adapting our system to send text messages in the families’ prefered language may help us boost enrollment and reach our goal.
Decide whether they should keep, adapt/modify, or abandon their innovation.
For Example: The team may decide to survey families and students to determine the language they primarily speak at home, and adapt the text messaging system to deliver messages in the families’ primary language.
Return to the Plan phase, begin a new cycle, and proceed according to what was decided based on learning.
For Example: This could mean planning a test to scale-up the text communication system; planning to modify the text system with families’ home languages and test this modification; or ending the text message system and testing a new idea for engaging families.
It’s recommended that SEL teams undertake multiple PDSA cycles before they conclude whether the innovation has achieved its aim. They should also continue to watch for unintended consequences.
Multiple PDSA cycles provides an opportunity for the SEL team to accumulate practice-based evidence and collective know-how for achieving outcomes. Early PDSA cycles create a base of knowledge and expertise that accelerates learning in later cycles, making widespread improvement more likely (Bryk et al., 2015).
This continuous improvement template is designed to be used by an SEL team to drive the learning process about schoolwide SEL. It follows a Plan-Do-Study-Act cycle to guide learning. The template can be completed during SEL team meetings that focus on continuously improving the approach to SEL.
In order to lead and engage in a continuous improvement cycle, it is important for the SEL team to be familiar with how to engage in continuous improvement. Some districts may offer training to schools on continuous improvement. If your district does not offer this kind of support, the following resources can help your team learn more: