LearnPlatform by Instructure

Using Evidence to Refine EdTech Practices

How an Indiana Technology Team is Refining EdTech Implementations and Making Informed Budget Decisions

dekalb
Share

Located in Waterloo in northern Indiana, DeKalb County Central United School District is home to just over 3,000 students. The district aims to “develop socially responsible students who are literate, academically successful, engaged in all aspects of their education and prepared for success in the 21st century.”

Goals

A cohort of technology coaches and directors in DeKalb wanted to clean up and curate digital tools that prove effective for educators and students in their district. Before working with LearnPlatform, the district primarily gauged the success of products on qualitative feedback from teachers and student engagement – they lacked sufficient useful quantitative data to support decision making. Beginning with a single math innovation tool, the team asked the question: “Does completing 5 or more lessons per week on Product X lead to higher growth scores on NWEA for our K-5 students during the first semester of the 2019-2020 school year?”

By running a rapid-cycle evaluation (RCE) using LearnPlatform’s RCE engine, IMPACT™, the cohort was able to discover evidence to help answer this question, evaluating edtech usage in relation to student outcomes.

How the Process Worked

Based on teacher feedback, the team chose to start with a math innovation tool for elementary students. First, the cohort reviewed the recommended usage from the provider itself to see if that correlated with the growth based on a local, relevant assessment (NWEA). It then looked at a variety of grade levels to see if a certain group of students had improved outcomes based on that recommended usage. This multi-phase process included some prior, inhouse analysis that led to key questions; IMPACT was used to refine the question in relation to the data from the chosen tool. This was a collaborative process that brought together educators from different groups, including a district-level technology integration coordinator, an innovation coach and technology coach, around a common goal. Ultimately, the team highlighted findings that leadership could use to make more data-driven decisions around edtech tools.

When we first shared [the IMPACT report] with our administrator and principals and they had the data in their hands, they were so impressed that it suddenly becomes ‘well, what else can we use this for?’

Amy Neal

District Technology Integration Coordinator

What the Team Discovered

Rapid-cycle evaluations using IMPACT allowed the DeKalb team to see what was working for their students under what conditions and apply that evidence to their ongoing plans. By engaging in a formative decision making process, education leaders were able to evaluate data for their specific contexts and avoid binary judgments on whether a given product is good or bad.

Brief overview of outcomes:

Kindergarten and grade 3 students showed the most student growth as measured by NWEA scores. A core group of grade 3 teachers was identified as using the tool to fidelity.

Grade 2 students were not reaching recommended dosage.

One of the district’s four elementary sites had lower average utilization of the product than the others, but a relatively similar effect size.

Based on pre-achievement outcomes, the district’s lowest-performing students showed a strong, positive effect related to use of the program.

Actions Taken

The Outcomes Analysis prompted the team to refine teacher professional development around the tool to better meet the needs of educators and students, and so these groups are using the tool in the way it is intended in the local context. The team also decided to adjust licenses and replace the tool for the grade levels where it was not shown to be as effective and reallocate those funds to an alternate tool.

What's Next

Going forward, the team will:

Continue running Outcomes Analyses at regular intervals.

Identify additional priority edtech products for analysis, as well as a cadence for formative rapid-cycle evaluations during the school year.

Systematically improve the process of collecting educator feedback so the team can draw on multiple data points when evaluating a given tool, balancing both qualitative and quantitative data.

Download Case Study

Discover More Topics: