Blog

Why the Best Edtech Strategy is Often “Less, but Better”

students sit at a table in a school library
rce_leaders_blog.jpg

Every year, the average K–12 district accesses over 2,400 unique edtech tools. When technology reaches that level of sprawl, it starts to feel less like innovation and more like noise.

K–12 leaders often ask, “Are teachers using the tools we bought?” But that question misses the heart of what edtech should do. Usage data tells you if a student or staff member logged in, but it can’t tell you if students learned. To connect edtech tools with student outcomes, the metric needs to move from login to evidence.

Rapid-cycle evaluation (RCE) flips the research game on its head, so you aren’t waiting for years in a traditional longitudinal study. It allows K–12 leaders to receive contextualized results about usage and effectiveness in short, iterative cycles, so they can get past surface-level data and make decisions based on which tools actually work.

To see how this works in practice, let’s look at how two different districts used rapid-cycle evaluation to move from noise to visibility to validation of the effective edtech tools they invest in.

 

Identifying the “grassroots” tools

Before a Chief Academic Officer (CAO) or other leader can decide what’s working, they have to know what’s actually being used. And that’s not always as easy as it may seem, because regardless of how large or small the learner population is, leaders express concern that their context is either too limited or too complicated. 

In one of the largest urban districts in the country, leaders didn’t start with a question of ROI—they started with reality. This district partnered with LearnPlatform by Instructure to examine the use of a core Math and ELA software product purchased at the school level. The district wanted to know whether the licenses purchased by 80 of its campuses were being used to fidelity, which would justify a larger, districtwide contract

LearnPlatform designed analyses to measure skill proficiency by network zone and to aggregate findings at the district level, including cost analyses. Analyses revealed that over 60% of licenses for this software were not being utilized, resulting in potential cost savings of over $500,000 across the purchasing campuses. 

Because the district hadn’t purchased this software from its central office, leadership lacked visibility into its impact, which they could see through analyses designed and run by LearnPlatform. Summary reports for each network zone, as well as aggregated district-level reporting, provided the evidence to shift district conversations from focusing on anecdotes and summary dashboard reports to a highly contextualized, realistic view of implementation. 

 

The key takeaway: Visibility comes before validation. Rapid-cycle evaluations can be used to gain the visibility your K–12 context needs to know what to validate.

 

Validating edtech through high-states comparisons

When you have visibility into which tools are being used and where, the next natural question is, “What helps students grow?” In a Washington district, leadership used RCE to move past marketing claims and anecdotal feedback. They put five high-use products to the test by layering in evidence. Usage data was part of the equation, and they built on that with student achievement data from state assessments.

By comparing the impact of similar tools, district leaders found that two of these major programs—which appeared successful because they had high login rates—weren’t showing a significant impact on student performance on end-of-course assessments.

With that evidence in hand, the district chose not to renew those contracts.

Sunsetting a tool can be controversial, but when leaders can point to local, context-specific evidence that shows the tool isn’t driving student growth, conversations stay grounded in student outcomes.

Validation of tools also allows you to focus on instructional success. If a tool is tied to growth in three schools but not in others, you don’t need a new tool—you need to learn and implement the strategies used by the successful schools.

Managing over 2,400 tools can drain even large technology teams at K–12 institutions. Validation is a tangible way to whittle the list down to the most effective, proven tools that teachers can use and students benefit from.

 

The key takeaway: Validation isn’t necessarily about getting rid of tools. Rapid-cycle evaluations help you identify what works and where, so you can apply those success strategies in every classroom.

 

How evidence layers together in rapid-cycle evaluations

Utilizing primary source data directly from providers and assessment systems is the path toward the kind of clarity that those districts experienced. The best recipe for clarity includes these ingredients:

  1. Usage data - to see who is using the tool, and to what degree 
  2. SIS data - to understand which groups of students are seeing positive results from using the tool (grade levels, schools, tiers of support)
  3. Outcome data - to see if usage is related to outcomes

When you layer these together with fidelity goals and cost information through rapid-cycle evaluations with LearnPlatform, the data starts to tell a story not only about whether the tool is worth the investment but also about how it’s best used in your classrooms.

 

RCE meets you where you are and gets you where you’re going

District leaders need ways to visualize and validate the 2,400+ edtech tools they oversee.

With rapid-cycle evaluation, you can:

  • identify the “grassroots” tools that teachers already love,
  • eliminate the noise of underperforming, high-cost tools,
  • and scale what works by showcasing the specific implementation strategies that help students grow.

Having a simpler edtech ecosystem would be nice, but having the right, data-backed tools is the recipe for helping your teachers help students thrive even more.

Like what you learned?

Stay in the know by subscribing to monthly recaps of our news feed.

CAPTCHA
Enter the characters shown in the image.