LearnPlatform by Instructure

Building KIPP Regional Capacity for Evaluating EdTech Effectiveness

Charting a Path for Sustainable Progress and Collaboration

Empowering education leaders with actionable data to inform edtech decision-making is paramount as tech- enabled learning has become a core component of teaching and learning. Recognizing this need, the KIPP Foundation implemented LearnPlatform's EdTech Effectiveness Clearinghouse, piloting it with two KIPP regions.

With leaders from KIPP DC and SoCal, LearnPlatform enabled each region to investigate usage data and, ultimately, work to evaluate edtech effectiveness, aligning with the Foundation's charge to train and develop KIPP school leaders and provide tools, resources and training for teaching and learning.

0

Building a Process for Evaluating Efficacy of EdTech Tools

Just like many education organizations across the United States, the KIPP network had and continues to grow its interest in evaluating the effectiveness of digital learning products. The shift to remote/hybrid learning has offered a new set of challenges around product evaluation, making Foundation-level visibility into edtech effectiveness, engagement and ROI even more critical. The Foundation wanted to build regional capacity for evidence-based decision making, and LearnPlatform fit into their plans perfectly.

The chosen regions were given access to run rapid-cycle evaluations (RCEs) inside of LearnPlatform. Rapid-cycle evaluation offers a practical and timely way to generate relevant evidence that informs decisions for their specific regions.

For example, KIPP DC ran a series of analyses to discover the impact of products on student achievement, each time drilling down to specific curriculum objectives and tying rapid-cycle evaluations of tools back to those goals.

When first starting to work with LearnPlatform, the team at KIPP DC discovered just how many tools its small network of schools was using. Leaders in the region decided they needed to reduce the number of tools in their core product portfolio, but needed data to direct which products they kept. By running rapid-cycle evaluations that consider product efficacy, they were able to cut down their product portfolio by almost half.

0

KIPP Regional Focuses for the School Year

Leaders at the regions agree that this kind of efficacy evaluation is and should be ongoing, noting that with every RCE comes a new series of questions that lead to even more outcomes to support teaching and learning. KIPP DC and SoCal plan to incorporate product cost metrics into future analyses to offer a clearer view of edtech ROI, something they can do easily with LearnPlatform.

0

KIPP SoCal

The region focused on analyzing product usage data from the past few years, and will build on those findings to start asking questions like:

What does intentional use of a specific digital learning resource look like in our classrooms?

What does successful use of a product look like in a specific setting?

What do we want to discover in different student groups?

0

KIPP DC

The team wants to run a new set of rapid-cycle evaluations for its core product portfolio, comparing findings to analyses run on the same products a few years ago. Leaders want to answer questions like:

Have there been changes in how products are being used? What are they? 

How can we use findings to inform how teachers and students are using products? 

Do we need to replace or add any products?

0

The Ongoing Nature of Analysis and the Importance of Collaboration

When it comes to efficacy analyses and evaluation of edtech products, KIPP leaders emphasized three things to keep in mind for other K-12 organizations looking to do similar work:

0

Jump right in - but don’t try to do everything at once. 

The group emphasized that you don’t have to have “perfect” information to make a decision that will impact teaching and learning – you just need to have the right level of evidence for the consequences of any decisions you make.

To these leaders, a great way to start is to choose a single challenge or ask one key research question about your edtech and how it may be impacting learning. Tying research questions to curriculum and instruction and/or strategy goals from the beginning can help prioritize which challenges to tackle at specific times during the year.

0

Collaborate – you don’t have to do it alone.

The region leaders agreed that their ability to collaborate across KIPP regions and other K-12 districts is one of the most valuable aspects of this work. In addition to internal teams working to find answers to key questions, managing data, and reviewing results, looking at how others are doing this work helps see what people are doing that is different, as well as how they are tackling similar challenges.

Collaborating among KIPP regions, as well as with other education organizations, allows leaders to consider questions like:

Are different KIPP regions using different core products? Or using core products differently? Why?

How do different KIPP regions coach their teachers in using specific edtech products? Are some seeing better engagement with a product than others?

How can we compare data from one region to another? How does it impact what we are doing?

0

The work is - and should be - ongoing.

By definition, rapid-cycle evaluations are formative and iterative, meaning that they don’t have a true beginning and end. With changing student demographics, policies and regulations, and teaching practices, there are always ways to ask new questions and continue to improve over time.

From the beginning, leaders in the KIPP regions realized that edtech evaluation is and should be ongoing work. It is something that takes time, with leaders focusing on chunks of data and learning every step of the way rather than on an ultimate ‘endgame’ result.

0

Share this case study.

Share how LearnPlatform can help build and maintain your evidence base for a more modern and compliant learning environment through this customer’s story. Download a shareable PDF or simply forward this page.