EdTech Evaluation in Practice: More Evidence, Better Decisions, Improved Student Outcomes

Share

Edtech solutions can have significant impact on student outcomes–but understanding whether the tools your district is investing in are producing the academic returns you’re hoping for can be a challenge. How can district leaders get answers to tough questions about where edtech budget dollars are going, what’s paying off, and which tools are truly moving the needle for students?

Watch the Webinar

Share
Video Transcript
For attending this special THE journal session titled EdTech evaluation and Practice More Evidence Better Decisions improve student outcomes. Today, we'll be discussing the impact that EDTech Solutions can have on student outcomes. And learning to understand whether the tools your district is investing in are producing the academic turns you're hoping for. This event is made possible by our sponsor in structure. We sincerely thank them for their support. In this session, we'll give you firsthand insights on evidence based ed tech management, best practices for district generated research, and actionable tips for engaging with ed tech provider partners.

In evidence work. My name is David Nagel, Editor in Chief of THC Journal, and editorial director of the Education Group here at eleven oh five Media. I'm joined today by our very special guests. Chris Bailey is Director of Technology at Edmund School District in Washington Greg Schwab is assistant superintendent of secondary education and facilities and operations also at Edmund School District. Jonathan Scares is instructional technology coordinator at Darlington County School District in South Carolina.

Both districts are leveraging rapid cycle evaluation in partnership with Learn platform by Instructure to analyze the impact of strategic and technical investments. And we're joined by Amanda Cadron, who is program director, Rapid Cycle Evaluation, Learn platform by Instructure. Now before we get started, I'd like to cover a few housekeeping items. First, if you would like to enlarge the slides, look for the square icon in the top corner of the slide window. You can also view the presentation in full screen mode by clicking the squared arrows icon.

If you would like to submit a question at any time. And We encourage you to submit them early as they pop into your head. Just look for the question field to the left of the slide window. Type in your question there and hit submit. We will field questions at the end presentation.

But do try to get them in early. If you have any technical difficulties during the session, look for the yellow question mark icon below the slide window. Click there for technical assistance. If you would like to download a copy of today's presentation, just look for the resource center just below the question field. Finally, within the next day or two, we will email you a link to the archived version of this session so you can view it again or share it with a colleague.

Now without further ado, I'd like to hand it over to Amanda Cadron. Excellent. Thank you so much. Really excited to be here today. I think we're going to have a fantastic conversation.

So let's go ahead and get started. As David mentioned, we're here to talk about a tick evaluation. And in a practical way, so regardless of your role in K12, I'm hoping that You will hear some things that resonate with the work you do. Maybe work that you're hoping to do or that's been discussed amongst your team members. This is us.

You'll get to know more about us as we go through our discussion today, but I can tell you that the work that Greg Chris AND Jonathan ARE DOING IS REALLY INIVATEIVE AND VERY MEANNINGFUL. SO I'M EXITED TO BE ABLE TO HELP SHARE THEIR STORIES WITH YOU RIGHT NOW. SO Let's get into just a little bit of context for what we're going to be doing today. If we think about modern effective ed tech, and the environment that would best support that. That all depends on evidence.

So, in addition to evidence required under ESA, Learn platform helps district leaders to generate additional forms of evidence which you'll hear about today, which are really necessary to gain a comprehensive understanding of what edtech is working, which student groups it's working for, and then under what conditions. But having evidence itself isn't necessarily sufficient because that effective ecosystem makes it simple to put evidence to use. So to be able to actively engage with and utilize the evidence and the results that you would see from it. So learn platform enables you to activate that evidence efficiently across your organization. And this can really have, I think, great effects on budgetary, instructional, and operational decisions at every level.

And I think you'll definitely hear that through our conversation today. So in order for districts to ensure that tech ecosystems are safe, equitable, and effective, they need evidence to inform decisions, and then the ability to activate that evidence efficiently across their district teams and communities. And so Learn platform combines these needs and centralizes them into kind of a single system. So that's just a little bit about the work that we do, high level. But I did wanna also bring this into the equation.

And I'm sure if you've been in education in any capacity over the last few years, this is something that we can relate to. Since we first started tracking ed tech usage and engagement in twenty eighteen, we have seen the number of ed tech tools used in school districts steadily rise. But with the widespread school closures in March of twenty twenty, as you can see here, that growth really accelerated so rapidly jumping from an average of nine fifty to separate tools per month to over thirteen hundred and so on. So our data and evidence shows that this acceleration really has staying power as the number of ed tech tools continues to rise, And what that has created is a really big challenge for districts as they try to modernize their learning environments and also improve the way that they make decisions. So while there may not be one right number of ed tech tools for any district, It is essential for districts to have some kind of system in place to manage their resources efficiently, and understand if they're getting the right kind of value from the tools they're using in order to build capacity on their teams so they can continuously improve how they evaluate select, and then maybe most importantly implement their ed tech.

So, just a little bit of information here, a brief overview of some of the common types of analyses that we run with our district partners through rapid cycle evaluation. Some of these analyses will be more or less applicable to a district team, depending on need, questions that need to be answered, data availability, and then also the timing of the work. So you can see here. So what I would take from this is that we can be really adaptable to what district teams would like to accomplish around that concept of evaluation and the use evidence in decision making. All right.

Well, I'm very excited to get into our discussion today, because I know that's what we are here for. So, Greg, Chris Jonathan, I have a few questions that's going to help guide our decision today. And we'll go ahead and get into this first one here. So, the first thing that we were curious about, you can see the question here, which is what prompted your district to even get started with this work, evidence work through RCEs. And so, Greg, I think we're going to kick it off to you to share some thoughts with that.

Thank you, Amanda, and hello everybody. Really from the standpoint of a district decision making group, one of our main concerns was making sure that you know, through COVID, we've implemented several interventions to support students as we came out of COVID and while we were in COVID and during the whole school closure time period. And so I think you know, having a way to evaluate the impact of those interventions, we're we're spending significant resources on those interventions, and we needed a way to to quantify that impact. I think we're all we all know the world of education and know we're very we're very quick to implement things, but we never really take a time to implement them or to evaluate them in meaningful ways. And I think our, you know, the rapid cycle evaluation provided us with some quantifiable data in, you know, in in real time.

I think that's the other porting is that we can evaluate the effectiveness of our interventions using this evaluation process and giving us giving us really actionable data to know whether or not we want need to do it or not. Thank you. Jonathan, I know that your experience is very much relatable to the admins team, but also you all have maybe a different entry point. Would you like to talk a little bit about that? Yeah. Sure.

Greg said it perfectly. You know, with everything that's happened in the last couple years, our main goal is to slim down our ed tech stack. Because we just have seen an explosion, especially with s or money and what's been really getting funded into our schools our goal is to make it more manageable for our teachers and our students. As an instructional technology coordinator, my role is kind of very unique. Kind of work in two departments.

I work in curriculum and instruction, and I also work in the technology. And I'm kinda like a bridge between those two departments. And I have two wonderful leaders that I work for. So my goal is to really present them data and facts so that they can make good judgment calls when it comes to decision making. So with the RCEs, it really is trying to help us close the gap so that we can make sure that we provide best materials for our schools.

Right? Absolutely. And I know we're going to talk a little bit more about this, but that concept of teamwork and bringing your colleagues into this work is definitely an important one. Let's see here. Craig, I know that one of the things we've talked about is just the significant dollar costs of kind of providing these tools to students. Is that any way part of your decision or or what you're trying to do with this work? Yeah, absolutely.

I think there's this return on investment you know, question that we have to be able to answer to to our stakeholders, you know, the the our school communities and our board and our wider community, you know, as we spend these significant dollars on these different intervention resources, we need to be able to justify through results. I think we we really do wanna ground it in results. And are we are we getting what we're are we getting what we're paying for? And I think that really at the end of the day is from a very practical standpoint is why these evaluations are really important. Absolutely. And Jonathan, that also makes me think you have twenty sites in your district.

So one of the things that we've talked about is Just your central office trying to help them. Is that a part of this work for all of you? Yeah, you know, we're a very rural area here in South Carolina. And we're spread out, and we wanna make sure that our return on investment is given fairly to every single school with us having twenty different sites for high schools in our area. We wanna make sure that what is provided coming down from the district office is informed to our schools and it is, you know, just leaving them in the best direction that they can go into. Absolutely.

Thank you so much. So I think that sets the stage a little bit for maybe the why. Let's move forward here, and we have a few more questions. Now we're going to start really getting into what happened. And so this next question about what has gone well, what's been challenging about this work, I think, is very realistic because as with any project, are going to be learning moments.

And so, Chris, I know you and I have worked really closely together. What would you want to share to help answer this question? Yeah. So, we've gone through several cycles of RCE with several vendors and being able to identify patterns with the usage of tools as well as how they're helping us with student achievement has been really valuable. It's been interesting, I I guess, the challenge is, some vendors really understand process and want to contribute to it so that we can kind of get shared value out of the evaluation cycles that we're doing. And others, it's newer for them.

And so it takes a little bit of work to get the information out of their technology out of their ed tech tool in a way that can be leveraged by the learn platform team in order to give the district information that is actionable. So, been a challenge. That's something that we're continuing to refine our our our communication to vendors on, try to ensure that they see the value in the work as well. And ultimately, we would just want to make good decisions. I think, I'd also add that it's not always about, do we want to buy this tool or not? Sometimes the information we learn is, this tool is really effective for the five percent of our students that are using it.

How do we address that issue internally to ensure that a tool that we know is effective is actually being used authentically throughout the So, we learn things we don't necessarily expect to learn, but definitely valuable. Yeah, that's true. There are always going to be those moments that maybe are just slightly different than what you initially even went into that evaluation to find. And I will just kind of give a a quick shout out to not only the Edmond's team, but for Jonathan as well, that because of your interest in this work and the partnership, we've actually been able to conduct some RCEs on several products that we had not worked with in the past. And that's always very exciting for us, so we really kind of welcome that opportunity to explore.

Jonathan, I know that you are just getting started. What are some things that you have found in this process so far? Yeah. So this was our pilot year here in South Carolina for us partnering with you guys. And we did have some hurdles at first, and it was it was interesting, you know, just like Chris said, when you started talking to some of your vendors, it it kinda, like, made them stop and think a little bit, and then they asked some good questions. And the nice thing that came from Learn is that any we have, we just kinda send it to you guys.

You hopped on a call with us real quick. Sometimes we had the vendor come on a call, with you all, and it just worked out. And once we started moving through those hurdles, things got a little easier. And I think the vendors and the people in our school district are starting to see the importance of these conversations because it shouldn't just come down to, you know, what's the cost of this program for a kid. We wanted to see how this will continue to grow and and develop over time because we don't wanna be a school district that has a program for one year and we're hopping around because we're like, oh, we don't see results.

We know that some things take some time to grow. So for us, it's just, you know, building our library up to have it grow and get there and, you know, being in that first year. It was a little daunting at first, but it was pretty easy once we got a slide deck created with you all. And that helped us out so that we can just kind of say the same questions, the script, and it moved real smooth after the first couple. Awesome.

Thank you. Now, Greg, in your role, there might be a slightly different perspective to this work, which is what I love about engaging with district teams. Is there anything that you would want to add to answering this question? Well, I think it, I mean, it speaks to what you talked about just a bit ago with the uniqueness of some of the evaluations that that have been done. So as an example, I'll give a very practical example from our district. We've we've been using a product called paper tutor and with our students in our high schools.

And and and, you know, anecdotally, we've been hearing about, you know, some level of success, but but again to Chris's point, we weren't seeing that consistent usage across our system. But we really wanted to know, is paper tutor moving the needle? On on student achievement, and that would because again, that's one of those one of those products that's really it's really valuable product, but it's not not inexpensive. And and so we wanted to make sure that we were getting the results we wanted to see out of it. And so I think having the opportunity to work with you on that gave us some really important information about about the usage of that product and and whether or not it is actually doing what we wanted to do, which is at the end of the day, producing better outcomes for our students. Yeah.

Absolutely. That practical aspect is is key to all of this, I think. And Jonathan, just a mention, I know that that conversation that Greg just mentioned also is relevant for you as well, right, in your district. Yeah. You know, it was funny when we were meeting back in April to talk about, you know, possibly doing, you know, this session for different groups.

And Edmonds, they they brought up paper. It was funny because a paper came to our school district to talk to some of our leaders. So I got excited knowing that Edmonds is kinda laid down the groundwork to help us out on the other side of the country so that if we do go with the vendor, we have the ability through the partnership with Learn to sit there and say, everything's set up that we can go ahead and just kind of roll back off what they went through. So those are nice little things that You know, I think we'll kind of win people over knowing that groundwork is being created, frameworks are being created so that you don't have to just start from scratch when you're asking for this information. Yeah.

Absolutely. And seeing that happen across so many districts of different shapes and sizes at the end of the day, those evaluations are going to produce the same type of results so we want to make sure everyone has the opportunity to get to that point. So I appreciate that. Okay. So, now, you know, we're thinking about this process.

We've gone through at least one cycle. So, Chris, I was wondering if you could share a little bit about how RCE has started to maybe have some kind of impact on processes or conversations. Yeah. You bet. Think like anything change takes time.

And so, we've began the conversation with you about our CEs more than a year ago, and it's been in the last six months so that they've really begun to resonate across our system with our leaders. So I have facilitated probably a half a dozen different conversation with groups, helping them understand what the value is and then to explore, which tools that we're currently leveraging might there be a value in exploring with evaluation process how well is it meeting our need? How well is it doing what we purchased it to do essentially? So, those conversations have been really cool because I think they're their tangible actionable kind of I can point specifically to RC has helped us to change practice in these ways. I think something else that I think is worth mentioning is going back to what John shared. I I I see a lot of value in seeing how companies, before we even invest in them before we engage in ed tech product, how they're doing with respect to the ESSA framework, the tiers of evidence. And so, that's something I'm excited about.

I know platform has more coming there, I think, that's going to help us to be able to see and understand vendors intended. Evidence before we even make a purchasing decision. And to me, that's gonna be that's gonna take it to another level. So, really excited about that too. Yes.

I love to hear that. And You read my mind exactly. I think being able to have information on both sides is just really helpful. Helps you ask maybe better questions or more informed questions and also it just helps us to become better partners on all sides of the market. Jonathan, I know I've been able to meet with some of your team members.

Do you want to talk a little bit about where you're at right now? Yeah. So, you know, we were able to do a couple RCEs on some of our programs, and I think the best part of this whole thing and it's kinda like what you're hearing from admins, what they're saying, what we're seeing, this conversation is happening in your district now, and the ability to link different people from different departments with this data has been eye opening. And I think that's the big thing is learn has been like almost a catalyst to get some people excited to look at data. And the the big thing about that is that if you have a program and, like, you know, some people they are the only people that have the dashboard to the program, not everyone who can do that. So, like, your special ed teacher might not have the dashboard to go in there and look at the data, with the RCE, we could share that data with her, and there could be questions that are coming from the early childhood director, where could she could say, is where we need to focus on this school.

So, it's those conversations that are starting, that are helping the people in our office go back into the schools, and use that in the PLCs that we're seeing with our teachers and our administrators. And it's just the way that we can kind of connect all this stuff together for us to communicate and really push a a firm directive that's coming from our district office with what we want our schools be using with programs. And one of the things I heard kind of embedded in that answer is helping your colleagues understand not only what these results are and what they might mean for you, but helping them be comfortable with the idea of using data at all, because sometimes that in itself is one challenge. And so if we can help with that process from the first moment, that is definitely something that makes me very happy. Oh, yeah.

And we're excited we're excited to keep building a library of it too. You know, with this being our first year, we only have a couple of them, but then you can start comparing your RCEs together and say, how can we build this on with some time? So that's what we're we're starting to see in, like, an excitement in our department when we're talking about this. Thank you. I got a little excited and switched to question for too soon, but absolutely. And it is a cycle that that word is definitely key to to what we're doing here.

Thank you so much. So, we have just a couple more questions here. And again, just kind of a reminder if you have questions, if you're out there listening and you're curious about something, please feel free to share those. As we move forward here, we're gonna take a little bit of broader view. And I think some of this has been touched on, even in just the work that both districts here are doing, But do you see others doing similar things or even asking the questions? And then similarly, Have you found some ways to possibly share what you're doing kind of with the broader education community? So, Chris, I think we're going to kick it over to you to share some insights there.

Yeah, you bet. So, I've had some conversations with other districts about what we're doing, and each time I bring it there's a lot of interest, I guess, and one greater understanding about how we're able to pull off what we say we can do. And so, definitely, generating excitement. I I'm not aware of other districts in my area that have done much with RCEs yet, but I think that's gonna change here down the road Ultimately, I think the interest is in figuring out how they can see their data just to evaluate their own tools. And so hearing for me is great, but being able to get their own hands on it, I think is really gonna help them understand how it works.

Absolutely makes it real. So definitely understand that. And Jonathan, you you touched on this a little bit a few minutes ago. About your CNI team curriculum and instruction. What were some of the things that maybe they found helpful as we think about sharing even within your own district education community? I think the big thing for them is that they were learning how they could ask some simple questions just to bridge with each other.

Even if they just didn't understand something, they could get clarity right from the person that was meeting with, you know, the vendors. A lot of times in structural technology, we we start meeting with the vendors and we hand it off to someone and we're there in the background just to kinda handle breaks or something. But a lot of this is bringing our ELA coordinator, our math coordinator together. We're meeting more. And it's kind of like almost one of those things we joke that with this long table in our district office, and it's kind of like us sitting at the dinner table just talking and conversing about what's going on.

And I'm hoping, you know, we have some other districts in our state of South Carolina that our problem is I'm hoping that we could start meeting with them and communicating with them and seeing what they're looking at when they pull these RCEs And I'm excited to take this back to the instructional technology coordinators in my in my state also next year so that I could show them the possibilities of what it brings by pulling this data in. Yeah. And by that time, you'll have two full cycles of this -- Yep. -- to go. So you'll see those patterns, which is great.

Greg, from where you're sitting, what would you say in response to this question? Well, I think similar to Chris, you know, I I in the in the circles that I run-in around with other district leaders, I'm not hearing a lot of conversation about this yet, but certainly there's an interest when I when I share the work we're doing because I think we're all in that in that need to evaluate because I think, you know, we're in I think all of us are facing some significant budget challenges right now, and I think that the dollars we spend are increasingly scrutinized and so we want to be able to defend to our stakeholders, into our school board, into our community, and others as well, you know, that the money we're spending on these things is producing results. So again, I think it's it's it's exciting and I think people are certainly I certain they certainly perk up their ears when I'm talking about the work we're doing, because I think they want to they want to learn more about it. So I think it's -- I'm excited to continue to share because it really is important work. Sure. Absolutely.

And also I know one of the things that we've talked a bit about is just being able to connect some of those results to strategic planning, and every state, of course, does that slightly differently, but this can align in many ways with those kind of short and long term goals that I know you all are preparing. Okay. I think we've got one more question here. Yep. Okay.

So forward thinking now, I think this has come up a little bit, but would like to just kind of think through where you see this going next. Greg, I may have just given away part of what you're Okay. No. Totally. I think I think this is you know, we really we've launched a new five year strategic plan with lots of strategies and lots of specific things that we're going to do, and some of those have resource is attached to them.

And so we wanna make sure that we can evaluate those components of the work we're doing through the lens of our strategic plan. So it gives us again, it's another data point that we can point point back to you to say, you know, we said we're doing this in our plan. I mean, I Ready is a great example. We have so many of our school tools are using I Ready. Data as part of their school improvement planning processes and this component of our district plan too.

So being able to evaluate the effectiveness of that specific assessment tool is really important because again, we wanna make sure that that we're holding ourselves accountable as well and not just saying we're doing it and then, you know, putting it on the shelf and never looking again, but really looking at the results and seeing making sure that we're getting what we say we're getting. All right, thank you. And clearly I get very excited about strategic planning, so appreciate you elaborating. On that. Chris, I'd love for you to just follow-up.

What would you say, you know, being so immersed in this work What would be something that you would kind of be thinking about as we go into the next school year? Yeah. You bet. I guess, first, I'll dovetail on what Greg said a little bit. We had a purchase procurement that needed to go to the school board based on the value earlier this year and the board interrupting the meeting and say, hey, so how do we know this is actually gonna work? And it was awesome because we'll be able to bring that back to the next year and show them how well the that they had questions about is doing the job. That said, I guess, time management next year.

I hope and expect that we will be having regular conversations with a variety of instructional leaders edmonds about the different tools that they're using and promoting and purchasing so that we can evaluate together if they make sense to continue. And if if we're gonna continue them, then how are we gonna ensure they're gonna be used the way that they should be used? Because, again, it's not always about student achievement. It's sometimes about making sure a tool is being used with Fidelity. So, I think that's the key area for us is having those regular conversations with various tied into our procurement cycles as well. Yeah, absolutely.

And you're right, use itself is can be very, very enlightening when you're trying to get that kind of immediate handle on how things are going. John, what would you say an answer to this question? Our big thing, especially because this is our pallet ears. We just wanna continue to expand our products that are being evaluated. But I think Greg and Chris said it perfectly, you know, if someone says, how do you know this is good? We wanna be like, well, here's the proof. Here's what we got for you so that we can just be as transparent as possible because that's our goal.

Is that we want people to see that we are making informed decisions based off data and that we're not just sitting there saying, oh, I saw this when I went to a a convention and it's the best thing ever, you know. We wanna be able to sit there and say, hey, this is why we evaluate the tools that we use so that you guys understand that we want to be efficient in our decisions. Right. Absolutely. Well, I just wanna thank all three of you for bringing your perspective to this conversation and being so open and honest about your experiences with it.

I'm just gonna take a couple of minutes right now to just wrap things up in terms of what we're sharing today. And then I believe we'll have plenty of time for questions if there are any out there. So first thing, again, thank you for everything you've shared. And I hope that for everyone out there, whether you're listening live to the recording, that the information that you just heard from Greg, Chris and Jonathan resonates with you in some way regardless of your role. One thing I did want to mention, as well, is that Learn platforms RCE work is what I focus on, but we also do offer a lot of ways to manage ed tech.

Both edmonds and Darlington are working in other areas of the platform, Jonathan's mentioned the library, so that's a good example. So, we really want to be the hub for district ed tech management so that districts can gain visibility both to high level directional data on programs being accessed, as well as communicating safe and approved products to teachers and families, which is so important. Also, things like streamlining, request and vetting processes, and then, of course, the in-depth evaluation work that we've been talking about today. So, really, it's a holistic management with insights that really can help in a lot of different ways. And with that, I will let's see here.

Have one more piece of information. Just to put a finer point on it, I think, we like to enable districts and want to enable districts to generate the evidence necessary to align with as the standards. As Chris mentioned a bit, and we talked about at the beginning, So, that is inclusive of a lot of different pieces of information that can go into these analyses. And you can see the different components of some of that here. We want you to be able to collect and analyze ed tech usage data, but teacher feedback can also be kind of another component of this work.

And ideally, to be particularly useful to you, informing decisions about EDTech as that funding cliff approaches which I think we talked a little bit about today, and EDTech providers really do play a crucial role in generating evidence for districts, and that's something that we really value. As well. So with that, I think we're going to go to some Q and A. And again, thank you so much for being here today. Thank you very much.

I'd like to remind everyone in the audience that you that you still have time to get in some questions. So please do. I do have some questions for you already. I'd also like to remind everyone that you will be able to download a copy of today's presentation. Just look for the resource center just below the question field.

And do get those questions in soon if you have them. All right, I'll start with some questions here. What metrics are most important to you when considering student outcomes and conducting rapid cycle evaluations on specific tools? This will be for everyone. I'm happy to start. Good, Chris.

I was gonna say, for me, it really depends on the tool. We have some tools where we wanna know exactly which students are seeing positive learning outcomes based their usage. And so we'll drill into a tool based on the maybe instructional program, a student is a part of certain grades, maybe even certain schools. For others, really more about is a tool being used. And so the the data that we care about, sometimes we're surprised as well as I mentioned earlier.

So the data we we think we care about may depend on the tool, and then the data that actually matters to us, we may not even know until we see the RCE. So Yeah. And I think I would add, for us, again, I keep coming back. I hate to beat this horse, but it keeps coming back to the it's the strategic plan. And what did we you know, what what are the goals we set in our plan and what are those tools that we're calling specifically in our plan.

And so making sure that we're evaluating our things, our tools through those lens, through that lens is is really important for us. And I'll just add a little bit there too. One of the other layers to that conversation about metrics is the provider's theory of change, as we call it. So, what would that provider say is a really appropriate metric to show student progress, completion, use of a program, because providers are going to capture those things differently, depending on the purpose of that tool. And so that's part of our planning process is to kind of understand what's available.

And then from there, determine what's the most appropriate way to kind of look at the engagement of the product with students or users really. So that's what I would add to. One thing I wanna say too is you know, we we use the product the vendor's recommendation for number of lessons that kids should be doing for something. And we had a school that was going over that and we're able to use the RCE to show that it wasn't helping. So for, like, the last two months of school, we don't have that this data yet, but for last months of school, we had them go back to the normal set to see what's going on because we were able to see in that first evaluation that it wasn't helping them move the gap at all by going over a recommendation.

And they were only going over by two lessons, so we were able to kinda give that feedback to them. And any addition of instructional time is so important and valuable as well, so that's great feedback. There are sometimes diminishing returns, absolutely. And those are things that you can start to see. I think this next question is very again.

How do you decide which tools to focus Rapid Cycle evaluation on? This will be for all of you again. I feel like it should be anything that the district is promoting. And that's where we started. When in our pilot when we met with Amanda, we we kinda had a conversations about, okay, what are you guys really heavily saying, hey, this is something that we should be using. So we started there And then in the conversations, especially with our special ed department, they're like, hey, we got these programs that we have out there.

Could you add them into the evaluations to help us make effective decisions. So we started with the ones that we were really essentially, like, pumping out, saying, hey, this is what we want people using, you know, weekly, and then we started building in the ones from our departments that they were saying would help students. And I'll take that one step further and say that we have some tools in our district that we're mandating are used. I already being one of them. And so I think And, you know, that's not not without some consternation from the end users.

But but you know, being able to use the data to say, but it is having results. And when we implement it with Fidelity, it has results. And it produces the results we're looking for, gives us the the place to, you know, the safe ground to stand on to be able to say we expect you all to be using this. And the big thing too for us is, you know, we got twenty sites twelve elementary schools, you know, three middle schools, four high schools and mix there. You know, with us breaking stuff down, it it helped guide us make decisions to see who were using things correctly.

Because we don't want, you know, a school not to be using something and then they're falling through the cracks. So this helped us kinda sit there and have those conversations with our principals, and it helped our directors feel comfortable in bringing that to their attention. I guess, aside from those things that have already been shared, the other thing is that I would add as far as factors that we use when deciding which tools to focus on. One is cost. We look at the tools that we're spending the most money on because that's how we're gonna determine if we need to scale back or if there's a potential cost savings to maybe scale back on grade levels of the tool, things of that nature.

The other is there have been a couple of situations where we've felt like we had tools that were potentially redundant, or we thought there might be some overlap in the value they're providing. And so being able to do an RC on both of them helps us to determine is the usage there for each of them. And then if it is, which is actually doing an effective job for different groups of students, grade levels, populations, and what not. So As a follow on to that, I don't know I don't know if this is particularly relevant, but Is there a number of tools that you aim to evaluate in each cycle? Or what number are you landing on currently? I can maybe just take that from experience for what we've seen. So, typically, especially at the beginning, we're going to probably want to focus on between one and three products just simply for the fact of getting started with that work, and that it is often a new process of, you know, finding that particular data.

Interpreting the results, so that's, I think, a manageable amount. And of course, I'll let everybody else kind of chime into that But then from that point on, we really would like to know, yeah, what is out there. And also, what data is available, because that will sometimes be a determining factor as well. Does anybody wanna follow on on that? Yeah. I think the only thing I would add is that there are some tool I think one to three tools makes sense for trying to figure out what the kind of low hanging fruit is.

But the frequency that we evaluate, each of those tools will vary as well. There are somewhere there's a logical kind of cyclical time period where we should be looking at them through the RC lens several times a year. And for others, that's just not gonna be as valuable. And so it's more likely an end of the year type thing or or some other chronological time frame? This might be a question for Amanda, but Is there a typical percentage of these tools that you find are not being used effectively and need to be tweaked? Their usage needs to be tweaked or reevaluated. All right, I mean, it's a great question.

What I would say in response to that is, regardless of the product, the district, or even the implementation context. There are always going to be bright spots of implementation, whether it's a grade level, a school, a designated student group of interest perhaps, and you can really think of that in many different ways. And then similarly, there are going to be pockets, I think Jonathan referenced this, where perhaps it is a school site or some other student designation where that implementation is not quite hitting where the district would like it to be. So I tend to think more in terms of there's a story within each one of these evaluations. And our job here, you know, what I do is to try to provide the ability for district leaders to see what those stories are so that they can take that information with them and do whatever it is that is appropriate, you know, for that particular setting.

Mhmm. Chris or Jonathan or Greg, did you wanna add anything to that one? Yeah. I mean, I felt like one the school that we noticed that was going a little over the usage, the one time, it kind of gave us an eyesight where we said, can we, as the instructional team come there and work with some teachers? So it opened up a door for us to kind of just be supportive. And that's that's our role in instructional technology. We always say, hey, we're here to help You just have to invite us into your buildings and we'll be there.

And, you know, just give me a call and I'll be anytime. And and it opened that doorway where we could say, hey, we noticed this maybe we just need to sit down with people and and go through some training again because, you know, even though we started something in August and it's in January, sometimes it's good to get a lower fresher. So we we hopped in those PLCs and help people out, and it was just a way for us to open up a door and it was nice. That's a really great point to bring in the collaborative aspect of this. Moving on to the next question unless anybody wanted to add anything to that, we're good.

Our rapid cycle evaluation results shared with classroom teachers and what has been their response to this work and your findings so far? Maybe start with I think I think here in Edmonds, we haven't we haven't yet gone to that level. I think the the rapid cycle evaluation results have mostly stayed at the building leader and the district leader level, but I certainly think that's that's the next place to go with this is to really provide our teachers with that same data. Because again they are the end users of these products and we want them to know have the same have the same access to data about its effectiveness. Has that been the case with you also, Jonathan? Yeah. Greg said it perfectly.

You know, we're in year one. So this is our pilot but the goal is we want teachers to be able to see what's going on. With the library that we have with Learn two, we want that be a destination that teachers can see what products we have available and why we might deny a product also. So we want that to be like a a place that they feel comfortable going to because the thing is this is nowadays with social media, with conventions, teachers see things all the time, and they wanna bring it to their school, and we we welcome that because that's a a teacher growing and learning. And they wanna do what's best for their kids.

But at the end of the day, we want them to understand that we have to follow some boundaries. So we want that library to be set up so that they can look at it and have data that says, oh, this is why we don't use this program and we use this one over it. So we wanna start building that. That's our end goal. We're just know in the palette of building the library up right now.

Alright. We still have a couple questions to go if anyone in the audience wants to submit anymore while we still have time. Next question, how do you incorporate or plan to incorporate rapid cycle evaluation in your district ed tech vetting processes? Chris or Greg or Jonathan? Yeah. That's the issue, one. We we don't necessarily even know when we're evaluating products that we've not used before.

Whether they're gonna be able to provide us data that would lend itself well to an RCE. So that's a bit of a tricky one. That being said, I think if I as a customer knew that a vendor had a robust partnership with Learn or had done authentic RCEs in some form that justifies their what their what their marketing to me, that would be compelling. So I I don't know if that's a direct answer to your question, but I don't know upfront when I'm meeting a vendor for the first time, whether they even have data that would lend itself well to RCE. So I don't think that necessarily helps me decide which pursue this vendor or not.

But I think that there are other ways I can get that sort of data. Okay. Is that is that the case in your district as well, Jonathan? Yeah. And the great thing is, I think our vendors have all been real welcoming to this. We haven't had anyone say, no.

You know, when we explain what we're doing, And then we we get learns contact. I gave Amanda's email a couple times. And I'm like, if I'm not explaining it the best I can, Amanda can explain it to you. They seem welcoming to it, because at the end of the day, they understand that we are trying to make sure that the cost is effective for our students, and they wanna be transparent with us also. So I think that's a it's been a good thing that we've had It just it's built our partnerships up with our vendors a little bit better because I think they are sitting there saying, wow, they're really taking this very serious and I like that.

You know, to me, it's been a good thing. All right. How can rapid cycle evaluation be used to improve adoption and implementation of specific ed tech tools? Very broad question. For any of you, Well, I think, again, using the example of of the paper tutor product that I mentioned earlier, I think that, you know, that was a we needed to make a decision about whether or not we're gonna continue to, you know, provide resources to fund that. So that, you know, while we didn't have well, while we didn't have data, this kind of data when we implemented it, it certainly helped us to make a decision about whether or not we were going to continue with the product or not.

All right, Jonathan. Anything to add to that? I mean, I think Greg said it perfectly. I mean, it's just This is just a good way in case like a school board member who might not be familiar with something. We just want to provide information to them, so they feel comfortable because at the end of the day, we got people that are making decisions and they wanna just know that we're making the right decision. So, we wanna provide them as much information as we can.

So that way, if they see something, then they can ask a question off that, or they might see the conversations that we're putting in the RCEs and they know, okay, I need to go talk to the director of special ed because she's been asking some really good questions here, and that can help them when they have to make a decision. Alright. This has been really interesting. Amanda, do you want to get in anything else before we close out? I'll just say I love hearing these conversations myself, and that the process is really designed to be not only iterative, but flexible, so what you're hearing here are just different different versions of the same process. And so, regardless of what the product is, what the assessment calendar is like, there there are a lot of good ways to do this work, and that's probably one of my favorite things about it.

Okay. All right. Well, we will close out the session. I'd like to thank all of our speakers for a very informative discussion on a topic that we haven't covered before. So, has very eye opening.

I'd like to thank Chris Bailey, Greg Schwab, Jonathan Scares, and of course Amanda Cadron from Learn platform by Instructure. And I'd like to thank our sponsor infrastructure for their support. I'd also like to remind you in the audience, again, that in the next day or two, we will be emailing you a link to an archived version of this session so that you can review it or share it with a colleague. And this concludes our webcast. Thank you very much for attending. Thank you everyone.
Collapse

Hear from leaders at Edmonds School District in Washington and Darlington County School District in South Carolina who are leveraging rapid-cycle evaluation (RCE) in partnership with LearnPlatform by Instructure to analyze the impact of strategic edtech investments. We’ll discuss why they chose to do this work, how they think about measuring outcomes, and how they see this data impacting budgetary, operational and instructional decisions going forward.

Watchers can expect to come away with first-hand insights on evidence-based edtech management, best practices for district-generated research, and actionable tips for engaging with edtech provider partners in evidence work.

Webinar Panelists:

  • Chris Bailey - Director of Technology, Edmonds SD
  • Greg Schwab - Assistant Superintendent of Secondary Education and Facilities and Operations, Edmonds SD
  • Johnathan Skaris - Instructional Technology Coordinator, Darlington County SD
  • Amanda Cadran - Program Director, Rapid Cycle Evaluation, LearnPlatform by Instructure

Discover More Topics: