Evaluating Your Education Technology for Impact
When it comes time to decide what education technology tools (“edtech”) districts should continue to invest in and support, having access to key data and the ability to analyze it into something meaningful is mission critical, especially when faced with rising costs and smaller budgets
Before we do that, though, just a few points I wanted to mention here. As always, we will be recording today, and within a couple of days, if you are registered, for the event, you'll get that recording link. So we'll definitely be sending that out to you. We will have our videos on. We're not gonna be sharing a lot in terms of slides once I get through this introduction section, so we really want it to be a great conversation.
But we definitely have a lot of opportunities for questions. So as you're hearing, our panelists speak today, please make sure to use the q and a feature. I'll be monitoring that. And also we've got other members of our team here, Erin, who leads our customer success team, and others. So, please do not hesitate, to throw a thought, a question.
If you connect with what we're discussing, we'd love to hear that as well. And, I'm just gonna take a moment, before we dive in to just, share, in case, you're not familiar with us. Learn platform supports k twelve stakeholders in generating evidence and driving efficiencies with that evidence to help inform decision making, which, of course, can be so many different things. We'll hear a little bit about that today. States and ESAs, right, in addition to districts, can help to identify potential savings, and, really, this is all about helping to support strategic decision making, and reducing the workload on districts.
And this is, a part of our conversation today because, the district leaders that we're gonna be talking with are working with us through, a state level initiative, and we'll get into that in just a moment. So, let's talk just a little bit, set some context here, and then we're going to jump right into our introductions. Since we first started tracking usage and engagement in EdTech, and that was around twenty eighteen, Maybe no surprise to all of you that we've seen the number of EdTech tools used in school districts rise steadily, with, as we all know, those widespread school closures in March twenty twenty. You can see that growth accelerated rapidly, really jumping up from that nine hundred and fifty two average tools used per month all the way up to over thirteen hundred after mid March. So we have that delineation there.
And the evidence that we have shows that that acceleration has a lot of staying power, as the number of EdTech tools in use, as you can see, climbed even further, and has remained pretty steady through the twenty twenty two school year, so through last year. This ongoing and pretty rapid expansion of EdTech has created a lot of different types of challenges and considerations for districts as they're looking to, modernize learning environments and improve how they make different types of decisions, whether that's instructional, budgetary, operational, combination of all of those. And there's not necessarily a number, that we can say is the right number of EdTech tools for a district to have in use, but what's really essential is for districts to have a system in place to manage whatever those resources are efficiently, and understand if they're getting value from the tools that they are using, in order to build capacity on their teams to really continually improve in that area. And so with that in mind, what I'd like to do is have our participants and panelists who we're so grateful to have today introduce themselves. I will just say, you see here at the bottom, I am a rapid cycle evaluation, program director for LEAs and SEAs.
So, districts, charters, educational agencies, at all levels, and really grateful to, Jennifer and Shannon for being here today as well as all of you who are attending. Jennifer, would you like to introduce yourself and tell us about your role in your district? Sure. Hello. I'm Jennifer Warford, and I am the coordinator of the curriculum instruction and assessment coordinator for Stafford Municipal School District. And we're a very small school district, which means that I get to wear lots of hats.
And so part of my job is just aligning, kind of the district vision with the different campuses, but I also manage a lot of the grants that come through at our different levels. I work with the specialist to create the actual curriculum that goes out to the campuses. The instructional coaches are under me, so that helps us push out implementation. And then even the interventionists are under my department. So, in terms of blended learning, that's a nice piece to have as a component so we can really look at what we're doing and fine tune kind of, the different kinds of implementations that we're working on.
Awesome. Thank you so much. Shannon, will you, give us a quick introduction as well? Anything you'd like to share about your role or the district? Absolutely. Good afternoon, everybody. Pleasure to be here.
My name is doctor Shannon Trimble. I'm the executive director of innovation and digital learning in Terrell Independent School District. Terrell is about forty five minutes east of Dallas. We're more of a rural district with urban characteristics, highly diverse, you know, seventy six percent EcoDis, and we are at our highest enrollment, ever with a little bit over fifty one hundred students. And so, some of my responsibilities, like Jennifer, I have worked in, curriculum and instruction prior to moving into my role in innovation.
I do manage and write a lot of our grants, particularly, our blended learning grants. We've been awarded over a million dollars in blended learning grants. And so I support implementation and partner with the curriculum team, on that, as well as, I'm over our library program and our technology integration. And so looking forward to, talking to you today more about how we've used the learn platform. Yeah.
Appreciate that. And, actually, with that, I'm gonna stop the share so we can, really make this a true conversation. We are definitely going to get into the specific work that, each of you has done around evaluation, and understanding effectiveness, but I definitely want to make sure to take just a moment to share a little bit about the context, that I have gotten to work with each of you in, which has been really fantastic. So this work that learn platform has done, with the Math Innovation Zone districts, in Texas is, part of what we call Texas, EdTech Effectiveness Clearing House, and it what it's designed to do is give the Texas Education Agency or TEA, which you might hear us refer to today, insights into EdTech use and engagement by students across that Math Innovation Zone district, and then build local capacity, so district level capacity for more informed EdTech decision making, as well as, just understanding that through the lens of technology use as costs continue to expand. So participation in the current cohort, what it really is designed to do is enable districts to perform, different types of fidelity costs and what we call impact analyses, which are rapid cycle evaluations, to better inform budget and EdTech implementation.
This is a process that has, kind of a cyclical nature to it. We'll get into that a little bit today. We also have something called a digital inventory dashboard, that is there to allow districts to collect data on what is being used in their classrooms. So just a little bit of, information on how, I have come to work with Jennifer and Shannon, and I have been doing so since, I'd say, toward the end of last school year, really starting this past summer. And so as part of our conversation today, I think it's really important for our, listeners and those that are joining us to kind of understand, how you all got started.
So, Shannon, I was going to ask you, how did you first hear about this work in general? And then how did you decide which products you wanted to evaluate first as we started working together? You bet, Amanda. So, I mentioned as part of my responsibilities is, I help manage our implementation of our blended learning grants. As part of those TEA, the Texas Education Agency, grants, and and, you know, it's full of acronyms and alphabetics, so BL, GPM, IZ, As part of being a participant in those grants, we were given access to the LEARN platform. After we were given access, TEA had us go through extensive training program. And as a result, it helped us understand all the different features that the learn platform offered, as well as how we might end up applying those.
And so in a long along the lines of our blended learning programs, the adaptive learning software that we're using, we wanted to evaluate its, effectiveness. You know? Because in the end, you know, we we wanna make sure we have that return on investment, that we're, being fiscally responsible. But it's also the most important thing is the student learning. And so, we have access through our TEA grants, and then the products that we chose to evaluate first, were also in align with those grants. And so currently, we use Achieve three thousand, in literacy and ST Math and math, and so that's where we started.
Thank you. And I know that, our conversations, which I'll kind of pepper our conversation today with, some of the work that we've done, part of our process is really, looking to understand where your priorities are and the context that you're using those tools in, whatever they might be. And so, what has been really great is getting to know what's happening in your community and what's important to you. Jennifer, I'd love to, hear kind of similarly from you, where you first maybe heard about this rapid cycle evaluation concept, and then how you came to determine where you wanted to get started with that RCE work. Sure.
So I heard about it in a very similar way. We were part of the MIZ cohort. So, when we were introduced to math innovation zones, they said that there was a wonderful opportunity, to kind of really dig into what the products are actually doing with our kids. We had been, of course, uploading data and and doing all sorts of different things with them, with the different products that we had. But this was an opportunity to kind of dig deeper into the different demographics that we're going to be benefiting from the programs and things like that.
And that was really kind of where we needed to go with the data at this point. I mean, we could generally tell whether it was helping, but we just needed to take it a little bit further. So with the products that we were really pushing out to the campuses with fidelity, so there were different kinds of usage requirements. Those are the products that we decided to start with just because we thought that they would give us the most reliable metrics on whether the products were really serving our populations. Yeah.
So I think, what I'm hearing from both of you is there are similar entry points. What we're gonna get into now as we go a little bit further, are some of the more specific questions or considerations you may have had or things that were on your mind. But I think it's always great to just touch on the fact that that word cycle and rapid cycle evaluation is actually really important because this is a process. And a lot of it has, strong connections to the calendar where we might be in a school year assessment cycle. And so I wanted to just talk a little bit, with both of you about, the approach or maybe the mindset that you were thinking of, as you started to really embark on this work, which, of course, involves gathering various data elements, maybe working with members of your team to determine where some of that lives.
So thinking of the process and how that unfolded, I'd love to know your thoughts on that and if you were surprised about how any of that came together once we really started digging in. And, Shannon, I think, maybe I'll I'll start with you for that one. For sure. So, when we gained access to the learn platform, I had the pleasure of working with Erin Curran, and, both Erin and Amanda have been amazing support throughout the process. Our first steps were looking at the library, you know, digital tools that our teachers were using.
I think one thing that was really eye opening to me there is, man, they were accessing a lot of different digital tools. And so, knowing what they were accessing, the impact that's going to have on learning, and then trying to prioritize and narrow down, you know, which of those tools are most effective. Then we began, because of the blended learning, wanted to, you know, look at our adaptive learning, programs. We have a usage goal of sixty minutes a week. So we we set those expectations.
Our vendors, were in contact with the learn platform to share our usage data. And then, once we got to our middle of the year map testing, we downloaded the comprehensive data file to share so that we could run those analyses. As all of that was going on, there's another wonderful tool within the platform, that allows you to survey teachers on these, tools as well. And so, you know, it's one thing we wanted to, you know, see the correlation in the RCE of, you know, the learning to usage, but we also wanted to get the input, of our teachers on, you know, how they were using it and, you know, what they were seeing from their end. I I don't know if I was surprised at the results.
I guess, I was maybe a little pleasantly surprised because they were reinforcing that the tools that we had selected, were having a positive correlation on student learning. And so with that, that's allowed us to share that information along with the teacher feedback from their survey. I think their survey had it ranked as a b plus, and then to align that by showing that the the learning was occurring as well, it really helps support that ongoing buy in and ownership. You know, You know, a lot of times, we we have conversations about buy in, and, we we develop structures to get that ownership upfront by involving our stakeholders in that. But sometimes that buy in lags until you see the data showing that these tools are effective.
And so, I really think that that has helped our ongoing implementation by the teachers seeing that positive correlation of usage, with student learning. I've got some some follow-up thoughts that I think we'll get to in a few minutes, but, just following that and thank you for that. Jennifer, I'd love to know if your experience how that connects to what Shannon was just saying, or perhaps if you have, kind of a a slightly different perspective on on how this, unfolded for you. There are definitely some similarities. I I think, one of the things that we were most looking at was right after right after COVID.
We had all sorts of blended learning projects that people were using. And so we knew that once grants ran out and once, once we really had to buckle down, we were going to need to choose just a couple of those that were really effective. And so I think that that was really our focus going into it. We love the teacher, surveys that kind of tell told us what teachers wanted to use too. But with so many gaps that were left by learning loss and things like that, we really wanted to know what was going to move us in the right direction and when money started to disappear, where we needed to reallocate that.
And so that was kind of, those were the guiding questions that really kind of led us in a particular direction. And in terms of surprises, kinda like he was saying, I'm not sure that there were big surprises, but it was wonderful to have the data to be be able to go back and show the teachers. Because when you're having conversations about what's working and what's not, there's that anecdotal kind of evidence. But unless you have something concrete that you can show them that, okay, maybe this didn't work, but look at our fidelity in this particular grade level or this particular classroom or this really helped our ELLs or now our EBS. These were the kind of conversations that we were able to have, as a result of it.
And so maybe I wasn't surprised, but I think some teachers were probably definitely surprised by what we brought back. Right. Right. And I definitely want to, to talk about that just a little bit because I think what you mentioned about that anecdotal piece, I think, is very relevant. And, for those working in k twelve, we've all been there at different points in time.
Just a little bit of, I think, additional context for where we started. So, Shannon, with your work, we started with data that was available. Like you mentioned, at that time, we began working together, which was the beginning of year to midyear for last school year, twenty one twenty two. And, initially, we were able to run a couple of different evaluations with usage, which often is where we start. So there's kind of a a soft landing entry point a lot of the times so that, you all can just see what this is about.
Right? What what do the reports actually give you? And then as you mentioned, we added on your map data both for midyear and the end of the year. We'll talk about this, in terms of forward thinking, but now where we are in February is a great time for, you all and other districts to to think about midyear analysis for twenty two, twenty three. And so that just shows a little bit of that cycle that I mentioned, because having, one set of analyses is definitely powerful. As you mentioned, sometimes confirming what you already knew or suspected can be just as, if not more important than maybe uncovering things that you didn't know. All of it is relevant.
But, you know, we definitely love to have a couple different, cycles or or kind of points in time where we can see that information, and then you can start to get a sense of, is this a pattern? Is this something that now we're seeing two times, three? And that can also, lead to different types of conversations as well. So definitely appreciate that. For Stafford, I know that initially we had discussed products that were priority, in different ways and determined, for you all which product would most easily produce the type of data that we need for this work because as we know that that's just variable for for all different providers, and that was a math software. And then because of that time of year, again, we looked, for you all at a full school year's worth of data initially, and then began planning for this next set of outcomes, which we'll talk about here in a moment. So I say this to kind of just highlight that this is really flexible, and wherever we begin that work, we're gonna figure out the best place to move forward.
Whether that's August or, you know, or March. It doesn't matter for us. We're gonna really get you where you need to be, and you all have been fantastic in in working with us through that. Let's see here. So you've talked a little bit about what you learned, and I actually have some examples too if we wanna, you know, get into that.
Jennifer, you had mentioned, your teachers and your staff. I think that, for us, a huge piece of this work is, of course, getting to the analysis part, and we need some data for that and all those other, elements that come together. But, really, we want, at a district level, the focus to be sharing out in some way. Can you talk a little bit about how you did that or what decisions you made to share your findings and and apply those and and how that was for you? Sure. So at the beginning that that first round where we were just looking at usage, it became a conversation about the systems that we had in place and whether we as an a curriculum team needed to go back and revamp what we were pushing out onto the campuses and whether it fit for everyone.
And so it was just a nice, conversation piece on what we were doing and, just let us step back from that process. And then in terms of the the later data that actually incorporated the the map information to see whether we had student growth and things like that, It really also let us kind of hone in on whether we were making the kind of progress that we needed and, created a context to where we could understand who was moving if we did our part to make sure that kids had the opportunity to to show that growth. Whether those system changes that we put in place earlier in the year were enough, or whether we really needed to give it more time. Because I think one thing that y'all do a great job of emphasizing is that just because you get a certain grade level that doesn't produce results, you're not really recommending that you throw out the product. It's really supposed to be something that you bring back and you have conversations about about whether or not it's working, whether or not there are different aspects that you need to try and revisit.
And so those were the kind of conversations that, that resulted as a as a result of the the RC at the end of the year. Yeah. I I'm really glad that you mentioned that because I think you're right. And, of course, you know, all of this is very context specific. So both of you went through our our patented RCE, questionnaire process right when we first got started.
We wanna know how you're intending to use those tools, who adds access to them, what grade levels or maybe student groups, and then what you would expect to see happen as a result of the use of that product. Because, as you all know, that can be really different depending on whether something is used in a primary curricular, you know, way or if it's a supplement. There just might be different expectations that you all have, and that's variable. It might even change from school year to school year, as products come in and out. But, what you mentioned that I think is definitely a key part of our conversations, for any LEA is is that concept of, we're not really looking in a binary way about whether something is good or not good.
As you said, there are going to be variations, within grade levels or other student groups. And, actually, Jennifer, I was wondering, is it okay with you if I pop something on the screen that shows just a little bit of of what you have done and that might help illustrate, some of that for us? Sure. Go for it. Okay. Cool.
Let me oh, that's my video. That's not the share button. Alright. Let me get this on here for us. And this is just, I think, a a quick way to visualize.
So if you're listening and you're wondering, well, what does that actually look like? I think this is kind of a neat way to think about it. So we should be able to see that screen again. What we are looking at here, of course, for for this math tool that, we had been evaluating and analyzing to see how effective is it for different groups of students, to which degree, and then under which conditions. Here we have first, third, fifth grade. And on the left side, we can see, average product use.
This is looking at number of completed lessons. There's gonna be sometimes multiple ways that we can understand use depending on what the provider is capturing and what the provider might feel is the most important way to understand that. So we can see here that, in this study, third grade had, higher average overall use during the time frame we were looking at in terms of how many lessons they had completed, with first and fifth grade being pretty much even there, right around sixty total completed lessons. So that gives you a sense of, across these grade levels, what was their usage like? On the right side, we're then going to take that a step further with the addition of some kind of outcome, and we're going to see, okay, well, for that additional use that third grade had, it seems that that may have, you know, in some way shown that relationship to be stronger between the use of this math tool, and whatever outcome it is that we might be looking at. So in this case, more use may be preferable.
That's not always true. Sometimes different student groups need different amounts. They may not need as much as others. Here, everything is positive, so, there's definitely, you know, good relationships there across all grade levels. But we could say that, more use may be beneficial.
And so, again, it's it's not a good or bad question. It's maybe we can find out what's happening within those grade levels, and how those teachers are applying the product too within their own instructional time. Jennifer, I don't know if you'd wanna add anything there, but, hopefully, this is helpful for everybody who's watching just to get a sense of what that looks like when it comes down to sharing those results. No. That is exactly the the kind of thing that we were having conversations about.
And so certain grade levels, grew a little bit more. You had some grade levels that had a little bit better usage, and then it got even more complex and nuanced once we broke it down into the demographics. And so we really just kind of put our thinking hats on and and dove into why different groups would have had different results and whether there were big pieces of information that we could take away from it. Whether, the experience of the teachers or how they were trained in the middle of the summer, how these kind of things would have affected some of these, results, and that's really just what you have to do with it. Yeah.
And you can see here on the right side some of those other, we call them covariates, but they're for the most part, they are demographic groups, and that's something that can be, well, we do determine it with every district what we wanna include there, but there's a lot of options, for it. And so, Jennifer, I think your point is a really good one. I'm gonna stop the share for a second, but I did wanna mention and, Jennifer, you know how I feel about your team. Jennifer's team includes a data specialist who has built a pretty robust database, and what that does is it collects and displays the data that we would use for this kind of work. So that includes some demographic data, outcome measures and scores for those, metrics on usage data for different products or software, and that enables our team to kind of more rapidly run some of these analyses.
But I wanted to share that in the sense that that's a great example, and and, Jennifer, we love that you all have that. But our research team is really always available, very ready to review, clean, and combine datasets together. In this case, you know, we definitely, actually just received that from from, your team, Jennifer, I think last Friday because your winter testing window recently closed. So this work is happening in real time and as we're even talking about it today. Shannon, I wanted to, get back to your experience just a little bit, and maybe in addition to any other pieces that you learned, and maybe how you've applied them, also want to do some forward thinking a little bit, next steps for you, with our CE work, what you might have planned.
I know we've talked a little bit. Anything that you wanna share there? Sure. Like Jennifer and and Stafford, we also approached the results looking at what systemic, supports were, you know, effective and, which ones were maybe needed to help support, implementation. A spin off of that also became as we were looking at the different grade levels, different results, We started talking about professional development, and I I know, recruiting and retention is a challenge, you know, nationwide. And so what we were able to even identify even further on some of those grade levels was that, you know, we've had some turnover.
And so that constant need of, you know, providing that ongoing quality professional learning opportunities. You know, the first time, you know, a teacher goes through a training, they're gonna get certain things out of it. And so we wanna continue to offer that training so that, you know, they're able to utilize the programs according, you know, to our expectations and guidelines. So that was really one of the other things that we got out of that besides examining our systems was, you know, our professional learning. As far as the future, like, Stafford, you know, we just concluded our middle of the year map testing.
So I'll be be working with you guys to get all of that, you know, in the system so we can look at that analysis for ongoing opportunities. But, yeah, you know, it it was nice to see that first year, but, you know, as we're talking, you wanna, you know, make sure that those results are consistent as we continue moving forward and, you know, making sure our students and teachers have the tools that they need. And so we'll continue that process. We've also been in talks with the technology department and curriculum on, you know, how we can utilize other elements in the learn platform as far as that library, you know, and and teachers submitting, you know, digital tools for review and and so forth. And so we're still in the infancy stages there, but we see the potential, in, you know, really you know, because as I mentioned, they were using a lot of different digital tools, and we wanna make sure they're using the right digital tools.
And so we we see some potential there in, making sure that we're, you know, helping provide that selection of those tools that are gonna be most impactful. Mhmm. Thank you. Yeah. No.
I I appreciate that. And, I think one of the the things that I wanted to just mention again is kind of another layer to how this comes together. Shannon, I know for your studies in particular, some of the things that came of this work were inclusion of, different types of study designs. So our research team test files for what we call sample imbalance, and if needed, they're gonna separate grade levels from one another in the way we run the reports so that everything can, be run-in the most appropriate way. So, I'd like to mention that because I think it's important, and we talk about this all the time, that, I might be sitting here, you know, talking with you about these RCE plans, but, surrounding all of this is our research team who is going to look at every single file we get, for this work, clean, combine them, and then let you know if there are things that are missing or maybe there are some questions about, and then really try to, with the context we have, prepare, reports for you that are going to be the most meaningful.
So I know you were, one of our TA districts in particular where we ended up with a a number of different reports because we wanted those to be kind of the most valuable for you. And it's been great to go through that. I think that what both of you are saying in terms of the PD questions, just understanding implementation, speaks to the knowledge that you have of your communities, because a lot of times our reporting delivery calls are more asking questions. Why do you think this grade level, you know, may be experiencing such success with this program? Or a school site. You know, let's talk a little bit about what might be happening here.
And so having that kind of innate knowledge of what's happening where you are, can really help us because, of course, we don't come with that context. But, hopefully, that gives us a sense of, kind of coming together where you can provide some of that additional insight, and then take that back. So we really appreciate, everything that you have done. And, you know, I think if if anything else for those of you that are listening today, we wanna hear if you have any questions, or if you've been thinking about doing this kind of work, on your own. I guess just one last question I would have for both of you.
Now that you've heard each other talking about how this work has started and what you have done, Jennifer, I think I might, kind of start with you here, but what value do you think there is in kind of hearing from other districts? Because that's something that I know we always think about. It's not always easy to do, but, what could you get out of these kinds of conversations that might be able to help you where you are? No. I think that that is incredibly important, and I like that that is one of the things that came out of a lot of these grants and a lot of the work, right after COVID were were these communities of learning. Because, like we said, a lot of what we're doing is taking this data back and making critical decisions about systems. And what works in one district might not necessarily work in another district, but it's certainly nice to have that kind of a sounding board to hear what people have done and what they've tried and things that may have had an impact on on the results that they're seeing.
So in education, you know, we don't we don't steal. We borrow. And sometimes our our neighbors are the best people to borrow from. Absolutely. I love how you said that.
Shannon, what do you think? Similar thoughts? Anything you'd wanna add there? I I have a philosophy of base, borrow, and steal everything. And so when I get the opportunity, you know, to learn from others around the state that have, utilized, the learn platform as well as, you know, participating in blended learning, you know, it it a lot of the times when we're trying to solve solutions in a district, the solutions aren't in the district that we need to get outside of that district. And it provides us opportunities to collaborate and share things that we've tried and, you know, that may or may not work depending on, you know, various characteristics of the district, but it it it gets you thinking. It gets you collaborating and sharing with those outside of your organization to improve, you know, everybody's organization. So, you know, if PLCs are obviously a best practice, it's very nice to be able to collaborate with others, you know, in similar roles from around the state and nation, you know, to, be able to help our students as much as we can.
Absolutely. And for us, because we are fortunate enough to work so closely with, districts of literally every shape and size you could imagine, we hear so many similar types of things from districts regardless of where they're located or, you know, how big or small those districts might be, in addition to those really specific needs. So bringing that together in a way that feels really practical, I think, is definitely, an important goal for for our team, and what we do. I wanted to, just take a moment and, ask for anyone out there if you have questions. I do have the Q and A open.
We can give a minute or two, see if anyone, would like to hear more, from what Jennifer or Shannon shared, or if you just have general questions about the process and how this work comes together, we're happy to answer those. While you think about that, because I always say it takes me a while to think about my questions, when I have them, I am just gonna put something up here, with some additional resources. Of course, you will get this, as part of the recording, but wanted to just mention a couple of things here, before we wrap up for today. Also, just really wanna thank again, Jennifer and Shannon for today, but also, for the work that you have done over this last almost year now. Hard to believe, but we're coming up on it.
It's really made it possible for us to, to do a lot of things within, you know, the MS district zone. So thank you for that. Okay. Let's see. We'll get these resources up for you.
John had a question. Oh, this is a great one. Do you make privacy policies from vendors public? So there are a couple of different ways that we can connect through the platform to student data privacy consortium or, you know, help you understand what policies, or exhibites, for example, might be in place where you are. And that's definitely something, John, that if you wanted to know more about that, we could definitely get you to someone that can show you exactly what that looks like, and how how that kind of connects to, the national database there. Yeah.
Really great question. Let's see. Anything else? Oh, great. Okay. Well, good.
So you you are in touch with the right people then, John, for sure. I was saying I know enough about that to be able to to start the answer to that one, and we'll you know, we can definitely get everything that you need there. Okay. Any other questions out there? Hopefully, you all found this to be really helpful. It's amazing to me to just get to hear these kinds of conversations and take a step back a little bit, from the day to day and a very detail oriented pieces that are involved in getting this together.
So I know I learned a lot. In terms of the resources that are here, of course, our website is there. We also have that link to the inventory dashboard, which is a really great complimentary resource. It does not utilize the same data as we do for rapid cycle evaluation, but will, Shannon, I think you've referenced it, kind of give you that higher level, look across, your schools and what EdTech products are being accessed, which can definitely be very insightful. And then we do have, a really great ebook that is available as well.
So maybe we'll give it another minute or so. But again, just want to thank, everybody. This has been really, really wonderful. Thank you for your time, and I'm gonna be working with all of you, now and also in the future too. So this is definitely an ongoing process.
In this webinar, we bring together representatives from two Texas districts to share how they are evaluating strategic edtech implementations to inform instructional, operational and budget decisions. We will discuss how they are answering questions such as:
-
Is an edtech tool being used to the extent we intended when we purchased it?
-
Is usage of a particular edtech tool having the desired impact on student learning?
-
What recommendations can we provide to our school leaders to help them get more value from a specific edtech tool?
-
How can we support better adoption of a particular edtech tool?
Panelists
-
Dr. Shannon Trimble, Executive Director of Innovation & Digital Learning, Terrell ISD
-
Dr. Jennifer Warford, Curriculum, Instruction & Assessment Coordinator, Stafford MSD