Evidence in Action: How EdTech Partners Are Solving for Policy, Proof, and Impact
Hear from Edmentum, Age of Learning, and Lincoln Learning as they share how they’ve met policy and funding requirements while driving measurable results. Learn how Instructure’s Partner Program and Evidence as a Solution help simplify compliance and amplify impact.
- Real-world challenges and lessons from EdTech peers
- How partners use Instructure’s Evidence solution
- Best practices for proving impact and building trust
Good morning. Good afternoon from depending on where you're joining us in the country. I'm Erin Mote. I'm the CEO of InnovateEDU. I'll be your moderator today for our this exciting conversation we're going to have around evidence and impact. Thanks for joining us on the Friday before what is a busy holiday week.
Please meet my fellow panelists who are as passionate about interoperability evidence and making an impact as I am. Joining me is Michelle Barrett. Michelle, tell us a little bit about yourself. And I also want you to tell us what is your favorite thanks giving or fall food? Get us get us hungry right around, this lunchtime hour for so many. Alright.
Super. So hello, everyone. I'm Michelle Barrett. I'm the senior vice president of research policy and impact for Augmentum. It's a real pleasure to be here today to talk about evidence and how Edmentum views the world and in looking at how we can continue to improve impact on educators and students that we serve.
In terms of my favorite Thanksgiving dish here, I I think I'm gonna go I I think it counts as a dish, but maybe not. My husband makes a wicked jalapeno gravy, and it is really amazing and delicious. So that would be my favorite. Does it freeze well, I think? Because I would keep that Absolutely. All all all year long.
That sounds like it could be an all year long kind of thing. Well, thanks for sharing a little bit about Momentum. I know we're gonna hear a little bit more later about the work that you all are doing both in partnership, but also at the company. Next joining me is Heejin Dang from Age of Learning. So tell me a little bit about you and a little bit about maybe what's your favorite fall Thanksgiving food side dish that you wanna share.
Sure. Hi. My name is He Jin Bang. I serve as the vice president of efficacy research and evaluation at Age of Learning. Our company I love our mission statement, which is to help children everywhere build a strong foundation for academic success and a lifelong love of learning.
My favorite fall side dish is a Korean dish. It's a rice cake that get filled with sesame or chestnut or rice beans and are steamed over, pine tree leaves. That sounds delicious, and is bringing me back to, like, the holidays in Koreatown when I lived, in New York City. It was one of my favorite things, to do. Great.
And certainly, last but not least is Charlie Thayer. Charlie, can you introduce yourself? And, again, bring us home with your favorite, fall or Thanksgiving side dish. Sure. Thank you, Erin. I'm Charlie Thayer.
I am the chief academic officer at Lincoln Learning Solutions. It's a pleasure to be here with my co panelists and everyone to kinda chat about this topic. It's something that's very important to us, and we've really been pushing over the years to dive deeper and deeper into evidence at Lincoln Learning. It's kinda part and parcel with what we do, working with schools to provide a variety of different educational solutions in their environments. We work with folks that are in a lot of different environments, so it's really important to understand the nuance nuances across those.
Favorite side dish. I could go one of two ways. I can go with side dish or I could cheat into the desserts. So I think I'm gonna go with the desserts, and my wife makes an amazing pecan pie. It is the best.
So she makes a a battery of pies. The pecan pie is it's at the top. It's usually the one that's gone first, from the dessert table. So that's that's that's the one I'm going with. Hey.
I'm gonna ask you to wait in some dangerous territory here. Are you a pecan pie with ice cream or pecan pie on its own? This is a dangerous question I know to ask you, Charlie. No. No. No.
No. All good. I I I broached the dessert topic. So there's no You you opened the door as we said. I went there.
You did. I'm gonna say with whipped cream. That's how we do it at Thanksgiving. Whipped cream is kinda right in the middle. So not without, but not it's without the ice cream.
Well, I'm so excited for you to join me today in this conversation. I'll just finish up by, introducing myself. I'm Erin Mota. I'm the CEO of InnovateEDU. We are a house of brands, not a branded house.
So you know us from our multi stakeholder alliances, like Project Unicorn, the EdSafe AI Alliance, the National Partnership for Student Success, the Pathways Alliance. Through this work, we touch over three thousand organizational partners and two of every three school districts in the United States. And so we are very focused on how we serve folks who are in the field doing the hard work. And today, we're gonna have that focus be on evidence and interoperability. So, Mary, I'm gonna come back to you first or sorry, Michelle.
I'm gonna come back to you first. Tell me a little bit about Edmentum, and tell me why you care about evidence. Why does Edmentum care about evidence? Absolutely. So Edmentum serves over five million students in more than six thousand public school districts in the United States. We also serve students in over a hundred countries.
We believe very, very firmly that all students deserve learning acceleration that is tailored to their unique needs, whether they are looking to catch up or stay on track or to chart their own path. We are really interested in changing the direction of students' lives. It matters very, very deeply to us. We care about evidence because it is a really critical feedback loop that helps us understand what we are getting right as we work to really influence and impact the positive trajectories for students, it also tells us where we can improve. Because what we're doing is creating innovative solutions to solve these really tangly and challenging problems that our district partners are facing, and we're also working with our district partners to implement these solutions.
We care about evidence because the district administrators, they don't really have time for trial and error. They, need to serve students at scale today with solutions that are helping them achieve their goals. And so evidence is intertwined in everything that we do in the decisions that we're making in order to make sure that we are, having the the greatest positive impact we can on the students that we serve. And so folks who interact with your platform, just in case in case they're not maybe familiar, student facing, but on the back end, lots of data and sort of hinge information about sort of how to use that data to act. Is that is that a good Exactly.
Exactly. So at at Admin's, we we offer really three solutions for district partners. So the first is all about accelerating student achievement. So here we offer assessment, grade level standards aligned, digital instruction and practice, personalization, intensive support, and human centered instruction, virtual instruction that is in a coherent system that drives academic growth, driving academic growth on the data, the assessments that we are administering to the students as they pursue their their academic instruction. The second is in our attempt to reimagine the path to graduation.
And here, we expand access for students by offering credit bearing digital program that maintains diploma integrity while also flexing to the needs of individual students, families, and the districts that they're in. These can address difficult challenges like attendance, declining engagement, and really making sure that students are able to learn the academic skills that they need to unlock opportunity. The third solution that we offer is in expanding college and career readiness. Our solution brings students from the journey of middle school exploration all the way through to employability and postsecondary success. So this solution supports academic skill development to unlock opportunity.
It, supports durable skill development for in, lasting impact. Students are able to take those skills and transfer them across multiple career potential pathways. And then finally, it supports their technical skill development, which helps with industry certifications and employability. Thanks. Thanks for digging in just one level deeper for folks to be a little bit more familiar.
So coming to you, Charlie, tell us a little bit about LinkedIn Learning. Oh, how what products you offer? What, to whom? And and how you think about evidence in your product offerings and their use? Yeah. For sure. So we're in a very similar space, as Michelle and Inmentum. Lincoln Learning is primarily, and I say primarily because I think we're we're a lot more than that, a k twelve, digital curriculum solution.
But what we've really kinda backed into is creating a versatile set of capabilities, services, and capabilities with our offering. So while we maybe provide traditional courses, those courses are created in such a way that they can really be flexibly implemented to snap into the different learning environments that we work with. So we work with we work with schools of all different shapes and sizes, all different composites, urban, rural, highly digital, not very digital. Right? And everything in between. The the spectrum is broad when it comes to the types of schools that we work with.
So that was really that was really the the backbone for us approaching the work that we've been doing for the past several years to build something that it really doesn't matter what setting you're coming to us from. We can kinda meet you where you are. And that's really what we try, and I think we do really well at Lincoln Learning is we partner with folks. We make sure that your strategic goals, as a school are our strategic goals. We really try to lean in, and align ourselves, with our partner schools, to make sure that, you know, together, we're both best set up for success.
And I think, you know, part and parcel with that is, being mindful of that evidence. I think I heard someone else say, you can have a really good idea. You can have a really cool idea. Edtech is a really innovative space. There are tons of cool things.
I I love EdTech. I fell in love I actually fell into EdTech years ago by accident, and I fell in love with it. And I can't I can't move away from it. I can never imagine myself not working in this space, because they're the the canvas is so broad for us to to to kind of work in. But evidence is really tantamount in that because, you know, lots of good ideas.
Have tons of good ideas. I work with tons of creative people. I'm surrounded by some of the smartest folks that I've ever encountered. I'm I'm astounded on a daily basis by some of the things that they come up with. But all those great ideas in execution might fall flat.
Right? So I think evidence really, brings things full circle to make sure, you know, we set out to do a thing. Did it make sense? Did it work? Is it is it working? Maybe it worked at one point in time, but it's no longer working, or maybe it works in this setting and not in that setting. So I think, you know, evidence for us is really kind of that it's a it's an internal validator to make sure that we're on the right path, we're moving the right direction. And as we kinda continue down the path, are we making the right turns? Are we making the right moves with some of the decisions that we're making? And it's really important not just in, you know, the academic affairs department that that that I get to work with, But from our business development department, we're we're a close knit organization, business development department, to make sure that they have the information that they can go out and, you know, use that as testimony, use that as support to say, right, we we understand what your profile is. I think, you know, here's here are some ways that maybe we can help you, succeed or improve, in those settings.
So that's, I think, really, from our perspective, what's most critical about evidence is validating. You know, if we took a course and, you know, we've deployed in a in a traditional brick and mortar classroom, what are the things that were successful there? And that comes back in that that evidence loop, for us to kinda validate those pieces. I love that. I I will never forget. I was in a meeting with a bunch of superintendents, and sort of the theme of our meeting was that's a nice story.
Now show me the data. And the importance of using data to inform decision making, both when you're the superintendent and when you're the classroom teacher. So I really, really appreciate that point, Charlie. Okay. Well, I know age of learning because I have young kids, and they are, maybe a little bit wild for ABC Mouse.
But age of learning is bigger than just ABC Mouse, maybe not in our household yet. But, but tell us a little bit about Age of Learning, the the sort of the scope, of products that you all are offering, the the types of, you know, work that you're doing, and how you're using Evidence. And as as Charlie talked about, both to understand what those products are doing, but also how to help your customers and districts or consumers like me. Yes. So we have and smart products like ABCmouse, which provides broad early learning environment for children to explore core concepts across all subject areas in a really playful and adaptive way.
And then we have school solutions, My Math Academy, My Reading Academy, and our newest one is My Reading Academy Espanol. They offer more targeted mastery based pathways to early math and early literacy using embedded assessments to personalize instruction and supporting students when they struggle and challenging them when they are ready for more advanced content. And together, these programs really help children learn at their own pace and help teachers deliver tailored instruction with actionable insights. And we've been able to show meaningful gains in both math and literacy across diverse learning context. But the way that we use evidence and the reason that it is so important is it's incorporated into the entire life cycle of every learning feature that we build into our programs.
At the very beginning, when we're just thinking about what to create, our curriculum team does a deep dive into what evidence tells us right now, what learning outcomes matter, what curricular approaches and interventions have been tried, how outcomes have been measured, what those empirical studies have found. And from day one, we're asking ourselves what approach is most likely to help children learn the best. So efficacy is really the north star guiding everything that we choose to build. And as we move into product design and development, we again use existing evidence of what works to inform the learning experience. So product designers make decisions about features and interactions that are likely to support learning and engagement, and our design research team plays a major role as they sit with kids, families, caregivers, educators in real contexts and watch how they use our products and ask what they're trying to accomplish and capture emerging insights.
And those are insights that get tested and retested before we make final decisions so that what we ship is designed to both engage children and help them learn and to give us a realistic way to collect robust evidence of impact. And then in the validation and iteration phase, where more of the public facing efficacy and evaluation work is done, we run a portfolio of studies, everything from case studies to pilots and randomized control trials. And our goal is really best in class research, which means evaluating and continuously improving and reporting on how our products drive learning gains. And just as important, we study implementation. What works for? What could have worked better for whom? Under what circumstances and context? And these findings go straight back to product and curriculum teams to inform our ongoing optimization.
And so we cared very deeply about evidence because we're made to building programs that actually help children learn and stay engaged in learning and to be able to serve more diverse groups of children across different contexts. I love that. What works for whom and under what conditions, I think, is is something that EdTech has really been pushing towards. And I love hearing you talk about it, not just in deployment, but in design. And I think as we look at the lots of tools that folks are using right now, how do we think about those tools that are purpose built for education and have the evidence base to support that.
I really appreciate those points you brought in. Okay. Let's talk about, let's great. Let's move to, like, some of the elephants in the room. Listen.
Resources are tightening. Budgets are, harder to manage. And frankly, I think when I talk to school district leaders and state chiefs, they're feeling a little uncertain about the funding climate. Maybe because of shifts at the federal level, but also, you know, changing attitudes around, bonds or even issues that they're having in, folks feeling like this is a a worthy investment, in their communities, and in education in particular. So budgets are tightening.
What shifts are you seeing right now? So I'm not asking you yet to pull out that crystal ball. I will later. But what shifts are you seeing in your customers' ed tech requirements and their evaluation process both maybe in procurement, but also the types of questions you're getting? Are they asking for evidence? Are they thinking about efficacy? Are you seeing, even mechanisms like outcomes based contracting come into the the frame? I'm gonna start with Charlie. Charlie, tell us what you're seeing as resources constrain, as budgets tighten, as on Yeah. I think of the newer space.
Yeah. From a broad standpoint, I think what we've observed over, into the past few years and then even, past several months, things seem to be kinda moving toward a more rigorous a more rigorous qualification. We see, more requirements, more boxes that we need to ensure that we check even beyond just evidence, but as it pertains to interoperability and, you know, seamless, you were you're talking about procurement. I just got back from a conference yesterday where a big part of that topic was how do we streamline procurement, you know, from, and there were vendors there, there were schools there, Everybody in between. But but how do we try to make that procurement that procurement game a little bit easier for us to all play? And that's I I think that's what we're seeing.
It hasn't all happened at once, I think, for us, but it's definitely, things you know, as the budgets have gotten tighter, so have the requirements. And I think in some senses, that's a good thing because I think it holds folks to a higher standard to make sure. I think it kinda pushes the community forward to make sure that, you know, at the end of the day, when you're thinking about things like accessibility or you're thinking about things like interoperability or, you know, some of the broad things that face the ad tech space, it's good to a degree. Don't necessarily like the budgets might be getting tighter, at the end of the day, but that's that's kinda what we're seeing. Right? Things things are just becoming, you really have to find tooth comb, every RFP that kinda comes through every every requirement.
We've, you know, we've seen you mentioned outcomes based agreements. We've we've seen those as well. So I think it really comes down to where the school is at any given point in time, and I think how aware they are of things that are out there, and then ultimately, the budget at the end of the day. So I think, from their standpoint, just wanna be sure that, you know, they're getting what they pay for at the end of the day, which is a fair fair ask. But that's that's really what we're seeing here at Lincoln Learning, in the work that we're doing right now.
Ejin, are you seeing those same things or something different? Similar. I think that many districts are now asking more specific questions about evidence. They're asking questions about, you know, at level does this program have SL lined evidence? Has that been independently validated? And that's kind of the one big shift that I've seen from kind of nice to have to show me the evidence. And I think the second shift that I've seen is toward context relevant and time realistic evidence. So educators aren't just asking, does this work somewhere? They're asking, does it work in settings like mine? Title one schools, rural schools, virtual settings, and with students like mine, including emerging multilingual learners, students with individualized education plans.
They also wanna know if the program can deliver impact under tight time constraints in real classrooms where a teacher is juggling fifteen or twenty children with very different learning needs and might realistically have thirty to sixty minutes a week at most to devote to a supplemental program. So we really lean leaned into studies that reflect those conditions and into implementation guidance that speak to that reality. And then another major shift that I've seen is that focus on interoperability and teacher workload. Districts want programs that fit into their existing ecosystems and the ones that make data more actionable, not more overwhelming. Well, you're never gonna hear me argue against interoperability.
Obviously, at Project Unicorn, we've been working at it for ten years, and we're seeing those same things in the state of the sector and the school system data survey that also when folks have a robust procurement process, when they have strong interoperability, frankly, they see less cybersecurity incidents, less security challenges. So both, like, the CIO is happy and the CAO is happy. The chief academic officer and the chief information officer are feeling better when their, environments are more interoperable. Their procurement systems are stronger. And then, frankly, they're able to justify those expenditures, to their board and ultimately to their communities.
Michelle, what would you add to sort of those sets of comments that you're seeing at InMentum? Does evidence matter? I think the answer is yes. At what tier? I think we're evolving, right, on the ESSA tiers of evidence for tiers of evidence for folks who who aren't deeply familiar. What are you seeing? Were were we on case studies is what I'm hearing? We are absolutely beyond case studies. Without a doubt, we're beyond case studies. And and what we see is actually ESSA evidence on whatever program you're offering is table stakes.
It is a it is a must have in RFPs. You you don't get to pass go without having ESSA evidence. And and, typically, it's not just a tier four of ESSA evidence, which is that you built with a solid research base that you that you actually did research based design of your programs, but that you actually have to be able to show that you made gains on the outcomes that the that the program is meant to deliver through we're starting to see requirements that it it really needs to be at that quasi experimental design level at least. We're seeing that at the state level as well. So we're seeing some of the supplemental and intervention curricular approval lists require ESSA evidence.
We're seeing that in Florida. We're seeing that in Texas. Very important. And a number of other states are are also taking that lead. But at the district level, interestingly, I I would say that I I I we're seeing it go even beyond does this work in a district like mine.
We're seeing did this work in my district? So so we were working in a large district last year that actually required all vendors who were offering a particular kind of intervention program to, adopt a data sharing agreement with the district. The district would offer the mid of year assessment results and the end of year assessment results to the vendors with the expectation that each of the vendors would, within about a two week period of time, conduct a quasi experimental design on the outcomes achieved that semester in the district and present those findings to the chief academic officer in the district. And so we we went through that cycle. We did that at both the mid of year and the end of year for that district as well. So it it went it went well beyond is this working in a district like mine.
They actually wanted the evidence and and fairly rigorous evidence with that that meets a high level of of research methodology, they wanted to see that for their program. So not just does this work, does this work for my students in my context is what I hear you say. Yeah. I think we're gonna see folks doing that. I will also just say I don't think it's just tightening budgets.
I think it's also questions from parents, communities, and students about tech use, screen time in classrooms. Is this the most efficacious activity that we can be doing, in in our schools? And so evidence can both be, I think, helpful to change implementation practices, but also as a build bit of a sword and a shield to talk about protecting the use of technology and ed tech in schools when it's purpose built and designed for education. Okay. I'm gonna stick with you, Michelle. I wanna talk about something that I think is really important to the vendor community, which is, like, how do I stand out when there are so many products that are out there? And and, frankly, everybody knows this.
Our education market is quite disjunct, here in the United States. You know, it's a it's a very fragmented market. How do you think about leveraging evidence and interoperability as a market differentiator for Edmentum? And and can you tell me a little bit about how that how you've really worked to differentiate your product? No no product secrets needed, but how you've worked to differentiate or tell the story of Edmentum around interoperability and evidence. Yeah. That that's a it's a it's a great question, and it's, you know, how how can you how can you get your program to stand out among the others? And and I and I'm gonna point to a couple of things here with a few examples.
So one of them is that I mentioned that we work in a in an area of really reimagining secondary education and graduation. And in that, what you don't tend to see is evidence necessarily around kind of all of the outcomes. You see know, like, it's it's not just about academic achievement for students. It is about whether they are showing up to school. It is about whether they are late to class and disengaged.
And so one of the ways that we've thought about this is to think broadly about the outcomes that we are working to achieve and then figure out ways to measure whether or not the programs that we have are influencing those as well. In that particular space, we are seeing competitors maybe not have have that vantage point or that viewpoint around the the various outcomes that they should be looking at. And and I expect as we continue to work on our career college and career readiness evidence, that is going to also be very much true there. When we're working in a space where I would say, like, in the intervention space, however, where we're accelerating student achievement, that is the the it's it's actually you know, I think I would say most of the vendors in that space are able to talk about growth, and they're able to point to studies that show that the students that use their programs in in some way have made gains on either math or or literacy. And so when we were thinking about this, one of the things that we did was really listen to the district administrators about what we need what they needed.
And what they needed was not just another ESSA tier two study somewhere else. They really needed to understand if this program was working for their students in their districts. So what we did at Edmentum is that we we heard this message that the local context really, really matters. We knew that I do not have the capacity on my research team to turn around a study in two weeks after every assessment that an assess you know, that a a district is doing. And so what we did, we took all the rigorous research that we had behind our product Exact Path, and we actually built it into an in product on demand report for district administrators that shows them the relationship between student completion of skills and Exact Path and the test scores that those students are getting in their district in real time.
And and that brings that that makes for those district administrators, it is it makes the data very, very transparent. So we're not, you know, there's nothing in there that's hiding. Like, we're only gonna show you, like, these really great results in grade three, and we're we're gonna, like, maybe not tell you about the results in grade two. Like, that it's all there. It's all there in front for the district administrators to see in in the platform.
It increases the relevance of the evidence for them because it really is reflecting very directly to them what is happening in their district. And it makes the data actionable as well because the reporting gives them an ability to see in their district, for example, who is using it to Fidelity, who is not using it to Fidelity, what is the relationship between using it to Fidelity and the gains that students are getting on on their on their on their assessment scores. So I would say, like, how to differentiate, it does depend on the space that you're in and what kinds of evidence your competitors are bringing to bear. But it but it also matters to listen very deeply to the people that are trying to they're doing this work on the ground in their school districts every day. They have great ideas about what they need, and and, frankly, they should they should be demanding quite a lot of us as EdTech vendors in terms of the evidence that they need.
So here I hear you saying you've made it accessible, digestible, interpretable to the folks who are doing the work. They don't need to have a PhD in data science. No. I really appreciate that. The other thing I hear you saying, and I'm just I wanna call it out because I think innovation moves at the speed of trust, is that you're building trust through transparency about the product.
And so even maybe if if you don't love maybe what they're seeing, they trust you to tell them the good, bad, and the ugly. And that trust builds a relationship and that they have a relationship with Edmentum. So I can imagine that that probably helps in customer retention as well. How does the age of learning think about evidence as a market differentiator? I appreciated your comments earlier about it being even at the product design level, a way you are differentiating in the market. How are you leveraging evidence interoperability to differentiate Age of Learning's products? I well, thank you for bringing in the part that I mentioned about how we are integrating evidence into our our product design as well.
So one of the things that we do, and this relates a little bit to something that Michelle said, we're very transparent about how we have designed our programs with the learning outcomes in mind. And we show what that design process looks like and what outcomes we're aiming for. And it's not just about learning outcomes. We place a great deal of emphasis on making that learning process enjoyable Fun, and confidence building because those are all antecedents to achievement. You have to have engagement in order for learning to happen.
And we when we talk about evidence, we also try to contextualize those results. Talk about how much instructional time did teachers actually have, how were classrooms organized, what challenges were teachers navigating in their local context, and to help districts see what's realistic and replicable. And, again, echoing something that Michelle said, we have to communicate evidence in ways that teachers and educators and district leaders can assess quickly through concise summaries and clear methods where learning happens, what subcut subgroup outcomes are possible, and transparent limitations. And, again, trust building. That openness is important, and we also know that learning can vary across context, and we have to be cognizant of that and recognize that.
Going back to differentiation and focusing on student experience, One of the things that I always want to point out is that across studies, we're seeing that children are not just learning, but they are so deeply engaged and enjoying the process, and they are becoming competent as learners. They're seeing themselves as learners, and that's one of the most exciting things that we see. And teachers also tell us that these these students who are hesitant at the start become more willing to try to persist and eventually to celebrate their progress. And that kind of engagement isn't just an add on. It's foundational to early learning.
We're talking about three, four, five year olds, and it's supported by the way that our programs personalize practice and challenge students at the right moment. And another thing that I would say is that we differentiate through usability and teacher empowerment. We're not just serving the students. We also think of teachers as our users of data, of re resources that we provide. They get support.
The educator center or the dashboard that we provide surfaces mastery of specific skills that students are working on. They pinpoint areas of struggle, recommend small group small groupings for students working on similar skills, and also target offline activities that are aligned to exactly what students are working on. And many teachers tell us that it helps them feel more control more in control of the differentiation and more confident in making instructional decisions and more able to meet each student where they are. I love that you both talked about personalization. Michelle, you talked about it as, like, getting kids in, like, in your one of your products, getting them to come to school because they're excited to be engaged.
And then this idea of even at the early stage, how we think about sense of belonging, joy of learning, as part of when we're thinking about personalization. So it's not just like what's the right skill and what's the right strand and what's the right intervention, but it's really restoring that joy, in learning and and young people seeing themselves as learners all the way through. Okay. Charlie, how are you thinking about interoperability and evidence as a market differentiator at Lincoln? Things you would add to this conversation. Yeah.
I think, from our standpoint, evidence in particular, it it really assists us in telling a bigger story that's, you know, maybe even beyond Lincoln Lincoln itself, and kinda where we fit in this education space. And I think the other thing that it helps us do is continue on the journey toward innovating and and finding better solutions, more differentiated solutions that are meeting more learners as we understand about different preferences and different environments that might might be originating in inside of our space. So evidence really helps kinda reaffirm that. And then, you know, I love many of the things that I've heard here how it it's it's it's kind of a cyclical sort of thing, right, where we're able to kinda tell the story about here's what was, but then we're also able to say here's what is because we found these pieces out. We are taking these actions.
Right? But I think that really helps to, you know, reaffirm that trust. Like that, I like that piece of it, to ensure that, you know, we didn't just take our idea and say, yeah. I I we're standing firm on, you know, what our principles were regardless, of what other folks think. We're always open. We're no good without feedback.
We're no good without evidence to reaffirm the things that we're setting out to do. So it has to be that that full life cycle. We have to pull that evidence back in so that we can continually tell that story because that story has multiple chapters. I I think as time goes forward, and I've seen that even in my time here, you know, the story has evolved from year over year over year as education continues to evolve. And then I think, you know, in terms of interoperability, just kinda reaffirming.
Folks, things need to work. Right? There there are so many things in there are so many gadgets, so many very neat tools, that do so many different things, in the ed tech space, and you have all different types of vendors, providers that that have all of these great things. I think, on our journey for interoperability, it's really been about giving time back to educators so that their time is best used with with their learners. And they're not worried about administrative overhead or or clerical tasks that, you know, so many things have kinda come into our teachers' desks, that they have to to interface with on a on a regular basis. We have really, really, really tried to take it hard, over the past number of years, making those things easier, or even eliminating, those pieces.
So hearing kind of the talk about, you know, evidence and and putting it in context for folks, we try to do that in a macro way, but we also try to do that in a micro way. Right? So that that evidence all aggregates. You know, the the feedback all aggregates, and, you know, we we try to understand that in a big way as well, as in a small way at Lincoln. Carly, you're reminding me of when we first started Project Unicorn ten years ago. Our tagline was giving teachers their Sundays back.
Because we would hear these horror stories of teachers, like, you know, spending whole Sundays, like, rostering kids via Excel spreadsheet or, like, hand entering emails, into into platforms. So one, I'm glad that's no longer our tagline and that the market has evolved, but we still have some progress to make on interoperability, to be frank. But you you made you flashed me back to ten years ago in the way back machine. So listen. I think we've talked a lot about the things that are really positive around sort of what's happening around efficacy, you know, what's happening in terms of this space, more folks asking for evidence.
I wanna shift and talk a little bit about challenges if we could go the next slide. Great. So, yep, thank you so much. So all of you have part so, you know, not to hide the ball and trust and transparency, all of you have partnerships with, Instructure. What challenges sort of led you to that partnership with Instructure? And and, you know, maybe I'll I'll take a little moderator's liberty.
Have you gained value in that partnership? I think you probably have if you're you're here today. And what has been that value? So let's start with you, Heejin. Yeah. So I think I've alluded to the fact that agent learning invests in research. Research evidence is essential to the way we do our work.
So we've done user design, usability testing, small scale small design studies to, you know, larger scale implementation studies and cross experimental studies as well as randomized control trials and longitudinal analysis. But we've always recognized that internal machine, internal evidence building is only one piece of the puzzle. Because as we've talked about, districts are under enormous amount of pressure to justify investments with solutions that have independent and rigorous evidence behind them. And they need to know that studies were reviewed by neutral experts and interpreted using the frameworks that they know and trust. And that's why we've sought, you know, SI line third party review, external researchers having having them review and scrutinize our methods, our sample characteristics, analyses, findings, and determine whether they meet certain asset tiers.
And that helps districts compare solutions consistently and understand the strength of evidence in context. And the validation process also strengthens our internal practice. We receive feedback on methodology. We get ideas about different ways to analyze data and implementation considerations also that enhance future studies and also inform product positions. So the independent evaluation that we where I that we're able to get by partnering with Instructure provides both external credibility as well as internal growth and learning.
Awesome. Just a reminder to our audience, I see some questions starting to come in the chat. We're gonna we're gonna turn to those questions really soon. Please, go ahead and put those in the QA panel, and we will get to those. Michelle, what would you add, about sort of the partnership and the why? So I'm hearing sort of trust, external validity.
What else would you say? Yeah. I mean, I I I would say that we're we're very similar to what Heejin just described. We have a great research team here at Inmentum. Really, really proud of the rigorous research that we can conduct. We can run studies faster and less ex with with less expense really than a third party can.
We have a technical advisory committee that comes in and helps us look at our methodologies, and our our director of efficacy research has just been elected to the board of directors for the society on research on educational effectiveness. So really, really proud of what we've put together here at Edmentum. And just as Heejin described, we found our partnership with Instructure is still a very, very important part of the evidence journey because we do need that third party independent validation. They have a great rubric that outlines exactly what they're checking for and and how it aligns with the ESSA tiers of evidence. And and what I'll say is that it is not just a rubber stamp.
So so what I what I would add is that it does push and force and raise questions and and really help us be better versions of ourselves by working with Instructure in order to in order to externally validate our study. I'll also say, I think the Instructure research team is is is really incredibly strong, and so they provide us with really valuable surge capacity. I don't know if you all have noticed what state list applications look like, but an RFP can drop, and you've gotta have, you know, a study submitted three weeks from then or or what have you. And so the the the flexibility that Instructure has shown us in in being able to help support some of those search capacity needs has been has been really fantastic as well. The talent capacity peer support is a little bit about what I hear.
Like Researcher. Researchers love researchers. So, having that having that community. Charlie, anything you would add to the partnership question and and the value? Just a couple things. Instructure, the evidence team, has really made it easy for us, to kinda step down, step down the road pursuing evidence.
We've done different things in the past. We've had independent research done. We have you know, we've had teams, internally that that kinda pursue that. But the independent validation, from an entity in the space that kinda has that respect and that trust already, I think goes much further for us in in in our instance, than much of what we would do on our own. That that's kinda where we've seen the value, with the partnership with Instructure.
You know, it's not necessarily, us on our own, saying saying what we're saying. It's coming through Instructure. We actually historically have partnered with Instructure in a variety of different ways. It was also a convenience for us, to kind of extend the partnership through evidence as well. So, it was, it it it was just a convenient, sort of happenstance for us to utilize the evidence services that Instructure has.
Great. I'm gonna turn a couple questions in the chat, and then I'm and like I told you earlier, I'm gonna ask you as our wrap up question to grab that crystal ball, keep it handy as we think about the future. So couple questions from the QA. And, Dennis, if you wouldn't mind, I see your hand up. And if you wouldn't mind using the QA, that would be great.
First, other firms or other folks who are in the research ecosystem that are doing some of this work, and maybe you can talk a little bit about differentiation, there. And then also, this is an important question. Where do I find the evidence, of how a product is performing? Let's say, I I want access to that evidence. Where should I go to find it if I'm a district leader, if I'm an educator, if I'm a superintendent, if I'm a state chief? How do I understand who has evidence and who doesn't? So, Michelle, why don't you, kick us off there a little bit? Who else does this, and, where can I find the evidence? And then Yeah. We'll just chime in.
Yeah. So so I would say that there there are other research firms who will conduct third party studies on your behalf. The the way that Instructure has put together their validation services, I would say, is actually fairly unique. I I don't see a lot of that happening from other research firms. They're they're very instructors are very systematic about it.
It it so it's whereas the the other research firms are more likely to want to conduct a study for you, they're less likely to look at a study that you've conducted and and provide that service a validation. And I think that this is actually tremendously important because as we see the changes that are happening with IES and the What Works Clearinghouse, I mean, I we I'm it's gonna be really like, this is no filter from me, I guess. Right? Like I I can I'll I'll help you here. It's a disaster. It's a disaster.
At Loeworth clearinghouse since twenty eighteen waiting to be reviewed. So it is there's a real gap there. I would say the the other kind of a a place that you can submit for validation would be evidence for ESSA, which is out of Johns Hopkins University. And they also do provide some services in a website that is up. They run quite a lot faster than the What Works Clearinghouse review cycle did.
And then I would say Instructure is is is is is the other that really does does this provides this service. Instructure is also partnered. Someone from Instructure could pipe in here with some of the international review bodies as well, and that has that has been really supportive in our in our international evidence story. Yeah. I I'll I'll just chime in.
I talked to state chiefs and and bigs big superintendents or even small district superintendents who are very frustrated with the slowdown in federal research and and knowing where to go. And, some of them take a significant port portion of their budget either to create their own sort of internal research teams or to band together, to figure out how they can, sort of have some local and state r and d heft. I think we're gonna see that trend grow. I think, you know, there's gonna be a demand for evidence. There's a growing demand for evidence from states, districts, communities, parents, and even students.
And so who's gonna help answer that question as as it's been really difficult to get those answers with speed trust and validity at the federal level. So, Michelle, I'll say all the hard things too on Friday, and validate that from the people who are doing the work in the field who are scrambling to figure this out. So I really appreciate your candor, and that trust building you just did right there. Really, really appreciate it. Charlie, Heejin, what would you add, to Michelle's comments? I also noticed that, one of our attendees mentioned the LXD research does validation as does EduEvidence, both of which are true.
For EduEvidence, they offer international certification model and kind of focus on broad impact dimensions rather than just it does it work? I think they actually have what they call five e's around efficacy, equity, ethics, environment, and I forget what the fifth one is. And then the International Center for EdTech Impact Yep. Is another one. They work with EdTech firms for usability testing, implementation research, as well as, I think, more recent outcomes based contracting as well. And they emphasize sort of the full evidence cycle from design through validation.
And that's how I got to know them, as we kind of align in that philosophy. Yeah. And I I mean, we send folks to the ST EdTech Index, which has some really, just simple market signals in terms of badges around, s you know, sort of asset tiers of evidence, research design, and validity, so on and so forth. So that's another place that folks can go. Charlie, would you add anything else, real quick in this? Yeah.
There are definitely other other institutions. I think it it ultimately comes down to, what question you're looking to answer, at the end of the day. So we're familiar with, all the groups that have been mentioned here. Before, we undertook our pathway with evidence with Instructure, you know, we've added all of those different solutions. Ultimately, at the end of the day, Instructure was kind of the best fit for us, at the time we were making the decision.
Great. And some real time hacks in the chat, honestly, in the in the QA using those quotes, to find a product in Eric and in in ResearchGate. Eric is a federal research repository, ResearchGate, as well. So real quick from Dennis. Thanks, Dennis, for converting that hand into a question.
I appreciate you. Dennis is developing an open source software product, maybe slow neg. I'm gonna or slo neg. You can correct me later, Dennis, for pure learning. It has, rigorous evidence on efficacy.
They're thinking of getting into schools through partners that are already selling something there. Anyone wanna reflect if they think that's a legit marketing strategy, a viable marketing strategy to do it through partnership? I mean, I think partnerships can help you scale faster, for one. And partnering with organizations that already sell into districts could be a really smart approach because many schools and districts have really long procurement cycles and already established vendor relationships and trust built with existing suppliers. And partnering lets you kind of capitalize on that a little bit and build on existing distribution and credibility rather than adding to stuff cold. So that's, I think, maybe the immediate thing that starts stands out to me as a benefit.
But I do think that there might be a couple of things that you would want to watch out for as well, like partner alignment and value proposition. The partner has to believe that your solution adds value to their offering, and the client the existing clients have to also see how your solution fits into their workflow. And I think the other thing that comes to mind is kind of around implementation and evidence translation. You mentioned that you have rigorous evidence of on efficacy, but I think that you will need to kind of ensure that evidence is communicable in partner and school terms. And implementation matters.
Schools will ask, like, what happened in the research? Like, who are the students? What supports were needed? Yeah. So if you partner, you'll need materials for the partner sales and implementation team to about about how to replicate the conditions of your evidence. Those are great tips, and what a wonderful way to mentor an entrepreneur. Thanks so much. Okay.
I told you guys you had to get your crystal ball. I have one on my desk. You can borrow it, if you'd like. So, Michelle, we'll just go straight down. Michelle, Heejin, and and Charlie, in forty five seconds or less, I need to tell tell me about one big initiative that you're thinking about in the future and how you're gonna partner with Instructure to get it done.
Michelle, go first. Yeah. One of our big initiatives is advancing our college and career readiness offering. We just acquired a platform, Major Clarity a couple of weeks ago. We're working on integrating that through.
We've already been working with Instructure on all the logic models related to our career offering and really excited to be advancing all of the the the great work that we can do in order to advance our understanding of how our programs are going to support each other actually within that within that career ecosystem. Amazing. Symbiotic support of learning. Great. Yi Jin.
Well, we're focusing on the student experience, enhancing that experience through AI informed personalization. We know that time is limited, so we're using embedded AI driven tools to quickly understand what students are capable of. We use this time spent in lengthy placement or pretesting and get them into appropriately challenging content as quickly as possible. And there are other features that we're integrating into the educator center to help make teacher's jobs easier. But, basically, we know that all of these new capabilities, the students and educators experiencing our programs, that's changing.
And that's exciting, but it also means that older studies are not enough. And we are committed to continuing to evaluate each new generation of the product, and that means running new studies, analyzing the new usage patterns, and updating our implementation guidance. So we'll continue to pursue, independent review of our research so that districts can interpret the findings through a consistent, external framework. Great. Charlie, we're at time, so you're gonna have to make this fast.
I'll be very quick. We're reevaluating, portions of our student experience, and hopeful to work, with Instructure, through that through that exercise to validate that, the redesign provides the right value, and engages in the right ways. Thank you so much for joining me today. Here's the bottom line. Evidence matters, so does interoperability.
Our job is to make it easier for teachers to use technology, not to make it harder. I'm so glad to have examples of three outstanding companies along with Instructure who are doing that in our ecosystem. If you wanna learn more about partnering with Instructure, reach out to their team. Thanks for joining us today. Have a great day.
Bye bye. Bye, everybody. Bye.
Please meet my fellow panelists who are as passionate about interoperability evidence and making an impact as I am. Joining me is Michelle Barrett. Michelle, tell us a little bit about yourself. And I also want you to tell us what is your favorite thanks giving or fall food? Get us get us hungry right around, this lunchtime hour for so many. Alright.
Super. So hello, everyone. I'm Michelle Barrett. I'm the senior vice president of research policy and impact for Augmentum. It's a real pleasure to be here today to talk about evidence and how Edmentum views the world and in looking at how we can continue to improve impact on educators and students that we serve.
In terms of my favorite Thanksgiving dish here, I I think I'm gonna go I I think it counts as a dish, but maybe not. My husband makes a wicked jalapeno gravy, and it is really amazing and delicious. So that would be my favorite. Does it freeze well, I think? Because I would keep that Absolutely. All all all year long.
That sounds like it could be an all year long kind of thing. Well, thanks for sharing a little bit about Momentum. I know we're gonna hear a little bit more later about the work that you all are doing both in partnership, but also at the company. Next joining me is Heejin Dang from Age of Learning. So tell me a little bit about you and a little bit about maybe what's your favorite fall Thanksgiving food side dish that you wanna share.
Sure. Hi. My name is He Jin Bang. I serve as the vice president of efficacy research and evaluation at Age of Learning. Our company I love our mission statement, which is to help children everywhere build a strong foundation for academic success and a lifelong love of learning.
My favorite fall side dish is a Korean dish. It's a rice cake that get filled with sesame or chestnut or rice beans and are steamed over, pine tree leaves. That sounds delicious, and is bringing me back to, like, the holidays in Koreatown when I lived, in New York City. It was one of my favorite things, to do. Great.
And certainly, last but not least is Charlie Thayer. Charlie, can you introduce yourself? And, again, bring us home with your favorite, fall or Thanksgiving side dish. Sure. Thank you, Erin. I'm Charlie Thayer.
I am the chief academic officer at Lincoln Learning Solutions. It's a pleasure to be here with my co panelists and everyone to kinda chat about this topic. It's something that's very important to us, and we've really been pushing over the years to dive deeper and deeper into evidence at Lincoln Learning. It's kinda part and parcel with what we do, working with schools to provide a variety of different educational solutions in their environments. We work with folks that are in a lot of different environments, so it's really important to understand the nuance nuances across those.
Favorite side dish. I could go one of two ways. I can go with side dish or I could cheat into the desserts. So I think I'm gonna go with the desserts, and my wife makes an amazing pecan pie. It is the best.
So she makes a a battery of pies. The pecan pie is it's at the top. It's usually the one that's gone first, from the dessert table. So that's that's that's the one I'm going with. Hey.
I'm gonna ask you to wait in some dangerous territory here. Are you a pecan pie with ice cream or pecan pie on its own? This is a dangerous question I know to ask you, Charlie. No. No. No.
No. All good. I I I broached the dessert topic. So there's no You you opened the door as we said. I went there.
You did. I'm gonna say with whipped cream. That's how we do it at Thanksgiving. Whipped cream is kinda right in the middle. So not without, but not it's without the ice cream.
Well, I'm so excited for you to join me today in this conversation. I'll just finish up by, introducing myself. I'm Erin Mota. I'm the CEO of InnovateEDU. We are a house of brands, not a branded house.
So you know us from our multi stakeholder alliances, like Project Unicorn, the EdSafe AI Alliance, the National Partnership for Student Success, the Pathways Alliance. Through this work, we touch over three thousand organizational partners and two of every three school districts in the United States. And so we are very focused on how we serve folks who are in the field doing the hard work. And today, we're gonna have that focus be on evidence and interoperability. So, Mary, I'm gonna come back to you first or sorry, Michelle.
I'm gonna come back to you first. Tell me a little bit about Edmentum, and tell me why you care about evidence. Why does Edmentum care about evidence? Absolutely. So Edmentum serves over five million students in more than six thousand public school districts in the United States. We also serve students in over a hundred countries.
We believe very, very firmly that all students deserve learning acceleration that is tailored to their unique needs, whether they are looking to catch up or stay on track or to chart their own path. We are really interested in changing the direction of students' lives. It matters very, very deeply to us. We care about evidence because it is a really critical feedback loop that helps us understand what we are getting right as we work to really influence and impact the positive trajectories for students, it also tells us where we can improve. Because what we're doing is creating innovative solutions to solve these really tangly and challenging problems that our district partners are facing, and we're also working with our district partners to implement these solutions.
We care about evidence because the district administrators, they don't really have time for trial and error. They, need to serve students at scale today with solutions that are helping them achieve their goals. And so evidence is intertwined in everything that we do in the decisions that we're making in order to make sure that we are, having the the greatest positive impact we can on the students that we serve. And so folks who interact with your platform, just in case in case they're not maybe familiar, student facing, but on the back end, lots of data and sort of hinge information about sort of how to use that data to act. Is that is that a good Exactly.
Exactly. So at at Admin's, we we offer really three solutions for district partners. So the first is all about accelerating student achievement. So here we offer assessment, grade level standards aligned, digital instruction and practice, personalization, intensive support, and human centered instruction, virtual instruction that is in a coherent system that drives academic growth, driving academic growth on the data, the assessments that we are administering to the students as they pursue their their academic instruction. The second is in our attempt to reimagine the path to graduation.
And here, we expand access for students by offering credit bearing digital program that maintains diploma integrity while also flexing to the needs of individual students, families, and the districts that they're in. These can address difficult challenges like attendance, declining engagement, and really making sure that students are able to learn the academic skills that they need to unlock opportunity. The third solution that we offer is in expanding college and career readiness. Our solution brings students from the journey of middle school exploration all the way through to employability and postsecondary success. So this solution supports academic skill development to unlock opportunity.
It, supports durable skill development for in, lasting impact. Students are able to take those skills and transfer them across multiple career potential pathways. And then finally, it supports their technical skill development, which helps with industry certifications and employability. Thanks. Thanks for digging in just one level deeper for folks to be a little bit more familiar.
So coming to you, Charlie, tell us a little bit about LinkedIn Learning. Oh, how what products you offer? What, to whom? And and how you think about evidence in your product offerings and their use? Yeah. For sure. So we're in a very similar space, as Michelle and Inmentum. Lincoln Learning is primarily, and I say primarily because I think we're we're a lot more than that, a k twelve, digital curriculum solution.
But what we've really kinda backed into is creating a versatile set of capabilities, services, and capabilities with our offering. So while we maybe provide traditional courses, those courses are created in such a way that they can really be flexibly implemented to snap into the different learning environments that we work with. So we work with we work with schools of all different shapes and sizes, all different composites, urban, rural, highly digital, not very digital. Right? And everything in between. The the spectrum is broad when it comes to the types of schools that we work with.
So that was really that was really the the backbone for us approaching the work that we've been doing for the past several years to build something that it really doesn't matter what setting you're coming to us from. We can kinda meet you where you are. And that's really what we try, and I think we do really well at Lincoln Learning is we partner with folks. We make sure that your strategic goals, as a school are our strategic goals. We really try to lean in, and align ourselves, with our partner schools, to make sure that, you know, together, we're both best set up for success.
And I think, you know, part and parcel with that is, being mindful of that evidence. I think I heard someone else say, you can have a really good idea. You can have a really cool idea. Edtech is a really innovative space. There are tons of cool things.
I I love EdTech. I fell in love I actually fell into EdTech years ago by accident, and I fell in love with it. And I can't I can't move away from it. I can never imagine myself not working in this space, because they're the the canvas is so broad for us to to to kind of work in. But evidence is really tantamount in that because, you know, lots of good ideas.
Have tons of good ideas. I work with tons of creative people. I'm surrounded by some of the smartest folks that I've ever encountered. I'm I'm astounded on a daily basis by some of the things that they come up with. But all those great ideas in execution might fall flat.
Right? So I think evidence really, brings things full circle to make sure, you know, we set out to do a thing. Did it make sense? Did it work? Is it is it working? Maybe it worked at one point in time, but it's no longer working, or maybe it works in this setting and not in that setting. So I think, you know, evidence for us is really kind of that it's a it's an internal validator to make sure that we're on the right path, we're moving the right direction. And as we kinda continue down the path, are we making the right turns? Are we making the right moves with some of the decisions that we're making? And it's really important not just in, you know, the academic affairs department that that that I get to work with, But from our business development department, we're we're a close knit organization, business development department, to make sure that they have the information that they can go out and, you know, use that as testimony, use that as support to say, right, we we understand what your profile is. I think, you know, here's here are some ways that maybe we can help you, succeed or improve, in those settings.
So that's, I think, really, from our perspective, what's most critical about evidence is validating. You know, if we took a course and, you know, we've deployed in a in a traditional brick and mortar classroom, what are the things that were successful there? And that comes back in that that evidence loop, for us to kinda validate those pieces. I love that. I I will never forget. I was in a meeting with a bunch of superintendents, and sort of the theme of our meeting was that's a nice story.
Now show me the data. And the importance of using data to inform decision making, both when you're the superintendent and when you're the classroom teacher. So I really, really appreciate that point, Charlie. Okay. Well, I know age of learning because I have young kids, and they are, maybe a little bit wild for ABC Mouse.
But age of learning is bigger than just ABC Mouse, maybe not in our household yet. But, but tell us a little bit about Age of Learning, the the sort of the scope, of products that you all are offering, the the types of, you know, work that you're doing, and how you're using Evidence. And as as Charlie talked about, both to understand what those products are doing, but also how to help your customers and districts or consumers like me. Yes. So we have and smart products like ABCmouse, which provides broad early learning environment for children to explore core concepts across all subject areas in a really playful and adaptive way.
And then we have school solutions, My Math Academy, My Reading Academy, and our newest one is My Reading Academy Espanol. They offer more targeted mastery based pathways to early math and early literacy using embedded assessments to personalize instruction and supporting students when they struggle and challenging them when they are ready for more advanced content. And together, these programs really help children learn at their own pace and help teachers deliver tailored instruction with actionable insights. And we've been able to show meaningful gains in both math and literacy across diverse learning context. But the way that we use evidence and the reason that it is so important is it's incorporated into the entire life cycle of every learning feature that we build into our programs.
At the very beginning, when we're just thinking about what to create, our curriculum team does a deep dive into what evidence tells us right now, what learning outcomes matter, what curricular approaches and interventions have been tried, how outcomes have been measured, what those empirical studies have found. And from day one, we're asking ourselves what approach is most likely to help children learn the best. So efficacy is really the north star guiding everything that we choose to build. And as we move into product design and development, we again use existing evidence of what works to inform the learning experience. So product designers make decisions about features and interactions that are likely to support learning and engagement, and our design research team plays a major role as they sit with kids, families, caregivers, educators in real contexts and watch how they use our products and ask what they're trying to accomplish and capture emerging insights.
And those are insights that get tested and retested before we make final decisions so that what we ship is designed to both engage children and help them learn and to give us a realistic way to collect robust evidence of impact. And then in the validation and iteration phase, where more of the public facing efficacy and evaluation work is done, we run a portfolio of studies, everything from case studies to pilots and randomized control trials. And our goal is really best in class research, which means evaluating and continuously improving and reporting on how our products drive learning gains. And just as important, we study implementation. What works for? What could have worked better for whom? Under what circumstances and context? And these findings go straight back to product and curriculum teams to inform our ongoing optimization.
And so we cared very deeply about evidence because we're made to building programs that actually help children learn and stay engaged in learning and to be able to serve more diverse groups of children across different contexts. I love that. What works for whom and under what conditions, I think, is is something that EdTech has really been pushing towards. And I love hearing you talk about it, not just in deployment, but in design. And I think as we look at the lots of tools that folks are using right now, how do we think about those tools that are purpose built for education and have the evidence base to support that.
I really appreciate those points you brought in. Okay. Let's talk about, let's great. Let's move to, like, some of the elephants in the room. Listen.
Resources are tightening. Budgets are, harder to manage. And frankly, I think when I talk to school district leaders and state chiefs, they're feeling a little uncertain about the funding climate. Maybe because of shifts at the federal level, but also, you know, changing attitudes around, bonds or even issues that they're having in, folks feeling like this is a a worthy investment, in their communities, and in education in particular. So budgets are tightening.
What shifts are you seeing right now? So I'm not asking you yet to pull out that crystal ball. I will later. But what shifts are you seeing in your customers' ed tech requirements and their evaluation process both maybe in procurement, but also the types of questions you're getting? Are they asking for evidence? Are they thinking about efficacy? Are you seeing, even mechanisms like outcomes based contracting come into the the frame? I'm gonna start with Charlie. Charlie, tell us what you're seeing as resources constrain, as budgets tighten, as on Yeah. I think of the newer space.
Yeah. From a broad standpoint, I think what we've observed over, into the past few years and then even, past several months, things seem to be kinda moving toward a more rigorous a more rigorous qualification. We see, more requirements, more boxes that we need to ensure that we check even beyond just evidence, but as it pertains to interoperability and, you know, seamless, you were you're talking about procurement. I just got back from a conference yesterday where a big part of that topic was how do we streamline procurement, you know, from, and there were vendors there, there were schools there, Everybody in between. But but how do we try to make that procurement that procurement game a little bit easier for us to all play? And that's I I think that's what we're seeing.
It hasn't all happened at once, I think, for us, but it's definitely, things you know, as the budgets have gotten tighter, so have the requirements. And I think in some senses, that's a good thing because I think it holds folks to a higher standard to make sure. I think it kinda pushes the community forward to make sure that, you know, at the end of the day, when you're thinking about things like accessibility or you're thinking about things like interoperability or, you know, some of the broad things that face the ad tech space, it's good to a degree. Don't necessarily like the budgets might be getting tighter, at the end of the day, but that's that's kinda what we're seeing. Right? Things things are just becoming, you really have to find tooth comb, every RFP that kinda comes through every every requirement.
We've, you know, we've seen you mentioned outcomes based agreements. We've we've seen those as well. So I think it really comes down to where the school is at any given point in time, and I think how aware they are of things that are out there, and then ultimately, the budget at the end of the day. So I think, from their standpoint, just wanna be sure that, you know, they're getting what they pay for at the end of the day, which is a fair fair ask. But that's that's really what we're seeing here at Lincoln Learning, in the work that we're doing right now.
Ejin, are you seeing those same things or something different? Similar. I think that many districts are now asking more specific questions about evidence. They're asking questions about, you know, at level does this program have SL lined evidence? Has that been independently validated? And that's kind of the one big shift that I've seen from kind of nice to have to show me the evidence. And I think the second shift that I've seen is toward context relevant and time realistic evidence. So educators aren't just asking, does this work somewhere? They're asking, does it work in settings like mine? Title one schools, rural schools, virtual settings, and with students like mine, including emerging multilingual learners, students with individualized education plans.
They also wanna know if the program can deliver impact under tight time constraints in real classrooms where a teacher is juggling fifteen or twenty children with very different learning needs and might realistically have thirty to sixty minutes a week at most to devote to a supplemental program. So we really lean leaned into studies that reflect those conditions and into implementation guidance that speak to that reality. And then another major shift that I've seen is that focus on interoperability and teacher workload. Districts want programs that fit into their existing ecosystems and the ones that make data more actionable, not more overwhelming. Well, you're never gonna hear me argue against interoperability.
Obviously, at Project Unicorn, we've been working at it for ten years, and we're seeing those same things in the state of the sector and the school system data survey that also when folks have a robust procurement process, when they have strong interoperability, frankly, they see less cybersecurity incidents, less security challenges. So both, like, the CIO is happy and the CAO is happy. The chief academic officer and the chief information officer are feeling better when their, environments are more interoperable. Their procurement systems are stronger. And then, frankly, they're able to justify those expenditures, to their board and ultimately to their communities.
Michelle, what would you add to sort of those sets of comments that you're seeing at InMentum? Does evidence matter? I think the answer is yes. At what tier? I think we're evolving, right, on the ESSA tiers of evidence for tiers of evidence for folks who who aren't deeply familiar. What are you seeing? Were were we on case studies is what I'm hearing? We are absolutely beyond case studies. Without a doubt, we're beyond case studies. And and what we see is actually ESSA evidence on whatever program you're offering is table stakes.
It is a it is a must have in RFPs. You you don't get to pass go without having ESSA evidence. And and, typically, it's not just a tier four of ESSA evidence, which is that you built with a solid research base that you that you actually did research based design of your programs, but that you actually have to be able to show that you made gains on the outcomes that the that the program is meant to deliver through we're starting to see requirements that it it really needs to be at that quasi experimental design level at least. We're seeing that at the state level as well. So we're seeing some of the supplemental and intervention curricular approval lists require ESSA evidence.
We're seeing that in Florida. We're seeing that in Texas. Very important. And a number of other states are are also taking that lead. But at the district level, interestingly, I I would say that I I I we're seeing it go even beyond does this work in a district like mine.
We're seeing did this work in my district? So so we were working in a large district last year that actually required all vendors who were offering a particular kind of intervention program to, adopt a data sharing agreement with the district. The district would offer the mid of year assessment results and the end of year assessment results to the vendors with the expectation that each of the vendors would, within about a two week period of time, conduct a quasi experimental design on the outcomes achieved that semester in the district and present those findings to the chief academic officer in the district. And so we we went through that cycle. We did that at both the mid of year and the end of year for that district as well. So it it went it went well beyond is this working in a district like mine.
They actually wanted the evidence and and fairly rigorous evidence with that that meets a high level of of research methodology, they wanted to see that for their program. So not just does this work, does this work for my students in my context is what I hear you say. Yeah. I think we're gonna see folks doing that. I will also just say I don't think it's just tightening budgets.
I think it's also questions from parents, communities, and students about tech use, screen time in classrooms. Is this the most efficacious activity that we can be doing, in in our schools? And so evidence can both be, I think, helpful to change implementation practices, but also as a build bit of a sword and a shield to talk about protecting the use of technology and ed tech in schools when it's purpose built and designed for education. Okay. I'm gonna stick with you, Michelle. I wanna talk about something that I think is really important to the vendor community, which is, like, how do I stand out when there are so many products that are out there? And and, frankly, everybody knows this.
Our education market is quite disjunct, here in the United States. You know, it's a it's a very fragmented market. How do you think about leveraging evidence and interoperability as a market differentiator for Edmentum? And and can you tell me a little bit about how that how you've really worked to differentiate your product? No no product secrets needed, but how you've worked to differentiate or tell the story of Edmentum around interoperability and evidence. Yeah. That that's a it's a it's a great question, and it's, you know, how how can you how can you get your program to stand out among the others? And and I and I'm gonna point to a couple of things here with a few examples.
So one of them is that I mentioned that we work in a in an area of really reimagining secondary education and graduation. And in that, what you don't tend to see is evidence necessarily around kind of all of the outcomes. You see know, like, it's it's not just about academic achievement for students. It is about whether they are showing up to school. It is about whether they are late to class and disengaged.
And so one of the ways that we've thought about this is to think broadly about the outcomes that we are working to achieve and then figure out ways to measure whether or not the programs that we have are influencing those as well. In that particular space, we are seeing competitors maybe not have have that vantage point or that viewpoint around the the various outcomes that they should be looking at. And and I expect as we continue to work on our career college and career readiness evidence, that is going to also be very much true there. When we're working in a space where I would say, like, in the intervention space, however, where we're accelerating student achievement, that is the the it's it's actually you know, I think I would say most of the vendors in that space are able to talk about growth, and they're able to point to studies that show that the students that use their programs in in some way have made gains on either math or or literacy. And so when we were thinking about this, one of the things that we did was really listen to the district administrators about what we need what they needed.
And what they needed was not just another ESSA tier two study somewhere else. They really needed to understand if this program was working for their students in their districts. So what we did at Edmentum is that we we heard this message that the local context really, really matters. We knew that I do not have the capacity on my research team to turn around a study in two weeks after every assessment that an assess you know, that a a district is doing. And so what we did, we took all the rigorous research that we had behind our product Exact Path, and we actually built it into an in product on demand report for district administrators that shows them the relationship between student completion of skills and Exact Path and the test scores that those students are getting in their district in real time.
And and that brings that that makes for those district administrators, it is it makes the data very, very transparent. So we're not, you know, there's nothing in there that's hiding. Like, we're only gonna show you, like, these really great results in grade three, and we're we're gonna, like, maybe not tell you about the results in grade two. Like, that it's all there. It's all there in front for the district administrators to see in in the platform.
It increases the relevance of the evidence for them because it really is reflecting very directly to them what is happening in their district. And it makes the data actionable as well because the reporting gives them an ability to see in their district, for example, who is using it to Fidelity, who is not using it to Fidelity, what is the relationship between using it to Fidelity and the gains that students are getting on on their on their on their assessment scores. So I would say, like, how to differentiate, it does depend on the space that you're in and what kinds of evidence your competitors are bringing to bear. But it but it also matters to listen very deeply to the people that are trying to they're doing this work on the ground in their school districts every day. They have great ideas about what they need, and and, frankly, they should they should be demanding quite a lot of us as EdTech vendors in terms of the evidence that they need.
So here I hear you saying you've made it accessible, digestible, interpretable to the folks who are doing the work. They don't need to have a PhD in data science. No. I really appreciate that. The other thing I hear you saying, and I'm just I wanna call it out because I think innovation moves at the speed of trust, is that you're building trust through transparency about the product.
And so even maybe if if you don't love maybe what they're seeing, they trust you to tell them the good, bad, and the ugly. And that trust builds a relationship and that they have a relationship with Edmentum. So I can imagine that that probably helps in customer retention as well. How does the age of learning think about evidence as a market differentiator? I appreciated your comments earlier about it being even at the product design level, a way you are differentiating in the market. How are you leveraging evidence interoperability to differentiate Age of Learning's products? I well, thank you for bringing in the part that I mentioned about how we are integrating evidence into our our product design as well.
So one of the things that we do, and this relates a little bit to something that Michelle said, we're very transparent about how we have designed our programs with the learning outcomes in mind. And we show what that design process looks like and what outcomes we're aiming for. And it's not just about learning outcomes. We place a great deal of emphasis on making that learning process enjoyable Fun, and confidence building because those are all antecedents to achievement. You have to have engagement in order for learning to happen.
And we when we talk about evidence, we also try to contextualize those results. Talk about how much instructional time did teachers actually have, how were classrooms organized, what challenges were teachers navigating in their local context, and to help districts see what's realistic and replicable. And, again, echoing something that Michelle said, we have to communicate evidence in ways that teachers and educators and district leaders can assess quickly through concise summaries and clear methods where learning happens, what subcut subgroup outcomes are possible, and transparent limitations. And, again, trust building. That openness is important, and we also know that learning can vary across context, and we have to be cognizant of that and recognize that.
Going back to differentiation and focusing on student experience, One of the things that I always want to point out is that across studies, we're seeing that children are not just learning, but they are so deeply engaged and enjoying the process, and they are becoming competent as learners. They're seeing themselves as learners, and that's one of the most exciting things that we see. And teachers also tell us that these these students who are hesitant at the start become more willing to try to persist and eventually to celebrate their progress. And that kind of engagement isn't just an add on. It's foundational to early learning.
We're talking about three, four, five year olds, and it's supported by the way that our programs personalize practice and challenge students at the right moment. And another thing that I would say is that we differentiate through usability and teacher empowerment. We're not just serving the students. We also think of teachers as our users of data, of re resources that we provide. They get support.
The educator center or the dashboard that we provide surfaces mastery of specific skills that students are working on. They pinpoint areas of struggle, recommend small group small groupings for students working on similar skills, and also target offline activities that are aligned to exactly what students are working on. And many teachers tell us that it helps them feel more control more in control of the differentiation and more confident in making instructional decisions and more able to meet each student where they are. I love that you both talked about personalization. Michelle, you talked about it as, like, getting kids in, like, in your one of your products, getting them to come to school because they're excited to be engaged.
And then this idea of even at the early stage, how we think about sense of belonging, joy of learning, as part of when we're thinking about personalization. So it's not just like what's the right skill and what's the right strand and what's the right intervention, but it's really restoring that joy, in learning and and young people seeing themselves as learners all the way through. Okay. Charlie, how are you thinking about interoperability and evidence as a market differentiator at Lincoln? Things you would add to this conversation. Yeah.
I think, from our standpoint, evidence in particular, it it really assists us in telling a bigger story that's, you know, maybe even beyond Lincoln Lincoln itself, and kinda where we fit in this education space. And I think the other thing that it helps us do is continue on the journey toward innovating and and finding better solutions, more differentiated solutions that are meeting more learners as we understand about different preferences and different environments that might might be originating in inside of our space. So evidence really helps kinda reaffirm that. And then, you know, I love many of the things that I've heard here how it it's it's it's kind of a cyclical sort of thing, right, where we're able to kinda tell the story about here's what was, but then we're also able to say here's what is because we found these pieces out. We are taking these actions.
Right? But I think that really helps to, you know, reaffirm that trust. Like that, I like that piece of it, to ensure that, you know, we didn't just take our idea and say, yeah. I I we're standing firm on, you know, what our principles were regardless, of what other folks think. We're always open. We're no good without feedback.
We're no good without evidence to reaffirm the things that we're setting out to do. So it has to be that that full life cycle. We have to pull that evidence back in so that we can continually tell that story because that story has multiple chapters. I I think as time goes forward, and I've seen that even in my time here, you know, the story has evolved from year over year over year as education continues to evolve. And then I think, you know, in terms of interoperability, just kinda reaffirming.
Folks, things need to work. Right? There there are so many things in there are so many gadgets, so many very neat tools, that do so many different things, in the ed tech space, and you have all different types of vendors, providers that that have all of these great things. I think, on our journey for interoperability, it's really been about giving time back to educators so that their time is best used with with their learners. And they're not worried about administrative overhead or or clerical tasks that, you know, so many things have kinda come into our teachers' desks, that they have to to interface with on a on a regular basis. We have really, really, really tried to take it hard, over the past number of years, making those things easier, or even eliminating, those pieces.
So hearing kind of the talk about, you know, evidence and and putting it in context for folks, we try to do that in a macro way, but we also try to do that in a micro way. Right? So that that evidence all aggregates. You know, the the feedback all aggregates, and, you know, we we try to understand that in a big way as well, as in a small way at Lincoln. Carly, you're reminding me of when we first started Project Unicorn ten years ago. Our tagline was giving teachers their Sundays back.
Because we would hear these horror stories of teachers, like, you know, spending whole Sundays, like, rostering kids via Excel spreadsheet or, like, hand entering emails, into into platforms. So one, I'm glad that's no longer our tagline and that the market has evolved, but we still have some progress to make on interoperability, to be frank. But you you made you flashed me back to ten years ago in the way back machine. So listen. I think we've talked a lot about the things that are really positive around sort of what's happening around efficacy, you know, what's happening in terms of this space, more folks asking for evidence.
I wanna shift and talk a little bit about challenges if we could go the next slide. Great. So, yep, thank you so much. So all of you have part so, you know, not to hide the ball and trust and transparency, all of you have partnerships with, Instructure. What challenges sort of led you to that partnership with Instructure? And and, you know, maybe I'll I'll take a little moderator's liberty.
Have you gained value in that partnership? I think you probably have if you're you're here today. And what has been that value? So let's start with you, Heejin. Yeah. So I think I've alluded to the fact that agent learning invests in research. Research evidence is essential to the way we do our work.
So we've done user design, usability testing, small scale small design studies to, you know, larger scale implementation studies and cross experimental studies as well as randomized control trials and longitudinal analysis. But we've always recognized that internal machine, internal evidence building is only one piece of the puzzle. Because as we've talked about, districts are under enormous amount of pressure to justify investments with solutions that have independent and rigorous evidence behind them. And they need to know that studies were reviewed by neutral experts and interpreted using the frameworks that they know and trust. And that's why we've sought, you know, SI line third party review, external researchers having having them review and scrutinize our methods, our sample characteristics, analyses, findings, and determine whether they meet certain asset tiers.
And that helps districts compare solutions consistently and understand the strength of evidence in context. And the validation process also strengthens our internal practice. We receive feedback on methodology. We get ideas about different ways to analyze data and implementation considerations also that enhance future studies and also inform product positions. So the independent evaluation that we where I that we're able to get by partnering with Instructure provides both external credibility as well as internal growth and learning.
Awesome. Just a reminder to our audience, I see some questions starting to come in the chat. We're gonna we're gonna turn to those questions really soon. Please, go ahead and put those in the QA panel, and we will get to those. Michelle, what would you add, about sort of the partnership and the why? So I'm hearing sort of trust, external validity.
What else would you say? Yeah. I mean, I I I would say that we're we're very similar to what Heejin just described. We have a great research team here at Inmentum. Really, really proud of the rigorous research that we can conduct. We can run studies faster and less ex with with less expense really than a third party can.
We have a technical advisory committee that comes in and helps us look at our methodologies, and our our director of efficacy research has just been elected to the board of directors for the society on research on educational effectiveness. So really, really proud of what we've put together here at Edmentum. And just as Heejin described, we found our partnership with Instructure is still a very, very important part of the evidence journey because we do need that third party independent validation. They have a great rubric that outlines exactly what they're checking for and and how it aligns with the ESSA tiers of evidence. And and what I'll say is that it is not just a rubber stamp.
So so what I what I would add is that it does push and force and raise questions and and really help us be better versions of ourselves by working with Instructure in order to in order to externally validate our study. I'll also say, I think the Instructure research team is is is really incredibly strong, and so they provide us with really valuable surge capacity. I don't know if you all have noticed what state list applications look like, but an RFP can drop, and you've gotta have, you know, a study submitted three weeks from then or or what have you. And so the the the flexibility that Instructure has shown us in in being able to help support some of those search capacity needs has been has been really fantastic as well. The talent capacity peer support is a little bit about what I hear.
Like Researcher. Researchers love researchers. So, having that having that community. Charlie, anything you would add to the partnership question and and the value? Just a couple things. Instructure, the evidence team, has really made it easy for us, to kinda step down, step down the road pursuing evidence.
We've done different things in the past. We've had independent research done. We have you know, we've had teams, internally that that kinda pursue that. But the independent validation, from an entity in the space that kinda has that respect and that trust already, I think goes much further for us in in in our instance, than much of what we would do on our own. That that's kinda where we've seen the value, with the partnership with Instructure.
You know, it's not necessarily, us on our own, saying saying what we're saying. It's coming through Instructure. We actually historically have partnered with Instructure in a variety of different ways. It was also a convenience for us, to kind of extend the partnership through evidence as well. So, it was, it it it was just a convenient, sort of happenstance for us to utilize the evidence services that Instructure has.
Great. I'm gonna turn a couple questions in the chat, and then I'm and like I told you earlier, I'm gonna ask you as our wrap up question to grab that crystal ball, keep it handy as we think about the future. So couple questions from the QA. And, Dennis, if you wouldn't mind, I see your hand up. And if you wouldn't mind using the QA, that would be great.
First, other firms or other folks who are in the research ecosystem that are doing some of this work, and maybe you can talk a little bit about differentiation, there. And then also, this is an important question. Where do I find the evidence, of how a product is performing? Let's say, I I want access to that evidence. Where should I go to find it if I'm a district leader, if I'm an educator, if I'm a superintendent, if I'm a state chief? How do I understand who has evidence and who doesn't? So, Michelle, why don't you, kick us off there a little bit? Who else does this, and, where can I find the evidence? And then Yeah. We'll just chime in.
Yeah. So so I would say that there there are other research firms who will conduct third party studies on your behalf. The the way that Instructure has put together their validation services, I would say, is actually fairly unique. I I don't see a lot of that happening from other research firms. They're they're very instructors are very systematic about it.
It it so it's whereas the the other research firms are more likely to want to conduct a study for you, they're less likely to look at a study that you've conducted and and provide that service a validation. And I think that this is actually tremendously important because as we see the changes that are happening with IES and the What Works Clearinghouse, I mean, I we I'm it's gonna be really like, this is no filter from me, I guess. Right? Like I I can I'll I'll help you here. It's a disaster. It's a disaster.
At Loeworth clearinghouse since twenty eighteen waiting to be reviewed. So it is there's a real gap there. I would say the the other kind of a a place that you can submit for validation would be evidence for ESSA, which is out of Johns Hopkins University. And they also do provide some services in a website that is up. They run quite a lot faster than the What Works Clearinghouse review cycle did.
And then I would say Instructure is is is is is the other that really does does this provides this service. Instructure is also partnered. Someone from Instructure could pipe in here with some of the international review bodies as well, and that has that has been really supportive in our in our international evidence story. Yeah. I I'll I'll just chime in.
I talked to state chiefs and and bigs big superintendents or even small district superintendents who are very frustrated with the slowdown in federal research and and knowing where to go. And, some of them take a significant port portion of their budget either to create their own sort of internal research teams or to band together, to figure out how they can, sort of have some local and state r and d heft. I think we're gonna see that trend grow. I think, you know, there's gonna be a demand for evidence. There's a growing demand for evidence from states, districts, communities, parents, and even students.
And so who's gonna help answer that question as as it's been really difficult to get those answers with speed trust and validity at the federal level. So, Michelle, I'll say all the hard things too on Friday, and validate that from the people who are doing the work in the field who are scrambling to figure this out. So I really appreciate your candor, and that trust building you just did right there. Really, really appreciate it. Charlie, Heejin, what would you add, to Michelle's comments? I also noticed that, one of our attendees mentioned the LXD research does validation as does EduEvidence, both of which are true.
For EduEvidence, they offer international certification model and kind of focus on broad impact dimensions rather than just it does it work? I think they actually have what they call five e's around efficacy, equity, ethics, environment, and I forget what the fifth one is. And then the International Center for EdTech Impact Yep. Is another one. They work with EdTech firms for usability testing, implementation research, as well as, I think, more recent outcomes based contracting as well. And they emphasize sort of the full evidence cycle from design through validation.
And that's how I got to know them, as we kind of align in that philosophy. Yeah. And I I mean, we send folks to the ST EdTech Index, which has some really, just simple market signals in terms of badges around, s you know, sort of asset tiers of evidence, research design, and validity, so on and so forth. So that's another place that folks can go. Charlie, would you add anything else, real quick in this? Yeah.
There are definitely other other institutions. I think it it ultimately comes down to, what question you're looking to answer, at the end of the day. So we're familiar with, all the groups that have been mentioned here. Before, we undertook our pathway with evidence with Instructure, you know, we've added all of those different solutions. Ultimately, at the end of the day, Instructure was kind of the best fit for us, at the time we were making the decision.
Great. And some real time hacks in the chat, honestly, in the in the QA using those quotes, to find a product in Eric and in in ResearchGate. Eric is a federal research repository, ResearchGate, as well. So real quick from Dennis. Thanks, Dennis, for converting that hand into a question.
I appreciate you. Dennis is developing an open source software product, maybe slow neg. I'm gonna or slo neg. You can correct me later, Dennis, for pure learning. It has, rigorous evidence on efficacy.
They're thinking of getting into schools through partners that are already selling something there. Anyone wanna reflect if they think that's a legit marketing strategy, a viable marketing strategy to do it through partnership? I mean, I think partnerships can help you scale faster, for one. And partnering with organizations that already sell into districts could be a really smart approach because many schools and districts have really long procurement cycles and already established vendor relationships and trust built with existing suppliers. And partnering lets you kind of capitalize on that a little bit and build on existing distribution and credibility rather than adding to stuff cold. So that's, I think, maybe the immediate thing that starts stands out to me as a benefit.
But I do think that there might be a couple of things that you would want to watch out for as well, like partner alignment and value proposition. The partner has to believe that your solution adds value to their offering, and the client the existing clients have to also see how your solution fits into their workflow. And I think the other thing that comes to mind is kind of around implementation and evidence translation. You mentioned that you have rigorous evidence of on efficacy, but I think that you will need to kind of ensure that evidence is communicable in partner and school terms. And implementation matters.
Schools will ask, like, what happened in the research? Like, who are the students? What supports were needed? Yeah. So if you partner, you'll need materials for the partner sales and implementation team to about about how to replicate the conditions of your evidence. Those are great tips, and what a wonderful way to mentor an entrepreneur. Thanks so much. Okay.
I told you guys you had to get your crystal ball. I have one on my desk. You can borrow it, if you'd like. So, Michelle, we'll just go straight down. Michelle, Heejin, and and Charlie, in forty five seconds or less, I need to tell tell me about one big initiative that you're thinking about in the future and how you're gonna partner with Instructure to get it done.
Michelle, go first. Yeah. One of our big initiatives is advancing our college and career readiness offering. We just acquired a platform, Major Clarity a couple of weeks ago. We're working on integrating that through.
We've already been working with Instructure on all the logic models related to our career offering and really excited to be advancing all of the the the great work that we can do in order to advance our understanding of how our programs are going to support each other actually within that within that career ecosystem. Amazing. Symbiotic support of learning. Great. Yi Jin.
Well, we're focusing on the student experience, enhancing that experience through AI informed personalization. We know that time is limited, so we're using embedded AI driven tools to quickly understand what students are capable of. We use this time spent in lengthy placement or pretesting and get them into appropriately challenging content as quickly as possible. And there are other features that we're integrating into the educator center to help make teacher's jobs easier. But, basically, we know that all of these new capabilities, the students and educators experiencing our programs, that's changing.
And that's exciting, but it also means that older studies are not enough. And we are committed to continuing to evaluate each new generation of the product, and that means running new studies, analyzing the new usage patterns, and updating our implementation guidance. So we'll continue to pursue, independent review of our research so that districts can interpret the findings through a consistent, external framework. Great. Charlie, we're at time, so you're gonna have to make this fast.
I'll be very quick. We're reevaluating, portions of our student experience, and hopeful to work, with Instructure, through that through that exercise to validate that, the redesign provides the right value, and engages in the right ways. Thank you so much for joining me today. Here's the bottom line. Evidence matters, so does interoperability.
Our job is to make it easier for teachers to use technology, not to make it harder. I'm so glad to have examples of three outstanding companies along with Instructure who are doing that in our ecosystem. If you wanna learn more about partnering with Instructure, reach out to their team. Thanks for joining us today. Have a great day.
Bye bye. Bye, everybody. Bye.