IgniteAI in Action

IgniteAI brings intelligent tools into the Canvas experience to help educators focus on what matters most: teaching, insight, and student impact. Join this hands-on session that will explore key functionalities seamlessly embedded in the Canvas experience.

Video Transcript
Hello, everyone. We'll just give a few more moments for everyone to jump into the webinar and get settled. Okay. I think people are still making their way in, but in the interest of time, let's get into things. Welcome everyone to our second webinar, on ignite AI. And today we're gonna be showing ignite AI in action.

I know that's what many of you are really keen to see. My name is Farrah King. I am the global growth product marketing manager here in APAC, and I'm excited to be running this session today with Greg Fowler, our APAC director of solution engineering here at Instructure. For those of you who haven't met Greg, he knows our products inside and out, and Greg works really closely with our institutions across the region to showcase the solutions that truly support teaching and learning. With his deep expertise in Canvas and our broader ecosystem, he helps our universities, our schools, all of our learning providers with their academic goals, getting them to practical, impactful EdTech use.

So today, Greg and I are gonna be walking you through our new Canvas AI features, and we'll be showing how they can empower educators and streamline your key workflows. So as I mentioned, this is our second webinar in our AI series. So we're gonna actually start off by quickly recapping what was covered at last week's webinar, where Melissa Lobel and I covered the why and the what about AI in education. This will help set the scene for today's session, and that's where we're gonna chat about the how. And we're gonna show you what we've have in store and the AI features that we'll be bringing to Canvas.

There is a q and a function in this webinar as well, So my amazing colleagues will be here to answer questions in there for you. Feel free to pop your questions directly into the Q and A. We've also got some chat going on. So we'd love to see where you're dialing in from in the region. Feel free to pop that in if you'd like because we always love to see our region get represented.

Now firstly, we'll cover off really quickly, because I know you're all keen to see things, what we've talked about last week. So we looked at the current landscape of AI in education and specifically the challenges that we know you're facing. Are the tools secure? Are they vetted? Have they been aligned with data privacy? How do we scale awareness and adoption with staff, with faculty, and students? And, of course, how can AI help solve the problems that we couldn't solve before? And so then we also covered off Instructure's principles for successful deployment. So we wanna ensure confidence. We wanna be transparent.

We want you to know that these tools have been vetted and that you can trust them. We also wanna support your adoption. So we wanna make sure these tools are native, integrated capabilities that are seamless and intuitive for your workflows. And, of course, that they're bringing you value. We wanna make sure that the AI solves the real challenges, and it avoids that AI for AI sake that some people might be using at the moment.

And, of course, it's sustainable for you and your institutions in the long term. Ultimately, our goal is that AI is not here to replace educators. It's here to enhance, augment, and expand what's possible in teaching and learning. So now let's get into what you're here for. We're going to be doing we'll we'll cover off some things.

And so what we're doing here at Instructure, I'm gonna go through the work that we're doing, then I'll hand over to Greg for the demo. I know that's what you're all here for, so just bear with me while I set the scene. So today, we're gonna introduce you to Ignite AI. That's our in context for education solution that you're gonna see featured throughout the session today. Think of it as a conductor for AI in our ecosystem.

It's designed to bring AI directly into the flow of teaching and learning with no separate logins, processes, or fragmentation. It's just smart, accessible tools directly embedded into the systems that you're already using. And Ignite is powered by Amazon Bedrock, so our customers will benefit from a secure enterprise grade AI foundation that ensures transparency, scalability, and flexibility to adapt to your diverse institutional needs. Ignite AI isn't just going to be another tool. It's Instructure's cohesive AI solution embedded in Canvas and our ecosystem, giving institutions control, confidence, and ensure that students can focus more on learning.

So I'm gonna run through some of the features that we're gonna be going through in a moment just to give you a bit of insight into what we're thinking. So firstly, when it comes to facilitating a better learning experience, we are working on discussion summaries and discussion insights. With discussion summaries, we know it's not easy to keep track of every conversation, particularly when there are large cohorts and lots of discussion going on. So when that conversation is moving so fast and you need a quick recap, we have discussion summaries in Canvas. They analyze the conversation threads with the click of a button, and they highlight the main ideas, areas of confusion, and points that might need more attention.

So it's going to help your educators quickly understand the flow of the discussion, spot any unanswered questions, and of course, help them focus their responses where they're going to have the most impact. This keeps your discussions productive. It ensures your learners feel heard, and it's gonna help your educators respond in a more timely and targeted way. So the result will be more engaged conversations, stronger understanding of your course materials, and less time lost sifting through endless posts. So where summaries help you see where the discussion stands, insights will take that a step further for you by showing you how your learners are engaging and the quality of their contributions.

Currently, educators can track how many times learners are posting or replying in a discussion, but it's actually very difficult to evaluate the quality of those discussions and contributions. Manually having to review every comment to gauge depth or relevance and engagement can take a lot of time and delay that meaningful intervention that might be needed. So discussion insights analyzes the content of your learner's posts. It assesses their quality and engagement levels and surfaces patterns in participation, highlights standout contributions, and of course, identifies when your learners might be under participating or even off track. Your educators will get a clearer, faster picture of how the discussions are driving learning, enabling them to recognize strong contributions, reengage those quiet learners, and of course, guide conversations for deeper understanding.

This results in richer dialogue, more equitable participation, and, of course, stronger learning outcomes. And when it comes to assessment, we have a few features aimed at improving teacher work flows, rubric generation, auto grading, and AI quiz question generation. So firstly, when we think about the arc of the learning evaluation, it starts with how the assignments are created. And for many of you, this includes leveraging a rubric to guide what your learners need to demonstrate and, of course, how you are assessing those things. I know many people don't enjoy the rubric writing process.

I'd be interested to know if anyone does. So now we have AI that can handle that draft process for you so that you can focus on the nuance. Having a meaningful rubric is so important to the assessment process, but we know creating one can be so time consuming. So we're currently developing a capability which will leverage Ignite AI to assist in generating those rubrics for you. It will improve the quality and reduce the time spent on the task.

So while AI helps to expedite the process, your educators will, of course, have full control to edit or amend that rubric and ensure that the end result is truly suitable and reflective of your needs. We actually have an early adopter program for this one kicking off this month, and we hope to deliver this into feature preview this year in q four. So we're really excited to see how this works for you in your day to day. And when it comes to grading, we know how valuable feedback and submission insights are for learners. We also know this can be very time consuming.

So I know how excited many of you will be to get your hands on this one. It's a first pass grading assistance. We're currently developing a solution to generate draft scores and draft feedback using AI, which is powerful, not only through the lens of maximizing your time and reducing those administrative hours, but also really importantly, it helps protect submissions from bias, which inevitably creeps in as we mark for a long period of time or, you know, things fluctuate in our day. Imagine having a starting perspective every single time where you can review, evaluate, and iterate on without having to start from scratch. We know getting this right is crucial.

So we've been undertaking comprehensive piloting of this functionality with over a hundred users at the moment to ensure that we can continue to refine and evolve this offering. We are aiming to also launch this into an early adopter program next month, so you can stay tuned for more information on that. And finally, when it comes to creating high quality quiz questions, this is also very time consuming. Educators often start from scratch. They're manually drafting questions, aligning them to content, checking for accuracy.

It's slow, and it takes time away from other teaching priorities. So we're solving for this by introducing Ignite AI quiz question generation in new quizzes. Your educators can automatically create questions based on the course content or materials that they provide. Ignite AI will draft the questions that align with your learning objectives and give your educators a strong starting point, again, that they can review, edit, and approve. The result is faster quiz creation and, of course, more consistent alignment with your course content and more time for supporting your teachers and learners.

Next up, we've got evaluation and remediation, and these are two things I'm really excited to talk about, course wide content accessibility remediation and our intelligent insights offering. So when it comes to content accessibility, we know how tedious this can be. Accessibility is a major concern for all of you. And so while we're committed to ensuring that our platform is always accessible, we are also developing tools to make that content accessibility remediation easier for you. So this tool will be able to scan courses for non compliant content, like inaccessible PDFs or color contrast issues.

And, ultimately, it will generate those remediations for you, such as adding necessary text tags or even suggesting those more appropriate color options for you. At the moment in Canvas, we do this page by page. So having this at a course level is a huge time saver. We're hoping this will be available to opt in for an early adopter program as well later this month. And the next feature is part of our intelligent insight solution.

We know decision makers often have to request custom reports, and that involves waiting, sometimes days or weeks, for particular answers. So with Ask Your Data, admins can query Canvas data in natural language. They can generate SQL. It also helps create charts and tables with detailed explanations at the account and sub account level. So it's allowing you to explore engagement and performance trends more easily.

Leaders can then move from being reactive to proactive. They can use these timely insights to shape policy, allocate resources, and, of course, improve outcomes faster. One of our higher education institutions actually used Ask Your Data to help identify and remove unnecessary LTIs. It ended up saving them over ninety thousand dollars because they could find free alternatives. They used Ask Your Data to locate the specific LTI usage across their courses, and in particular, transitions from LTI one point one to one point three, and they even identified hidden instances of LTIs that weren't necessarily showing up in the LTI usage tool.

So the tool enabled them to provide their instructors with a customized list and, of course, allow for more efficient updates and cost savings. Now so far, we've shared with you our new Ignite AI features that will be included in Canvas, and we know there's a continued appetite for more meaningful use of AI in education. There's also understandably a lot of reluctance. Current tools are still siloed. They're overly specific, and they require jumping between multiple systems.

There's no unified or flexible way for AI powered help inside Canvas. This is where Ignite Agent comes in. It's a powerful premium AI tool designed to handle more complex tasks and engage with various AI solutions within your Canvas ecosystem using a single prompt. So think of it as a catchall for AI assistance where a specific prebuilt use case isn't always available. Your educators can ask questions or give it commands related to their courses and learners.

It embodies an Agintic workflow. So it's orchestrating tasks by potentially routing different parts of your request to various AI agents within your ecosystem, including institutional or third party partner agents. So from an educator's perspective, this aims to provide a more unified experience rather than requiring them to interact with multiple disparate AI tools. A key feature of Ignite AI is its transparency. It's always gonna show its work, outlining the steps that it plans to take and allowing educators to review and approve each action.

This will build confidence and ensure that educators always remain in control. It connects deeply with over seven hundred Canvas APIs to execute complex tasks from a single prompt. For example, it can identify struggling students based on some criteria, create those differentiation tags for that group, generate remedial assignments, and even use third party AI content generators to create targeted learning materials. Due to its complex comp compute intensive nature and its differentiated capabilities, IgniteAgent will be part of an add on product to Canvas rather than a default feature. When it comes to AI use, activity doesn't equal impact.

So we're really excited to bring Ignite Agent to you so you can channel your AI use into outcomes that truly matter in the teaching and learning space. Now thank you for bearing with me while I introduced all of those features to you. I'm now gonna hand over to Greg, who's gonna give us a demo of some of these things. Thanks, Farrah. And, we'll walk through a couple of the features that Farrah has mentioned, this morning, this afternoon, wherever your time zone is.

We'll start with the Ask Your Data product, a product that we've had in our production environment available for customers, for over a year now. Ask Your Data is a tool, as Farrah mentioned, which allows us to ask questions of almost all of the data that sits within the Canvas environment. So earlier today I asked, which students have the lowest average assignment scores in the active term. This tool then went and processed all of the Canvas data and then provided me on the right hand side of my screen a table of data. The tool has provided us a summary of the results and then it's provided the methodology that it's taken to create these results as well, including any assumptions that have been made, in its interpretation.

If this is something that we need to ask a clarification question, whether it be as simple as can you also include the student email address, or we can go a completely different direction, this will then reprocess that query and return that as a table of data. One of the really nice elements of Ask Your Data is that it will provide us with the complete SQL that it's generated to actually query this data in the back end. So if you've got a data science team that is learning the Canvas data product, this will help, them understand the structure, what tables need to be joined together to return certain data, but also help you validate or if you need to actually amend that underlying query and reprocess that. When we do return a table of data, we can also pin that for reuse into a pin board. So we can develop dashboards of, you know, common assessment monitoring that we need to make or the health of our Canvas environment and in a few moments time I'll show you some of the pinboards that I've created with some saved results there which are looking at the health of my Canvas environment.

Things like when users are most frequently logging in, what browsers they're using as well. We can also save these results as charts, whether that be a KPI number, whether that be a line or bar graph. And so if we go into chart builder, based on the data that's available, it will give us the various types of charts that we can generate here as well. But as I mentioned earlier, 'show SQL' this will show us all of the SQL that it's used to generate this data, along with the SQL diagram which is helping us understand which tables of data have actually been used to construct this query as well. In my pinboards, this is where I've saved some frequently used queries that I want to look at, and these load live with a fresh version of the data.

These can also be shared with your other Canvas admins, and of course this can also be used at a sub account level. So if we have a higher education institution, these can be faculty specific. If we're in a school context, we can have some dashboard specific to our younger learners versus our senior school. All of these reports, we can also go back and see the summary methodology that was originally created. And if we need to, we can actually go back to the original chat that we had with ask your data to see how that query was structured, or yet again validate that SQL at any point in time as well.

So moving on from Ask Your Data, Farah mentioned a few different AI elements which we've incorporated into our discussion tool. The first one that we'll look at is the ability to translate content. So again, this is using one of AWS translation models. We can translate into a number of different languages and this would take all of the content of both my original prompts that I've given through to my learners and then each discussion will also be translated into Dutch as well. So here we can see the translation, and if I keep scrolling down every single posting that has been made will also be translated throughout the discussion.

A little further up, another existing tool that we've incorporated are discussion summaries. So really benefiting those large courses where we've got hundreds or thousands of replies within the discussion, this AI model will look at all of those replies and it's then generated me this summary here. This summary, if I need to adjust and maybe ask for a different prompt like 'can you suggest some focus areas that I need to address within my teaching?' it might that I post a subsequent announcement based off of the summary that the AI has generated. On a lot of our tools, we also give academics the opportunity to rate these, AI generated summaries and this will then feed back to our team as well. The other element that we currently have within an early adopter program are those discussion insights.

So whereas discussion summaries is looking at the overall performance of my discussion, clicking on discussion insights will load a new interface and this is now focused student by student. So I recently generated these insights but this would alert me to the fact that there have been new postings made and I may want to regenerate these insights. And then we're given a relevant score for all ten students who have made a posting within this discussion and we can then filter to the ones that have been marked as least relevant or on target potentially the, replies or the postings that I will need to go and spend some time clarifying with my learners. We're getting an evaluation for each of our students, so Beth here expressed that feelings have been overwhelmed. If I would like to, I can click on this eye icon that will show Beth's individual posting that she's made.

I don't have a lot to add here and feeling overwhelmed to everyone's experience. We then get the, AI generated, insight for this particular learner. If we need to, we can then click see reply in context, and as an academic staff member, that allows me to see exactly where I might need to come in, reply, add, some additional context there for Beth and help encourage her, to reflect on her experience and maybe post things that might not have been originally relevant to the intent of this discussion. Moving on from, our discussion AI, integration, we're moving on to a Canvas quiz, and this is a really exciting development, I think. The ability for us to generate quiz questions leveraging an AI model, This, will either leverage existing course content that you've developed.

We can also incorporate, text that we're either inputting directly in here or copying and pasting existing materials. Finally, we can also upload additional files. So if we've got things that might not be sitting within our Canvas course but they are resources that we have access to, we can upload that at time of generation. But for now, I will access all of the pages within my course and select some pages that will be relevant for, the generation of these quiz questions. So I'm just gonna select three within my list of pages, but you'll notice here it's loaded all of the modules and the pages that I can ingest into this, model.

When I select, I'm going to provide a few different topics for the, quiz question to focus on. So I'm going to use terms like inventory control, management techniques, and manufacturing. And then the next thing that I can do here is actually align this through to a deeper level, whether that be through to learning outcomes, the depth of knowledge, or Bloom's taxonomy. So in this case, I'm going to choose evaluation on Bloom's, taxonomy, choose Australian English, and at this stage, we've got three different options that we can leverage, either multiple choice questions, fill in the blanks, or essay style. When we do select the number of questions that we want created, we'll notice that what's going to happen is that this will initially prompt me with the questions that it is suggesting.

The next phase from here before these can be added into my Canvas quiz is an edit and review phase. So we're never directly adding these questions into a Canvas quiz. There's always that human in the loop, the academic to validate that these questions are correct. So I can click edit and review, and one by one this will load the entire quiz question for me to adjust if necessary. And, I can either discard this question altogether, click save and go through to the next question, and once this has been, completely done, the then these will be added into my Canvas quiz.

And the thing that you'll notice here, this little icon that we're seeing here, the star icon, this will, be present anytime, Ignite AI has been used throughout Canvas and generally this will be academic facing. Sometimes there will be prompts given to students that AI has been leveraged as well, depending on the context. It doesn't mean that we can't adjust these questions at a later date, and so we can come in and edit this question, we can add it into a bank, potentially make that available outside of the context of this Canvas course as well. But at this stage, I've just added this, quiz question directly into this single quiz within my course. Moving on from, AI quiz question generation, we'll move on to, auto evaluate within SpeedGrader.

So what I have here, a rubric that's previously been created with descriptions on every single criterion, a student submission on the left hand side of the screen, and this new auto evaluate button. So clicking on that, it takes into account the student submission, all of the rubric, criteria, and then the performance standard within each rating, and it will then align, for each criteria a suggested rating and then a suggested comment as well. This will note, where AI has been used, and so we can come through and see the rationalization that the AI model has taken to determine that score. We can adjust the score. We can, adjust the comment that the AI has generated as well.

And so as I mentioned, anytime AI is used, we're seeing this icon. Clicking on that will provide additional rationalization as to why it's chosen that particular categorization. You'll notice that we're also seeing a comment that AI has generated. If I do amend, the, color highlighting that we'll see around the comment disappears to indicate that I have adjusted that comment from the AI suggested prompt. Or if I go back to the original, comment, we can see it's now got that purple outline indicating that was the default provided by the AI model.

Same thing goes for if I adjust that category within the rubric criterion, we'd no longer get that purple highlight around the rubric criterion that has been selected. So scrolling down, if the AI has done, you know, a perfect job, we can just save that rubric assessment. Otherwise, we can adjust as I mentioned and save as we normally would. The very last thing that we'd like to demonstrate this afternoon is within a, Canvas assignment. If I scroll down to the bottom of the screen, the rubric that I just leveraged for, assessment is available for me to preview, but I can also adjust this and actually recreate a rubric using an AI model.

So if I delete this rubric, create a new rubric, and we'll give it a title. I'm just going to call it AI generated rubric. We can then adjust all of these settings, but really the main thing we're looking at here, which is new, is actually the auto generate criteria. At the moment, I'm looking at creating five criteria with four ratings per criterion, with a point value of twenty per criterion. We can adjust what is being sent through to the model as well, so what actually occurs by default is it takes the assessment description, puts that into a model, and then generates criteria based on the information available.

If we need to provide some additional context as per the learning outcomes or potentially we want to provide it with a scale to leverage, we could add this in here as well. But I'll just click 'generate criteria' at the moment. When this processes, again, we'll get a visual description as to where AI has been leveraged. We can always add new criteria into this rubric, align this through to learning outcomes as you normally would, and so we can have a dual model of partially generated via AI and even edited, or we can use this as is. But again, we'll notice that icon appears at every criterion level.

We can adjust as well, so we don't need to take the exact language or rating name that was created, or if we need to we can delete and perhaps ask the model to leverage a rating name of, that doesn't use exceptional, satisfactory, inadequate, and no marks. There might be something your organization defaults to. We can ask the AI model to take that into account at time of generation. Before we save this, we can preview the rubric and see how that will look when we do grade. And if that is, completely correct, at the bottom of the screen, we click create rubric, and that will then attach through to my assessment, add into my asset my rubric bank so I can reuse that in any course that I'm teaching or for another assignment within this particular Canvas, course.

I can see that there's quite a few things that have come through, the q and a. Farrah, any that we should address live here? I think it might be good to talk about what teachers see when they're using AI in terms of that Ignite, logo that you were talking about before, particularly in the rubric generation piece. Sure. So, in terms of when we're generating a rubric, any criterion that are created by AI will be visually represented by the Ignite logo. So if I just delete this and recreate the rubric, we'll run through that in a little sort of more detail.

So AI generated rubric. We can just click generate criteria, or as I mentioned before, it might be that we want to add a prompt here to change the, the terminology used in ratings. When we do click generate criteria, we'll actually add something manually in as well to compare what an AI generated criteria looks like as compared to something that we're manually creating ourselves as an academic staff member. So it might be that, AI is suggesting several different criteria, but we're not taking into account something as basic as, referencing or spelling and grammar that we want to assess in this rubric. So here we've got vocational placement attendance.

Next to that criterion is the, Ignite AI logo to suggest that this was generated by AI. If I want to, I can actually duplicate this criterion, and it might be that we just change this through to spelling and grammar. And imagine here that I've adjusted all of these descriptors and the language. When we click save criterion, that will appear at the bottom of my rubric now. And because that wasn't generated by AI, we don't have this outline around the criterion, we don't have that logo next to the criterion descriptor, and we can then adjust those ratings at any point in time as we can for one of these AI generated criterion as well.

So we're not locking you into the suggested criterion by any means. These are really just suggestions that the academic staff member can review, adjust, or save, through to assessment. Awesome. Thank you, Greg. While I have you, I might put you on the spot a little bit.

I realized one thing we didn't speak about very much is nutrition facts, and I think these are super important to call out that all of our AI use in Canvas will come with something we're calling nutrition facts. So you can see what is being used, what particular LLM model, how the data is being used, and some further detail on that. Greg, are you able to show us those nutrition facts? Yeah. I can. So, while these will be built into the product, in a few months' time, at the moment all of our nutrition facts are held within the community.

So this is talking about things like the models that are used behind the scenes, to generate or the AI models that are used I should say. All of the data here will run through whether this is trained with user data, what's done with the data as well, so what amount of data is logged, during the creation process, any exposure of PII and also things like where this data is located as well. So all of our AI models that we've talked about today are located in all of the data centers that we operate, so we're not bleeding your data into different, AWS locations when you use Ignite AI either. So publicly available on the community, I just googled in the background, nutrition facts for Ask Your Data, and it directed me straight through to the Instructure community. We can go back into the knowledge base, and it will list all of the different, Ignite AI elements within the platform.

Awesome. Thank you. There's still a bunch of q and a happening, and I think we're slowly making our way through those. So thanks for your patience. But while that's happening, I think we can move back into our slides and have a look at where to next in terms of our roadmap of these functionalities and Ignite AI.

So a number of the things that we've mentioned today are already available in Canvas. So the functionality we showed in Intelligent Insights, so Ask Your Data, is already available, and you can speak to your CSM or, of course, have a look on our site for more information about that. Discussion summaries is also already available. And another one we didn't we quickly touched on was smart search, which has been improved in terms of the model behind that and really surfaces clearer, more specific search results for students. In terms of partners, we've got lots happening already with Google.

Those of you using, Gemini as an LTI. We've also got other things coming in the Pipeworks if you followed along with some of our announcements. We also have a partnership with Anthropic. Claude is their big product. So we do have an LTI with Claude for education.

We've got one coming soon with OpenAI as well. In terms of some of the features that we've spoken about that are in development, I shared some pots and timelines as well in terms of when those are coming, and a lot of them will be available in the next month or so for early adopter. Our early adopter programs are a fabulous way to get involved, try out these products before they are finished. And the reason I call that out is because I think it's really important that we ensure we are capturing your use cases. We are getting feedback from you before we finalize things.

Here in the APAC region, we have a lot of specific use cases. And so I think early adopter programs are an amazing way to inform what we do going forward before these products become generally available. And then, of course, we've shown you some things that are in the works and will be announced at the start of next year. Again, we will be following up with with information on how you can stay in touch with all of these things. So if you are keen to stay in the know, we do have a link that will be put in the chat in a moment where you can stay connected on all things as we roll out Ignite AI because there are a number of features.

We will keep you posted on how those features are progressing and also the research and the educator stories from across the globe that are happening because there is so much good work, happening in this space. Hopefully, you've had a chance to watch our first webinar where we called out some of those great use cases. And along with our public facing roadmap, obviously, follow along with this feature development. The next thing I'll call out as well is we have our quarterly product update sessions specifically for the APAC region, and the next one's actually happening on the seventeenth of September. So this is where we show an exclusive look at recent releases and just give visibility into our roadmap more deeply so that you can see what's coming.

We get resources out to you as well and really help you manage that change and adoption across your institution. That was all for today. I hope you're feeling more, enlightened by what we've got coming, and we're hopefully excited by what we've shown that's coming into Canvas soon. As I said, so much is happening in the AI space, and we're really excited about the way we're bringing everything together for you in one place within Canvas. We'll take some time to work through a few more of the QA's or Q and A's.

Sorry. If you do have other questions that don't get answered, we can follow those up afterwards as well. But thank you for spending some time with us today. We're very grateful as always that you've made the time and made the choice to be with us, and we'll see you all soon. Thanks, everyone.

Presenters: Farrah King and Greg Faller

IgniteAI functionalities we’ll explore:

  • Content Accessibility Remediation
  • Discussion Summaries and Discussion Insights
  • Rubric Generation
  • Instructure and OpenAI partnership