GoReact
An on-demand webinar showcasing how the AI Assistant in GoReact simplifies feedback, enhances insights, and improves outcomes for both instructors and students
The AI Assistant in GoReact makes it simple to deliver the insights students need to improve while streamlining your workload. See it in action, get tips from other instructors, and discover ways it can help you and your students achieve more.
Matthew Short:
All right. Hello and welcome to our webinar today. We are thrilled that you have joined us and we hope that you enjoy the presentation and walk away prepared to inspire your students and make a positive impact on their careers. My name is Matthew Short and I’m a member of the GoReact client Success team. I will be leading today’s presentation. For those of you who aren’t familiar with GoReact, we are a competency-based video assessment and feedback solution used primarily across campuses in the United States and the United Kingdom. Before we begin, I’ll run through a few points of housekeeping. Today’s event will last about 45 minutes, includes about 30 minutes of presentation, and 10 to 15 minutes for questions and answers. We are recording today’s presentation, so if you do need to hop off before we finish or you want to share today’s recording with a colleague, we will email that recording to you afterwards.
We do want today’s presentation to be as interactive as possible, so throughout the presentation please participate in the polls and provide us with questions. To submit questions, please use the q and a function. We’ll answer as many questions in today’s session as we can. Towards the end, you’ll also see a chat function. Please use this to introduce yourself, tell us what school you are with and if you have any links or relevant resources to share with other attendees, please do so in the chat. And if you are experiencing any type of technical difficulties, please reach out to us and we’ll try to assist you as much as we can. Without any further ado, let’s get started. So as you can see, today’s presentation is on navigating the new AI Assistant with in GoReact. Now, ai, big buzzword culturally, socially, it’s being embedded and including in just about every tool that’s out there. So as we get started, I do want to float an initial poll question just to see from members of our audience have you had a chance to see and or utilize the AI Assistant with INGoReact thus far? And I’ll just pause for a moment here to let folks respond to the question in the poll.
And while you all are responding, I’ll just click over to my next slide. So that’s already up and ready for us whenever you guys have had a chance to respond. Oh, so I’ve got the results here and looks like most of our audience has not had a chance to see and or use the AI assistance. So this will be a great webinar for you to join. I’ll run through and demonstrate those AI functionalities directly within our product. But before I dive into the demonstration portion, I do like to kind of start by, as I was sharing, AI is a very prevalent thing in the marketplace. It’s being embedded in a lot of tools and things that we’re using on a day-to-day basis. So I think it’s helpful context for you all to kind of hear why GoReact is embedding AI within our functionality within our tool set.
As we incorporate AI into our program, we kind of have a few main objectives or kind of guide rails that we’re using for that. We want to make sure that we’re empowering student led skill development. When students submit videos with the AI Assistant activated, they can receive virtually instantaneous feedback and information on their performance within that particular video. They’re able to immediately click into it, review the feedback, engage with those analytics, review the transcripts that you’ll get to see here momentarily, but it’s empowering those students to kind of shoulder a little bit of that responsibility of beginning to analyze and figure out a game plan for, Hey, what did I do well in this instance? What could I improve upon for next time? So helping kind of share that responsibility of that skill development and having students take a little bit more ownership. We’re also hoping this accelerates that skill mastery and deeper engagement with video content, particularly for our instructors or coaches or university supervisors.
With the AI handling kind of a baseline of feedback for students and kind of identifying some of those common behaviors within a respective discipline, it’s empowering those coaches, those individuals with that skills, that experience, that mastery within that discipline to really hone in on those key critical areas that your students, your learners need to succeed and grow within that field. It’s allowing you to kind of leverage that knowledge that you have of the student. What are their strengths? Where can they improve within their performance so that you can really fixate or hone in on those specific points and give them that valuable feedback that they need. Also caught up in the mix of this, we’re also hoping that students and educators are realizing impactful feedback with less and less time. We want our AI to be providing that feedback and allowing you to hone in on those really important pieces. But all along this kind of process, all participants are kind of saving time within this that you are not having to focus on every single point in the video or point out every minute detail. The AI is kind of handling some of that in conjunction with you and also students are able to respond and engage and share that ownership. So as we go through today, I hope you’re kind of seeing elements of these guide rails as I showcase some of the tools and feature within our system.
So with that, I do want to start to shift to kind of the live demonstration component of this. So I’m going to switch out on my PowerPoint real quickly and I’m just going to pull up an example video that I have here. Now, the AI that you’re going to see within our system are concentrated in three key areas. So I’m just going to hit play on this video and sorry, my Zoom controls are slightly covering my play button. What I want you to notice on the screen at this particular point, and I’ll zoom in a little bit just so it’s not quite the eye test. Every video submitted with our AI Assistant activated will have a transcript activated and turned on for that video. You’ll see it’s one of the tabs here kind of in the central bar that divides the video player from kind of the commentary pieces on the right, just like the commentary log in, the comment section here.
Each one of these transcript pieces is something that you can click on to jump to a specific point in the video. So I’m going to pause momentarily on the video to demonstrate that. Let’s say there were a particular kind of vernacular or verbiage or some word that I’m expecting my student to say during this video that would kind of demonstrate or notate a specific point in the video where a skill is being demonstrated. I’m going to pick something pretty bland and generic because I want to make sure that it shows up in the search tool, but you’ll see here if I type in, okay, into my search bar, our system is going to highlight every instance where okay appears in that transcript. Let’s say that’s a particular point I really wanted to focus in on. If I click on that point in the transcript, you’re going to see the video jumps to that point and immediately begins playing. So as the student who filmed this video, or as the instructor that’s reviewing this transcript, I can search for those specific points in the video that are relevant or germane to the instructional activity that I’m conducting in this instance, and I can quickly notate, highlight, figure out what’s going on at that point and potentially then switch over to my commentary tools to start providing guidance based on what I’m seeing.
One other cool feature, actually two other cool features, sorry, I don’t mean to undersell it here. Up here in the top right corner, there’s an ability to download this transcript. We have a few disciplines, particularly counseling, psychiatry, psychology, where some exercises actually involve a student generating a transcript of their sessions. If you’re using GoReact AI’s assistant, you can download that transcript file directly from our system and have an electronic copy. You don’t have to manually transcribe that out. So those transcripts can be downloaded out of here and available for you to utilize in any other kind of instructional purpose. Also in videos where you have multiple speakers, for instance, like a teacher in the classroom, you obviously have the students speaking and sorry, the teacher speaking and there are students in the classroom speaking. You can actually identify specific individuals within that transcript. So if I know the first speaker is teacher James Smith in this example and the other speaker is the students in that class, I can quickly tag those individuals so that as I’m reviewing the transcript, I can see who’s saying what at any given point in time. This will also become important in a moment when I start to show some of the analytics. So all videos will have this transcript that is available to the student who recorded themselves. Also, any person that is reviewing that video can also see that transcript.
The next cool AI feature that comes along with this package, the analytics tab, and some of you may be familiar with this tab already, even with our essentials package, you have this feedback graph that’s kind of like a visual representation of the commentary feedback that’s been provided on the students video and kind of gives you that visual representation. So I can see here’s comments here are markers. Within there you can kind of see are there any gaps in the video where I haven’t noted any feedback that maybe I want to go back and review. And you’re also able to see some of these AI markers that I’ll get to show you here just shortly. But again, that visual representation of the feedback offered what our AI features or analytics are offering beyond that is in some of these other reports down here, things like filler words, kind of those durable communication, those soft skills that we want to make sure that when we’re communicating we’re communicating effectively that we’re not adding a bunch of different words in there that aren’t helpful, aren’t useful.
Our system is kind of detecting some of those particular practices. And for one filler words here you’ll see in this graphical visual representation, 82 filler words were spoken over the course of this video and represents about 2% of the transcript as a whole. So that’s helpful context, but how am I compared to kind of the general run of the mill person that’s speaking? You’ll also see on some of these analytics, we do have a baseline for, hey, it’s common to have less than 4% of your words when you’re speaking to be filler words. So in this instance, as the student I know, hey, I’m doing pretty good. I’m twice as good as the typical run of the mill person speaking, this is something that gives me helpful context to know, hey, there’s a little bit of room for improvement, but I’m not doing so bad in this sense.
Now if it were 8%, 10%, 20%, maybe that’s something I need to hone in and focus in on pauses. This is always fun, particularly within the teacher education field. Sometimes a deliberate pause is intentional. Sometimes it’s a good thing because it lets your students digest what you’ve shared, formulate responses and then actively participate. We want to make sure we’re giving our students the time they need to do that. So in some disciplines, a deliberate pause may be a good thing and something you want your students to be doing. Sometimes if you’re just delivering a presentation, it may be kind of a sign that did the content as well as you should have known for this presentation in front of your teacher and all your classmates. So this can be kind of context, discipline specific in terms of are these pauses good or are they something we need to work on for the future?
And you’ll see here it’s noting those deliberate pauses. It’s noting how many of those pauses are lengthy in nature. And then again, using that context, that knowledge that you have within that discipline or for that activity, you can determine area for growth or potentially a strength area pacing. How quickly am I speaking? Am I speaking so fast that nobody’s understanding what I’m saying or I’m just oversaturating them with information? Am I speaking way too slowly and it’s just putting you to sleep. You have some analytics to kind of justify or showcase kind of which end of the spectrum or maybe you’re kind of sweet in the middle you’ll see here the number of words per minute that you’re speaking. And then again, just kind of like with that filler words, we have kind of a baseline. You should be comparing yourself to see are you kind of average kind of middle of the road in terms of your speaking cadence?
And last but not least, edging words, those words that kind of take a little bit of authority off of what you’re saying it’s possible. Maybe that could be right. We want to minimize those instances where we’re adding words into our speaking patterns or speeches that would kind of betray that knowledge and expertise that we are trying to convey to our audience or to those that we’re speaking to. So again, just a great way kind of work on those durable skills that regardless of what discipline, what profession you’re going into, these are the type of things that when I’m trying to be an effective communicator, I can kind of hone in and grow and improve in these particular areas. Now the really fun part, I’m going to click into the comment log, which you guys are all probably familiar with if you’ve used GoReact. And what you’re going to notice is that there’s a participant in this process named GoReact AI Assistant.
So that transcript that is captured on every one of these video, that transcript is reviewed by our ai. The transcript is actually what our AI is processing to identify the particular behaviors that we’ve programmed into our system that you can choose to enable on specific activities and allowing our system then to provide praise to the student in areas where they’re doing well. It can also provide in terms of where they can improve for next time. And as you can see here, not only is it saying, Hey, you could have done this a little bit better, it’s then providing them suggestions on here’s how this could have been better. It is not just saying you did great, you did bad. There’s context, there’s information, there’s examples of how you can potentially improve your performance or in those instances of praise, it is noting what you did specifically that is worthy of praise in those particular instances.
I also want to highlight just like any other comment, you can click on these and it will jump to about five seconds in the timeline. Prior to that, just like any other comment in the system, there is the ability to reply to this and I will come back to that when we show some of the examples of some of our actual users in the field using advance. Not that you should be actually engaging in a conversation with an AI that won’t respond back, but we will circle back to that. You can also edit or delete these comments. I emphasize this because as you probably experience with other AI tools and other AI features, AI is not always going to be perfect. There’s going to be room for improvement. However, maybe there are parts of this that are beneficial, you could edit it or add additional context or information so that the feedback as a whole is worthy of being left and considered by the student.
For growth purposes, if you find there’s absolutely nothing redeemable about the specific comment, we’re going to ask two things. You can give it a thumbs down, we want you to engage with this feedback. Students and teachers, we want feedback. Is this good? The little thumbs up icon is this no good? Give us a thumbs down on that particular comment so that our product team can review that to make sure that our feedback is continuously growing, improving, getting better for you all. And then once you’ve given it feedback, just hit delete. You can exit off of the student’s commentary log, they won’t see it and you should be good to go. But we want this AI to be impactful and helpful. So you guys, your students engaging with this lets us know what’s good, what could be improved. And for those that you feel like shouldn’t be left on there, you can always delete them.
As we wrap up the demonstration, I just want to quickly note that with those AI markers, these are actually pieces within the activity settings. You’re probably familiar with this menu when you’re utilizing our product, but if you go into the feedback settings, AI Assistant right here, you’ll see here there’s a list of behaviors that you can potentially select for our AI to review. These are programmed by us. You’ll also note that you can select a transcript language by default. So if you are teaching a global language, you can select potentially a different transcript language for this, or if you just want it to stay in English, you can leave that. It also will try and auto detect that. So if it hears English first, it’s going to produce a transcript in English, but you can potentially tweak or adjust that the default for the transcript language.
And then down here you can also see a variety of disciplines where we have specific markers built into our system to cover. So if I were teaching nursing education, I could select that marker and I could find a list of behaviors that have been programmed for that specific discipline. Some of these may be kind of discipline agnostic. Some of the communication skills that we’re kind of noting earlier might be beneficial in any discipline. So there might be some markers you want to peruse and other disciplines, but this at least allows you to focus in on, hey, are there specific marker sets that I want to utilize in specific areas that might benefit the student In this particular instance, you’ll note here you can select upwards of 10, however, we do recommend generally keeping it between three and six and we’ll also have some client feedback on why that might be beneficial.
So with that, what I want to do is I want to transition. So now that we’ve kind of heard about the different features, we want to share some perspective from our actual clients out in the field that have been leveraging GoReact and our AI Assistant. And to start off, I’m going to play two videos from the same client, Dr. Chris Widdall with SUNY Cortland in the state of New York. The first video is just kind of the level set to describe why they’re using GoReact within their teacher education program. And then we’ll transition to the second video, which is more specifically on the AI feature and some of the benefits and things that they have practically realized by using it. So real quickly, I’m going to pull up the video and don’t worry, you don’t have to see what’s on the screen. I know it’s a little blurry. You’re just mainly listening for Dr. Will’s voice here in this particular instance,
Dr. Chris Widdall:
Don’t react changed our supervision pathway flat out and simple. There is a working problem with the support of high need teacher shortage in the country, and GoReact enabled us to take supervision that was set in geographical locations to be here, here and here, and we would have supervisors there that could go to the schools. It changed it. So now we have supervision and all of the counties in New York state because it can be done with GoReact. And our students love the fact that they can go back to their home area and be marketable where they want to live instead of having to do student teaching in Cortland when they live in the top of the Adirondacks and that’s where they should be doing their student teaching. So it really has changed how we focused and how we think about teacher supervision now. And I believe it’s changing our shortage needs. We are getting calls. Buffalo, for example, a huge high need district in Buffalo. They contacted us. I had five student teachers in Buffalo City schools. They all lived in the area. The closest supervisor was Rochester. That’s two hour drive. That’s a lot of money out of the college pocket too. And they did all their supervision with GoReact and I think three of ’em got hired. Justice Shear. So supporting high need students with multiple view processes,
Matthew Short:
You can hear from Dr. Widdall sharing that GoReact is allowing for their teacher education, the supervision clinical portion of their program to occur throughout the state of New York. You’re not bound to the locale of the actual college itself and some of the local school districts, you can have your students working in school districts closer to where they’re coming from, where they potentially want to finish after they’re done at college they want to return back to. So allowing that supervision program to occur throughout the state, wherever those students live or and want to go to. So I like that context just to help inform why they were using GoReact. And now let’s switch to the fun part where she’s talking about GoReact AI Assistant that they’ve seen from their usage. In the fall,
Dr. Chris Widdall:
AI Assistant said when a student mentioned three sisters, you immediately asked if anyone remembered what they were. Instead, you could have first acknowledged the student’s contribution by repeating or paraphrasing the answer however, and she makes a mention. I agree with this point, but stopping and reiterating the student’s question, I could have shown better feedback skills that I want to point out in my class. I have a feedback module that they have to go through to use ai. Well, students need to know the instructional strategies. AI is trying to help them learn better. If they don’t know those instructional strategies, then of course they’re going to think AI is a little off their rocker because they don’t understand what is really being said there. And that is something I saw quite a bit. And that’s an educational process, not a goal react process. But then you’ll see here, and I love this, so they use a thumbs ups and thumbs down.
I have my students going through, I do not have a set number, but they have to show me they’re being reflective in their practice. And so she’s got a thumbs down with the next one. And you’ll see, she said, you spent time waiting for students. Basically the AI is saying you’re really slowing down instruction, get things going well. She said, no, I disagree with this point by stopping and utilizing the wait time, I was using good classroom management strategies because she knew that that was the behavior management she needed to use for her students. It’s funny when they start arguing with ai, I’m like, yes, and then I’m just going to escape here and I’m going to let you see one other student example. And I know maybe some of you have not had the privilege to see what it looks like, but here we have a student who is teaching kindergarten and you’ll see here that she’s using AI and all through her reflective practice, she is actually even telling ai, yeah, I came up with this idea and this is how I came up with it. I’m like, they have a conversation with the tool. It’s quite interesting
Matthew Short:
And I want to hold up there just to make sure that we’re saving enough time for questions and answers towards the end. But I loved this kind of video share casing, kind of a practical experience with the AI tool, Dr. Widdall sharing that, hey, even in those instances where the AI was potentially missing context or not providing kind of feedback that was a hundred percent on the mark, it’s still kind of an educational opportunity for the student to demonstrate, Hey, I know the content. I know the information that is necessary for me to make the decision I made at this particular point. And she’s having her students actually reply to those AI Assistant comments and explain, I see why you may have noted this, but you’re probably missing context on this was an English as a second language course, or there’s this particular learning need for most of my students that I have to implement within this to make sure that it’s an effective lesson.
The opportunity to reply and engage with those comments is actually an opportunity for students to demonstrate that knowledge and an opportunity for you as an instructor to see that students recognize even when the AI is telling them something different, it’s like, no, no, no. I know what I’m doing. I’m the educator, I am the teacher. I know what I’m doing in this instance and this is why. So even in those opportunities where the AI kind of missed the mark, students still get to shine and demonstrate their knowledge and expertise. So I just love that as an example that the AI can provide that salient feedback that a student can respond to saying, yep, that’s right, and I think I’ll do this next time, or Hey, you’re missing something and here’s what you’re missing.
So as we start to wrap up here, head towards the end of our presentation and open up for questions. I also just wanted to share some additional feedback that we’re hearing from our users. I won’t read the quote verbatim, but I do want to highlight this from Dr. Warming the University of Florida. And I use this as kind of a nice highlight to emphasize those guardrails, those guidelines that I emphasized at the beginning of why we’re including AI within our tool. So you’re seeing here saving time, student engagement, those kind of key things that we want our AI to be providing to you and to your students to allow you to note those particular moments without instructors or mentors or university supervisors having to bear that load themselves. So I love this quote. I think it emphasizes very nicely some of those focuses, those folk I that we are gearing our AI towards addressing Dr. Co with National Lewis University here was highlighting some of those features or benefits that she’s seeing with those analytic reports that I showed you on that second tab, those soft skills, those durable skills that are noting some of the communication practices also emphasizing that transcript. This is a teacher preparation program, so the ability to review that afterwards is providing value to them as they’re analyzing students’ Videos.
Our friend, Dr. Widdall, again with a nice suggestion and I love this kind of context or suggestion from our users that talk about using AI as an additional lens. We want to get as many kind of perspectives as we can on student work so that they have a range of lenses to kind of pull off of. So talking here we’re Dr. Widdall is talking with an associate dean that’s talking about I wouldn’t have all the time in my day to know all of the things that this AI is capturing for the student. So it’s kind of highlighting that this can kind of share in that load and provide a different perspective beyond yourself. And in the similar vein too that I also like to highlight this particular feedback that we receive from an instructor at National Lewis University as well talking about that the feedback from the AI is so thorough that it’s making them think a little bit more about the value or making sure that their commentary is counting and also highlighting that this is a different lens or a different perspective for the students to consider in the review of their performance as well.
Also, as I emphasized before, even though our system will let you select up to 10 markers, sometimes less can be more. Dr. Widdall noted a pretty funny instance where she turned on up to 10 markers and the student had 92 comments on their videos, which my chemistry teacher in high school always said whenever we looked like we kind of absorbed as much as we can that we were oversaturated. So we want to avoid that feeling on students of getting overwhelmed by the commentary in there and establishing particular focal points for lessons or for activities by selecting specific markers that kind of align with that goal or objective within those particular lessons.
And as we wrap up just a few other additional suggestions for our AI Assistant tools within GoReact, leveraging it to serve as a playground or practice opportunity for students. You can create a GoReact activity with the sole intention of being students. You can record as many videos as you want for this activity and submit them. I’m not going to look at them, but the AI is turned on to provide you feedback on that so that you can do with the AI commentary. This is a great opportunity where students can leverage that and practice and get as much self-guided improvement or direction as they want without necessarily adding additional workload to you. You can focus on those other activities where you have specific kind of notions or learning objectives that you’re hoping to accomplish during those times while still providing the student opportunity to kind of get additional practice outside of those activities where you’re directly involved.
AI can also provide great examples of feedback for your students. If you have students that are struggling to provide kind of salient or relevant feedback on their peers videos or peer assignments, maybe pulling up an example of an AI generated feedback that shows, hey, it’s identifying what the student did. It’s noting, hey, this is something that’s great and this is why it is. Or Hey, there’s an opportunity for improvement here. Here are some approaches you could take next time to improve in this particular area. So leveraging our AI to show examples of great feedback that you want students to model in their own activities or maybe even in their own self-reflection. It’s a great use case as well. Also with the AI feedback, since this feedback is offered virtually instantaneously, you can get quicker engagement from students with the AI Assistant GoReact advance package.
We are noting that students are 60% quicker to engage with their videos because there’s something for them to pretty much look at. It’s not dependent on another actor kind of taking action or finding the time or you’re reaching that point in the day where they can look at those videos. AI is kind of immediately providing them something they can take action on and also we strongly recommend engaging with feedback within those AI Assistant markers and comments, thumbs up, thumbs down. Again, once you’ve noted whether it’s good or bad, you can keep it. You can edit it, you can delete it, whatever you feel is best for yourself or your students in those particular instances. So please make sure you’re engaging with that commentary in the commentary log with from our AI Assistant.
And with that, let’s get to some of the questions. So I’m going to pull up the question tab here and I’m going to try and address as many as I can here with the time we have remaining. So let’s see from Casey Johnson, transcript is a higher level of membership that we currently have. So is AI also out of reach with upgrading our subscription? So with these AI marker sets, sorry, with these AI components I’ve emphasized here that does come with our GoReact advance package at this point. All of those features are a bundle deal. So in order to get any of those components I’ve shown you have to purchase kind of that GoReact advance package. You can’t pull it a la carte at this particular point. Great question. Next one, is there a way to delay the AI Assistant until after the student watches the video and analyzes it themselves? Unfortunately, there isn’t. The AI commentary will be loaded virtually instantaneously. There’s a very small processing time, but once it is processed that transcript and noted those behaviors, that feedback is visible to the student and instructor or if it’s a peer activity, they would be able to see it as well.
Does the video need to be recorded in GoReact or can it be uploaded? It can be recorded in GoReact. It can also be uploaded, whether it’s on a phone tablet, if you have a Zoom recording that can be also submitted as well. The only option where it will not work is with that YouTube integration. So if you do have students that submit videos through the YouTube integration, like they record, upload it to YouTube and then copy and paste the link for the submission, that’s the only instance where the AI Assistant will not work because basically we’re streaming that file from YouTube and our AI is not processing the actual audio file. So any other modality you use to submit a video works with our AI Assistant. Great question. Next question. Can I run analytics on the source video to demonstrate how they’re delivering the message? I often use that as an example of effective delivery. So with the advanced package, nothing that I would probably say fits that particular bill. With GoReact, you can always download the video file from the activity itself if there’s additional tools or processes that you can run on a video file to do additional analytics. But at this particular point, there’s nothing beyond those analytics that I’ve shown that you could kind of pull or do additional analysis with.
And next for tagging multiple speakers, what if it recognition of the speaker is wrong? Can I edit that? So at this particular point, you can tag the individuals that the AI has noted. If there is an instance where the AI incorrectly tags an individual, unfortunately there’s not a way to say, actually this was speaker B, or actually this was Matthew Short. It is dependent on the AI parsing that out at this particular point. Now we’ve seen pretty strong success with it tagging or identifying the correct individuals. So I’m not too concerned that’s a larger issue, but just noting and I guess maybe if somebody’s speaking pattern or cadence or inflection were to change, I don’t know if my voice cracked or something like that or I started making voices or something like that. Maybe it would identify me as somebody else, but in most instances is successfully tagging the individual that’s speaking.
Next question is the AI feature automatically enabled? So if you do have our GoReact advanced package, the AI features are readily available now on every GoReact activity with the advanced package, the transcript will always be on, the analytics will always be on. The only piece of our AI Assistant that you’ll have to set up on each activity is selecting which of those markers you want our system to note. So if I go back to my computer screen here and go back into the activity settings, in order for our system to be automatically marking and leaving commentary on those specific behaviors, you have to have enabled those markers in the AI Assistant activity setting. The only thing you have to turn on for our advanced package is this. And again, you can do this for each activity. So maybe one activity I’m focusing on pacing and engagement, maybe the other ones I’m looking at listening patterns, things like that.
So you can pick and choose different ones for different activities. One other thing I do want to highlight with our markers is as you go through this, if you identify any common behaviors within your discipline that would be beneficial to have as a marker set or a programmatic marker, at the very bottom of the list there is this form that you can submit a request to our product team to create additional markers. I’m actually happy to note that within the next month or so we are adding additional marker sets and eight of those markers are actually a direct result of feedback that we are getting from our clients that are leveraging this feature within our system. So as we start to hear common refrains from our end users that are leveraging GoReact, advance that hey, it would be great if your system were looking for this behavior, submit it with that marker form request and our product team as we receive those, we’ll work on adding those behaviors into the program as quickly as we can. So again, we want that direct engagement, we want your feedback. Great question.
All right, so next question. If we have a rubric for assessing homily, could you all somehow program train the AI on those behaviors? So I guess that actually fits into what I was just recommending or showcasing right there, that if there is a behavior or something within your discipline that is not in this list in any way, shape or form, submitting kind of this create a marker request to us is something that we can then take and see if it’s possible to program or set that up within there. So again, advocating strongly encouraging you to submit that feedback in that particular area.
How likely is this tool going to be able to use a framework rubric for teaching to conduct a video observation? So this is where I would say that a lot of the behaviors that we have programmed in here are going to kind of be more generic within those respective disciplines. It won’t necessarily be aligned to specific evaluation instruments or teaching instruments or rubrics that may be relevant in specific states or specific programs. Our AI system is kind of more honing in on those generic behaviors within a discipline that all of our clients can benefit from. So if there’s a very specific evaluation instrument that you’re using, there may be some behaviors in our list of AI markers that are aligned with some of the standards or attributes that are contained within those instruments. But we are attempting to keep these kind of more generically applicable.
But as we grow, as we receive more feedback requests on this, it may provide opportunities that we can provide maybe closer alignment to some of those behaviors that we are seeing that are common amongst a wide variety of kind of evaluation methodologies or instruments. So again, not to be a broken record, but anything you see that’s missing or that you would like to see enhanced or improved that create a marker request is definitely intended for that question. Are the AI interactions being fed back to the AI for training? So not the interactions with the AI beyond the thumbs up or the thumbs down that we are receiving from you all. And actually that engagement is going to our product team who will then review the feedback that was given, like a thumbs down and start to see what are we missing, what was kind of lacking in this particular kind of comment. And that allows us to refine that behavior so that it’s getting more accurate or more germane feedback within that respective behavior. So the AI itself doesn’t necessarily get that feedback directly, it’s more of our product team is receiving that to then work on course correction or adjusting or updating that to make sure that it’s giving more positive results going forward. Great question.
Does the AI only capture the auditory components of the lesson At this particular point, our AI is completely dependent on that audio, that transcript that is being generated. All of the behaviors, all of the suggestions for improvement are based on what is included in that transcript. And then the large language model we’re using is then kind of processing that and providing suggestions back to the students. So at this point, there’s no visual component to the AI that if I were doing something with my hands or showing an actual physical practice that it would be picking up in that particular instance.
Can each student have their own set of unique markers and would the system keep track per student so they don’t have to reset each video? So in each activity you would be applying or our system would be applying the same kind of set of behaviors that you have selected for any student submitting a video for that particular practice. So if I had five students complete this activity, it would note these same behaviors for all five of those students. Now maybe I could set up different activities that have different focal points so that I could instruct my students, Hey, if you feel you are needing improvement on listening skills or you’re needing engagement improvement, submit a video in this specific activity. That could be a way that you could tailor student submissions to specific activities that are selectively looking at a specific behavior. That could be kind of a workaround to address that. But the main question you’re asked asking there is if an activity is set with these six markers, every video submitted is going to be looking at those behaviors specifically.
We’re getting close to the end. I’m going to try and quick fire through the last ones. How do I turn on the AI so students can interact with it? So as I noted before, the transcript and the analytics page always on, unless it’s submitted through a YouTube integration, all the other ones, transcript analytics will always show. The only one you have to enable is the AI Assistant here, and that is through that activity setting, which if you missed my click path before the little more menu option here at the top of the screen, click edit feedback settings, and it’s right here at the AI Assistant. Your question, can you clarify your answer about how to upload the video? So if I wanted to submit a video for this particular activity here and I get to the recording or upload options, if I use the native recording tool and GoReact transcript, the analytics, the AI commentary marker tool, we’ll work with that.
If I upload a video from a separate device like my iPhone, sorry, it’s getting blurred out by my background or a tablet or other digital camcorder, you upload it. That will work with our AI Assistant. If you have a Zoom recording, you want to upload, click on the button, find the zoom recording, upload it, or you can even connect it to your Zoom account. The only one where our AI Assistant will not work is if you use this YouTube integration because it is streaming it through the YouTube API. Our system doesn’t actually get that file in that instance, which is why our kind of analytics can’t pick up that video and process it. So record upload zoom. All good for our AI system.
All right, lemme see if I can hire off the last two. Is it possible to adapt the AI markers to become discipline specific? For example, AI markers for teaching history? So probably not that specific at this particular point. Our AI markers are kind more generic within a discipline. So for teaching a specific subject area, the markers currently aren’t that targeted in terms of what they’re providing. It is more based on the teaching practices being demonstrated, but if there are more generic behaviors that could be beneficial to other kind of subject areas within a particular discipline, that could be an opportunity to submit that for marker feedback and maybe that’s something we can incorporate in the future. And last is there a list of current AI markers that is probably something we can work on sharing as they follow up. There’s not currently kind of a list directly in our system that you could access without obviously having the advanced package turned on. If you do have the AI or the advanced package turned on, you could go into the AI Assistant list and you could see all of the markers within our system that are available across all the disciplines. So that is one way you could see it, but I’ll see if we can maybe follow up with something that we can share that describes all the behaviors, but also want to note we are constantly looking to add more. So hopefully that’s a list that is continuing to grow and grow as we move forward as well.
And with that, I’ve hit all of the questions in the queue and I’ve only gone three minutes over, which is actually pretty good for me. So I want to start to wrap up here. I want to thank you all so much for your time here, joining us today, making this an interactive presentations with lots of fantastic questions. We’re grateful for the time you’ve carved out of your busy schedules to learn more about what GoReacts AI Assistant can offer. We’re happy to discuss it with you further if you’re interested. We hope to see you at Future GoReact webinars and please have a great rest of your week. Take care. All.