Teacher Education

An Intro to Using Video Assessment for Analytics (for Non-Data-Wizards)

A webinar featuring Dr. Patty Heydet-Kirsch from National University

Dr. Patty Heydet-Kirsch, Director of Clinical Practice at National University, explains how she captured data in GoReact to inform program design and progress.

SEE FULL TRANSCRIPT

Hillary Gamblin:

Hello. Thank you for joining today’s workshop on how to use student teaching videos to gather analytical insight into your programs. My name is Hillary Gamblin. I’m a GoReact employee and the host of The Teacher Education Podcast. And today, I’ll be chatting with Dr. Patty Heydet-Kirsch from the National University in California. Patty, do you want to introduce yourself real quick?

Dr. Patty Heydet-Kirsch:

Sure. Hi, everybody. I’m Patty Kirsch, and for the last almost two years, I’ve been a clinical practice director at National University in La Jolla, California.

Hillary Gamblin:

We’re all jealous. That’s for you. That’s for you.

Dr. Patty Heydet-Kirsch:

Don’t let the winter scene fool you.

Hillary Gamblin:

Thank you so much for joining us. For those of you that are new to our GoReact workshops, let me quickly explain how we structured these virtual events so you know how things are going to go.

Hillary Gamblin:

For the first 30 minutes or so, I’m going to discuss with Patty how she’s used her unique skillset in analytics to gather program insight at National University. After that, we’ll do a Q&A, depending on how much time, about 10 to 15 minutes for that Q&A. We strive for high audience participation so please submit a question if you have one. And if you see a question someone else has posted that you’d like answered, there is a handy upvote feature so you can vote for questions to be asked.

Hillary Gamblin:

Also, please don’t forget to use the chat feature. I know in Zoom, you need to click and have it pop out, but this is one of the most exciting places in our workshops. It’s where attendees share, exchange personal information so they can connect after the workshop. They share resources, ideas, a lot happens there so please don’t miss out and make sure you’re part of that chat.

Hillary Gamblin:

Now that we have a basic outline, we’d like to start off today’s workshop with a quick poll to get to know everybody a little bit better. The question is, “How often do you use video assessment software to gather, evaluate data?” We have, “Never,” “I’ve considered it,” “A few times,” and “Regularly.” We’re hoping that we can get a good idea of the audience set that we have here and who we’re talking to. I’ll give you a few minutes to do that.

Hillary Gamblin:

Interesting. It’s pretty even split. That’s fantastic. Well, as a preference to those who are joining us, if you’ve never done it or you’ve just considered it, or maybe a few times, it sounds like majority are in those camps, you should be happy to hear that you don’t need a degree in analytics to get any straightforward information about your program. That’s why we named this workshop as a friendly “Non-Data-Wizards.”

Hillary Gamblin:

The truth is, just using the tools already at your disposal, like video assessment software, can provide a straightforward way to discover new insights. So, as Patty shares exactly how she used GoReact to gather program insights and by the conclusion of the workshop, we’re hoping that you’ll be able to replicate or do something similar in your own program. At the very least, we hope that you leave today feeling empowered to start asking questions and gathering some of your own data.

Hillary Gamblin:

So, let’s get started. Patty, what program needs did you initially want to address with the video assessment software your program adopted?

Dr. Patty Heydet-Kirsch:

The story goes that I came on board right in the beginning of COVID so it’ll be two years in April. And when I came on board, a lot of universities were deciding not to host any student teaching or internships in the fall because they did not have a way to monitor their students in the classrooms or in whatever clinical experiences they might be doing.

Dr. Patty Heydet-Kirsch:

And luckily for me, our university was already looking at GoReact, and I quickly jumped on board to see what it could do. And in speaking with some of the people from GoReact, we tried to piece together again in, basically, May, June, and July, how this tool might be able to save us in the fall as other universities around us were just saying, “Nope, for fall, we’re not going to even go there. We can’t figure this out.”

Dr. Patty Heydet-Kirsch:

So, I feel like in taking a really good look at what the capabilities were, in a rush, we put more, I guess, focus on it than you might do with any other tool you ever purchased in your life, or my career, at least, and looked at, “Okay, how can this solve other problems for us?”

Dr. Patty Heydet-Kirsch:

At the same time, we were within a year and a half of our accreditation so we knew we had a lot of different data points that we couldn’t let drop. So, we couldn’t let those experiences in the classrooms or in the clinical settings be something that was not recorded and not part of a database. And that drove our next couple of decisions, to be honest with you.

Dr. Patty Heydet-Kirsch:

It was a crisis mode, to be honest. This was one of those serendipitous occasions where a tool ended up coming to our rescue, and I don’t think even those at GoReact realized how much we needed this at the time that we did.

Hillary Gamblin:

Well, that’s what we like to hear.

Dr. Patty Heydet-Kirsch:

Yeah.

Hillary Gamblin:

To start off, when you hear the words analytics and metrics, it makes my palm sweat because I don’t know how to do it. And so, maybe other people here also feel the same way, but it doesn’t need to be like that.

Hillary Gamblin:

You use a straightforward and effective method to gather data. As we were preparing for this workshop, that was the one thing that struck me. I’m just like, “Oh, that’s what you need to do. That would work.” So, can you share with us how you began gathering data?

Dr. Patty Heydet-Kirsch:

I did… Again, being new to California, it was interesting for me because I had to learn the state standards and, again, what they were looking for.

Dr. Patty Heydet-Kirsch:

I’m using the same GoReact that everybody else is. And if you look at… In one of our systems, it would look like this, where on this side over here, you can see the names are blocked out, but you can see that it’s different people. The M is a student, the P is an instructor.

Dr. Patty Heydet-Kirsch:

I wanted to make sure that we were using it for really good feedback and feedforward, which is how GoReact was packaged for us, whether this can be a feedback tool, this can really help people improve. So, as they were watching a video, we started to coach, “How many times should you annotate? What should you do?” We started working with students and faculty on what that might look like. And then we used it not only as a tool for coaching our students, but then also for coaching some of our adjunct faculty and faculty that work in clinical practice.

Dr. Patty Heydet-Kirsch:

We didn’t add in the tags here. These are our six basic TPEs, teaching performance indicators here. We didn’t add those in until the second term, knowing that we needed people to kind of get their feet wet a little bit. But the number one “aha” for me was, “Wait a second, there’s a rubric attached here.” And a rubric means that beyond just being between the instructor and their student, and beyond being able to download this data and look at it qualitatively if we want to, or use it for training purposes, with a rubric, there’s a whole another element that opens up. And that element for me was, “That can be a database.”

Dr. Patty Heydet-Kirsch:

If we set it up the right way and we set it up in a way that nails all of our standards in a way that is usable, this could then be the database we’re looking for and we’ll no longer be collecting paper versions or electronic versions from around the state, either in an LMS system or in another kind of a way where somebody has to put it all together and aggregate it. If we do it in GoReact, they’ll be able to share with us by term or whatever incremental pieces we want, what that data is. And then I can analyze it and give it back to departments and to faculty, and stakeholders as usable data. And that was the goal is when you look at accreditation, sure, you want to nail your standards, but beyond that, accreditation bodies change their rules over time. We all know that. And I want it to be usable. I don’t want to spend time doing something if people aren’t going to look at it and make decisions based on it, or at least see the value of it.

Dr. Patty Heydet-Kirsch:

So, this is what our screen looks like. Any questions on… I’m sure yours looks the same, but we are using all features. You can see at the top here, it shows this person had 35 different annotations. The rubric was scored. You can see the different comments. Any questions before I move to this piece?

Hillary Gamblin:

I think we’ll take questions at the end-

Dr. Patty Heydet-Kirsch:

Okay. Perfect.

Hillary Gamblin:

… [crosstalk 00:08:20] but, I have a question about the rubric, because as you were saying it, the rubric was kind of the “Aha, this is where we can really gather data program-wide.”

Hillary Gamblin:

I’m sure it required some thoughtful consideration about how you’re going to create that rubric. So, what did you consider when you created that master rubric and how did you set it up?

Dr. Patty Heydet-Kirsch:

Coming in and working with five different departments and five different needs, the levels were all over the place. Some were using a three-level rubric, some are four, some are five. And so, we took a look at two different departments. How can we make this as simple as possible, knowing that, unfortunately, no matter what we choose for the levels, the criteria down the left side, we’re going to have 45 standards? And that’s a lot. But I was coming from a state where we were using [inaudible 00:09:12] with well over a hundred so I felt like, “45? We can do that.”

Dr. Patty Heydet-Kirsch:

But to make it easier rather than repeat these words, we thought, “What if we did a really good training with what the words integrating, applying, developing, and emerging mean? And what if we actually used the zero-level not observed to our advantage?” What if we had that? It’s not a score but we used it as a level. And we trained our faculty that this rubric would not be scored for point value so that students would not have a percentage on this, that it would be a completion. It’s a skills rubric, which was a completely different mindset for some of our faculty. They were used to a grading rubric.

Dr. Patty Heydet-Kirsch:

So, that was a coaching experience for us to do with… If you really want students to improve, it’s okay to score them in a one and a two, and even a non-observed zero, which will make their score lower. But it’s not about the score. It’s about the content of the rubric itself and about the feedback the rubric is giving you. The score is absolutely not something we’re going to look at in these particular observational assignments.

Dr. Patty Heydet-Kirsch:

And the way we set it up was at least two in every clinical practice course, which gets us to between four and eight, depending on which program you’re in, throughout their clinical practice. And we wanted to then be able to see that growth over time from that first observation to the second, to the third, to the fourth, and on. And in doing that, we really needed a rubric that we all agreed we could coach. So, the words themselves are almost memorized by our faculty at this point. They know what integrating looks like. We’ve had all kinds of workshops. But that part stays stagnant. And then the part that doesn’t is the TPEs themselves.

Dr. Patty Heydet-Kirsch:

So, we give them a line for TPE 1.1. We give them what it is. And then they score it as integrating, applying. But we don’t repeat those levels each time so it keeps a little bit cleaner. And people tell us, “The first time I did it, it took me a little time to get through it. But the second, third, fourth, it’s so easy now.” Because the TPEs, they know, they teach them. And then the rubric levels have stayed the same now for a year and a half so that’s something that is a known value.

Dr. Patty Heydet-Kirsch:

And what this helped us do was amazing to me because once they get through it and they are able to score those zeros, we got some interesting things. This is how it looks to them, just a PDF, in case they don’t want to go to GoReact. This is what it looks like when we share it in our foundation pieces.

Dr. Patty Heydet-Kirsch:

So, it looks like this. And we did have to make a little bit of a note here for our special ed department, had a little different twist, but we wanted to use the same rubric if we could. They share a lot of courses. And each department uses the same model with some different standards at the bottom. And they may have altered some of the words at the top, the leveling language. It might be a little bit different, but we tried to stay with that same model. And I guess, because we were in a rush or just because I’m working with really nice people, everyone said, “Yup, that looks great. Let’s go for it.” They were just so excited to be able to, again, deliver our courses the way we wanted to in that fall of 2020. That was our goal, not to skip a beat.

Dr. Patty Heydet-Kirsch:

So, here’s the data. And this is the important part for me, once we got the data and we looked at it in aggregate, it was really cool that I could see the overlap. Observation one in dark blue is candidate scored. So, the candidate themselves would go in and score themselves on the same exact video. Observation two is in the gray, and then observation three is the light blue. At the bottom there along are all the TPE listed, and you can clearly see where we have some work to do. And again, we were happy. Everything was at least above a 2.5. But you can see the growth very clearly.

Dr. Patty Heydet-Kirsch:

But I loved looking at where are these weak spots, like this one here. All three groups are saying this is the hard one. So, we looked at what the language looked like and it turns out that’s hard to see in a video. So, that informed where could we see it, where can we assess this better, and led us to, “Let’s allow the lesson plan to be part of this submission.” So, as they speak to TPE 2.4, we can still judge that and make that part of our plan. Same thing over here, where we see some that are really high. And then some that… Again, this 5.6, 5.7, 5.8 are so low. So, this was actionable data for us. And it was the first time we had an overlay of the three different levels. Again, it’s almost a pre-test/post-test design from the beginning of this course to the second one. When they move into the B course, we can overlap both courses and see how students or candidates assess themselves as well as how the faculty assess them in their work.

Hillary Gamblin:

That’s so beautiful. I love that.

Dr. Patty Heydet-Kirsch:

I do too.

Hillary Gamblin:

It’s so pretty. I’m sure there’s so much work that goes into that one image.

Dr. Patty Heydet-Kirsch:

There is. But each course got this same type of a glimpse at what their course looked like in this particular set in time. And we did just for the fall data and that was fine. And then a spring data catch, a fall data catch. So, we can see over time if the changes we make influence these data points as well. And as all of you know, standards are very specific and they do change. Right now, we are lucky we’re in the same group of standards for the third go of this rubric, because I love trend data, and it’s hard when standards are always fluctuating or just even changing slightly, it alters what you see.

Hillary Gamblin:

That’s true. That’s true. You mentioned this a little bit, but I was hoping you could dive into a little bit more, how did you prepare or coach your faculty so they were consistent and comfortable using the rubric, especially as you were in, as you said, a survival mode?

Dr. Patty Heydet-Kirsch:

Yeah, and that was a really hard lift for our faculty. And I want to say that along with the faculty who are helping with the training, the part-time faculty or adjunct faculty who were joining on board were very eager to be part of a solution.

Dr. Patty Heydet-Kirsch:

And I think if you think back to how we all felt that summer of 2020, any little glimmer of hope was something you wanted to attach your wagon to. And I feel like we were very fortunate that we had people that even though this was more work and a lot of training, they were very willing to coach their students virtually. And for many, that was the first time they had done something like this virtually.

Dr. Patty Heydet-Kirsch:

But we also added in a professional learning community, part of their coursework to make them get together and talk about their own data and talk up their own videos. They could share them, which had not been done prior to this. They had not had a tool that enabled them to share the videos in the same way and to offer each other suggestions.

Dr. Patty Heydet-Kirsch:

And we tried to set it up like if you’re in a grade level meeting K through six or if you’re in a department meeting in secondary ed, this is what you do. This is how you sit with your colleagues and talk about the issues you’re having or how to do a lesson better next time. You look for what tools they might be using in their classroom that you could implement to help some of the students in your classroom. So, they would watch the videos and coach each other, which was really nice to see, and modeling that collegial behavior that you would have once you’re employed.

Dr. Patty Heydet-Kirsch:

So, we tried to really use this in that way as well, and that was also something that we had not necessarily planned ahead for, but the tool was there and we thought, “This is a great opportunity to pull this in.”

Dr. Patty Heydet-Kirsch:

We’ve also got some places for coaching, where we had areas for concern, and, Hillary, the downside was also there. We had places where people did not score at all, just didn’t score.

Hillary Gamblin:

[inaudible 00:16:34].

Dr. Patty Heydet-Kirsch:

Yeah, holes. Places where we didn’t have homogeneous data. That’s my biggest fear, where everyone just scores a four and it’s a straight line across the top and, “Oh, everything’s good. Let’s go back and then we can all retire. Everything’s good.”

Dr. Patty Heydet-Kirsch:

But you can see there’s lots of places here in observation two. Here, there’s a hole. Here, there’s a hole. For people, for whatever reason, decided not to score a rubric, and we had to go back and say, “Remember, we want you to score zero.” And the little, tiny analytics I did there is I pulled off the zeros completely before we did any of the aggregate. So, I had a number or a count of how many times rubrics were not scored against the hole. And that told me where the coaching needed to happen with. Why aren’t they scoring that? Why are they afraid to score a zero? And it came back to, “Oh, I hate it when it’s worth 180 points and students only score 90.” They freak out. Again, coaching opportunity. Talk to your students about a skills rubric. They should be using skills rubrics in their classrooms. It’s not a grading rubric. It’s about coaching those skills and being honest with students, and letting students be honest with their own performance about what skills they need to work on. It’s not a grade.

Dr. Patty Heydet-Kirsch:

So, that happened all fall and all spring. And we started to see them more able and more willing to use that zero value, which helps us a lot to see, “Is it true that 5.6, 5.7, and 5.8 will never be seen in a video? And if so, where in the program do we want to assess that?” Because the state wants us to assess it and it needs to be assessed, and needs to be assessed more than once. So, we had some decisions to make, and that was an area for concern that we talked about quite a bit.

Dr. Patty Heydet-Kirsch:

We also had some actionable findings. Again, going back to that percent of zero scores. Observation two for those three that I’ve mentioned is here, and you can see what they are. It’s a hard one to see in a video. It’s definitely hard. But you could see it in a lesson plan, perhaps. But percent here, in this section here of observation two, lots of zeros scores. Here, there were fewer. And the reason was we started working with our teams of our university’s support providers who will teach the courses, “How could you see this?” And they shared examples. And they, “Oh, well, what I did with my students was during the PLC, we talked through that.” So, I knew that they had met that 5.8 and then they added into their lesson plan or had documentation of that.

Dr. Patty Heydet-Kirsch:

So, when that supports the video, we’ve nailed it. And we were able to solve that issue. And that was huge for us, because finding solutions together, again, in a virtual world, is difficult. And yet, turns out in some cases, when you get good people on a call, they have lots of great ideas.

Dr. Patty Heydet-Kirsch:

So, we used even the bad data or I’d say the data that nobody wants to talk about to inform our practice and help us to solve that problem together.

Hillary Gamblin:

Well, it sounds like the bad data was probably something that you needed to talk about the most.

Dr. Patty Heydet-Kirsch:

Absolutely. Absolutely. Absolutely. And I think that’s what people are afraid of, which is why you do see that homogeneous, everyone scored a four. And they take it personally. “My students don’t score well. Is that a bad reflection on me?” Absolutely, not. Again, skills, it’s about the skills. And the students need to work out how to demonstrate that skill. We’re a TPA state. So, they have videos to exit the program and be credentialed. We need every opportunity for them to analyze their own work, and using this tool has really helped us.

Hillary Gamblin:

Fantastic. You said the faculty were all on board, but it did take some time for them to get used to the skills’ assessment and not feeling bad about giving a two or a one, or a zero, how did the students react to that kind of a rubric?

Dr. Patty Heydet-Kirsch:

It’s funny you mentioned that. We had so many opportunities where we were… Again, everyone would probably tell a story of the one student that needed help, but overall, students intuitively can do this work. And even when they went to score themselves, you saw the graph, they scored themselves pretty much the same as the instructor might have. Sometimes they’re a little bit harder on themselves.

Dr. Patty Heydet-Kirsch:

But the student experience is easy. And the scoring of both the rubric and annotating on their own, we don’t get questions about that. Where they sometimes struggle is if they’re not a social media user so they don’t know how to upload a video. But the coaching, again, from GoReact side is awesome for that. And then just getting the word out to our professors, that if they ask you this question, please have them go here. And that took a little while, but now that’s something we don’t deal with much anymore either.

Hillary Gamblin:

Oh, not thank you yet.

Dr. Patty Heydet-Kirsch:

No, no. That’s the end of my slides. I wanted to go to… That’s it. I don’t have any [crosstalk 00:21:30].

Hillary Gamblin:

I have a few more questions for you if you don’t mind, before we go to the Q&A.

Dr. Patty Heydet-Kirsch:

No, of course.

Hillary Gamblin:

I know that when we were preparing this workshop, one of the things that you mentioned that was really surprising for me and for you is that while the rubric was the main thing that you kept focusing on, feedback also became an important key, even though you weren’t collecting the feedback data necessarily with that rubric, it was key to making sure that your data was consistent. Can you explain that correlation for people?

Dr. Patty Heydet-Kirsch:

We were trying to get it, innovative reliability, and that’s a tough one. And everyone struggles with this, especially with a new tool, not knowing exactly where you’re going. But the cool thing here was we were able to extract from the annotations, some of our best examples of feedback and feedforward. In other words, feedback about what you did in the video, but feedforward with really good suggestions from those experts we’re all hiring that are doing these assessments of candidates, whether they’re retired principals or they’re your full-time faculty. That feedforward about what you might do next time becomes… It kind of coaches for them, “What does a true reflection look like?” It’s not just looking backwards, it’s planning forward as well. So, we’re coaching that.

Dr. Patty Heydet-Kirsch:

But we did find some that had been using what I call the old, 10 years ago accreditation language where, “Video fails to…” “Student does not demonstrate that day.” And it was all this deficit language consistently on the annotations. So, we ran several different sessions where we said, “Please, let’s not use that kind of old accreditation language, where all you’re doing is pointing out the errors. Let’s try and use some more encouraging language.” And we did a couple of workshops where we extracted the good examples and the non examples, and changed them just slightly, and coached, “What does it look like to give feedback and feedforward to a student? And what does it look like to point out the things that could be better without using deficit language?”

Dr. Patty Heydet-Kirsch:

Especially, let’s talk about it, the SELPs during COVID was absolutely front and center for us. Students were already nervous enough. Many were teaching online and they didn’t feel comfortable doing that, and they were in classrooms when they didn’t necessarily want to be, in some places. So, to get feedback that was too critical or couched in language that sounded as if you didn’t do a good job at all was just so hurtful. So, we coached people directly and then we coached people in groups, and we were able to use the data as the springboard for the coaching sessions that we did. And we created our PowerPoints and some of our sharing with those examples, which was awesome, because no one could see themselves, whether it’s good or bad. We changed them just slightly. But I think it was like, “Oh, I do remember getting that sheet years ago that had all those beginning phrases of ‘Video fails to demonstrate that…'” That’s not what coaching looks like on this tool. We don’t need that.

Hillary Gamblin:

That’s so fascinating, because you looked at the before and after. And so, they’re getting feedback in between there. And if the feedback they’re getting is using negative language, it will affect the score differently than people that are using positive reinforcing feedback.

Dr. Patty Heydet-Kirsch:

And even with the rubric itself, we left room for the end of each TPE that has approximately six different indicators within, we had a place to add some language and that’s where we found people who were really doing a great job at feedforward.

Dr. Patty Heydet-Kirsch:

They were saying, collectively, in this particular performance expectation, “We can see or I can see that you are really working towards improving students across the board or working with all students. And the two students that did not speak during this piece, let’s talk about those strategies at our next PLC.” So, they were giving them a, “I can help you with this,” that positive feeling like, “We can do this together.” Which is such a different way that you speak to people when you’re in person, because you’re in a hurry, you’re filling out all those forms. That was the way I used to do it. And I was so busy trying to talk to the teacher, talk to my student, fill out the form. This is a completely different way to work.

Hillary Gamblin:

So, just to review, I’d like to invite you… We’re going to do the Q&A a little bit so if you have questions, please start doing those.

Hillary Gamblin:

But I have a few more questions for you. Big picture, by using GoReact to gather this data, what were some of the most important insights that you gained about your program that you may have not seen otherwise?

Dr. Patty Heydet-Kirsch:

I think in some cases, we… While we were celebrating the fact that we didn’t have to tell anybody they couldn’t have a placement, whether it was online, mixed mode, in-person, it was fluctuating in California, that was the biggest, I guess, warm and fuzzy for me was that when we’re hearing students couldn’t have a placement and were delaying their career or their credential, we didn’t feel like we were dealing with that problem like other universities might have been.

Dr. Patty Heydet-Kirsch:

But the “aha” for me was that you don’t have to have a tool that just does your assessment needs, and you don’t have to have a separate tool that just responds to student need or faculty growth need. And, again, I did not choose this tool. I’m going to kudos to the team before me that had already chosen this, I walked into this.

Dr. Patty Heydet-Kirsch:

When you find a tool that you can use for all those things, it’s awesome, because the value behind having the coaching for your faculty, having the student feedback and feedforward, and the ability to practice before they have to submit a CalTPA, and add into that, the accreditation pieces that I had, all the graphs and data, and charts that I could possibly need that were very clearly showing we are doing a great job of the pre-test/post-test design… My next step is to correlate that with their CalTPA experience, which probably will happen in the spring, and be able to say, “Okay, can we predict? Can we decide what pieces they’re not able to do, and can we track that back into the program?”

Dr. Patty Heydet-Kirsch:

So, in terms of analytics, the data needs to line up, but I want to have the trend available to then triangulate where else we can go. And that’s the fun for me. It’s just a big puzzle.

Hillary Gamblin:

You must be a puzzler.

Dr. Patty Heydet-Kirsch:

I am. I am.

Hillary Gamblin:

My last question for you before we start the Q&A is, for those that want to start gathering data like this, and they’re a little unsure where to start, what should they do? Can you give them some basic pointers on where to start and maybe what to do in the middle and how to finish? Just some basic outlines so they know how to do this themselves.

Dr. Patty Heydet-Kirsch:

The main thing for me is setting up a really good rubric and doing your homework, teasing it out. Maybe even building a rubric, scoring it a few times, and then looking at what the data might look like. And then, the major of thing for me was we did an Excel spreadsheet with every program, every course, where a student is scoring it, where a faculty member is scoring it, where is our true pre-test, where is our benchmark in the middle, and then where is our post-test. And that’s behind the scenes.

Dr. Patty Heydet-Kirsch:

But setting that up, I mean, the people teaching the courses don’t know that. But behind the scenes, it was the design of the pre-test/post-test design so that the analytics come together. But then creating that rubric, we did not have a choice, state standards are given to us and they’re, like I say, lengthy. But you can’t pick and choose. You have to assess them all. So, let’s build a rubric that’s usable and make it the same exact touch over and over again so that faculty will become so accustomed to it, it won’t seem so cumbersome. I think it’s all about the rubric, if you want data.

Hillary Gamblin:

Perfect. The insight that you were able to get just using this one tool is extraordinary. Thank you for answering my questions.

Hillary Gamblin:

I’m sure your ideas and your slides, and the questions that we’ve discussed for those listening have inspired some of their own questions or ideas so we’re going to take the next 10 to 15 minutes to do that Q&A. Again, if you haven’t submitted a question, you still have time. Don’t worry.

Hillary Gamblin:

So, we are going to go through that. Let’s see. Okay. There are two open questions to start with. The first comes from Dr. James. Okay. I’m not going to try pronouncing your last name. I’m sorry, James. “Could captions be added to the videos in order to address language areas?”

Dr. Patty Heydet-Kirsch:

We have not dealt with that. We are just beginning a bilingual authorization. So, yes, you could and that would be awesome, if that’s the way you’re going.

Dr. Patty Heydet-Kirsch:

I know that we had one student that was able to do some of the annotations, talking about what else was going on to explain that when her language was different, teaching in a bilingual setting or working with students in her native language. Is that kind of where you’re going?

Hillary Gamblin:

I don’t know.

Dr. Patty Heydet-Kirsch:

Okay.

Hillary Gamblin:

James, maybe if you feel like that’s answering the right question, maybe put that in the chat so we can see if we’re getting at what you want us to cover here.

Dr. Patty Heydet-Kirsch:

We have not launched that program 100% so I can’t answer what we’re doing to set that up, but I know that it is available. There he is. Okay.

Hillary Gamblin:

Okay. Okay. Fantastic. The next one is from Lisa. She says, “I think you mentioned that candidates are required to pass a pedagogical performance assessment for licensure in your state, which PPA do they need to take and how does your video assessment process prepare candidates for that assessment, or how does it align with performance standards?”

Dr. Patty Heydet-Kirsch:

So, the performance standards that we use TPEs in the state are crosswalked to the CalTPA exam, which is a Pearson exam. It’s one of the derivatives. Most of us have some form of that in some way. So, it’s already aligned for us, which was great, if you assess all TPEs, which was the kicker. So, that was, in building that rubric, “Can you leave some out? Not really.”

Dr. Patty Heydet-Kirsch:

But that’s where I’m saying that once we have the CalTPA data from this group of students, we can crosswalk it back and see, “Did we prepare them with these multiple opportunities to do video, analyze their video feedback, feedforward?” And then when they do their seminar class, they do a completely different level of, “What does a reflection really look like?” But we’ve built in how to reflect into the early clinical practice courses so that as they’re doing their videos or choosing their videos for their CalTPA submission, which are very short, they understand what people are looking for. And they understand what the language looks like, not just what the language says.

Dr. Patty Heydet-Kirsch:

And I think that’s where you have… If your exit involves a video, you have to have your candidates practicing video from the very beginning. And especially now, I think, with a lot of us not being face-to-face yet or ever, the video becomes an important tool. It becomes the only tool to really help your students learn how to work this way.

Hillary Gamblin:

The next question is from Carol and she says, “Can you please talk about the amount and type of training you provided for your coaches?”

Dr. Patty Heydet-Kirsch:

It started out with basic webinars, where we’d walk through and show them. And then we’d share a PDF of the slides we did with almost a cheat sheet and how to do it. I worked with a team of people who put together several different job aids for each department with, specifically, if you’re spatted, “The courses say this. This is where you will go. Click here. Click here.”

Dr. Patty Heydet-Kirsch:

Once we got past that, the training to use the rubric was done in the monthly clinical setting meetings with different groups of people, whether they were the advisors, whether they were the people that are teaching the courses, the faculty directly.

Dr. Patty Heydet-Kirsch:

And from there, after we had… I felt a pretty good rhythm going and most people were doing things effectively, and the first data set came in and it was looking pretty good, from there, we did a survey. And our survey was called the survey of needs. And we listed all different kinds of things that they were asked to do, that were not necessarily in their wheelhouse over the last eight months, and again, I was brand new, so I was saying, “Help me help you. What do we need?”

Dr. Patty Heydet-Kirsch:

And the survey of needs came back. Over 80% of our faculty had responded to it, our adjunct faculty. Hands down, “Technology, I want help. I don’t feel like I’m good enough. I don’t feel like I’m using it the right way. I’m just doing the basics. I’m following the job aid. I’m not going outside that comfort zone.” So, every month, twice a month for about an hour, we did drop-in sessions, technology drop-ins, and that’s been awesome. It’s a PLC, basically, for adjunct faculty. And one will say, “I can’t figure out how to do this in Brightspace.” It’s hardly ever about GoReact. It’s about everything. And some, “Oh, I solved that problem. Let me show you.” And they’ll share their screen and show it. So, we started this collegial group of… Everyone has tech questions. In the middle of all of this, I should mention, we changed from Blackboard to Brightspace to just confuse people, for real. Because that’s what we do in higher ed, right?

Dr. Patty Heydet-Kirsch:

And so, there was a little freak out going on with people, “I just learned this system, you’re changing it?” And it’s so similar. This is my fourth LMS. But nobody wants to hear that when they’re an adjunct faculty member dealing with, again, the stress of COVID. So, we had just the nicest people that would come to these sessions and say, “I am so confused.” And they’d tell their story and I could jump in sometimes but more often than not, it was a colleague who would jump in and say, “I dealt with this last week. I called the helpline. Here’s what they said to do.” And they’d share it.

Dr. Patty Heydet-Kirsch:

So, the training became two-fold. It was training that we knew they needed and then training that we asked them what they needed, and we supported that. We’re trying to call it a wraparound support system. I’m not sure everybody feels that way. But if they attend those drop-ins, it’s a safe space. Ask whatever you want. And they’ll say, “This is my seventh time asking this. I’m still unclear.” No problem. Let’s go back to the beginning and let’s solve it together.

Hillary Gamblin:

It makes sense that you who have your background in analytics, you would think to ask for a survey halfway through and say, “What do you need?” That’s something that you need to do, but I’m sure a lot of people don’t think about that. Halfway through the program, you need to know, is it working? What are the gaps?

Dr. Patty Heydet-Kirsch:

You do. And it wasn’t valuative, which was good. We published the data. We shared it with all those stakeholders and showed that, “Here’s what you told us. This is what you need.” We shared the graphs, the charts, but no evaluation, completely anonymous. We just want to know what you guys need. Do you need support this way? Do you need support this way? And hands down in every group, it was technology. So-

Hillary Gamblin:

Interesting.

Dr. Patty Heydet-Kirsch:

… I would highly recommend you do that survey. It’s eye opening.

Hillary Gamblin:

No kidding. So, Kathleen has another question. She says, “Can you talk a little bit more about how you use professional learning communities?”

Dr. Patty Heydet-Kirsch:

Yes. With the students, it varies. Some people right off the bat, when they heard PLC, said, “I need a script. What am I supposed to do? What are my questions?” Some went right to the standards and said, “We’re going to talk about TPE one the first time and let that begin the conversation.” Others, assigned the students. And, again, it’s between three and five students in these courses, very small, but lots of them, the National is huge. I don’t want to sound like we’re this little, tiny group that can do this so easily. We’re huge, much bigger than my large state university in Florida that I used to work with.

Dr. Patty Heydet-Kirsch:

We would let the students run the PLCs so they would learn how to do a grade level meeting or they would learn what it’s like to plan. And they would use, sometimes, a little survey among their group and say, “What do you guys need this week?” Or “Here’s what I’m thinking. What are you struggling with this week?”

Dr. Patty Heydet-Kirsch:

So, it went from the instructor leading the PLC to learning how to do that professional learning community around topics that were interesting or in need for the students themselves. We ask them at two distinct times to do a PLC, and we’re finding out that they’re doing it much more often than that. And they’re setting up the course completely differently because the delivery model is different. These are adjunct faculty that were used to being face-to-face and walking in with a PowerPoint and doing a delivery. And now, that’s not what we’re asking for. We’re asking for the GoReact to be that one-on-one and the PLC to be their group environment. Does that help with that question?

Hillary Gamblin:

I think so. Kathleen, if you feel like that answered it, maybe say something in the chat.

Dr. Patty Heydet-Kirsch:

So, the answer is it started out being scripted because people didn’t know what to do with that assignment on their course, and it ended up being a wide variety of very effective, I think, uses of a PLC and very good student feedback.

Hillary Gamblin:

Fantastic. Oh, she says, “Yes-“

Dr. Patty Heydet-Kirsch:

Okay. There she go.

Hillary Gamblin:

“… that’s helpful.”

Dr. Patty Heydet-Kirsch:

Perfect.

Hillary Gamblin:

Yay.

Dr. Patty Heydet-Kirsch:

Thanks for that.

Hillary Gamblin:

So, I think we’ve answered all the open questions that we had. And so, thank you, everybody, for asking all those questions. This is what makes it such a real-time exceptional learning experience, being able to do this, especially as we can’t be in the same room together all the time anymore.