Sign Language

Exploring TPACK and SAMR Frameworks for Sign Language Instruction

Explore two tech frameworks, SAMR and TPACK, for use in sign language instruction

As the demand for ASL education continues to grow and education technology rapidly evolves, it is critical to reevaluate and explore the use of modern digital technologies in sign language teaching and learning. There continues to be a great demand for sign language instruction tools as growth in teaching sign language online evolves. 

This presentation introduces two tech frameworks, SAMR and TPACK, for use in sign language instruction. Participants will have the opportunity to reassess their current technology usage and determine whether it is practical, effective, and beneficial. Educators can use these frameworks to provide students with an authentic language learning experience.

Educational Objectives:

  • Evaluate current digital tools practices in sign language education through analysis and comparison.
  • Discriminate and contrast the unique technology requirements for sign language education compared to spoken/written languages.
  • Develop a critical perspective on using digital tools in sign language education, using the SAMR and TPACK frameworks to reevaluate and explore current digital tool usage.
  • Participants will leave the workshop being able to apply the digital pedagogy frameworks of ‘SAMR’ and ‘TPACK’ to create effective teaching strategies using technologies.



Matt Andersen (he/they), Assistant Professor in the Department of American Sign Language at Columbia College Chicago, teaches all levels of American Sign Language and several Deaf Studies related courses. Matt’s interest lies in remote education, and he is currently contributing to bodies of research on various topics relating to sign language education and digital tools. He holds a Master’s degree in Sign Language Education (Gallaudet University, 2019) and a Bachelor’s in Science in Visual Communication and Advertising/Public Relations (Rochester Institution of Technology, 2008). Matt is also a certified Canvas educator and one of Quality Matter’s H.E. Peer Reviewers. Matt is currently serving on the American Sign Language Teachers’ Association board as their Vice President (2023-2027 term).


Matt Andersen:

Hello, I am excited to be here today. I will give a brief introduction about me. Let me scroll through my slides. So I am an associate professor at Columbia College in Chicago. I teach ASL classes for those who are wanting to become interpreting students at the college. I have a master’s degree in Sign Language Education, and I work at a higher ed level. I am a certified ASLTA member, which means I am qualified to teach ASL courses. I’m also the vice president of the ASL Teachers Association. So that is my background. I’m passionate about my work and I’m glad to be here.

I’d like to raise the standard of education, particularly for ASL. And that’s my goal. That’s kind of what I’m passionate about. I want to raise the bar there and set a standard. So let me give you a little bit of background on some of my research that I’ve done. So my primary role has been in research. I wanted to explore how technology could be integrated into the ASL classroom as far as curriculum development, as far as assessments, and things like that. So there’s a lot of technology floating around out there. And is any of that technology relevant and useful for us in the ASL teaching field? So that’s why I’m so passionate about presenting on this topic, how technology can be used as a framework for our ASL classes.

I’ll talk about two things. TPACK and then AMSR. So I will just briefly touch on a lot of topics. You might see a lot of information on the screens. I did that so that you could download the PDF and peruse at a later date, which we’ll go into more detail for you. I will do my best to cover what I can, but we don’t have enough time for everything. So I’ll talk briefly on Part 1 and Part 2 and probably Part 3. Part 4 is some of my research findings, and that is specific to ASL education.

For those of you who are watching right now, I’m not sure if all of you are ASL fluent or ASL teachers. I’m sure we have a varying degree of familiarity with the ASL teaching field, so Part 4 and 5 goes to a little bit more in depth. If you’d like more information, please feel free to reach out. I’m happy to have a conversation with you. And then Part 5 is a call to action.

We’ll start with Part 1. So I put an asterisk next to language. And the reason is because, typically, those who are discussing language education, all the publications and the research is talking about spoken language. It’s not typically inclusive of American Sign Language. And so a lot of research that I have found, when I try to implement that in my classroom, it tends to be very ineffective because they have not considered signed languages. They typically include written or spoken language, and that’s the focus in a lot of research. And so that’s why I put an asterisk next to language because it’s language with the understanding that it doesn’t often include signed languages.

Okay. So we’re familiar with technology. Technology dominates our lives. It’s taken over for better or worse. But in our education field, we have seen a lot of benefits to technology, a lot of supports for teachers, a lot of supports for students, and in a lot of different disciplines, in math and science and language learning, so technology has been a huge benefit in the educational field.

You’ll see I have two points. So the white text is kind of general information. This kind of yellow text is my commentary added for ASL just for reference. So if you’re looking at my slides, that’s how you can reference that information.

In language education, they use these type of platforms for language learning, Rosetta Stone, Dueling, Babble, et cetera, et cetera. Those are great because they are powerful tools that incorporate AI for evaluation of student progress. So they can assess students and then develop their curriculum and adjust the curriculum in order to fit where the student is at in their learning, which is great. Those platforms have games and activities that can be fun and engaging, that track progress, that set goals where you can gain points and win leaderboard positions, and things like that. So that’s really great for a lot of people.

There’s been a shift where previously, we used computers and now we use mobile apps and things like that. If you are looking at the App Store, you’ll see thousands of different apps and many for language learning. And a lot of them support spoken language or recording, so when you speak into a microphone, then they can make sure that that has been said accurately, that it can track comprehension and things like that. So mobile apps have become very popular for that.

Recently, I’m not sure if it’s still popular, but Duolingo was kind of the leader in that kind of app service. People were… They wanted to maintain a streak on Duolingo, so every day they’d check in to make sure they do a little bit with their language learning, so kind of a daily assessment motivational tool for learning languages, which was a huge benefit.

Now, virtual reality and AR, augmented reality. How does that play into what we’re talking about? So VR and AR can give an artificial space where you can feel an authentic experience where you can engage with your environment using different languages in this virtual reality. So where typically language learning was taught in a classroom, now AR and VR allows you to provide students some access to artificial and authentic, at the same time, environments where they can learn languages. And those are considered safe and controlled spaces.

And the deaf community values their space greatly. So sometimes we have some new ASL learners that come into the sign language space and it feels like an interference or a disruption, and it causes a lot of deaf people to respond poorly. And so with the way the ASL students are responding to their teachers and others in the deaf community, it becomes really difficult to engage with people in our community. But using this approach of VR doesn’t cause any disruption to anyone’s lives, which is a nice bonus.

Since the pandemic, Zoom is everywhere. We can’t live without Zoom. So we use Zoom in video conferencing, which is great because this is an authentic way to have real-time, real-life conversations that are simultaneous conversations that people can have to gain experience and assess. And the bonus of that is that can be done anywhere, anytime. So you can be on campus and schedule an appointment. You can meet with a tutor or a teacher that’s maybe limited on time when you’re in person, but they can have so much more availability on Zoom.

Similar to my comments about the different platforms. So there were platforms that require you to download an app. There are also web-based platforms that offer similar features and resources, but they’re just web-based instead of app-based.

And then online education communities through a school, possibly through asynchronous-paced, self-paced courses where they can complete modules as they’d like to for fun or for just learning on their own time without credit. This is a nice feature because universities and colleges will charge per credit, so online options tend to be cheaper and more accessible to people.

All of these different types of technology and the tools that are often used in education, I’ve just mentioned. So for those of you who teach ASL classes currently, we suggest that you maybe think about some of these platforms that you can use, some different technological tools that you can apply and incorporate into your ASL classrooms and your courses. What would that look like? What are the benefits? What are the challenges? What are the costs involved? What is the training and the startup costs for all of that? For ASL teachers, you can think about the way you typically teach and then apply those things. There are eight different options that I just went through. Maybe take some time to think about how you can adapt some of those things into your classroom.

Okay. So I’m going to move on to Part 2, Exploring the TPACK Framework. So I’ll explain what TPACK is first. So TPACK is a framework that helps educators be able to assess and evaluate how technology impacts a teacher’s ability to teach and impacts the student’s learning and their experiences in the learning environment.

Is that working now? Okay. I think I’m good now.

Okay. Sorry about that. So as far as the framework itself, there are two parts. There are the technological and the pedagogical. And honestly, since 2006, this has been around, but I’ve never seen any discussion about this TPACK. I was able to find it and I realized what an important tool it is. I went ahead and did a lot of research and I realized that this was something that we could use. So let me explain a little bit more about what the TPACK is. So the T is technology, the technology and the knowledge related to that. P is pedagogical and that knowledge related to that. And A is just and. It’s just added in there.

We’ll go ahead and discuss this next slide. So I have this diagram for what the TPACK looks like and how it’s put together. So T again is for technology, the technological knowledge and what technology we use in the classroom and what the teacher needs, what technology the teacher needs to be able to teach or to become more efficient at their skills and how they can support their students’ learning. Those digital tools, that can include apps, websites, those that I just recently explained and I just showed you. So that’s the T.

As far as the P, pedagogical knowledge, that’s how we teach, what that looks like, what our approach is, and how those students learn as well, how we can help meet their needs and their expectations. And then as well as we can work together to be able to create a learning environment that’s effective. There are different strategies that we can implement that will be able to help that environment really thrive. And how we can do those assessments, so pedagogical knowledge. The C is the content, what is in the lesson plans, what needs to be taught, and what those students need to learn.

We’ll talk about this a little bit. Okay. So as far as TPACK, we have these three intersecting and overlapping fields. The pedagogical as far as the P, and then we have the content knowledge. These all overlap. So I was really interested in how we teach this information and how I can support my students to become effective at acquiring language and really improve their skills and knowledge. So I did a little bit of research. And if they know both of these, how to teach and how to use technology, that content will come. So if these are overlapping and we implement this technology, then maybe my teaching style would… I can add that in. I can add my own preferences, my own activities, and I really wanted to have technology to support me in my own teaching style. I wanted to be able to implement the technology.

The last the overlap is content as far as technology. And so if I’m able to connect all of those dots, that allows the content to become more accessible. It allows the students to learn complex information with the technology, and it becomes easier. They become more excited about the content. They want to learn and they want to develop their ability to become better, and they use the technology to be able to thrive within the learning environment. So having those three elements within your teaching allows the teacher to be able to take more responsibility of how each of those impact each other.

We can’t just say, “Oh, just use this app. It looks fun. It looks really flashy and exciting,” and have them use it but there’s no impact. It doesn’t impact my teaching. It doesn’t impact the students. It doesn’t help the experience really become more educational and help them understand the content. So really there’s no point in providing that new technology if there is no usefulness for it. So learning new technology can be frustrating sometimes. So we don’t want to just throw different apps and different websites, different technology, different digital resources and tools at our students where they are not able to actually master those skills. So the goal is really just to find a good middle ground and find where all three of those areas overlapped and have intention for each of the technological tools that you introduced.

Now, moving on to the second framework. This is more focused on the students themselves and their experience. So TPACK is more of a teacher tool and how we can assess on how to implement technology. And as we take that and allow the students to learn with the technology, we decide what level of experience these students can have.

There are different levels within the SAMR framework. Substitution, augmentation, modification, and redefinition. So those are the four standards within the framework. You can look at the slide here on how it’s spelled, SAMR, so substitution, augmentation, modification, and redefinition. This is how we can measure our students’ learning experience.

For each of these, there is a standard to what it looks like. So substitution, that might be a textbook, and that might be taking a textbook and switching it for an e-book. So that’s more of a milder transition. Just doing something small to change the student experience. It doesn’t have a large impact and is easier. The next one is augmentation. It’s a step up. I will use the same example of a textbook. So you have a textbook and transitioning that to an e-book, so you have this e-book, but within the e-book, it has different features that you can click on, such as clicking on a word and it brings up the dictionary. It has a definition of what that word means. So that is a little bit of a bump up from just the substitution. So instead of having to pull out a dictionary or look up a word, it’s got that already built into the e-book.

Our next step up is modification. As far as that e-book, having the ability to highlight might be a modification. And being able to email those highlights to my class, so we can all share what we’ve highlighted. We can all learn together. So I am changing the learning experience by allowing those students some of that freedom. So the teachers are empowering their students to be able to take responsibility for how they learn.

Moving on to our next one is redefinition. And that is what it looks like. Or what it looks like, I’m sorry, is difficult to describe. So we can’t predict what new technology looks like. So for example, if we have VR, virtual reality. If we are able to access ASL through VR, we don’t know what it looks like. So I can’t imagine how a student would be able to benefit from that. So thinking about, “Is that substitution because they don’t have to go to class?” They can sit in their bedroom and put an artificial classroom over their eyes with goggles. Is that substitution or is that redefinition? Is that redefining the student’s learning experience? So it’s a little bit vague, but really, it just depends on our technological knowledge. So if we are ahead of our time compared to somebody who’s just learning, usually these people who are just learning, they really will fall behind if we use too much technology, but we want to make sure to match our students, their abilities and level.

This is just explaining what I just explained to you. So this gives some of those examples that I just explained. So in an ASL class, usually, we’ll have students, giving directions on where they’re going to go from point A to point B. So the students need to use their 3D space to be able to show where they’re going. Those who don’t know ASL, I’m just giving you just this quick lesson. So I can trace a path. My home is here just right in front of my chest and I go straight and then right, and this is where point B is. This is where my favorite coffee shop is. I can see. I go straight. Make a right. You can kind of see tracing the path. So you can see that in my space.

And for many years, I would hand out a piece of paper that was a blank map and I would sign, I would give them instructions and directions on how to get from point A to point B, and my students would trace where I told them to go. And that’s honestly just old-school paper. Instead, what we could use is using a projector on the whiteboard and the students would be able to get up and draw their pathway out on the map that’s projected right on the whiteboard. So a lot of people think that that’s not even technology, but it is technology. It’s a different way to teach and it’s a better way to teach. So that’s an example of substitution.

Augmentation, it’s similar to substitution, but the students are able to choose their own map. So I’m not handing out my one map that everybody has the same thing, but they can look up Google Maps, they can pull up wherever they would like, and they can make their own map. And that’s something that’s really cool that they can use to help learn.

Modification. Using a whiteboard with a map on there like a Miro. It’s an app. It’s a website, so M-I-R-O. There’s a whiteboard. You can add anything you want. It’s there. You can just put it on there. And the students can put the app up on the board and they can put pins, they can put points of interest wherever they would like, they can type notes, or they can also sign their notes as well. So that gives you a little bit of flexibility. It’s pretty cool. That enhances the experience for the student.

And redefinition. I love this. So I don’t know, some people maybe think it’s not safe, but using virtual reality goggles, the students can tell you what to do so you walk around in this virtual space. So like Apple Vision, you can have two people in the same space. So have one person watching the video and the other person actually in the virtual space. So they can walk, they see the city, and the teacher can describe where they’re going and what they need to do to get there. So that’s a really cool piece of technology that can enhance a student’s experience.

Now, I’ll share some real-world applications and some discoveries that I’ve made when implementing this framework within my ASL program. I made a lot of discoveries actually. And it looks like we have enough time built in that I will go ahead and mention a few of those.

As I mentioned, I work at Columbia College of Chicago. It’s a wonderful college, a wonderful campus. It is primarily an art school. So that’s the main focus of the college. And sometimes when I’m teaching, the technology does not work in my classroom, and a lot of times, they don’t understand why I need to supplement my teaching with some of this technology.

For instance, in this room right here, I have two lights on me. And if I talk to my colleagues, they would say, “Well, you have the light from your computer screen. What do you need additional lights for?” And they don’t understand that I need multiple screens, that I need multiple lights, that I need all of these things in order to teach. So we’re spending money to get what I need and I’m spending personal money to get what I need instead of the college paying for what I need because they don’t understand the technological needs that I have and some of the accommodations that I need for my teaching. So I was looking into some of those specific problems and challenges. And I can’t just say, “Oh, they’re just ignorant as far as technology.” I wanted to provide them very specific examples so that they could understand better what the needs were for my classroom.

In my research with how to use technology in the ASL classroom, the research out there is very limited. There isn’t a lot of research. There’s been some publications that have been made recently, but there’s a strong correlation to the COVID pandemic time period, which is not what I’m looking for. I just want more information about our everyday teaching, not just what we used as sort of a band-aid fix for when we were in the pandemic.

This research that I have is a culmination of three years of research that I’ve done. So the ADDIE model is another framework. Here’s another one that we’re introducing. But this is how I evaluate, and I evaluate my department, the current way that they run, how they function, what the students need, what the teachers want, et cetera, et cetera. So I use this ADDIE model. And I’ll talk about ADDIE. So the acronym is A, assess, D, design, meaning new ideas. D for design and develop, and then I for install. So you have an idea and a plan, and then how you apply it, what you actually do, and then E is what you would change, so how you evaluate. So this ADDIE model is kind of a practical use model and it was heavily used in the military. Now, it’s widespread for education use as well.

And this is a way to help us progress and standardize and understand what phase we are currently in so that we’re not jumping around, so we can set a very clear standard. And so we kind of start off in that assessment phase so that we can figure out where we’re at and where we need to go. So first, that saying, “The cart before the horse.” Sometimes we tend to do that with technology. But as far as technology, should technology lead our decision-making, or should we establish how we teach what we want to teach and best practices, and then technology can support us in that?

As teachers, we’ve been really frustrated and sort of stuck in this thought, and I’ll explain some of the reasons for that. So this list seems like a very simple, successful example of how we should teach ASL. These sub-parts, of course, we know ASL is a visual language. It includes the language component, but it also includes culture within the course that you’re teaching. It includes theology. Those that are teaching, include strategies that are effective for their teaching, which is great and we have a lot of research on that.

But when we moved into research-based curriculums, content, courses, assignments, assessments, we have a lot of research that’s been done over the years for that, which is also great. And then we move into technology and digital tools. Again, the TPACK method. We know which technologies we want to use. We need to know. Right now, we’re lacking. We don’t have a lot of research on which technologies. This is the gap, and this affects everything after that. This affects our research-backed digital pedagogy, how we use technology within the classroom, how we assess. We don’t have any research for that.

As teachers now, we sort of have this innate ability to teach what the students need to know, and we feel like, “Well, we’ve taught that. It’s good enough.” But we don’t really know what our impact is and there’s no way to assess skill. We see student learning and we think, “Okay, our job has been done.” But there’s more research that needs to be done, more data that needs to be collected. And if all of these things were done, you have everything that you need, you would be able to teach effectively, face-to-face or in your hybrid methods, or fully remote. You could do any of those things if you had all of the resources.

In my department, they recognized. You see this yellow portion that I have listed here. This is for their spoken language, their foreign language department, all of the things that they need. So of course, audio and text is heavily talked about in their spoken language courses. They talk about culture, which is an overlap to our courses. And then there are some similarities and differences. So they have similar research-backed pedagogies. We have ours. They have theirs. But you see this technology and digital tools. Do you see that? I’ve listed ours in the green as much smaller than those that are teaching spoken language courses.

The colleges inform us of the specific technologies that we need to use, which often does not accommodate for sign language instruction. And so when I say, “Why does the college require this?” And they say, “Well, this is successful in other spoken language, other foreign language courses.” And so they assume that the application will be the same, that the effectiveness will be the same. But it’s not because our sign languages are very different than spoken language learning, and colleges just aren’t aware of that.

And then our ability to use technology face-to-face in a hybrid method and online is very lacking because we don’t have the right technology tools. They were not designed to support ASL education or deaf education. We don’t have the technology, and so we rely on the tools that the foreign language program uses. We copy whatever they’re doing without really realizing how much we’re lacking, how problematic it is, how much it impedes a teacher’s ability to teach in the classroom. They struggle. It’s awkward. It doesn’t work. It’s like trying to put a round peg in a square hole. It just doesn’t work. You can push as much as you want, but it doesn’t fit. It doesn’t work. And so a lot of our teachers are lacking in this way and not provided the appropriate accommodations that they need for their teaching.

These are issues based on research. So we don’t have the appropriate research to show which technology tools will be the most effective. Often, this information comes from the teacher’s past experience or what they’ve tried. Sometimes that can be enough. Because as teachers, we kind of share ideas about what works and what doesn’t work, but we don’t often think about the impact on our student, on their learning experience, whether the students like using that technology, where they feel that it’s effective. Often that’s overlooked. We just see the student skills improve and we feel like that is enough evidence to validate our use of certain technologies. But there are multiple layers to this that we often don’t consider.

For instance, in a classroom setting. So typically colleges will automatically put projectors into classrooms. So every time you walk into a college classroom, you’ll typically see a projector there. And historically, deaf instructors have used that and ASL teachers have used this because historically we’ve used overhead projectors. That was kind of our old method where you had that clear film paper that you wrote on. And so these digital projectors, we used it in a classroom just because they had been similar to what we had been using before. The transition was easy. But when teachers are successfully teaching in their classroom, they don’t have projectors. They have multiple TV screens in their class. And we’ve noticed that ASL teachers love using these multiple screens instead of using the projectors because the projectors force us to stay away from the light that’s coming in.

You can’t see someone signing when this projector is right in your face and you can’t see the student. They can’t really see you. So it forces us to stay in one place on the side of the screen for three hours in our classroom. We stay in one spot, and it’s boring. It’s not engaging. I would like to walk around my classroom. I would like to engage with my students without worrying about where I need to stand. So that’s a minor example, but that’s one way that technology can impede our ability in the classroom to teach effectively.

Another example is using laptops in the classroom for our students. So often, they don’t have laptops accessible to them to use in the classroom. Often they only have their mobile devices, which is a restriction. Because if I’m wanting my students to do a recording activity. For instance, when we use GoReact within our classroom if I’m wanting to use a GoReact assignment, we’re sort of limited because we can’t access that via laptop. So we have to have a lab set up so that everybody can access a computer. The college doesn’t enjoy that. They think it’s expensive. They think it’s difficult. They don’t understand the reason why that would be important, and that’s another challenge.

This is examples of high-tech labs, and they’re expensive. They’re expensive. They cost quite a bit of money for these colleges, and they don’t have the funds available for that. When requesting that, a lot of colleges will say, “Oh, it’s too expensive. The students already have their own laptops. They have their own technology nowadays.” But honestly, that’s not true. I know. I’m in the trenches and I know that many students, they don’t have the funds for it, and they do need those labs available but it’s expensive.

ACTFL so that is the American Council on the Teaching of Foreign Languages. It took me time to remember ACTFL, but they recommend 90% of the content being taught in target language. So in class, it’s all in American Sign Language, but only 10% of the content being in English. So 90% in the target language. We want them to see ASL all the time. So as a deaf teacher, we have Canvas, and that’s already set to English, obviously. We have to have instructions in English, and I am wondering if we can change that. A lot of different teachers can change their Canvas language to Chinese or German, but as an ASL teacher, how do we do that?

The Canvas platform has to be in English. So I have to count that as my 10%? So I’ve already got my 10% there on Canvas. And so in class, I really need to try my best to not rely on English phrases, and I really just need to use pictures, videos in my communication. And it’s a challenge. It’s a big challenge. And if it’s an online course, that means all of my instructions and everything needs to be my self-recorded and edited and uploaded. So that’s honestly a lot of manpower.

We all know how it is. We don’t all know how to use technology. We don’t know how to edit videos. We’re not all technologically savvy. And we do know those that are, that would love that kind of challenge. But there are many that do struggle and it’s not fair for them.

Another example of my student or my college using GoReact. We really struggled with deciding whether our students need to pay for their own subscription or whether the university would pay for just a license for GoReact. My department really wanted that license. That’s what they were really advocating for. But the college, they were very resistant because allowing a license for one program. Was it worth it? And so we really fought back and we explained GoReact would benefit so many different programs and departments, and we really tried to push this and they had no idea. It was a little bit of a struggle. And that’s my current battle right now. We’re still working through this. So our students are on self-pay currently. They’re paying for their own subscription plus tuition, and it’s kind of crazy. It’s expensive. We are adding one more expense to those students, but currently, it’s their responsibility.

Digital resources, digital materials. Often, it’s really popular to go with that subscription plan and each student will pay for their own access for different platforms, different websites, and we know that that’s not ideal. The best practice is to have Canvas be a central point for everything, and we branch out from there.

These subscriptions that students are paying for, it would be easier if they could just access it directly through Canvas. They could access the materials and they would have maybe the textbook there. They wouldn’t have to buy an extra textbook, but they could have the ability to revisit that textbook later. So if they do pay for a textbook, at least it’s available for them later. As far as the subscription, once they stop their payments, then all of that information is gone. So I have been really trying to consider weighing the pros and cons between a textbook and paying for those subscription costs.

As far as an ASL class, we have expressive homework. So they practice making a video while they’re signing. That’s typically recording themselves using GoReact. And I mean, there are many different platforms. There’s Zoom. There’s Panopto. There are many different platforms that you can, but my school uses GoReact. And it’s great. It’s set up. It’s easy. If a student isn’t able to afford a subscription, then what do they do? Are they not able to be involved in the class? How does that work? Do we send them away? And that’s something that we struggled with. I do want ASL to be accessible to all. All of our students should be able to be involved.

We have all of these technologies that students need to learn how to use and that teachers need to learn how to use. And we can teach them how to use it. It might not be a problem, but there is such things as digital fatigue where we learn so many tools and we’re just kind of worn out from that. And sometimes these students feel like one more tool and then it becomes a negative experience immediately because they just don’t want to learn one more thing. And for the rest of the semester, they’re kind of jaded towards that technology. And technology’s supposed to be fun. It’s supposed to be a positive experience, but sometimes that does happen by adding too many tools.

Okay. How much time do we have left? Eight minutes. Okay, eight minutes. Okay. I’ll get moving then. Okay. As far as receptive and cognitive assignments, they have to produce videos, and sometimes it creates a time cost, a money cost, and if you want to be able to watch those videos smoothly, they need to have a strong internet connection as well. Also, these videos are 2D. They’re not 3D, which American Sign Language is a 3D language. So sometimes it’s hard to see. Sometimes the students, they are not able to use their 3D space because of that. So there are challenges in regards to using a 2D space within sign language.

Okay. So as far as ASL, having technological knowledge, that’s limited. There are so few options. And this whole diagram really affects the student experiences. As far as ASL, we’re limited. I can’t empower these students to take responsibility for their own learning with different technologies. I really have to give them specific technologies and specific ways they can use those technologies. And so there’s no empowerment for their own independent studies.

And then wrap up. My call to action. This is for all of you to take that technology and digital tools developers to really start thinking about ASL, think about deaf education, and how they can use these new technologies that are available. We need to stop thinking, “Oh, they’re not included. Everybody but them.” If we do that, we’re really just neglecting the problem. And it could become the reason why ASL is becoming extinct and eventually disappears because we are not recognizing ASL as needing these technologies. So I really encourage you to continue to watch for how these technologies can be included with your ASL instruction.

And sometimes it’s a struggle. Sometimes it’s really hard. So that is my call to action. It’s to really just be creative and figure out how we can get these tools in the hands of our ASL learners. Okay. Three-hour lecture in one hour. We got this. Thank you so much.

I agree. It’s really important to have these language learners to be able to use these technologies. So I’m looking for more inclusion in American Sign Language. Yeah. So I know we’ve seen technology like the signing gloves and the interpreting gloves and all of that, and that’s great, but is that an everyday use technology? No, it’s not something that we can use every day. So I think that we need to continue to investigate all of these technologies that are becoming available. So if you want to talk more, feel free to contact me. You can send me an email anytime. I really love getting to know people.