A professor at the National Institute for the Deaf is rolling out an artificial intelligence tool to help bridge language learning gaps between American Sign Language and English.
Erin Finton, a lecturer in the college’s Department of Liberal Studies, said over the past 15 years she has developed the curriculum that the Grammar Laboratory AI tool is based on.
“We know that American Sign Language is powerful and effective at reaching students directly,” Finton said through an interpreter. “It's the best language in the classroom, but there's no standardized curriculum for teaching literacy skills, and so this is the beginning of that, especially for students at this age level.”
The Grammar Lab includes video explanations of English grammatical rules using ASL with English captions and voiceovers — and it moves through lessons at the pace of the student.
“ASL and English are quite different. They're totally distinct languages. They have separate word order, different grammatical structures, syntax,” Finton said. “So both grammars have the same concepts being represented, but they're expressed in different modalities, in different ways. And this is ... a very cool app in that it bridges both of those languages.”
An AI agent generates prompts for students to practice literacy skills and provides answers to vocabulary questions. The tool was developed with support from Google.
“I think part of our job as teachers is to let students take charge of their own learning,” Finton said. “I shouldn't have to work directly, one-on-one with a student all the time for them to be successful in their education. And so it's great to have an app where students can engage with it on their own... so they're able to continue that learning process outside of the classroom.”
NTID alum Sam Sepah, an AI accessibility research program manager at Google, worked with Finton on the project.
“There's no standard approach out there, or tools, for deaf and hard of hearing students," Sepah said. “You might have some kind of tutor AI tools that are out there, for sure. However, when you look at those tools closely, there aren't many that adapt to a deaf and hard of hearing students’ needs,”
The app uses a Natively Adaptive Interface, meaning that accessibility features are designed at the beginning of the development process.
“It's a co-design philosophy, really, where you design in line with the user who you're targeting,” Sepah said. “You don't consider them after something's been built and try to adapt it retroactively to their needs but actually accommodate from the very beginning and incorporate them.”
Finton worked with a team on the engineering side of the project to build the agent, Sepah said.
“If a student didn't understand something in an English sentence and struggled with English grammar, they could ask the agent to kind of clarify. The agent then would customize and help — to their reading level — to help them understand what that English sentence was trying to get at,” he said. “And that's the concept of the adaptive interface.”
Going forward, Finton said this could expand to include more lessons, and reach more people.