The first five years of a child’s life are crucial for language development — shaping cognitive abilities, enhancing communication skills, and supporting social interactions.
Within the Deaf community, 90 percent of Deaf children are born to hearing parents. Their child may be the first Deaf person a parent ever meets. Supporting hearing parents to gain fluency in American Sign Language (ASL) can have a significant benefit on language development for deaf and hard of hearing children.
University of Rochester researchers Zhen Bai and Wyatte Hall are developing AI-powered, augmented reality tools to help parents of Deaf children learn ASL. They’ve been working with the Deaf community to develop and test the software, including early collaborations with the Rochester School of the Deaf.
Guest host Sarah Abbamonte explores the new technology and its implications with the team.
In studio:
- Zhen Bai, Ph.D., assistant professor in the Department of Computer Science at the University of Rochester
- Wyatte Hall, Ph.D., assistant professor in the Department of Public Health Sciences, the Center for Community Health and Prevention, Neurology, Obstetrics and Gynecology, and Pediatrics, Neonatology at the University of Rochester Medical Center
- Karen Fisher-Malley, director of early childhood programs and Kindergarten at the Rochester School for the Deaf
- Byron Behm, ASL interpreter
*Notes:
- To read a transcript of this episode, click here.
To request a transcript of any episode of Connections, please use this form. - To learn more about this technology or to participate in the research, click here:
https://universityrochester.co1.qualtrics.com/jfe/form/SV_2c41WXtqDQhl6PI