Accessibility Research

Jun ‘16 – Aug ‘16

At the Center for Accessibility and Inclusion Research, Rochester Institute of Technology, Rochester, New York

The EPSON Moverio BT-200 smartglasses on which we projected the ASL interpretation

I developed smartglasses to help deaf/Deaf and hard of hearing (DHH) students learn better in mainstream classrooms. Working with two other students, we tested whether displaying the American Sign Language (ASL) translation on the glasses would improve lecture comprehension, compared to just having an interpreter in the room.

I learned how to program in Android Studio and improved the application (created the previous summer) that allows users to adjust the video viewed on the glasses by using one’s smartphone. I helped design and conduct our experiment and then analyze its results. We found no significant positive nor negative effect in comprehension with the smart glasses, but participants were enthusiastic about the technology and saw many benefits for their own lives. We expected a more significant effect to be found in a future experiment involving students taking notes, as opposed to only watching a lecture.

The setup for our experiment: the teacher was was on the left monitor, the interpreter was on the right monitor, and the slides were projected up front. When the participant wore the glasses, the right monitor was turned off.

I presented at RIT’s Undergraduate Research Symposium and co-authored a paper accepted to the CHI ’17 conference. This experience strengthened my interpersonal skills with the fun challenge of communicating with my DHH peers while still learning ASL. This REU also sparked my passion for accessibility research.