This week in the television show Scheire en de Schepping we saw a rather clever combination of two interesting subdomains of HCI: voice recognition and augmented reality. It was about glasses that show subtitles for the conversation the user is having. These are called “SpraakZien” glasses (we find the name rather unimaginative), and they could be a handy tool for people with a hearing impairment.
Alas, we didn’t find a video showing the technology, but here’s something even better (/s) : a list of articles about it! (A little heads up: the fourth article is NSFW.) Also, there’s a photo on the top of this page (think of it as a paused video).
For most deaf people, I think these glasses are risky. They could weaken the users ability to lip-read, and when the glasses fail, he or she might be in trouble. Also, when talking to someone the user wouldn’t be looking that person in the face anymore, which could be annoying.
On the other hand this technology allows the user to understand when someone they can’t see is talking, for example when shouting their name to draw their attention. There is another problem though: the glasses are very visible. I’m not sure, but maybe people with a hearing impairment don’t want to be recognized as such in public. A Google Glass device comes to mind, but an even better solution would be bionic contact lenses.
For the brave souls who made it to the end of this post, as a reward here’s another AR application: The Sulon Cortex (with video this time :p).