A group of students from the engineering technology and industrial design programs of the University of Houston have come up with a concept for a device that can read sign language and translate it into audible words, as well as translate spoken word into sign language. It’s called MyVoice and is meant to be a handheld device with a built-in camera, speaker, soundboard, microphone and monitor.
Yeah, I’m not sure where the monitor is in that mock-up either, but that’s all we have to work with. We’ve seen a hack based on motion sensing designed to teach sign language, but it seems like the students took a different approach with MyVoice – the device relies on a database of images to recognize signs. The students did say that they were able to build a prototype of MyVoice that was able to translate the phrase “A good job, Cougars.” The students hope that they can find partners to turn their concept into a real product.