The first time we featured programmer Will Powell, we learned how he was able to make a crude version of Google’s Project Glass augmented reality glasses. It turns out that Powell has made another version of his hack that is capable of translating spoken language and displaying the translation in subtitles.
Like with his earlier project, Powell used a pair of Vuzix STAR 1200 glasses as the base of the hack. If I understood what Powell said on his blog, a Jawbone Bluetooth microphone picks up the audio and sends it to a mobile device, which then processes the words using translation API made by Microsoft. The translation is then passed on to a Raspberry Pi, which sends a text of the translation to the Vuzix display and a transcript of the conversation taking place to a TV. Below is a shot of the subtitle being displayed on the glasses’ monitor:
And here’s a shot of the transcript on the TV:
Finally here’s a demo of the hack in action. Note that there is a significant delay in the translation, which according to Powell occurs mainly when the audio goes through the translation API.
The sheer number of gadgets needed plus the fact that the Raspberry Pi is physically connected to the glasses via an S-video connector means that this is not a portable system, but I am still amazed at what one man armed with off the shelf parts can do. Besides, all devices – including the ones Powell needs – get more powerful and smaller in time. The time when we’ll be able to reenact Casa de mi Padre is closer than we think.