As much as I dig my iPhone and HP Touchsmart touchscreens, there’s something that I still miss about actually feeling buttons under my fingertips. And while I’ve gotten pretty good at typing without even looking at the screen anymore, I’ll never achieve the kind of touch typing speed I’ve hit on a traditional keyboard. So I was really intrigued when I came across this concept technology that eventually make touchscreens just as tactile as the real deal.
Carnegie Mellon University grad student Chris Harrison and CS professor Scott Hudson have developed a tactile interface that lets you actually feel and press virtual buttons that emerge from a touchscreen. The prototype uses a combination of rear-projected images, infrared sensors and a layer of flexible latex from which the buttons rise and fall.
A custom cut acrylic layer behind the buttons gives them their shapes, and air is pumped into and out of the chambers behind the buttons to automatically change the button states as the images change.
The system is capable of detecting more than one simultaneous press using readily available infrared multi-touch technology, and can even figure out how hard you’re pressing on the buttons by monitoring air pressure. To see the system in action, check out this video clip:
While the prototype version is clearly not practical or cost-effective for real-world use, the overall concept is really cool, and I’d love to see them figure out a way to miniaturize the technology so it could be crammed into a mobile device some day.
[via Technology Review]