Boffins working out new ways to replace the tiny touchscreens on computers have hit on the cunning plan of using skin so you can make contact with people just by touching yourself.
According to New Boffin a new skin-based interface called Skinput allows users to use their own hands and arms as touchscreens by detecting the various ultralow-frequency sounds produced when tapping different parts of the skin.
The technology was hatched out by Chris Harrison at Carnegie Mellon University and Dan Morris and Desney Tan at Microsoft’s research lab in Redmond, Washington.
Using Skinput, users tap their skin in order to control audio devices, play games, make phone calls, and navigate hierarchical browsing systems.
A keyboard and menu are beamed onto a user’s palm and forearm from a pico projector embedded in an armband. An acoustic detector in the armband then determines which part of the display is activated by the user’s touch.
Software matches sound frequencies to specific skin locations, allowing the system to determine which “skin button” the user pressed.
The the acoustic detector can detect five skin locations with an accuracy of 95.5 per cent. Which is probably about the same as a touch screen on a mobile. Of course taking it off when you have a shower will involve howling and screaming and the loss of body hair.