The University of Sussex-led study – funded by the Nokia Research Centre and the European Research Council – is the first to find a way for users to feel what they are doing when interacting with displays projected on their hand.
Their SkinHaptics system relies on an array of ultrasound transmitters that when applied to the back of the hand, send sensations to the palm, which can therefore be left exposed to display the screen.
The device uses ‘time-reversal’ processing to send ultrasound waves through the hand. This technique is effectively like ripples in water but in reverse — as the ultrasound waves enter through the back of the hand they begin as broad pulses that actually become more targeted as they move through to the other side, landing at a specific point on the palm.
Professor Sriram Subramanian, who leads the research team at the University of Sussex, says that technologies will inevitably need to engage other senses, such as touch, as we enter what designers are calling an eye-free age of technology.
Wearables are already big business and will only get bigger, But as we wear technology more, it gets smaller and we look at it less, and therefore multisensory capabilities become much more important.
If you imagine you are on your bike and want to change the volume control on your smartwatch, the interaction space on the watch is very small.
So companies are looking at how to extend this space to the hand of the user.
What we offer people is the ability to feel their actions when they are interacting with the hand.
You can see a prototype of the SkinHaptics system demonstrated in the video below.