In a lab at the University of Washington in the US researchers are testing a new sensor called Sideswipe, which will allow smartphones to recognise hand gestures.
The team of Chen Zao and Matthew Reynolds say the new technology works in a similar fashion to aircrafts and radar.
Matthew Reynolds explains: “If you think about a radar on an aircraft or a boat or something like that, in that case you have a transmitter that is sending energy out into the environment and it is being reflected by objects nearby.”
The object in this case, is a hand, which acts as a mirror reflecting the phone signal back towards its source. Reynolds says that reflected signal can be used as a real time map for the sensor. He says that as the signal flow changes, patterns emerge:
“And what we do is use a machine learning algorithm to match patterns of the changes due to gestures with previously recorded patterns and when we see a match we say ‘oh’ a particular gesture has been performed.”
The team has demonstrated that their sensor worked with 87 percent accuracy using multiple hand gestures. Reynolds says they are now fine-tuning the technology to bring it out of the lab and onto the market.