When you hear the term “no-touch interface”, you probably think of the computers in movies like Iron Man or Minority Report. You may have even heard about paraplegics typing with their eye movements or moving machines with their brain. However, there are three fairly common no-touch interfaces currently available, and I’ll bet you’ve probably used at least one of them.
Probably the most common form of no-touch interfaces are voice user interfaces. If you call your bank for an account balance or call a movie theater ticketing service, chances are that you’ll be talking to some type of voice recognition system. The better the voice recognition software used, the better your experience will be.
Cell phones have had the capability of making phone calls via voice recognition software for a while, and many smartphones now include more advanced voice user interfaces. Siri, which has been built into all iOS devices released since October 2012, is one of the most common of these.
With the popularity of the Siri application, many other companies have developed their own voice user interface. For example, Google Glass, which does have a touchpad interface, can also be controlled by using “voice actions”. As voice recognition software improves, I think that this type of interface will become even more common and part of everyday life.
The most common motion sensor interface is probably Microsoft’s Kinect. First introduced in November 2010 as an add-on to the Xbox 360, Kinect uses motion sensors to turn your entire body into a game controller. This added a physical element to video game play.
Taking a step toward those science fiction interfaces, Microsoft has since released Kinect for Windows, which allows you to interact with gesture-driven applications. One app allows surgeons to use gestures to interact with medical scans and images on a computer without sacrificing sterility. Kinect for Windows also incorporates a voice recognition software, effectively combining two different forms of no-touch interface.
Touchless Touch screen
In recent years, some smartphones have started including interfaces that are being termed as “touchless touch screens”. Pantech’s Perception includes a Motion Sense feature that allows you to make calls or scroll using hand motions instead of touching the screen. Samsung’s Galaxy S4 takes it a step further with their Smart Scroll feature. Smart Scroll uses your phone’s camera to detect head and eye movement, allowing you to interact with content on your phone. Even the gyroscope in your phone, which enables it to auto rotate, is a type of no-touch interface.
As these and other technologies improve, I think that we will continue to interact with our phones and computers in ways previously relegated to science fiction books and movies. The future is closer than ever.