Touch Screen Interaction and Essential Tremor

Recently, Rebecca posted on the topic of Accessibility and the Smart Phone and how “touch and swipe” technology on smart phones affects how people with essential tremor (ET) use such devices.

Today, I saw an article about a new developing technology by Qeexo called FingerSense and TapSense that may or may not make smart phones easier depending on the degree of ET severity. Researcher Chris Harrison invented software that expands interaction on smart phones from simple touch to also recognize different kinds of taps. FingerSense technology allows screens to know how the finger is being used for input: fingertip, knuckle or nail.

FingerSense Overview | Qeexo.com from Qeexo on Vimeo.

This refined level of touch on screens may present a challenge for people with ET, if this technology is incorporated into devices.

According to an article in Fast Company, Harrison’s team is in talks with Android handset manufacturers to integrate FingerSense into their phones. FingerSense requires an extra bit of hardware in order to work–an acoustic sensor that can recognize the unique vibration patterns that distinguish among fingertip, fingernail, and knuckle taps. Which means you can’t just download FingerSense from Google Play and magically give your Galaxy Nexus a next-generation user interface–yet.

“We are looking to partner with device makers to integrate this sensor, which our software needs,” Harrison explains.

FingerSense’s two-handed touch screen input gestures seem much more useful for tablets,where two-handed interaction seems likely and practical especially for people with ET.