CSUN prof makes touchscreens accessible to those without use of their arms

0
30

Vehicular accidents, electric shock, injuries in war and birth defects can result in temporary or permanent paralysis of upper limbs, leaving millions of people worldwide with limited use of their hands and unable to interact with computers and other input devices designed for hand-use.

With a simple swipe of a tongue, CSUN computer science professor Li Liu and his students hope to open the doors to the touchscreen world of smartphones and tablets for those with upper-body mobility challenges.

Some assistive technology currently on the market can respond to eye movement, but are limited in their ability to do a number of tasks and lack precision in their responses.

“It was during a CSUN Conference (CSUN’s annual Assistive Technology Conference) a couple of years ago when I really got inspired to look for a more efficient and effective way for people who cannot use their arms to interact with their computers,” Liu said.

By tapping into the capabilities of the cameras embedded in laptop computers, Liu and a collaborator from Virginia Tech, Shuo Niu, developed a program that allows the camera’s imaging processing feature to zero in on the motions of a person’s tongue and translates it into action — whether it’s to move the cursor on the screen or type texts in different software applications on a desktop computer.

As assistive technology transitions to mobile interfaces, Liu is continuing the development of his tongue-computer interface for mobile platforms. Liu and his recent graduate student Garret Richardson created a system-level service with TensorFlow Lite, an open, deep-learning framework from Google, and they added the service to the Android ecosystem.

Leave a reply