MIT News has recently published a summary of research by Lynette Jones, a senior research scientist in MIT’s Department of Mechanical Engineering. Her field of research has a number of applications for the blind and deaf, but it could also one day allow the able bodied to read digital information on tactile displays using their skin rather than their eyes. Currently, she wants to ascertain the best configuration of tactile displays on the human body, and her findings will help researchers design the most effective and efficient devices using vibrating motors, like those used in mobile phones.
Jones placed an array of vibrating motors on respondents’ palms, forearms and thighs with some useful results. She found the palm to be more sensitive than the forearm or thigh, but on all three body parts, the respondents identified the four corners of the array better than the inner motors. Jones suggests people will respond to vibrations and other stimuli at the edges of their limbs better than elsewhere. She also found that the placement of the motors affected how the motors behaved. For instance, the forearm and thigh reduced the vibrations of the motors more than the palm.
Jones’ findings will likely contribute to guidelines for the optimal design of tactile displays. One of the main applications mentioned in the article and perhaps the most simple application of such a device is to help drivers navigate without taking their eyes off the road. The GPS connection could also help tourists appear less conspicuous or help emergency workers traverse disaster areas. The MIT News article is the latest insight for a trend alert I wrote a few months ago, Electronic Sensory Expansion.