Friday, April 4, 2014

New technology allows people to perceive color and shape through music - App Helps the Blind "See" With Their Ears

Screenshot photos from an iPhone of an app that helps visually impaired people use tones to identify objects.
The EyeMusic SSD app (shown on an iPhone screenshot above) translates visual objects into musical tones. PHOTOGRAPHS BY DR. AMIR AMEDI LAB, HUJI, EYEMUSIC
by Roni Jacobson | National Geographic | Apr 3, 2014
A woman who has been blind since birth sits at a table with a bowl of mostly green apples in front of her. When asked to find the single red one, she plucks it from the bowl without hesitation and holds it up to applause from the audience.
It's not a magic act but a demonstration of a new app that enables the visually impaired to hear information usually perceived through sight. The woman is wearing headphones and a miniature camera attached to a pair of glasses, which are connected to a laptop on the table. A series of musical cues—which combine into a pleasant tune—let her know the color, shape, and location of the fruit.
The app, called EyeMusic SSD (for sensory substitution device), uses a computer algorithm to construct a "soundscape" that conveys visual information through musical notes. With training to extract meaning from patterns of notes, people who have been blind since birth can learn to read letters and numbers, tell what they are looking at from far away, and even recognize facial expressions and body postures, says lead developer Amir Amedi, a neuroscientist and head of the multisensory research lab at the Hebrew University of Jerusalem.
When someone is smiling, for example, a visually impaired person would hear a string of high notes descending and then ascending again (sort of a U-shaped curve, like a smile). Frowning would be the opposite—low notes ascending and then descending.
Although the idea for tools like EyeMusic has been around since the 1960s, earlier sensory substitution systems typically required a computer and were unpleasant to use for long periods of time, Amedi says. He and his team recently developed sensory substitution software that can run on a smartphone, minimizing the need for a lot of clunky equipment and making the technology easier to use.
After downloading EyeMusic, users simply plug in their headphones and hold their phone up to the scene in front of them to start listening to their surroundings. The camera in the phone scans the environment every two seconds, and EyeMusic builds the image pixel by pixel.
"It's a little bit like how old TVs worked," Amedi says. Each pixel contains multiple auditory cues corresponding to different bits of visual information, which are then assembled into a musical composition representing the entire picture.
Seeing Through Sound: How It Works
Notes played earlier in the sequence correspond to things located toward the left of the scene, and notes played later correspond to things toward the right. Height is indicated by pitch—with higher notes signaling that something is toward the top of the scene and lower notes signaling that it is toward the bottom. A straight line across, therefore, sounds similar to a flat line on an EEG, and a diagonal line from left to right would sound like a descending scale.
Color is conveyed through different types of instruments. White, for example, is signified by human voices, blue by a trumpet, yellow by a violin, and red by reggae chords played on an organ. Black is represented by silence.
So looking at a busy street, a person using EyeMusic might hear an approaching taxi as a violin sequence growing louder as the vehicle gets closer, and a crosswalk as two low, even tones played in unison.
"The general concept is that you don't need to teach each object individually, you teach the principles—just like the brain understands the principles of dots and lines and how to combine them," Amedi says.
The program takes about 70 hours to master. Training sessions teach users how to identify various broad categories of objects, such as faces, bodies, and landscapes, which are each processed in a different area of the visual cortex in the brain.
Each category has its own set of unique characteristics, Amedi says. For instance, a musical scale that goes up and down linearly indicates buildings and houses, which are made up of straight lines and 90-degree angles; the vertical lines of buildings can also be signaled by short bursts of several notes played at the same time. For faces, which are rounder and have softer contours, the musical scale descends and ascends exponentially.
Blind Since Birth?
Brain imaging studies by Amedi and his team show that when people who have been blind since birth use EyeMusic to "see," they activate the same category-dependent processing areas of the brain as sighted people.
Body language and posture, for example, are normally processed in a part of the visual system known as the Extrastriate Body Area. Even though they may have never actually seen a body, blind people show activation in this brain area when looking at people with EyeMusic. Instead of traveling through the visual cortex to get there, however, the signal enters the brain through the auditory cortex and is then diverted to the proper spot.
"Everyone thinks that the brain organizes according to the senses, but our research suggests this is not the case," Amedi says. The brain is more flexible than we realize, he adds—we just have to find the alternate routes to tap into areas previously blocked.
http://news.nationalgeographic.com/news/2014/04/140403-eyemusic-ssd-visual-impairment-software-science/

No comments:

Post a Comment