IM this article to a friend!

March 17, 2005

'Face' phone helps hearing impaired

From: CNN International - USA - Mar 17, 2005

By Julie Clothier for CNN
Thursday, March 17, 2005

LONDON, England (CNN) -- Telephone conversations are difficult if you are hearing-impaired, but a group of scientists has created technology that makes things easier.

Using automatic speech recognition technology, the Synface software -- short for synthetic face -- displays an animated head "speaking" the words being said over the telephone.

The software "listens" to what is being said then displays it in real-time in a "virtual face" on a laptop screen.

The initiative is a joint effort between University College London (UCL) and research groups in The Netherlands, Sweden and the UK, including the charity the Royal National Institute for Deaf People (RNID).

Research into the concept began three and a half years ago.

RNID head of product development Neil Thomas told CNN the software enabled the listener to lip-read what was being said, just as they would in face-to-face conversation.

"Most people, particularly those who are hard of hearing, lip-read to communicate. When you're on the telephone this becomes difficult because you can't see the person who is speaking to you."

Prototypes of the software are currently in field trials in the UK, Sweden and the Netherlands. RNID is overseeing trials in the UK, and Thomas said results showed 100 percent support for the concept.

He said those who have tested it found that the technology made them more confident at making phone calls.

"This technology helps confirm what they thought they were hearing. When a person loses their hearing one of the things that suffers is their confidence in making telephone calls."

He said there was still some developments needed, including improving the level of speech recognition, for it to be suitable for everyday use.

He did not know how much it would cost when it became commercially available but those involved in creating it were keen to keep the cost down, he said.

There is a delay of 200 milliseconds between the person on the other end of the phone speaking and the receiver hearing the words.

This gives the software time to "listen" and display the face on the screen, though the delay is not noticeable and does not interfere with the flow of conversation, Thomas said.

Similar technology already on the market includes video telephony, which required both telephone users to have the technology, whereas Synface required only the receiver to have the software, Thomas said.

The technology can be linked to a laptop, which is then hooked up to any standard telephone.

Thomas said it was intended for people hard of hearing, for whom lip-reading helped them. It would not be suitable for people who were profoundly deaf, he said.

The Synface system could also be used in public voice information channels in noisy environments, including airports.

Dr Andrew Faulkner, of UCL's Department of Phonetics & Linguistics, who has helped develop the software said: "For most people the telephone is an essential part of our lives, in both social and workplace settings.

"For people with a severe hearing loss, however, the phone is virtually useless, putting them at a severe disadvantage. This project seeks to address this imbalance, by using the principles of visual information in speech communication."

© 2005 Cable News Network LP, LLLP. A Time Warner Company. All Rights Reserved.