A smart computer system able to read deaf sign language has been developed by a PhD student at the University of NSW
With the help of samples provided by deaf people, we have been able to develop a system that recognizes 95 signs from Australian Sign Language (Auslan) with about 98 per cent accuracy, says PhD student Waleed Kadous.”
Mr Kadous will describe his research at a ScienceNOW! media conference at the Melbourne Museum today (August 20)
Mr Kadous says deaf signers wear special gloves that let the computer know what their hands are doing.
Developing a computer system that learnt how to classify the different Auslan signs required finding patterns like ‘the hand was moving up’.
“The information coming from those gloves is updated 200 times a second, so the characteristics you would look at are not as obvious as you would think,” he says.
“It turns out that very similar techniques can be used to classify a patients’ ECGs with about 72 percent accuracy, about the same as human accuracy.”
According to Mr Kadous, computers are essentially limited because they are not able to learn – they are very fast but not very “smart”.
“Today’s desktop machines can easily add together 240 billion numbers a minute, but can’t do the things a four year old does every day,” he says. “In our field of machine learning we look at how to make computers learn from experiences.”
He is one of sixteen young scientists presenting their discoveries to the media, public and students for the first time, at Fresh Science.
“We’ve selected them from 105 national nominations, brought them to Melbourne, trained them and thrown them to the [media] lions,” said Niall Byrne, Chairman of Fresh Science. “It’s all about focussing public and media attention on Australian scientific achievement.”
Post your comments