Top boffin wants computer interface to be more natural

A top boffin is working out a way to make computers understand inputs without a keyboard or mouse.

According to Ground Report, Binghamton University’s Lijun Yin wants to find more comfortable, intuitive and intelligent way to use the computer. He thinks it should work as if you are talking to a mate.

Yin’s team has worked out ways to provide information to the computer based on where a user is looking as well as through gestures or speech.

His next plan is to get the computer to recognise a user’s emotional state. He has established six basic emotions, anger, disgust, fear, joy, sadness, and surprise and is working out how to get the computer to tell them apart

He is working with Binghamton University psychologist Peter Gerhardstein to explore ways this work could benefit children with autism. Autistics, like computers, have difficulty interpreting others’ emotions; therapists sometimes use photographs of people to teach children how to understand when someone is happy or sad.

Yin and Gerhardstein’s previous collaboration led to the creation of a 3D facial expression database, which includes 100 subjects with 2,500 facial expression models.

Armed with this information and adding in artificial intelligence he thinks it is possible to create a virtual-person model.

Yin thinks it is possible to have a computer understand how you feel.

Although the way my computer is responding at the moment, it might be better it does not know what I am thinking of doing to it.

Yin talks about his work here: