A magazine where the digital world meets the real world.
On the web
- Home
- Browse by date
- Browse by topic
- Enter the maze
- Follow our blog
- Follow us on Twitter
- Resources for teachers
- Subscribe
In print
What is cs4fn?
- About us
- Contact us
- Partners
- Privacy and cookies
- Copyright and contributions
- Links to other fun sites
- Complete our questionnaire, give us feedback
Search:
Emotional Glasses
by Paul Curzon, Queen Mary University of London
It's fun to add emoticons to messages, and they help ensure people understand our feelings. They are helping some people understand feelings face-to-face too, with a bit of help from an Artificial Intelligence.
Reading faces
We take it for granted that we can look at someone's face and tell whether they are happy or sad, angry or surprised. Autistic children, however, often struggle to understand people's expressions. When anxious we also all tend to avoid eye contact. Some autistic children do that all the time. They are then even less likely to see the clues in people's faces, and so start to understand emotions. This can then make it harder to make friends.
From robots to glasses
Many hi-tech ways have been tried to help autistic children learn about emotions. One, for example, involves letting them play with robot 'friends' as some find the cartoon-like expressions on a robot face more comfortable and easier to follow. A different approach is based on wearable technology. Researchers at Stanford University have created a program for autistic children that works out people's expressions and displays an emoticon of them in a pair of smart glasses.
An AI reading faces for you
A camera in the glasses records what the wearer sees and the Artificial Intelligence (AI) program detects any faces. This kind of technology is also used in smartphones to detect faces in your photo collection. It uses 'machine learning': the program learns what a face is by being shown lots of images, some with and some without faces. The program uses all that data to work out the patterns in an image that mean there is a face. It then uses that pattern to spot new faces.
In a similar way it can be trained on faces with different expressions. A training set of faces are used that are labelled with the emotion in that image. This allows the program to spot what pattern in a face makes a happy face, what makes a sad face, and so on. Having recognised an expression, the glasses finally act as a screen and show an emoticon, such as a smiley, corresponding to that expression. Superimposing digital images on the real world like this is called augmented reality. It makes looking at faces like a game and means that the child can use the emoticon to understand what the person in front of them is feeling. It also means they can start to learn for themselves - almost like the AI! The AI is labelling the faces for them, just as people had done for it. With the glasses, autistic children can be sure what each face is actually saying rather than having to guess. Eventually they might then form their own rules and so do it on their own.
Making a difference
The Stanford system was trialled with autistic children in their own homes. They used the system for several months and their parents found it made a clear difference. By the end many of the children were engaging much more with their family including making a lot more eye contact.
Emoticons are making a real difference to their lives.