A magazine where the digital world meets the real world.
On the web
- Home
- Browse by date
- Browse by topic
- Enter the maze
- Follow our blog
- Follow us on Twitter
- Resources for teachers
- Subscribe
In print
What is cs4fn?
- About us
- Contact us
- Partners
- Privacy and cookies
- Copyright and contributions
- Links to other fun sites
- Complete our questionnaire, give us feedback
Search:
In the mood to face the computer?
If you've ever found yourself pulling faces at your computer, becoming evermore frustrated as it fails to understand what you want it to do, then help from computer science researchers may be on its way. We use facial expressions to signal to others something about how we feel inside. It's one of the mechanisms we have evolved to let us live together in groups. Scientists are working on systems that teach a computer to recognise these facial expressions, and so get an idea of the moods or emotions those expressions might represent.
A face is a very complex moving image. It's very bendy, but as we grow up we learn how to interpret these movements and turn them into useful measures of facial expression. We can now also teach a computer to recognise changes, as time progresses, of key parts of the face - known as 'spatio-temporal features'. By showing the computer a variety of expressive faces, and some clever algorithms, we can train it to tell the difference between, for example, a smile and a scowl. The results can be used to help develop 'affective computing', that is computers that are aware of, respond to, and change our emotional state. For example, if you run into a problem while working with a particular piece of software, and start to look frustrated or angry, the computer will be able to recognise that emotion in your face, and automatically produce the right set of useful notes to help you. Such sensitive emotion-recognition software may appear on computers within the next few years.
Our plastic pal who is fun to be with?
But there are other applications for emotion-recognition software, for example in the development of 'synthetic companions' for the elderly. Robots can play a useful and valuable role in looking after the elderly or infirm - reminding them to take their medication, or to lock the front door or even perhaps acting as a type of pet you start to care for. But in order to build a relationship with an individual, to empathise with them, you have to be able to have a two-way communication. Expression recognition software would enable the robot to recognise a person's emotions and mood and respond appropriately - including reflecting the appropriate expression on their own face. You don't want a 'friend' who when you are unhappy appears to just smirk! Getting the emotions right is important in making the relationship between individual and carer much more socially realistic.
It's going to be important in the future that humans and robots 'talk the same language', even if no words are spoken. Robots need emotions too!