A magazine where the digital world meets the real world.
On the web
- Home
- Browse by date
- Browse by topic
- Enter the maze
- Follow our blog
- Follow us on Twitter
- Resources for teachers
- Subscribe
In print
What is cs4fn?
- About us
- Contact us
- Partners
- Privacy and cookies
- Copyright and contributions
- Links to other fun sites
- Complete our questionnaire, give us feedback
Search:
Making faces
How do you get a robot to smile? Put it in front of a mirror. Researchers at the University of California, San Diego, have made a robot that can teach itself facial expressions. Made up of 31 artificial muscles nestled underneath a slightly creepy Einstein mask, their robot has so far used machine learning to smile, frown, look surprised or look angry.
The robot watched itself in the mirror as it moved its facial muscles in essentially random ways. Researchers call this ‘body babbling’ and it may be how babies learn to imitate the people around them. The idea is that when the baby randomly makes a ‘smiley’ expression, it gets positive feedback from the grownups around it who think it’s cute. That helps it learn that a smile is a positive expression. A frown, on the other hand, might prompt the grownups to start trying to find out what’s wrong. The robot didn’t have any parents around to give it feedback, though, so instead it had a program that could recognise human facial expressions. Whenever it arranged its artificial features in a way that looked like a real expression, the program gave the robot a reward signal. After that, the robot was more likely to make expressions that would bring it the reward.
The researchers think that figuring out how a robot could learn facial expressions might tell them more about how babies do it. There are already studies that involve watching babies learn expressions, but the group at the University of California compare their study to actually having a baby.
Their next step is going to be to try and get the robot to socialise with humans, and mimic their expressions in real time. Once it can do that, the researchers think they could be on the road to using it as an automatic tutor for students. One-on-one tutoring helps students do better, and a robot that can make appropriate facial expressions would be able to approximate a human tutor much better.
Machine learning has also been used to make a robot respond to human speech with appropriate facial expressions, in a robot called Blade. And in Japan, a robot called CB2 is trying to learn to read humans' facial expressions. It looks like we could be seeing a lot more expressive robot faces in the future!
Source: Wired Science