A Calit2 photographer took great photos of the Einstein robot earlier this year:
http://www.flickr.com/photos/calit2/sets/72157614016199411/
the photo credit for any of these robot images on Flikr is:
UC San Diego / Erik Jepsen
A hyper-realistic Einstein robot at the University of California, San Diego learned to smile and make facial expressions through a process of self-guided learning. The UC San Diego researchers used machine learning to “empower” their robot to learn to make realistic facial expressions.
“As far as we know, no other research group has used machine learning to teach a robot to make realistic facial expressions,” said Tingfan Wu, the computer science PhD student from the UC San Diego Jacobs School of Engineering who presented this advance on June 6 at the IEEE International Conference on Development and Learning.
The faces of robots are increasingly realistic and the number of artificial muscles that controls them is rising. In light of this trend, UC San Diego researchers from the Machine Perception Laboratory are studying the face and head of their robotic Einstein head in order to find ways to automate the process of teaching robots to make realistic facial expressions. Read the full story here.
Snapshots from the UC San Diego Jacobs School of Engineering.
Tuesday, July 7, 2009
Robot Learns to Smile
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment