more photos from the robot facial expression story I posted yesterday:
credit: UC San Diego / Erik Jepsen
A hyper-realistic Einstein robot at the University of California, San Diego has learned to smile and make facial expressions through a process of self-guided learning. The UC San Diego researchers used machine learning to “empower” their robot to learn to make realistic facial expressions.
“As far as we know, no other research group has used machine learning to teach a robot to make realistic facial expressions,” said Tingfan Wu, the computer science Ph.D. student from the UC San Diego Jacobs School of Engineering who presented this advance on June 6 at the IEEE International Conference on Development and Learning.
Download the paper at:
Watch an overview video (and read related story) about the Einstein robot research program at the Machine Perception Laboratory at UC San Diego.