Robot Faces: How AI is Making Them Less Creepy
- Researchers at Columbia Engineering have developed a robot capable of learning facial lip movements for speaking and singing, a significant step toward more lifelike humanoid machines.
- Humans instinctively focus on lip movements during face-to-face interactions, dedicating nearly half of their attention to them.
- The Columbia Engineering team's findings, published on January 15 in Science Robotics, detail how the robot learned through observation rather than pre-programmed rules.
Robot Learns to Mimic Human Lip Movements
Table of Contents
Researchers at Columbia Engineering have developed a robot capable of learning facial lip movements for speaking and singing, a significant step toward more lifelike humanoid machines. The breakthrough addresses a key challenge in robotics: creating natural and convincing facial expressions,notably lip movements,which humans heavily rely on during dialog.
The Challenge of Robotic Facial Expressions
Humans instinctively focus on lip movements during face-to-face interactions, dedicating nearly half of their attention to them. Though,robots frequently enough struggle to replicate these movements convincingly,frequently exhibiting stiff or exaggerated motions that contribute to the “Uncanny Valley” – a phenomenon where robots appear unsettling rather than lifelike. Even minor inaccuracies in facial motion can instantly stand out.
how the robot Learned
The Columbia Engineering team’s findings, published on January 15 in Science Robotics, detail how the robot learned through observation rather than pre-programmed rules. The robot initially learned to control its 26 facial motors by watching its own reflection. It then studied hours of human speech and singing videos on YouTube to understand natural lip movements.
The robot demonstrated its ability by forming words in multiple languages and even performing a song from its AI-generated debut album, “hello world_.”
“The more it interacts wiht humans, the better it will get,” said Hod Lipson, James and Sally Scapa Professor of Innovation in the Department of Mechanical Engineering and director of Columbia’s Creative Machines Lab.
overcoming Technical Hurdles
Creating realistic robotic lip motion is tough due to the complex hardware and coordination required. Human faces utilize dozens of muscles beneath soft skin, allowing for fluid movements. Most humanoid robots have rigid faces with limited motion, relying on fixed rules that often result in unnatural expressions.
The Columbia team addressed these challenges by designing a flexible robotic face equipped with a high number of motors.
See a video of the “Lip Syncing Robot” here: Link to Video
