UBC Reports | Vol.
51 | No. 8 |
Aug. 4, 2005
Chew on This
Unique robotic jaw will advance speech research
By Brian Lin
A group of UBC researchers have developed a unique robotic jaw to help them better understand the role jaw movements play in perceiving and understanding face-to-face
The model is the first anthropomorphic robotic jaw to be built with six degrees of freedom - roll, pitch, and yaw rotations, as well as vertical, horizontal and lateral translations. It is not only capable of accurately reproducing the complex motions of the human jaw, but has the ability to extend normal motions up to three times, literally creating jaw-dropping effects.
“The extent to which the mechanical jaw can simulate both normal and exaggerated human jaw motions makes it a great tool in speech therapy and research,” says Sid Fels, an associate professor in electrical and computer engineering and director of the UBC Media and Graphics Interdisciplinary Centre (MAGIC).
In fact, Japan’s Advanced Telecommunication Research Laboratory (ATR), an independent research and development
corporation, has been closely following the development of the robotic model in order to adopt part of the design for its Infanoid, an upper-torso humanoid the size of a three-year-old child. With expressive eyes, lips and hands, the Infanoid has been helping researchers from around the world learn how young children communicate with others.
“Without an animated jaw, however, the Infanoid lacks some of the most important visual cues in non-verbal communication,” says Edgar Flores, a robot engineer who designed and built the robotic jaw from scratch. “This deficiency hampers the child-to-humanoid social interaction.
“Research has shown that the jaw is a vital part of non-verbal communication,” says Fels, who is exploring how the jaw supports or negates verbal communication in adults.
Eric Vatikiotis-Bateson, a Tier 1 Canada Research Chair in the Department of Linguistics, participated in conceiving and directing the project to study the link between speech production and speech perception.
“For example, if the jaw isn’t moving naturally during speech, regardless of how subtle the inconsistency, at one point the listener begins to lose confidence in what the speaker is saying,” says Fels.
“If the inconsistency continues or worsens, the listener would eventually shut out nonverbal cues altogether and their ears take over again.”
Flores, who designed 3D simulation software that controls and monitors the robotic jaw’s every move -- down to the cogwheel - says he and Fels discovered other potential applications of the model after they showed it off at conferences for acoustic professionals.
“We’ve been told that the model may serve as an improvement to current tools for studying chewing cycles, denture fitting and other orthodontics work,” says Flores. “It also has great potential in entertainment, linguistics, psychology, and human-computer interaction.”