The difference between how men and women form their mouths to smile has allowed artificial intelligence (AI) to automatically determine gender by analyzing the underlying muscle movements alone, according to new research by the University of Bradford in the U.K.
Although automatic gender recognition is already in existence, current methods analyze static images and compare fixed facial features. The new study is the first to use the dynamic movement of the smile to automatically distinguish between men and women.
For the study, the researchers mapped 49 points on the face, primarily around the eyes, mouth, and down the nose. They used this information to study how the face changes as we smile caused by the underlying muscle movements. This involves two types of movements: the change in distance between the different points as well as the ‘flow’ of the smile — how much, how far, and how fast the different points on the face moved as the smile was formed.
Next, the researchers looked at whether there were any notable differences between men and women. They found that there were, with women’s smiles being more expansive.
“Anecdotally, women are thought to be more expressive in how they smile, and our research has borne this out. Women definitely have broader smiles, expanding their mouth and lip area far more than men,” said lead researcher Professor Hassan Ugail from the University of Bradford.
Based on this analysis, the researchers developed a new algorithm and tested it on video footage of 109 people as they smiled. The computer was able to correctly determine gender in 86 percent of cases, and the team believes the accuracy could easily be improved.
“We used a fairly simple machine classification for this research as we were just testing the concept, but more sophisticated AI would improve the recognition rates,” said Ugail.
Although the underlying purpose of this research was to enhance machine learning capabilities, the new findings have raised a number of intriguing questions that the team hopes to investigate in future projects: One is how the machine might respond to the smile of a transgender person and the other is the impact of plastic surgery on recognition rates.
“Because this system measures the underlying muscle movement of the face during a smile, we believe these dynamics will remain the same even if external physical features change, following surgery for example,” said Ugail. “This kind of facial recognition could become a next-generation biometric, as it’s not dependent on one feature, but on a dynamic that’s unique to an individual and would be very difficult to mimic or alter.”
The study is published in The Visual Computer: International Journal of Computer Graphics.
Source: University of Bradford