Computer-aided techniques for analyzing human facial expressions are becoming increasingly complex and efficient, and a Bradford University researcher, Hassan Ugail, is leading a new project in this field.
Specifically, the researcher is creating, together with his team, a software that is able to identify false facial expressions, the ones we constantly do when we talk to a person, from the half-smiles to the movements of the eyebrows or the mouth. The software is able in particular to analyze the smile on the face of a person and to determine whether it is authentic or not.
To do this it analyzes even the slightest movements of other parts of the face, something that a person can hardly do even because these minimal movements have a very short duration. The researchers used the machine learning technique: they included various data, basically images of people expressing real smiles or images of false smiles, in a computer program. After the “training” phase, the software was able to find significant differences in the slightest movements of the mouth and cheeks and was able to identify the real expressions by discarding the false ones.
As the same Ugail specifies, there are minimal differences between a true and a false smile and this concerns above all the major zygomatic muscle, which causes the mouth to curve upwards, and the muscle of the orbicularis oculi, which is responsible for those classic folds that form around the eyes when you smile: “In false smiles are often only the muscles of the mouth that move but, as humans, we often do not detect the lack of movement around the eyes. Computer software can detect it much more reliably.”