It's virtually a face off! Scientists have developed a software which they claim
can change the sex of a person on a computer by taking a live video feed of a person talking. The software has been developed by computer scientist Barry-John Theobald at the University of East Anglia in the UK and Iain Matthews, formerly at Carnegie Mellon University and now at Weta Digital in Wellington, New Zealand.
In fact, according to the scientists, the software can take a live video feed of a person talking and make them look and sound like somebody else could actually change that, the 'New Scientist' reported. In their research, the scientists recorded video of volunteers performing 30 different facial expressions such as frowning, smiling and looking surprised. For each expression, the positions of key facial features, such as the eyes, nose and corners of the lips, were manually labelled.
That annotated footage was used to "train" software to recognise the face of each individual featured in the set. Once trained on a person in this way, it can closely track
every move of their face in video footage. Those movements can then be transferred onto the face of another "known" person by calculating how the recipient's features need to change to take on each new expression. Doing that and displaying the transformed face takes just 150 milliseconds, fast enough to allow a conversation over video link to continue in real time.
To complete effect, a person's voice can be manipulated to match their new face. Volunteers were asked to chat to one another in a video conference, but did not know if the face they saw was really that of the person they were talking with," or indeed
if the other volunteer was seeing their own true face.
"The results suggest that our body language during conversation is more reactive to that of others than it is to their physical appearance. We've shown you can present a
female as herself or as a male, and the other participant's behaviour doesn't change," Theobald said. The results will soon be published in the Journal of Experimental Psychology.
No comments:
Post a Comment