A different kind of facial reconstruction

|

Using only data from an fMRI scan, researchers led by a Yale University undergraduate have accurately reconstructed images of human faces as viewed by other people. One of the paper’s authors, professor Marvin Chun, called it “a form of mind reading” in a news release.

According the paper, published in the journal Neuroimage, fMRI scans have already enabled scientists to use data from brain scans taken as individuals view scenes and predict whether a subject was, for instance, viewing a beach or city scene, an animal or a building.

One of Chun’s students, Alan S. Cowen, then a Yale junior now pursing an advanced degree at the University of California at Berkeley, wanted to know whether it would be possible to reconstruct a human face from patterns of brain activity. The task was daunting, because faces are more similar to each other than buildings. Also large areas of the brain are recruited in the processing of human faces, a testament to its importance in survival.

Working with funding from the Yale Provost’s office, Cowen and post doctoral researcher Brice Kuhl, now an assistant professor at New York University, showed six subjects 300 different “training” faces while undergoing fMRI scans. They used the data to create a sort of statistical library of how those brains responded to individual faces.  They then showed the six subjects new sets of faces while they were undergoing scans.  Taking that fMRI data alone, researchers used their statistical library to reconstruct the faces their subjects were viewing.

Cowen said the accuracy of these facial reconstructions will increase with time and he envisions they can be used as a research tool, for instance in studying how autistic children respond to faces.

 

Amanda Cuda

Leave a Reply