Analyzing Human Appearance as a Cue for Dating Images



Geofaces

Abstract:

Given an image, we propose to use the appearance of people in the scene to estimate when the picture was taken. There are a wide variety of cues that can be used to address this problem. Most previous work has focused on low-level image features, such as color and vignetting. Recent work on image dating has used more semantic cues, such as the appearance of automobiles and buildings. We extend this line of research by focusing on human appearance. Our approach, based on a deep convolutional neural network, allows us to more deeply explore the relationship between human appearance and time. We find that clothing, hair styles, and glasses can all be informative features. To support our analysis, we have collected a new dataset containing images of people from many high school yearbooks, covering the years 1912-2014. While not a complete solution to the problem of image dating, our results show that human appearance is strongly related to time and that semantic information can be a useful cue.

People:

Highlights:

Related Papers

Dataset:

In our dataset, we have in total 719 229 images of face patches, and 565 069 images of torso patches. Please contact us by email to receive access to the database.

Models:

face2year
torso2year

See Also:

The work of Ginosar et al. was developed concurrently and independently from our work. They use yearbook imagery and apply weakly-supervised data-driven techniques to analyze appearance trends.