Vision Under Changing Scene Appearance: Describing The World Through Light And Symmetries

Other Titles
Change is an inexorable aspect of the world that surrounds us. Night gives way to day as the Earth rotates around its axis, weather changes, buildings decay. All these changes alter the appearance of our surroundings. In trying to understand our world it is sometimes useful to factor out changes that are not important to the subject of our study, for instance when attempting to determine if two pictures taken decades apart depict the same building it is useful to ignore the cracks and peeling paint that are due to aging, while other times a careful examination of change might reveal interesting phenomena, like how the shading produced by sunlight on objects surrounding us can tell the time of day. In the first part of this thesis we examine changes in light that reveal information about materials and geometry. We introduce a simple pixel-wise statistic  that we show is linked to ambient occlusion, a measure of light accessibility. This simple realization allows us to recover the albedo for the scene, which then allows us to obtain the lighting of each input image. We start our study by focusing on a simple setup, a static scene and camera where each image is captured under varying but unknown lighting. We then extend this foundation in two ways, both of which apply to Internet photo collections of outdoor landmarks. This presents a much more challenging source of data as cameras are radiometrically uncalibrated and not registered, natural lighting is much more complex than what our initial model expects, and occluders obscure parts of the scene. First, we show how physically based models of outdoor illumination developed in the computer graphics community can be used to incorporate many of the subtleties of outdoor illumination into our algorithm, such as the influence of geolocation on the sun's path in the sky, and the changes in color and intensity that occur over the course of a day. This advance allows us to correctly estimate illumination for outdoor scenes, which we show is useful in estimating the correct timestamp for images. Second, we show how the estimated lighting can be digested into a novel image descriptor, one that captures the distribution of light in a scene in a format that is independent of geometry. This descriptor allows one to reason about many phenomena that are linked to lighting, such as weather conditions, and time of day. It also enables queries to an image database based on how light is distributed in the scene, irrespective of geometry. In the second part of this dissertation we look at change from another angle by tackling the problem of image matching. We ask ourselves how can we match challenging image pairs of architectural structures when changes in the images are too drastic for traditional methods to work. We devise novel feature detector and descriptors based on local symmetries, a mid-level cue that we show can be more robust to drastic changes than the more traditional edge based methods.
Journal / Series
Volume & Issue
Date Issued
Ambient occlusion; Intrinsic images; Local symmetry
Effective Date
Expiration Date
Union Local
Number of Workers
Committee Chair
Snavely, Keith Noah
Committee Co-Chair
Committee Member
Edelman, Shimon J.
Bala, Kavita
Degree Discipline
Computer Science
Degree Name
Ph. D., Computer Science
Degree Level
Doctor of Philosophy
Related Version
Related DOI
Related To
Related Part
Based on Related Item
Has Other Format(s)
Part of Related Item
Related To
Related Publication(s)
Link(s) to Related Publication(s)
Link(s) to Reference(s)
Previously Published As
Government Document
Other Identifiers
Rights URI
dissertation or thesis
Accessibility Feature
Accessibility Hazard
Accessibility Summary
Link(s) to Catalog Record