A system for image-based modeling and photo editing

Thesis (Ph.D.)--Massachusetts Institute of Technology, Dept. of Architecture, 2002. === Includes bibliographical references (p. 169-178). === Traditionally in computer graphics, a scene is represented by geometric primitives composed of various materials and a collection of lights. Recently, techniq...

Full description

Bibliographic Details
Main Author: Oh, Byong Mok, 1969-
Other Authors: Julie Dorsey.
Format: Others
Language:English
Published: Massachusetts Institute of Technology 2005
Subjects:
Online Access:http://hdl.handle.net/1721.1/8511
id ndltd-MIT-oai-dspace.mit.edu-1721.1-8511
record_format oai_dc
spelling ndltd-MIT-oai-dspace.mit.edu-1721.1-85112019-05-02T16:21:49Z A system for image-based modeling and photo editing Oh, Byong Mok, 1969- Julie Dorsey. Massachusetts Institute of Technology. Dept. of Architecture. Massachusetts Institute of Technology. Dept. of Architecture. Architecture. Thesis (Ph.D.)--Massachusetts Institute of Technology, Dept. of Architecture, 2002. Includes bibliographical references (p. 169-178). Traditionally in computer graphics, a scene is represented by geometric primitives composed of various materials and a collection of lights. Recently, techniques for modeling and rendering scenes from a set of pre-acquired images have emerged as an alternative approach, known as image-based modeling and rendering. Much of the research in this field has focused on reconstructing and rerendering from a set of photographs, while little work has been done to address the problem of editing and modifying these scenes. On the other hand, photo-editing systems, such as Adobe Photoshop, provide a powerful, intuitive, and practical means to edit images. However, these systems are limited by their two-dimensional nature. In this thesis, we present a system that extends photo editing to 3D. Starting from a single input image, the system enables the user to reconstruct a 3D representation of the captured scene, and edit it with the ease and versatility of 2D photo editing. The scene is represented as layers of images with depth, where each layer is an image that encodes both color and depth. A suite of user-assisted tools are employed, based on a painting metaphor, to extract layers and assign depths. The system enables editing from different viewpoints, extracting and grouping of image-based objects, and modifying the shape, color, and illumination of these objects. As part of the system, we introduce three powerful new editing tools. These include two new clone brushing tools: the non-distorted clone brush and the structure-preserving clone brush. They permit copying of parts of an image to another via a brush interface, but alleviate distortions due to perspective foreshortening and object geometry. (cont.) The non-distorted clone brush works on arbitrary 3D geometry, while the structure-preserving clone brush, a 2D version, assumes a planar surface, but has the added advantage of working directly in 2D photo-editing systems that lack depth information. The third tool, a texture-illuminance decoupling filter, discounts the effect of illumination on uniformly textured areas by decoupling large- and small-scale features via bilateral filtering. This tool is crucial for relighting and changing the materials of the scene. There are many applications for such a system, for example architectural, lighting and landscape design, entertainment and special effects, games, and virtual TV sets. The system allows the user to superimpose scaled architectural models into real environments, or to quickly paint a desired lighting scheme of an interior, while being able to navigate within the scene for a fully immersive 3D experience. We present examples and results of complex architectural scenes, 360-degree panoramas, and even paintings, where the user can change viewpoints, edit the geometry and materials, and relight the environment. by Byong Mok Oh. Ph.D. 2005-08-23T20:44:32Z 2005-08-23T20:44:32Z 2002 2002 Thesis http://hdl.handle.net/1721.1/8511 50775298 eng M.I.T. theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission. See provided URL for inquiries about permission. http://dspace.mit.edu/handle/1721.1/7582 178 p. 13088112 bytes 13087869 bytes application/pdf application/pdf application/pdf Massachusetts Institute of Technology
collection NDLTD
language English
format Others
sources NDLTD
topic Architecture.
spellingShingle Architecture.
Oh, Byong Mok, 1969-
A system for image-based modeling and photo editing
description Thesis (Ph.D.)--Massachusetts Institute of Technology, Dept. of Architecture, 2002. === Includes bibliographical references (p. 169-178). === Traditionally in computer graphics, a scene is represented by geometric primitives composed of various materials and a collection of lights. Recently, techniques for modeling and rendering scenes from a set of pre-acquired images have emerged as an alternative approach, known as image-based modeling and rendering. Much of the research in this field has focused on reconstructing and rerendering from a set of photographs, while little work has been done to address the problem of editing and modifying these scenes. On the other hand, photo-editing systems, such as Adobe Photoshop, provide a powerful, intuitive, and practical means to edit images. However, these systems are limited by their two-dimensional nature. In this thesis, we present a system that extends photo editing to 3D. Starting from a single input image, the system enables the user to reconstruct a 3D representation of the captured scene, and edit it with the ease and versatility of 2D photo editing. The scene is represented as layers of images with depth, where each layer is an image that encodes both color and depth. A suite of user-assisted tools are employed, based on a painting metaphor, to extract layers and assign depths. The system enables editing from different viewpoints, extracting and grouping of image-based objects, and modifying the shape, color, and illumination of these objects. As part of the system, we introduce three powerful new editing tools. These include two new clone brushing tools: the non-distorted clone brush and the structure-preserving clone brush. They permit copying of parts of an image to another via a brush interface, but alleviate distortions due to perspective foreshortening and object geometry. === (cont.) The non-distorted clone brush works on arbitrary 3D geometry, while the structure-preserving clone brush, a 2D version, assumes a planar surface, but has the added advantage of working directly in 2D photo-editing systems that lack depth information. The third tool, a texture-illuminance decoupling filter, discounts the effect of illumination on uniformly textured areas by decoupling large- and small-scale features via bilateral filtering. This tool is crucial for relighting and changing the materials of the scene. There are many applications for such a system, for example architectural, lighting and landscape design, entertainment and special effects, games, and virtual TV sets. The system allows the user to superimpose scaled architectural models into real environments, or to quickly paint a desired lighting scheme of an interior, while being able to navigate within the scene for a fully immersive 3D experience. We present examples and results of complex architectural scenes, 360-degree panoramas, and even paintings, where the user can change viewpoints, edit the geometry and materials, and relight the environment. === by Byong Mok Oh. === Ph.D.
author2 Julie Dorsey.
author_facet Julie Dorsey.
Oh, Byong Mok, 1969-
author Oh, Byong Mok, 1969-
author_sort Oh, Byong Mok, 1969-
title A system for image-based modeling and photo editing
title_short A system for image-based modeling and photo editing
title_full A system for image-based modeling and photo editing
title_fullStr A system for image-based modeling and photo editing
title_full_unstemmed A system for image-based modeling and photo editing
title_sort system for image-based modeling and photo editing
publisher Massachusetts Institute of Technology
publishDate 2005
url http://hdl.handle.net/1721.1/8511
work_keys_str_mv AT ohbyongmok1969 asystemforimagebasedmodelingandphotoediting
AT ohbyongmok1969 systemforimagebasedmodelingandphotoediting
_version_ 1719039595571576832