Mailinglist Archives:
Infrared
Panorama
Photo-3D
Tech-3D
Sell-3D
MF3D
|
|
Notice |
This mailinglist archive is frozen since May 2001, i.e. it will stay online but will not be updated.
|
|
FWIW
- From: T3D john bercovitz <bercov@xxxxxxxxxxx>
- Subject: FWIW
- Date: Wed, 20 Nov 1996 07:44:30 -0800
This just came in. I of course have grave reservations when the writer
doesn't understand that "myself" is reflexive but this writer may have
other attributes. (Oooh, _bad_ form, John B!)
========================================================================
Modeling and Rendering Architecture from Photographs
Friday, November 22,
16h00-17h00, 306 Soda
Paul Debevec
University of California at Berkeley
Imagine visiting your favorite place, taking a few pictures, and then
turning those pictures into a photorealisic three-dimensional computer
model. Dr. C.J. Taylor, Prof. Jitendra Malik, and myself have merged
techniques from computer vision and computer graphics to create a system
to make this possible. The applications range from architectural
planning and archaeological reconstructions to virtual environments and
hollywood special effects.
In this talk I will present our new approach for modeling and rendering
architectural scenes from a sparse set of still photographs. The
modeling approach, which leverages both geometric and image-based
techniques, has two components. The first component is a
photogrammetric modeling method which facilitates the recovery of the
basic geometry of the photographed scene. Our photogrammetric modeling
approach is effective, convenient, and robust because it takes advantage
of the constraints that are characteristic of architectural scenes. The
second component is model-based stereo, which recovers how the real
scene deviates from the basic model. By making use of the model, this
stereo technique robustly recovers accurate depth from widely-spaced
image pairs. Consequently, our approach can model large architectural
environments with far fewer photographs than current image-based
modeling approaches. For producing renderings, I will present
view-dependent texture mapping, a method of compositing multiple views
of a scene that better simulates geometric detail and specular
reflectance than flat texture-mapping. I will present results that
demonstrate our approach's ability to create realistic renderings of
architectural scenes from viewpoints far from the original photographs,
including images from the Rouen Revisited art installation presented at
the SIGGRAPH '96 art show.
For more info, see:
http://www.cs.berkeley.edu/~debevec/Research/
------------------------------
|