|
|
|
|
Image Based Lighting |
|
Image-Based BRDF measurement |
|
Inverse Global Illumination |
|
|
|
|
Allows us to place 3D objects into photos of
real scenes. |
|
Create accurate interactions with 3D objects
placed in a scene. |
|
|
|
|
Capturing real-world illumination as an
omnidirectional, high dynamic range image |
|
Mapping the illumination onto a representation
of the environment |
|
Placing the 3D object inside the environment |
|
Simulating the light from the environment
illuminating the computer graphics object |
|
|
|
|
Used as the captured lighting input for the IBL |
|
Created from multiple images to give an exacting
account for all the light in the scene |
|
These images where constructed from two radiance
images of a mirrored sphere |
|
|
|
|
single high dynamic range images of a mirrored
ball |
|
the images show the camera and the photographer |
|
not well sampled in the area that is opposite
the camera. |
|
|
|
|
“dynamic range” of a scene is the contrast ratio
between the brightest and darkest parts |
|
A HDR image has a greater dynamic range than
shown on a standard device |
|
HDR images has pixel values proportional to the
amount of light in the world corresponding to the pixel |
|
HDR images are typically generated by combining
multiple normal images of the same scene with different light intensities |
|
|
|
|
A Light Probe Image is mapped to a large sphere
surrounding the model |
|
When a ray hits the IBL environment it takes on
the pixel value of the corresponding
point in the light probe image. |
|
|
|
|
Scene Rendered using Radiance before insertion
of the image for environmental lighting |
|
|
|
|
|
These images are real objects with captured
environmental light illuminating them |
|
Done by taking a large set of images of the
object as illuminated by all possible directions |
|
Linear combination of the images can produce
images under arbitrary lighting conditions |
|
The IBL environment determines the combination
of the images |
|
|
|
|
http://www.debevec.org/Research/IBL/ |
|
|
|
|
|
Can measure the BRDF of a material without a
Gonioreflectometer |
|
Uses fewer measurements to define the BRDF than
a Gonioreflectometer |
|
Less expensive than a Gonioreflectometer |
|
Draw Back: Can only perform measurements on
surfaces which can be placed on the geometry of a physical sphere |
|
Coatings |
|
Sheet of flexible material |
|
|
|
|
Fixed Position Primary Camera |
|
Light Source |
|
Secondary Camera for position measurement |
|
Sample: Sphere painted with various coatings or
a sheet of flexible material |
|
Photometric targets on the sample container |
|
|
|
|
Illuminate the sample from a sequence of known
positions |
|
Finds the position of the light source using the
second camera |
|
|
|
|
|
32 measurement images from the primary camera |
|
96 when three filters are used for RGB |
|
32 light source calibration images from the
second camera |
|
|
|
|
Primary camera is in a known fixed position |
|
Secondary camera is aimed at the sample and
mounted below the light source to image the photometric targets |
|
|
|
|
Targets are mounted on the base on the sample |
|
Targets have a known 3D position relative to the
sample and the primary camera |
|
By analyzing the measurement image the position
of the secondary camera and thus the light source can be determined |
|
This method is accurate to a few millimeters |
|
|
|
|
For each pixel in the primary camera image
determine the surface point and the normal |
|
The direction of illumination is computed
relative to the surface point and the normal |
|
Compute the relative irradiance from the known
source geometry |
|
Compute the BRDF by dividing the radiance (pixel
value) by the irradiance |
|
|
|
|
The BRDF measured shows reciprocity when mapped
as a height field over the (θi, θe) plane |
|
|
|
|
|
|
Goal: To model a scene with a realistic
reflectance properties, from images of the scene, which can be given novel
lighting conditions or have 3D objects placed in it. |
|
|
|
|
Estimates the incident radiances of the surfaces
in a scene. |
|
Radiance estimate used to estimate the
reflectance properties of the surfaces in the scene, by an iterative
procedure |
|
Reflectance property estimates can then be used
re-estimate the incident radiances |
|
|
|
|
|
|
The surfaces of the environment are broken into
a finite number of patches |
|
Patches assumed to have constant radiosity and
diffuse albedo |
|
For each patch: |
|
Bi = Ei + ρiΣjBjFij |
|
Bi is the radiosity |
|
Ei is the emission |
|
ρi is the diffuse albedo |
|
Fij is the form factor between the
patches |
|
The form factor is the total power leaving patch
i that is received by patch j |
|
Bi and Ei are measured
from a photo with known geometry. Fij is derived from the
geometry. So we can find the
diffuse portion of the reflection: |
|
ρi = (Bi -
Ei)/(ΣjBjFij) |
|
|
|
|
Li = (ρd/π + ρsK(α,Θi))Ii |
|
Li is the radiance |
|
Ii is the irradiance |
|
ρd/π is the diffuse term |
|
ρsK(α,Θi)
is the specular term |
|
α is the parameterized surface roughness |
|
Θi is the azimuth of the
incident and viewing directions |
|
The data from the images collected we can solve
the nonlinear optimization problem and get parameters for ρs,
ρd and α |
|
The radiance image must cover an area with a
specular highlight or we will not have enough information for recovering
the specular parameters. |
|
|
|
|
|
|
|
|
Debevec P., “Image-Based Lighting”, IEEE
Computer Graphics and Applications March/April 2002, pp. 26-34 |
|
Yu, Debevec, Malik, Hawkins, "Inverse
Global Illumination: Recovering Reflectance Models of Real Scenes from
Photographs", Proc. ACM SIGGRAPH 1999 |
|
Marschner S.R., Westin S.H., Lafortune P.F., and
Torrance K.E., "Image-Based Bidirectional Reflectance Distribution
Measurement", Applied Optics 39: 16, 2000 |
|