Photon Mapping
|
Introduction | ||||||||||||
Global illumination is critical in generating synthetic images that are indistinguishable from images taken of the real world. Two major categories for GI algorithms are radiosity methods and point sampling techniques such as path tracing. Global illumination algorithms are designed to approximate the rendering equation which describes the complete transport of light within an environment. | ||||||||||||
|
||||||||||||
Fortunately, Photon Mapping is a two pass global illumination
algorithm developed by Henrik Jensen as an efficient alternative to pure In this write-up I plan to give an over-view of the two passes of Photon Mapping. I will not touch upon participating media and the various optimizations relating to photon mapping and uses in other algorithms. |
||||||||||||
First Pass - Photon Tracing |
||||||||||||
Photon tracing is the process of emitting discrete photons from the light sources and tracing them through the scene. The primary goal of this pass is to populate the photon maps that are used in the rendering pass to calculate the reflected radiance at surfaces and out-scattered radiance in participating media. | ||||||||||||
Photon Emission |
||||||||||||
A photon’s life begins at the light source. For each light source in the scene we create a set of photons and divide the overall power of the light source amongst them. Brighter lights emit more photons then dimmer lights. Finding the number of photons to create at each light depends largely on whether decent radiance estimates can be made during the rendering pass. For good radiance estimates the local density of the photons at surfaces need to provide a good statistic of the illumination. |
||||||||||||
|
||||||||||||
Jensen states that any type of light source can be used and describes several emission models. For point light sources we want to emit photons uniformly in all directions. For area light sources we pick a random position on the surface and then pick a random direction in the hemisphere above this position. |
||||||||||||
Photon Scattering |
||||||||||||
Emitted photons from light sources are scattered through a scene and are eventually absorbed or lost. When a photon hits a surface we can decide how much of its energy is absorbed, reflected and refracted based on the surface’s material properties. |
||||||||||||
|
||||||||||||
|
||||||||||||
The expected value for the next radiance estimate is calculated as follows. | ||||||||||||
|
||||||||||||
L is computed by tracing another ray and is weighted by its probability.
The probability of taking another radiance estimate is p,
otherwise (1–p). |
||||||||||||
|
||||||||||||
The probability events we are concerned with in Photon Mapping are whether photons are absorbed, reflected or refracted. If we only consider the probability events for reflection and absorption we have the following. | ||||||||||||
|
||||||||||||
It is important to note that the power of the reflected photon is not modified. Correctness of the overall result will converge with more samples. The probability is given by the reflectivity of a surface. If the reflectivity for a surface is 0.5 then 50% of the photons will be reflected at full power while the other 50% will be absorbed. To show why Russian Roulette reduces computational and storage costs let’s consider shooting 1000 photons at a surface with reflectivity of 0.5. We could reflect 1000 photons at half power or we can reflect 500 at full power using Russian Roulette. Considering only diffuse and specular reflection and absorption we have the following domain |
||||||||||||
|
||||||||||||
Again, we decide what to do with the photon by generating a random number within the domain. For more information see [1]. | ||||||||||||
Photon Storing |
||||||||||||
For a given scene we may shoot millions of photons from the light sources. It is desirable that our photon map is compact to reduce the storage costs. We also want it to support fast three dimensional spatial searches as we will need to query the photon map millions of times during the rendering phase. The data structure that Jensen recommends using for the photon map is a kd-tree. It is one of the few data structures that is ideal for handling non-uniform distributions of photons. The worst-case complexity of locating photons in a kd-tree is O(n) where as if it is balanced it is O(log n). After all photons are stored in the map we want to make sure it is balanced. For each photon we store its position, power, and incident direction. Jensen proposes the following structure. |
||||||||||||
struct
photon { |
||||||||||||
Jensen prescribes Ward’s shared-exponent RGB format for packing the power into 4 bytes. The phi and theta are spherical coordinates that mapped to 65536 possible directions. The position can possibly be compressed further into a 24bit fixed point format. However, for most serious implementations the structure will be as compressed as possible to allow for extremely large and complex scenes. For naïve renderers the structure can remain largely uncompressed for ease of use. |
||||||||||||
Second Pass - Rendering |
||||||||||||
Photon mapping is used to calculate the effects of indirect lighting and caustics. Too many photons would be needed in the photon map to accurately handle specular/glossy surfaces and direct lighting. | ||||||||||||
Approximating the Rendering Equation |
||||||||||||
During the rendering pass we are using an approximation of the rendering equation to compute the reflected radiance at surface locations. The rendering equation in the form that is typically used is described as: | ||||||||||||
|
||||||||||||
where: | ||||||||||||
|
||||||||||||
In the above equation we are integrating over the hemisphere above x the incoming radiance per solid angle as seen by x. | ||||||||||||
We can use density estimation to estimate the illumination by querying for the nearest N photons from the photon map. | ||||||||||||
This density estimate gives us the incoming flux or irradiance. If we ignore the wave phenomenon of light and time then irradiance basically tells us how much light is arriving at a particular area. In other words incident irradiance, Ei, is the sum of the number of photons, omega sub i , arriving per unit area. | ||||||||||||
|
||||||||||||
With a differential solid angle, dw' , the differential irradiance, dEi , can be expressed in terms of incident radiance Li as: | ||||||||||||
|
||||||||||||
Using this relationship we can substitute for Li in the rendering equation: | ||||||||||||
|
||||||||||||
canceling out the terms we have: | ||||||||||||
|
||||||||||||
Using information from our nearest neighbor query gives us a set of N photons for a given area A in which we know the incident direction for each. This gives us the following approximation: | ||||||||||||
|
||||||||||||
In order to get a good radiance estimate we need a sufficient number of photons in our density estimate. Of course this can directly relate to the number of photons that are emitted from the light sources. The more photons that are used the more accurate is our estimate. We also need to be careful how we collect photons. The main idea behind gathering photons is that we are hoping to get an idea of what the illumination at x is by examining the nearest N photons. If we include photons from other surfaces, with drastically different surface normals, or from volumes in our estimate then we can degrade the accuracy of our estimate. |
||||||||||||
Caustics |
||||||||||||
Rendering caustics is fairly straightforward as you can directly visualize the photon map. Visualizing the photon map directly basically means we gather N nearest photons and estimate the radiance directly from that information using the last equation from the previous section. During the photon tracing phase a separate photon map is used for caustics. Otherwise many more photons would be needed in a single photon map to produce good caustics. For sharper caustics a filter can be used such as cone filter where more closer photons are weighted more then photons that are father from point x. | ||||||||||||
click on images to enlarge |
||||||||||||
|
||||||||||||
Indirect Illumination |
||||||||||||
Indirect illumination is much trickier to get right. In this case, the
information from the photon map can be used in a more accurate solution such
as path tracing. However, good results can be obtained using photon
mapping directly. Often times a final gathering step is employed to
improve the results using the photon map. During the final gather step
we can sample the hemisphere above the surface at x for incoming
radiance. For each sample we can compute a radiance estimate directly
from the photon map. The first image in this write-up uses this
technique as well as the image on the right in the previous section.
However, even without a final gather step it is possible to achieve images such as the one below. The image on the left shows the contribution from direct lighting only and the image on the right was rendered using photon mapping. The image on the right does not use a final gather step. Only 26,000 photons actually made it into the window and 500 were used in the radiance estimate. Notice the color bleeding on the yellow ball and on the floor. |
||||||||||||
click on images to enlarge |
||||||||||||
|
||||||||||||
Even using the photon map directly images can take a long time to render. Using final gathering in the rendering step can cause the render time to increase dramatically. Ward[6] developed a method called irradiance caching where the illumination is cached and re-used. Illumination values from the cache can be interpolated at a point. The image on the left shows a visualization of the irradiance cache. All points that are black are the points where a cached value was used in the right image. | ||||||||||||
click on images to enlarge |
||||||||||||
|
||||||||||||
Flexibility |
||||||||||||
The rendering equation can be split up into four separate terms describing direct illumination, specular reflection, caustics, and indirect illumination. | ||||||||||||
|
||||||||||||
Due to the fact that information from the photon map describes the incoming flux and that this information is decoupled from the geometry we have the flexibility to solve different parts of the rendering equation by using multiple photon maps. This is also another reason why photon mapping can easily be incorporated into existing renderers. For example, we may have an existing path tracer where we would like to use photon mapping to handle caustics. Path tracing does a great job at computing the direct and indirect lighting while photon mapping handles caustics which removes the high frequency noise typically seen in path traced images. In addition, we can use information from the photon map to help us importance sample the hemisphere. Another great feature of the photon maps is that it provides information that can be used as heuristics for other parts of the rendering equation. For example, with a little extra work, information from the photon map can provide information as to whether shadow rays are needed at a location. This alone can reduce the rendering times significantly. In path tracers the photon map can also be used for importance sampling. |
||||||||||||
References |
||||||||||||
|