[Radiance-general] Computing irradiances for glow materials

Thomas Bleicher tbleicher at googlemail.com
Wed Aug 4 02:39:39 PDT 2010


Morning.

My comments are below to keep the context of the question.

2010/8/4 Claus Brøndgaard Madsen <cbm at create.aau.dk>:
> http://www.cvmt.dk/~cbm/tmp/radianceglowproblem/glowproblem.html

As a side note: You can get rid of light sources with the "-dv" switch
of rpict. However, sky domes are usually made out of the "glow"
primitive which is treated differently than a proper "source".

In any case, to eliminate the sky dome in your second image just set a
"-va" clipping plane just behind your geometry.

> In short: I have acquired an image of a scene for which I also have a 3D
> model. The camera is calibrated to the scene, and therefore I am able to
> backproject the image pixels onto the geometry. Now I want to use the rpict
> –i option to compute the irradiances in the scene, where the backprojected
> image should be the only illumination in the scene. I.e., the backprojected
> image should be treated as emitted radiance from diffusely emitting
> surfaces, and I want to render the irradiance received at all locations in
> the scene. But when I do that, the only thing which shows up in the render
> are the glow materials, not the irradiances.

If I understand correctly, you want to use the projected texture as
HDR light source. Your problem now is that you can not show irridiance
values because your scene is made up of glow materials for which rpict
does not show irridiance values (that's why it's called "glow").

One idea would be to split the texture image in two or more parts and
define only the surfaces covered by a partial image as glow. The rest
is an ordinary plastic material which will show irridiance values. You
then create separate images and add up all the irridiance values to
create the final image. You probably have to use "-dv" as well to keep
the glow surfaces out of the partial image (although I don't really
know if that affects "glow").

When you split the image you have to check which surfaces can
illuminate others. For example the roof can illuminate the chimney but
not the walls. This also shows a problem of  your method of projecting
the image from a single camera view point: The lantern on top of the
building is partially hidden by the chimney so it's texture will not
cover the full geometry. However, the roof could receive light from
the whole structure but because there is no texture there the hidden
part will not contribute to the irridiance on the roof. Of course
that's only a problem if you only do have a single camera view point.

Another more complex solution to your problem would be the following:

1) You create the view point and direction of each image pixel with vwrays.
2) Along each ray you check for an intersection with the geometry.
3) At the intersection point you calculate the irridiance value with
"rtrace -I" using the intersection point and surface normal as origin
and direction. The calculated value gives you the RGB values for this
pixel.
4) You compile a new image pixel by pixel out of individual irridiance values.

Although that sounds complicated it is possible to do this in Radiance
with a single (long) command line. You can find an introduction to the
various tools and their command options here:

http://sites.google.com/site/tbleicher/radiance/stencil


Happy reading,
Thomas



More information about the Radiance-general mailing list