[Radiance-general] [Fwd: Re: physically-based landscapes]

Rob Guglielmetti [email protected]
Tue, 03 Jun 2003 18:08:27 -0400


This went to Greg, but should have gone to the list.  More thoughts on 
this topic...

-------- Original Message --------
Subject: Re: physically-based landscapes
Date: Tue, 03 Jun 2003 16:58:47 -0400
From: Rob Guglielmetti <[email protected]>
Reply-To: [email protected]
To: Greg Ward <[email protected]>
References: <[email protected]>

Greg Ward wrote:
 >> From: Rob Guglielmetti <[email protected]>

 >> ... Carsten says that the -ar
 >> is based on the scene bounding cube, so even if I exclude the exterior
 >> values I need to crank it up, yes?
 >
 > Yes, though you could set -ar 0 and you might get around this problem.
 > The disadvantage is that the calculation can go a bit nuts in the little
 > corners, but it's only a problem on high-resolution renderings.  Rtrace
 > shouldn't be much affected.

Ahhh, but I *always* do renderings now, so that that ambient cache is
well populated! =8-)  No matter, I'm much more interested in making
option two work...

 >> The illum's luminous distribution function is the result of applying
 >> the lightmap to the window pane, just the same as if I were to use
 >> gensky?  The colorpict is purely for the view out, it does not
 >> contribute to the illuminance of the interior space?
 >
 > No, it's the same as if you used mkillum.  The only difference is that
 > rpict -vta -vh 180 -vv 180 computes the window's light distribution from
 > a single viewpoint, where mkillum would average it over the entire
 > window.  If your window is small relative to the closest geometry, the
 > difference is vanishingly small.

Well, the window mullions are pretty deep, but other than that, no.
Now, your method utilizes a hemispherical fisheye view.  Since typical
HDR lightmaps are a re-mapping of two HDR images of a mirrored ball, one
90 degrees off-axis to the other, I'm confused as to how my
hemispherical image is to be rendered (or photographed).  If I do a
hemispherical rendering of 0 0 0 0 0 1, I get the sky, but no ground.
Vice versa gives me the ground plane but no sky.  Do I need a
hemispherical view for each cardinal heading, or something like that?

This brings me back to my original inquiry about this.  Maybe Santiago
asked the question better than I did, in his recent follow-up (Hi
Santiago!):

"...is there any way to map a HDR image in a source, something in the
way skies are generated (maybe replacing the skyfunc?) and then use it
as any other sky and/or ground? Somehow, for me this seems to be the
most natural solution."

I'm imagining either an HDR photograph lightmap of the entire
model-encompasing sphere, or a Radiance image of the same type of thing,
   mapped to a large sphere, in essence.  Photometric accuracy of the
distribution of the light in the space, as well as the view out the
window wall are both important, so I thought this method might combine
the two.  Sorry if I'm not explaining this well.

----

-RPG