[Radiance-general] Re: physically-based landscapes

Greg Ward [email protected]
Wed, 4 Jun 2003 12:15:02 -0700


If you photograph or render a fisheye view, you don't have to bother 
with the spheremap nonsense.  A fisheye view of the sky can replace 
gensky.  Likewise for a fisheye view out the window.  If you want both 
the sky and the ground for every possible viewpoint (rather than just 
out the window), you'll need one fisheye up and one fisheye down (from 
an bird's eye view).  As I recommended for the window, you should use a 
lower resolution image for the light distribution than you use for the 
view.  You'll learn a lot by trying the example I suggested in my 
earlier e-mail.

-Greg

> From: Rob Guglielmetti <[email protected]>

>>> The illum's luminous distribution function is the result of applying 
>>> the lightmap to the window pane, just the same as if I were to use 
>>> gensky?  The colorpict is purely for the view out, it does not 
>>> contribute to the illuminance of the interior space?
>>  No, it's the same as if you used mkillum.  The only difference is 
>> that rpict -vta -vh 180 -vv 180 computes the window's light 
>> distribution from a single viewpoint, where mkillum would average it 
>> over the entire window.  If your window is small relative to the 
>> closest geometry, the difference is vanishingly small.
>
> Well, the window mullions are pretty deep, but other than that, no. 
> Now, your method utilizes a hemispherical fisheye view.  Since typical 
> HDR lightmaps are a re-mapping of two HDR images of a mirrored ball, 
> one 90 degrees off-axis to the other, I'm confused as to how my 
> hemispherical image is to be rendered (or photographed).  If I do a 
> hemispherical rendering of 0 0 0 0 0 1, I get the sky, but no ground. 
> Vice versa gives me the ground plane but no sky.  Do I need a 
> hemispherical view for each cardinal heading, or something like that?
>
> This brings me back to my original inquiry about this.  Maybe Santiago 
> asked the question better than I did, in his recent follow-up (Hi 
> Santiago!):
>
> "...is there any way to map a HDR image in a source, something in the
> way skies are generated (maybe replacing the skyfunc?) and then use it 
> as any other sky and/or ground? Somehow, for me this seems to be the 
> most natural solution."
>
> I'm imagining either an HDR photograph lightmap of the entire 
> model-encompasing sphere, or a Radiance image of the same type of 
> thing,  mapped to a large sphere, in essence.  Photometric accuracy of 
> the distribution of the light in the space, as well as the view out 
> the window wall are both important, so I thought this method might 
> combine the two.  Sorry if I'm not explaining this well.