[Radiance-general] Attempt to illuminate Paul Debevec's rnl_scene.rad with a hemispherical HDR image of the sky

kyle konis kskonis at gmail.com
Mon Oct 15 13:18:15 PDT 2012


Claus, thanks for the detailed explanation.

RE: > Your sky images naturally only cover a hemisphere. If done
correctly in HDRshop your env maps when remapped to angular mapping
would result in everything BELOW the horizon, below the xy-plane,
would return black.

I found sample sky probes with this mapping on
http://projects.ict.usc.edu/graphics/skyprobes/

When i initially looked at these images, i had assumed they were
acquired using a chrome sphere (with the ground portion cropped)
rather than a fisheye lens, due to their "half-sphere" appearance.

I'll give these a try using Paul's .cal file and method. Assuming that
works, i will try to transform my hemispherical sky images to match
this angular mapping using HDRshop.

Thanks for taking a break from vacation to push this along!

-Kyle



On Mon, Oct 15, 2012 at 12:40 PM, Claus Brøndgaard Madsen
<cbm at create.aau.dk> wrote:
>
> Oops ... I didn't notice two threads were involved:-)
>
> Yep, panoramic conversion tools in HDRshop are second to none.
>
> At the conceptual level: it makes absolutely no difference what geometry your light probe is mapped to. It can be sphere, box, super quad, tetrahedron, whatever, as long as it totally encompasses the scene.
>
> The .cal you had in your other post would work for any geometry provided you had the correct mapping (image format) of the env map.
>
> That cal file is called whenever a light ray intersects the sky box (it is a box in Debevecs example, but could just as well be a sphere). The cal file takes the coordinates of that intersection and converts it to a spherical coordinate and subsequently transforms this spherical direction to a look-up in an angular mapping of the env map.
>
> Therefor it does not make any difference what geometry you are using because it is just a proxy geometry used to generate some intersection of which only the directions, not distances, are relevant. Hence the requirement for a big box in Debvecs example. Had the object/scene to be illuminated been 10000 by 10000 units large the box would have had to be an order of magnitude larger than that at least.
>
> Your sky images naturally only cover a hemisphere. If done correctly in HDRshop your env maps when remapped to angular mapping would result in everything BELOW the horizon, below the xy-plane, would return black.
>
> If you go ahead a try to make your own cal file that suits your cropped fish eye images then you don't have to worry about the black corners of the square image outside the spherical projection of the sky. Those pixels would never be looked up because they correspond to impossible ray directions.
>
> As you can see (also from the typos) I am writing from my iPod as I am on vacation. I may be able to provide more direct technical assistance once I'm back in my office next week. I can probably find some image examples and a cal file we can tweak to your needs.
>
>
> Best
> Claus
>
> Sent from my iPod
>
> On 15/10/2012, at 19.32, "kyle konis" <kskonis at gmail.com> wrote:
>
>> Hello Claus,
>>
>> Thanks for the response, both on the HDRI list and here. I've never
>> got into custom .cal files, but looks like the place to start to get
>> exactly what i want.
>>
>> I quickly read through some of the HDRshop tutorials on: http://www.hdrshop.com/
>>
>> There are many interesting and useful tools (including scripting) ....
>> And perhaps the "Panoramic Image Conversion" tool is what you are
>> referring to?
>>
>> ....At the conceptual level, the step i am confused with is
>> identifying the appropriate virtual object in Radiance to map my HDR
>> environment to.
>>
>> Paul uses an inward-pointing box, as in:
>>
>> ###############################################################
>> # specify a large inward-pointing box to be mapped with the HDR environment
>>
>> !genbox env_glow boxenv 500 500 500 -i | xform -t -250 -18 -250
>> ###############################################################
>>
>> All surfaces of the box contribute light.
>>
>> Because i only have captured light data from the horizon up, and dont
>> care (at this stage) about the group plane, it would seem that i
>> should start with a box that encompasses a hemisphere that is
>> positioned on the xy plane.
>>
>> Assuming this step is correct, i would then aim to map the non-zero
>> pixels in my hemispherical image to the surfaces of that box
>> (effectively stretching the circle into a rectangle).
>>
>> Alternatively, i could also try to map to a hemisphere, but then there
>> is the issue of what to do with the "black" pixels which make up the
>> corners of the HDR image.
>>
>> best,
>>
>> -Kyle
>>
>>
>>
>>
>> On Sun, Oct 14, 2012 at 12:23 PM, Claus Brøndgaard Madsen
>> <cbm at create.aau.dk> wrote:
>>>
>>>
>>> Hi Kyle,
>>>
>>> Solving your problem is easy and not so easy.
>>>
>>> There will be some overlap between this response and my response To your other post, but for completeness...
>>>
>>> The easiest solution would be to re-map your hemispherical sky image to angular mapping that Debevec always prefers. I, myself, prefer the lat-long mapping, so I have written my own mapping .cal file. Writing such a .Cal is easy but you have to know what you are doing and there can be some debugging (trial and error) involved.
>>>
>>> In order to remap you sky view to an angular mapping it would be simple and easy to use the HDRshop progam (www.hdrshop.com), but there is no longer a free 1.0 version. Versions for academia are not too expensive though.
>>>
>>> Best
>>> Claus
>>>
>>>
>>>
>>>
>>> Claus B. Madsen
>>> AD:MT/AAU
>>>
>>>
>>> On 14/10/2012, at 12.17, "kyle konis" <kskonis at gmail.com> wrote:
>>>
>>>> Dear list,
>>>>
>>>> I am attempting to illuminate Paul Debevec's rnl_scene.rad with a
>>>> hemispherical HDR image of a dynamic sky.
>>>>
>>>> After the tutorial in http://www.pauldebevec.com/RNL/Source/
>>>>
>>>> I have succeeded in simply substituting in my image, but because it is
>>>> a 180-degree fisheye image (zenith view of sky) rather than a light
>>>> probe, i am encountering some issues with the resulting "reality" of
>>>> the output image.
>>>>
>>>> I would appreciate suggestions for how to properly map this image data
>>>> to a hemisphere (i.e. sky) in radiance to enable the image to serve
>>>> properly as a light source.
>>>>
>>>> I have included the relevant code below, and sample images on a
>>>> temporary webpage:
>>>>
>>>> https://sites.google.com/site/konisskytemp/
>>>>
>>>>
>>>> I would appreciate any suggestions!
>>>>
>>>> -Kyle
>>>>
>>>>
>>>> ### Here is the relevant description of the lighting environment ( i
>>>> have only changed the image file name to source my 1350 x 1350 px
>>>> cropped sky image)
>>>>
>>>> ################################################################
>>>> #
>>>> #  Lighting Environment
>>>> #
>>>> ################################################################
>>>>
>>>> # specify the probe image and how it is mapped onto geometry
>>>>
>>>> void colorpict hdr_env
>>>> 7 red green blue cropped.hdr angmap.cal sb_u sb_v
>>>> 0
>>>> 0
>>>>
>>>> # specify a "glow" material that will use this image
>>>>
>>>> hdr_env glow env_glow
>>>> 0
>>>> 0
>>>> 4 1 1 1 0
>>>>
>>>> # specify a large inward-pointing box to be mapped with the HDR environment
>>>>
>>>> !genbox env_glow boxenv 500 500 500 -i | xform -t -250 -18 -250
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> ################################################################
>>>> ### and here is angmap.cal
>>>> ################################################################
>>>>
>>>> {
>>>> angmap.cal
>>>>
>>>> Convert from directions in the world to coordinates on the angular sphere image
>>>>
>>>> -z is forward (outer edge of sphere)
>>>> +z is backward (center of sphere)
>>>> +y is up (toward top of sphere)
>>>> }
>>>>
>>>> sb_u = 0.5 + DDx * r;
>>>> sb_v = 0.5 + DDy * r;
>>>>
>>>> r = 0.159154943*acos(DDz)/sqrt(DDx*DDx + DDy*DDy);
>>>>
>>>> DDy = Py*norm;
>>>> DDx = Px*norm;
>>>> DDz = Pz*norm;
>>>>
>>>> norm = 1/sqrt(Py*Py + Px*Px + Pz*Pz);
>>>>
>>>>
>>>>
>>>>
>>>> ### Any suggestions for solutions would be appreciated
>>>>
>>>> I assume that i need to map the image to a hemisphere (representing
>>>> the sky dome) rather than a box
>>>>
>>>> perhaps passing the image as a glow rather than the standard sky glow?
>>>>
>>>> skyfunc glow skyglow
>>>> 0
>>>> 0
>>>> 4 1.000 1.000 1.000 0
>>>> skyglow source sky
>>>> 0
>>>> 0
>>>> 4 0 0 1 180
>>>>
>>>> Or to a sphere?
>>>>
>>>>
>>>>
>>>> And i assume that i need a different angmap.cal file, given that this
>>>> was written to interpret light probe images . . . and not
>>>> hemispherical fisheye images
>>>>
>>>> _______________________________________________
>>>> Radiance-general mailing list
>>>> Radiance-general at radiance-online.org
>>>> http://www.radiance-online.org/mailman/listinfo/radiance-general
>>>
>>> _______________________________________________
>>> Radiance-general mailing list
>>> Radiance-general at radiance-online.org
>>> http://www.radiance-online.org/mailman/listinfo/radiance-general
>>
>> _______________________________________________
>> Radiance-general mailing list
>> Radiance-general at radiance-online.org
>> http://www.radiance-online.org/mailman/listinfo/radiance-general
>
> _______________________________________________
> Radiance-general mailing list
> Radiance-general at radiance-online.org
> http://www.radiance-online.org/mailman/listinfo/radiance-general



More information about the Radiance-general mailing list