[Radiance-general] Re: per-vertex coloring

Greg Ward gward at lmi.net
Tue Nov 13 21:24:54 PST 2007


Hi Jelle,

I lost this thread in some e-mail difficulties I was having with my  
online service.  These seem to be sorted out, now...

A quick look at the ra_xyz man page will confirm that this program is  
for converting between RGB and XYZ color space, nothing to do with  
the pixel coordinates on an image.  (Confusing, I know.)

Is your mesh grid regular?  If not, you're sort of stuck because you  
need to resample the colors onto a grid, which is a hard problem.  If  
you have a grid to begin with, it may be as simple as establishing a  
correspondence between the local (u,v) coordinates and a dump of  
values into a Radiance picture (using pvalue -r ...).

If you feel like hacking C-code, the program src/hd/rhpict will take  
a set of point values and resample them into an image.  It wouldn't  
take that much to get it to accept colors and positions rather than  
taking a holodeck as input, I don't think.

-Greg

---------------
> Hi Thomas & Greg,
>
> Many thanks for your suggestions here!
>
> Thomas, the texture-based appraoch is something that came across my  
> mind as
> well.
> The other thing that I was wondering about is whether the colordata
> primitive might be relavant here?
>
> For the textures based approach, probably ra_xyz is my friend here,  
> though
> I'm more wondering how I could generate the proper set of uv  
> coordinates to
> get the right mapping.
> I'm not using textures in any sense, the use of per-vertex coloring is
> closer to scientific data visualization.
> Each vertex has a corresponding scalar value assigned to it, which  
> is mapped
> to a color value by a lookup-table, and the values of the lookup- 
> table are
> the ones I'm trying to map.
>
> Many thanks for your suggestions!
>
> -jelle
>
> Nice to be right at least once but I was thinking of your mail
> > about 'extraction of luminance values' from 17/08/2007, especially
> > the part with the coloured 3D plot at the end.
> >
> > Reading it again it might not be that simple to apply to this
> > problem, though. My idea was to read the colour value and save
> > it to a known picture position, then assign this position as
> > UV-coordinate to the mesh vertex. The picture used as texture
> > should give the mesh the desired colouring.
> >
> > What I probably had in mind was something Blender users can do:
> > create a mesh (example: a sphere), 'paint' a mouth, nose, eyes and
> > hair onto the sphere in 3D and 'unwrap' it to a texture image.
> > Then you can use Photoshop etc. to refine that sketchy image to
> > a proper face texture.
> >
> > Applied here it would be:
> >
> > 1) get the mesh into Blender (...)
> > 2) unwrap and save texture of vertex colours
> > 3) export mesh to *.obj with uv-texture
> > 4) convert texture image to *.pic
> > 5) use obj2mesh to get shape and texture into Radiance
> >
> >
> > Regards,
> > Thomas



More information about the Radiance-general mailing list