[Radiance-general] memory errors in 3.4
Greg Ward
[email protected]
Wed, 30 Jan 2002 09:34:28 -0800
Hi Lars,
You should check that you don't have a limit placed on your data size if
you are getting out-of-memory errors. Run the command in csh or tcsh:
% limit
It should give you a list of resource limits currently in effect. If
the datasize says anything besides "unlimited", you should run the
command "unlimit datasize" before running Radiance. Better still, but
such a command in your .cshrc or .tcshrc file in your home directory.
If you are still getting out-of-memory errors after that, there is
either a problem with your virtual memory size or a problem with
Radiance.
Raphael wrote to tell me that ximage does a segmentation fault when you
re-expose the image in his version of Linux. I found the problem -- a
pointer was being freed twice due to a library call that either changed
or always freed twice and no one noticed before. Anyway, I've fixed the
problem and will quietly put it into this release rather than creating a
patch, so you might want to download it again or patch it yourself if
you have a slow connection with the following code change:
ray/src/px/x11image.c (line 781):
free(ourdata);
change to:
/* free(ourdata); This is done in XDestroyImage()! */
Then simply run "rmake install" in that directory, presuming you've run
"makeall install" previously. Running "makeall install" again also
works.
Regarding pcond and macbethcal, I don't know what the problem is, but I
don't recommend this method anymore, anyway. Macbathcal works very well
on calibrating input paths, but less well on calibrating output paths.
Unfortunately, I don't have a really good solution to offer for this at
the moment....
-Greg