Did you >try< ulimit -n ?
How many files are in your working directory when it bombs?
Try typing
ls | wc -l #That's a pipe after ls, not another l
and then
ulimit -n
If you see that the number of files is close to or greater than the
what is reported by the "ulimit -n" command, then you actually are
running into the problem I mentioned.
The response returned with the "ulimit" command without arguments says
"unlimited" on macosx and doesn't have anything to do with reality.
You must set the number of open files per process to something greater
using something like "ulimit -n 1024".
I went through this exact problem with the sieve command, and apart
from using "ulimit" command the only other option was to give up on
using sieve.
Well, that's my take on it anyway, and I could be totally off base.
Perhaps there is a bug where files descriptors aren't being properly
disposed of during the sieve routine, but that's a question for the
powers that be.
On Jun 25, 2009, at 7:54 PM, Rachel Rice wrote:
> I get the bus error even with ulimit set to "unlimited". The error
> does not
> happen when the program is trying to write output files- it appears
> to occur
> when it attempts to read in data for the second set of clustering.
>
>
> On Thu, Jun 25, 2009 at 1:53 PM, David Watson <dewatson.olemiss.edu>
> wrote:
>
>> Rachel,
>>
>> Check the following message from the archive:
>> http://archive.ambermd.org/200901/0303.html
>>
>> I hope that helps.
>>
>>
>>
> _______________________________________________
> AMBER mailing list
> AMBER.ambermd.org
> http://lists.ambermd.org/mailman/listinfo/amber
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Mon Jul 06 2009 - 11:18:52 PDT