Hello again Amber users and developers
I return with more questions. When running cpptraj hbond analyses including lifetime analysis, the memory demand for the analyses I am running sometimes peak at around 80 GB which is a bit more than I have access to. I am assuming that this is because something in the lifetime analysis is kept in memory since running just the hbond analysis lands me around 2-5% memory requirements.
So this is my question, is there any way to perform the lifetime analysis on the entire set though in some way use intermediate files and thus manage to reduce the memory requirement for the analyses?
This is the input I’m using
hbond S1 series out series_file.out \
donormask :URA.N1 donorhmask :URA.H1 \
acceptormask :URA.O1 \
avgout average_file.out nointramol
run
runanalysis lifetime S1[solutehb] out lifetime_file.out
Keeping my fingers crossed!
Best regards
// Gustaf
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Tue Sep 25 2018 - 23:30:02 PDT