On Sat, Sep 13, 2025, Dulal Mondal wrote:
>Thank you for your response. Without ifreaf=1, the pmemd.cuda.MPI run is
>fine. But when I put ifreaf 1, it is not running.
>
>But in local machine it is running properly.
>> > Running multipmemd version of pmemd Amber24
>> > Total processors = 24
>> > Number of groups = 24
>> >
>>
>> Try running a short test job with the serial version of pmemd (or even with
>> sander). That might give you more helpful error messages.
1. Are there any error messages in the mdout file when ifreaf=1?
2. When you say the "local machine is running properly", is that with an
identical inputs, especially regarding the number of groups and number of
processors, and also using pmemd.cuda.MPI?
2. Are you willing to try what was suggested above, i.e. try a test run with
just (say) two groups (and hence 2 MPI threads)?
Finally, please direct replies to the Amber mailing list, not just to me.
I know almost nothing about ifreaf, so you want to try to get feedback from
those who do.
...dac
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Sun Sep 14 2025 - 11:30:02 PDT