Hi Jason,
Here is the list of files that were created:
-rw-rw-r-- 1 freedman freedman 262 Aug 20 11:34
_MMPBSA_gb_decomp_rec.mdin
-rw-rw-r-- 1 freedman freedman 251 Aug 20 11:34
_MMPBSA_gb_decomp_lig.mdin
-rw-rw-r-- 1 freedman freedman 307 Aug 20 11:34
_MMPBSA_gb_decomp_com.mdin
-rw-rw-r-- 1 freedman freedman 657295 Aug 20 11:35 _MMPBSA_receptor.pdb
-rw-rw-r-- 1 freedman freedman 1971541 Aug 20 11:35 _MMPBSA_receptor.mdcrd.0
-rw-rw-r-- 1 freedman freedman 7722 Aug 20 11:35
_MMPBSA_normal_traj_cpptraj.out
-rw-rw-r-- 1 freedman freedman 3571 Aug 20 11:35 _MMPBSA_ligand.pdb
-rw-rw-r-- 1 freedman freedman 10781 Aug 20 11:35 _MMPBSA_ligand.mdcrd.0
-rw-rw-r-- 1 freedman freedman 296227 Aug 20 11:35
_MMPBSA_dummyreceptor.inpcrd
-rw-rw-r-- 1 freedman freedman 1708 Aug 20 11:35
_MMPBSA_dummyligand.inpcrd
-rw-rw-r-- 1 freedman freedman 297833 Aug 20 11:35
_MMPBSA_dummycomplex.inpcrd
-rw-rw-r-- 1 freedman freedman 660886 Aug 20 11:35 _MMPBSA_complex.pdb
-rw-rw-r-- 1 freedman freedman 1982241 Aug 20 11:35 _MMPBSA_complex.mdcrd.0
-rw-rw-r-- 1 freedman freedman 439 Aug 20 11:35 progress.log
Thanks,
Holly
On Wed, Aug 20, 2014 at 3:08 PM, Jason Swails <jason.swails.gmail.com>
wrote:
> On Wed, Aug 20, 2014 at 3:17 PM, Hallel Freedman <hfreedma.ualberta.ca>
> wrote:
>
> > Hi Jason,
> > We were in fact using a serial version of MMPBSA.py, and it passes all of
> > the serial tests. Do you have any other suggestions for me please?
> >
>
> What intermediate files _were_ made?
>
>
> > Thanks,
> > Holly
> >
> >
> > On Wed, Aug 20, 2014 at 11:41 AM, Jason Swails <jason.swails.gmail.com>
> > wrote:
> >
> > > On Wed, Aug 20, 2014 at 1:37 PM, Hallel Freedman <hfreedma.ualberta.ca
> >
> > > wrote:
> > >
> > > > It doesn't seem to be getting that far - there is no file with that
> > name.
> > > > Maybe I should mention that I am getting a warning:
> > > > An MPI process has executed an operation involving a call to the
> > > > "fork()" system call to create a child process. Open MPI is
> currently
> > > > operating in a condition that could result in memory corruption or
> > > > other system errors; your MPI job may hang, crash, or produce silent
> > > > data corruption. The use of fork() (or system() or other calls that
> > > > create child processes) is strongly discouraged.
> > > >
> > > > The process that invoked fork was:
> > > >
> > > > Local host: jasper.westgrid.ca (PID 19790)
> > > > MPI_COMM_WORLD rank: 0
> > > >
> > > > Could this be related to the problem?
> > > >
> > >
> > > If you're having trouble in parallel, the first thing to do (if
> > possible)
> > > is to run in serial. It's a very simple way to help reduce the number
> of
> > > places one has to look for the source of the problem.
> > >
> > > I can't recall the exact name of the output file off the top of my
> head,
> > > but the name should indicate it is the mdout file of rank 0 (.mdout.0)
> > for
> > > the complex. The output files often hide the true error messages.
> > >
> > > --
> > > Jason M. Swails
> > > BioMaPS,
> > > Rutgers University
> > > Postdoctoral Researcher
> > > _______________________________________________
> > > AMBER mailing list
> > > AMBER.ambermd.org
> > > http://lists.ambermd.org/mailman/listinfo/amber
> > >
> > _______________________________________________
> > AMBER mailing list
> > AMBER.ambermd.org
> > http://lists.ambermd.org/mailman/listinfo/amber
> >
>
>
>
> --
> Jason M. Swails
> BioMaPS,
> Rutgers University
> Postdoctoral Researcher
> _______________________________________________
> AMBER mailing list
> AMBER.ambermd.org
> http://lists.ambermd.org/mailman/listinfo/amber
>
_______________________________________________
AMBER mailing list
AMBER.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber
Received on Wed Aug 20 2014 - 14:30:03 PDT