Hi Claudio
Can you send me the specs of your machine please. Processors, OS version +
mpi installation version, compilers + versions etc. Also whether you used
Intel MKL or not and if so which version. Options you gave to
./configure_amber etc.
Can you also please send me
$AMBERHOME/test/qmmm2/1NLN_test_diagonalizers/mdout.1NLN_auto
If you can also try this test case with the serial version of the code and
also on 2 processors as well and let me know what happens that would be
great.
All the best
Ross
> -----Original Message-----
> From: owner-amber.scripps.edu [mailto:owner-amber.scripps.edu] On Behalf
> Of Carra, Claudio (JSC-SK)[USRA]
> Sent: Friday, May 02, 2008 6:37 AM
> To: amber.scripps.edu
> Subject: AMBER: test.parallel.QMMM problem
>
> Dear All,
> I've encountered this problem while I was running the test for
> the parallel QM/MM, doing
>
> cd $AMBERHOME/test
> setenv DO_PARALLEL 'mpirun -np 4'
> make test.parallel.QMMM
>
> All the other tests run fine. I've downloaded and updated
> all the files for DFTB in $AMBERHOME/dat/slko.
>
>
> ==============================================================
> cd qmmm2/1NLN_test_diagonalizers && ./Run.1NLN_dsyevd
> diffing mdout.1NLN_dsyevd.save with mdout.1NLN_dsyevd
> PASSED
> ==============================================================
> cd qmmm2/1NLN_test_diagonalizers && ./Run.1NLN_dsyevr
> diffing mdout.1NLN_dsyevr.save with mdout.1NLN_dsyevr
> PASSED
> ==============================================================
> cd qmmm2/1NLN_test_diagonalizers && ./Run.1NLN_auto
> forrtl: severe (174): SIGSEGV, segmentation fault occurred
> Image PC Routine Line
> Source
> sander.MPI 000000000071ACF9 Unknown Unknown
> Unknown
> sander.MPI 00000000006EFDAE Unknown Unknown
> Unknown
> sander.MPI 00000000006CFD88 Unknown Unknown
> Unknown
> sander.MPI 000000000066EDE8 Unknown Unknown
> Unknown
> sander.MPI 000000000082985A Unknown Unknown
> Unknown
> sander.MPI 00000000004FE0C8 Unknown Unknown
> Unknown
> sander.MPI 00000000004B0500 Unknown Unknown
> Unknown
> sander.MPI 00000000004A9252 Unknown Unknown
> Unknown
> sander.MPI 00000000004058AA Unknown Unknown
> Unknown
> libc.so.6 0000003534B1C3FB Unknown Unknown
> Unknown
> sander.MPI 00000000004057EA Unknown Unknown
> Unknown
> ------------------------------------------------------------------------
> -----
> One of the processes started by mpirun has exited with a nonzero exit
> code. This typically indicates that the process finished in error.
> If your process did not finish in error, be sure to include a "return
> 0" or "exit(0)" in your C code before exiting the application.
>
> PID 19803 failed on node n0 (10.46.3.254) with exit status 174.
> ------------------------------------------------------------------------
> -----
> make[1]: *** [test.sander.QMMM] Error 1
> make: *** [test.sander.QMMM.MPI] Error 2
> ./Run.1NLN_auto: Program error
>
> ----
>
> any help is highly appreciated
> sincerely
> claudio
>
> -----------------------------------------------------------------------
> The AMBER Mail Reflector
> To post, send mail to amber.scripps.edu
> To unsubscribe, send "unsubscribe amber" to majordomo.scripps.edu
-----------------------------------------------------------------------
The AMBER Mail Reflector
To post, send mail to amber.scripps.edu
To unsubscribe, send "unsubscribe amber" to majordomo.scripps.edu
Received on Sun May 04 2008 - 06:07:40 PDT