RE: AMBER: FW: MPI error message

From: Ross Walker <ross.rosswalker.co.uk>
Date: Tue, 24 Jul 2007 14:31:12 -0700

Hi Taryn,
 
This error message isn't much help at all - the real error message has been
lost somewhere - do you get any output file at all? Does the queuing system
give you the contents of stdout and stderr - typically these get put into a
file that you would have specified in the queue submission script.
 
Also have you tried this without MPI? If you run this with just regular
sander interactively on your machine does it run okay? Then try submitting a
single processor job to your queue again NOT using mpirun and calling the
single processor version of sander - does this run correctly?
 
Only if the two runs above work should you move to trying to run in
parallel. Plus when running in parallel you must use sander.MPI as the
executable which from your error message it doesn't look like you were
doing.
 
MPI: On host co-compute2, Program
/usr/apps/chemistry/amber/amber9/amber9/exe/sander.

All the best
Ross
 

/\
\/
|\oss Walker

| HPC Consultant and Staff Scientist |
| San Diego Supercomputer Center |
| Tel: +1 858 822 0854 | EMail:- ross.rosswalker.co.uk |
| http://www.rosswalker.co.uk <http://www.rosswalker.co.uk/> | PGP Key
available on request |

Note: Electronic Mail is not secure, has no guarantee of delivery, may not
be read every day, and should not be used for urgent or sensitive issues.

 


  _____

From: owner-amber.scripps.edu [mailto:owner-amber.scripps.edu] On Behalf Of
Taryn Hartley
Sent: Tuesday, July 24, 2007 12:00
To: amber.scripps.edu
Subject: AMBER: FW: MPI error message


Although I have yet to determine why, I re-made the .inpcrd and .prmtop
files in xLeap, and attempted to run the job again. This was my error
message this time.... any thoughts?


set_SCR: using existing PBS job directory /scratch/batch/205048
MPI: On host co-compute2, Program
/usr/apps/chemistry/amber/amber9/amber9/exe/sander.
MPI, Rank 0, Process 7718 called MPI_Abort(<communicator>, 1)

MPI: --------stack traceback-------
Internal Error: Can't read/write file "/dev/mmtimer", (errno = 22)
Internal Error: Can't read/write file "/dev/sgi_fetchop", (errno = 22)
MPI: Intel(R) Debugger for Itanium(R) -based Applications, Version 9.0-20 ,
Build 200
60218
MPI: Reading symbolic information from
/usr/apps/chemistry/amber/amber9/amber9/exe/sa
nder.MPI...done
MPI: Attached to process id 7718 ....
MPI: stopped at [0xa000000000010641]
MPI: >0 0xa000000000010641
MPI: #1 0x2000000005873a80 in __waitpid(...) in /lib/tls/libc.so.6.1
MPI: #2 0x20000000000e4170 in MPI_SGI_stacktraceback(...) in
/usr/lib/libmpi.so
MPI: #3 0x20000000001208d0 in PMPI_Abort(...) in /usr/lib/libmpi.so
MPI: #4 0x20000000001bdda0 in mpi_abort__(...) in /usr/lib/libmpi.so
MPI: #5 0x40000000003641f0 in mexit_(...) in
/usr/apps/chemistry/amber/amber9/amber9
/exe/sander.MPI
MPI: #6 0x4000000000211510 in getcor_(...) in
/usr/apps/chemistry/amber/amber9/amber
9/exe/sander.MPI
MPI: #7 0x40000000001a5440 in sander_(...) in
/usr/apps/chemistry/amber/amber9/amber
9/exe/sander.MPI
MPI: #8 0x4000000000194bc0 in MAIN__(...) in
/usr/apps/chemistry/amber/amber9/amber9
/exe/sander.MPI
MPI: #9 0x4000000000004840 in main(...) in
/usr/apps/chemistry/amber/amber9/amber9/e
xe/sander.MPI
MPI: #10 0x2000000005791c50 in __libc_start_main(...) in
/lib/tls/libc.so.6.1
MPI: #11 0x4000000000004580 in _start(...) in
/usr/apps/chemistry/amber/amber9/amber9
/exe/sander.MPI

MPI: -----stack traceback ends-----
MPI: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
MPI: aborting job





  _____

From: taryn_hartley.hotmail.com
To: amber.scripps.edu
Subject: MPI error message
Date: Mon, 23 Jul 2007 11:39:27 -0600


Regarding using Sander on Amber9 to run an MD simulation, my pbs script
included AMBERHOME/exe/sander and was receiving this error message:
MPI: co-compute1: 0x34b5000042df8b72: forrtl: severe (64): input conversion
error, un
it 8, file /u/ac/thartley/test/bundlewat.prmtop
MPI: co-compute1: 0x34b5000042df8b72: Image PC
Routine
         Line Source
MPI: co-compute1: 0x34b5000042df8b72: sander 4000000000C91390
Unknown
            Unknown Unknown
MPI: co-compute1: 0x34b5000042df8b72: sander 4000000000C8C560
Unknown
            Unknown Unknown
MPI: co-compute1: 0x34b5000042df8b72: sander 4000000000C34250
Unknown
            Unknown Unknown
MPI: co-compute1: 0x34b5000042df8b72: sander 4000000000B882E0
Unknown
            Unknown Unknown
MPI: co-compute1: 0x34b5000042df8b72: sander 4000000000B87810
Unknown
            Unknown Unknown
MPI: co-compute1: 0x34b5000042df8b72: sander 4000000000BCB6B0
Unknown
            Unknown Unknown
MPI: co-compute1: 0x34b5000042df8b72: sander 40000000001B13C0
Unknown
            Unknown Unknown
MPI: co-compute1: 0x34b5000042df8b72: sander 400000000018B850
Unknown
            Unknown Unknown
MPI: co-compute1: 0x34b5000042df8b72: sander 40000000001874F0
Unknown
            Unknown Unknown
MPI: co-compute1: 0x34b5000042df8b72: sander 4000000000003D40
Unknown
            Unknown Unknown
MPI: co-compute1: 0x34b5000042df8b72: libc.so.6.1 2000000001D0DC50
Unknown
            Unknown Unknown
MPI: co-compute1: 0x34b5000042df8b72: sander 4000000000003A80
Unknown
            Unknown Unknown
MPI: could not run executable (case #4)

It was suggested to me to add .MPI after sander (AMBERHOME/exe/sander.MPI),
which I did, and now my error message reads as follows:
set_SCR: using existing PBS job directory /scratch/batch/204612
forrtl: severe (64): input conversion error, unit 8, file
/u/ac/thartley/test/bundlewat.prmtop
Image PC Routine Line Source

  
sander.MPI 40000000007F3CD0 Unknown Unknown Unknown
sander.MPI 40000000007EEEA0 Unknown Unknown Unknown
sander.MPI 4000000000796B90 Unknown Unknown Unknown
sander.MPI 40000000006EEEE0 Unknown Unknown Unknown
sander.MPI 40000000006EE410 Unknown Unknown Unknown
sander.MPI 4000000000731DF0 Unknown Unknown Unknown
sander.MPI 40000000001CAA00 Unknown Unknown Unknown
sander.MPI 40000000001A4650 Unknown Unknown Unknown
sander.MPI 4000000000194BC0 Unknown Unknown Unknown
sander.MPI 4000000000004840 Unknown Unknown Unknown
libc.so.6.1 2000000005791C50 Unknown Unknown Unknown
sander.MPI 4000000000004580 Unknown Unknown Unknown
MPI: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
MPI: aborting job

Is it my .prmtop file as the 2nd line indicates? .MPI was not used in the
PBS script in the tutorials (which I executed successfully before attempting
my own project) and that is what I am using as my guide. Help?

-Taryn






  _____

Explore the seven wonders of the world Learn more!
<http://search.msn.com/results.aspx?q=7+wonders+world&mkt=en-US&form=QBRE>


  _____

Get news, entertainment and everything you care about at Live.com. Check it
out! <http://www.live.com/getstarted.aspx>




-----------------------------------------------------------------------
The AMBER Mail Reflector
To post, send mail to amber.scripps.edu
To unsubscribe, send "unsubscribe amber" to majordomo.scripps.edu
Received on Wed Jul 25 2007 - 06:07:40 PDT
Custom Search