Md run step for nvt and npt step is aborted when running in cluster

GROMACS version:2021.1
GROMACS modification: No
Dear collegues,
I want to run one protein in water simulation in cluster. I have generated the tpr files also. but when i am keeping it for md run the job is being aborted. my tpr file name is fine and it is also present is the directory. one more thing when the same file i am running in workstation its running without any error. i am not getting what is the problem behind that. i want to run in cluster because it will save my time. please help me with this.

for the reference i am attaching the error:

/var/spool/pbs/mom_priv/jobs/167170.master1.cmsd.uohyd.SC: line 7: cd: /home/nppsls/subhasree_2022/1yamb_in: No such file or directory
Number of nodes is 1
Number of processors is 20
Node list is node05.cmsd.uohyd
/var/spool/pbs/mom_priv/jobs/167170.master1.cmsd.uohyd.SC: line 16: cd: /home/nppsls/subhasree_2022/1yamb_in: No such file or directory
:-) GROMACS - gmx mdrun, 2021.1 (-:

                        GROMACS is written by:
 Andrey Alekseenko              Emile Apol              Rossen Apostolov     
     Paul Bauer           Herman J.C. Berendsen           Par Bjelkmar       
   Christian Blau           Viacheslav Bolnykh             Kevin Boyd        
 Aldert van Buuren           Rudi van Drunen             Anton Feenstra      
Gilles Gouaillardet             Alan Gray               Gerrit Groenhof      
   Anca Hamuraru            Vincent Hindriksen          M. Eric Irrgang      
  Aleksei Iupinov           Christoph Junghans             Joe Jordan        
Dimitrios Karkoulis            Peter Kasson                Jiri Kraus        
  Carsten Kutzner              Per Larsson              Justin A. Lemkul     
   Viveca Lindahl            Magnus Lundborg             Erik Marklund       
    Pascal Merz             Pieter Meulenhoff            Teemu Murtola       
    Szilard Pall               Sander Pronk              Roland Schulz       
   Michael Shirts            Alexey Shvetsov             Alfons Sijbers      
   Peter Tieleman              Jon Vincent              Teemu Virolainen     
 Christian Wennberg            Maarten Wolf              Artem Zhmurov       
                       and the project leaders:
    Mark Abraham, Berk Hess, Erik Lindahl, and David van der Spoel

Copyright (c) 1991-2000, University of Groningen, The Netherlands.
Copyright (c) 2001-2019, The GROMACS development team at
Uppsala University, Stockholm University and
the Royal Institute of Technology, Sweden.
check out http://www.gromacs.org for more information.

GROMACS is free software; you can redistribute it and/or modify it
under the terms of the GNU Lesser General Public License
as published by the Free Software Foundation; either version 2.1
of the License, or (at your option) any later version.

GROMACS: gmx mdrun, version 2021.1
Executable: /home/hpc/apps/gromacs-2021-openmpi-312-gcc820/bin/gmx_mpi
Data prefix: /home/hpc/apps/gromacs-2021-openmpi-312-gcc820
Working dir: /home/nppsls
Command line:
gmx_mpi mdrun -v -s npt.tpr -deffnm npt


Program: gmx mdrun, version 2021.1


Program: gmx mdrun, version 2021.1


Program: gmx mdrun, version 2021.1


Program: gmx mdrun, version 2021.1


Program: gmx mdrun, version 2021.1


Program: gmx mdrun, version 2021.1


Program: gmx mdrun, version 2021.1


Program: gmx mdrun, version 2021.1


Program: gmx mdrun, version 2021.1


Program: gmx mdrun, version 2021.1


Program: gmx mdrun, version 2021.1


Program: gmx mdrun, version 2021.1


Program: gmx mdrun, version 2021.1


Program: gmx mdrun, version 2021.1


Program: gmx mdrun, version 2021.1


Program: gmx mdrun, version 2021.1


Program: gmx mdrun, version 2021.1


Program: gmx mdrun, version 2021.1


Program: gmx mdrun, version 2021.1


Program: gmx mdrun, version 2021.1
Source file: src/gromacs/commandline/cmdlineparser.cpp (line 275)
Function: void gmx::CommandLineParser::parse(int*, char**)
MPI rank: 1 (out of 20)

Source file: src/gromacs/commandline/cmdlineparser.cpp (line 275)
Function: void gmx::CommandLineParser::parse(int*, char**)
MPI rank: 3 (out of 20)

Source file: src/gromacs/commandline/cmdlineparser.cpp (line 275)
Function: void gmx::CommandLineParser::parse(int*, char**)
MPI rank: 5 (out of 20)

Source file: src/gromacs/commandline/cmdlineparser.cpp (line 275)
Function: void gmx::CommandLineParser::parse(int*, char**)
MPI rank: 7 (out of 20)

Source file: src/gromacs/commandline/cmdlineparser.cpp (line 275)
Function: void gmx::CommandLineParser::parse(int*, char**)
MPI rank: 8 (out of 20)

Source file: src/gromacs/commandline/cmdlineparser.cpp (line 275)
Function: void gmx::CommandLineParser::parse(int*, char**)
MPI rank: 9 (out of 20)

Source file: src/gromacs/commandline/cmdlineparser.cpp (line 275)
Function: void gmx::CommandLineParser::parse(int*, char**)
MPI rank: 6 (out of 20)

Source file: src/gromacs/commandline/cmdlineparser.cpp (line 275)
Function: void gmx::CommandLineParser::parse(int*, char**)
MPI rank: 11 (out of 20)

Source file: src/gromacs/commandline/cmdlineparser.cpp (line 275)
Function: void gmx::CommandLineParser::parse(int*, char**)
MPI rank: 13 (out of 20)

Source file: src/gromacs/commandline/cmdlineparser.cpp (line 275)
Function: void gmx::CommandLineParser::parse(int*, char**)
MPI rank: 15 (out of 20)

Source file: src/gromacs/commandline/cmdlineparser.cpp (line 275)
Function: void gmx::CommandLineParser::parse(int*, char**)
MPI rank: 16 (out of 20)

Source file: src/gromacs/commandline/cmdlineparser.cpp (line 275)
Function: void gmx::CommandLineParser::parse(int*, char**)
MPI rank: 17 (out of 20)

Source file: src/gromacs/commandline/cmdlineparser.cpp (line 275)
Function: void gmx::CommandLineParser::parse(int*, char**)
MPI rank: 12 (out of 20)

Source file: src/gromacs/commandline/cmdlineparser.cpp (line 275)
Function: void gmx::CommandLineParser::parse(int*, char**)
MPI rank: 0 (out of 20)

Source file: src/gromacs/commandline/cmdlineparser.cpp (line 275)
Function: void gmx::CommandLineParser::parse(int*, char**)
MPI rank: 18 (out of 20)

Source file: src/gromacs/commandline/cmdlineparser.cpp (line 275)
Function: void gmx::CommandLineParser::parse(int*, char**)
MPI rank: 2 (out of 20)

Source file: src/gromacs/commandline/cmdlineparser.cpp (line 275)
Function: void gmx::CommandLineParser::parse(int*, char**)
MPI rank: 4 (out of 20)

Source file: src/gromacs/commandline/cmdlineparser.cpp (line 275)
Function: void gmx::CommandLineParser::parse(int*, char**)
MPI rank: 19 (out of 20)

Source file: src/gromacs/commandline/cmdlineparser.cpp (line 275)
Function: void gmx::CommandLineParser::parse(int*, char**)
MPI rank: 10 (out of 20)

Source file: src/gromacs/commandline/cmdlineparser.cpp (line 275)
Function: void gmx::CommandLineParser::parse(int*, char**)
MPI rank: 14 (out of 20)

Error in user input:
Invalid command-line options
Error in user input:
Invalid command-line options
Error in user input:
Invalid command-line options
Error in user input:
Invalid command-line options
Error in user input:
Invalid command-line options
Error in user input:
Invalid command-line options
Error in user input:
Invalid command-line options
Error in user input:
Invalid command-line options
Error in user input:
Invalid command-line options
Error in user input:
Invalid command-line options
Error in user input:
Invalid command-line options
Error in user input:
Invalid command-line options
Error in user input:
Invalid command-line options
Error in user input:
Invalid command-line options
Error in user input:
Invalid command-line options
Error in user input:
Invalid command-line options
Error in user input:
Invalid command-line options
Error in user input:
Invalid command-line options
Error in user input:
Invalid command-line options
Error in user input:
Invalid command-line options
In command-line option -s
File ‘npt.tpr’ does not exist or is not accessible.
The file could not be opened.
Reason: No such file or directory
(call to fopen() returned error code 2)

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors

In command-line option -s
File ‘npt.tpr’ does not exist or is not accessible.
The file could not be opened.
Reason: No such file or directory
(call to fopen() returned error code 2)

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors

In command-line option -s
File ‘npt.tpr’ does not exist or is not accessible.
The file could not be opened.
Reason: No such file or directory
(call to fopen() returned error code 2)

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors

In command-line option -s
File ‘npt.tpr’ does not exist or is not accessible.
The file could not be opened.
Reason: No such file or directory
(call to fopen() returned error code 2)

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors

In command-line option -s
File ‘npt.tpr’ does not exist or is not accessible.
The file could not be opened.
Reason: No such file or directory
(call to fopen() returned error code 2)

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors

In command-line option -s
File ‘npt.tpr’ does not exist or is not accessible.
The file could not be opened.
Reason: No such file or directory
(call to fopen() returned error code 2)

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors

In command-line option -s
File ‘npt.tpr’ does not exist or is not accessible.
The file could not be opened.
Reason: No such file or directory
(call to fopen() returned error code 2)

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors

In command-line option -s
File ‘npt.tpr’ does not exist or is not accessible.
The file could not be opened.
Reason: No such file or directory
(call to fopen() returned error code 2)

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors

In command-line option -s
File ‘npt.tpr’ does not exist or is not accessible.
The file could not be opened.
Reason: No such file or directory
(call to fopen() returned error code 2)

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors

In command-line option -s
File ‘npt.tpr’ does not exist or is not accessible.
The file could not be opened.
Reason: No such file or directory
(call to fopen() returned error code 2)

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors

In command-line option -s
File ‘npt.tpr’ does not exist or is not accessible.
The file could not be opened.
Reason: No such file or directory
(call to fopen() returned error code 2)

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors

In command-line option -s
File ‘npt.tpr’ does not exist or is not accessible.
The file could not be opened.
Reason: No such file or directory
(call to fopen() returned error code 2)

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors

In command-line option -s
File ‘npt.tpr’ does not exist or is not accessible.
The file could not be opened.
Reason: No such file or directory
(call to fopen() returned error code 2)

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors

In command-line option -s
File ‘npt.tpr’ does not exist or is not accessible.
The file could not be opened.
Reason: No such file or directory
(call to fopen() returned error code 2)

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors

In command-line option -s
File ‘npt.tpr’ does not exist or is not accessible.
The file could not be opened.
Reason: No such file or directory
(call to fopen() returned error code 2)

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors

In command-line option -s
File ‘npt.tpr’ does not exist or is not accessible.
The file could not be opened.
Reason: No such file or directory
(call to fopen() returned error code 2)

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors

In command-line option -s
File ‘npt.tpr’ does not exist or is not accessible.
The file could not be opened.
Reason: No such file or directory
(call to fopen() returned error code 2)

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors

In command-line option -s
File ‘npt.tpr’ does not exist or is not accessible.
The file could not be opened.
Reason: No such file or directory
(call to fopen() returned error code 2)

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors

In command-line option -s
File ‘npt.tpr’ does not exist or is not accessible.
The file could not be opened.
Reason: No such file or directory
(call to fopen() returned error code 2)

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors

In command-line option -s
File ‘npt.tpr’ does not exist or is not accessible.
The file could not be opened.
Reason: No such file or directory
(call to fopen() returned error code 2)

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors


MPI_ABORT was invoked on rank 18 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.


MPI_ABORT was invoked on rank 6 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.


MPI_ABORT was invoked on rank 4 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.


MPI_ABORT was invoked on rank 14 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.


MPI_ABORT was invoked on rank 16 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.


MPI_ABORT was invoked on rank 13 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.


MPI_ABORT was invoked on rank 5 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.


MPI_ABORT was invoked on rank 19 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.


MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.


MPI_ABORT was invoked on rank 3 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.


MPI_ABORT was invoked on rank 15 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.


MPI_ABORT was invoked on rank 7 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.


MPI_ABORT was invoked on rank 12 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.


MPI_ABORT was invoked on rank 17 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.


MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.


MPI_ABORT was invoked on rank 10 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.


MPI_ABORT was invoked on rank 8 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.


MPI_ABORT was invoked on rank 11 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.


MPI_ABORT was invoked on rank 9 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.


MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.

Hi,

From the error output this is the directory that Gromacs is being launched in:

Is that really the directory that contains your npt.tpr? In this earlier line it seems like your runscript tries to change to another directory, but fails because it doesn’t exist. Possible typo?

Perhaps it’s as simple as correcting that line.

Regards,
Petter

Dear Pjohansson,
Thank you so much for finding out this minute correction. I will try the same and check if it works that will be great. Thanks once again.

Thanks and regards,
Subhasree

Did you get any solution for the error that you mentioned (regarding MPI_ABORT)? I am also facing the same issue while running on cluster.
I am using slurm script.
Looking forward to your reply.

Thanks!