Mpirun: cannot start gmx_mpi on n0 (o): No such file or directory

GROMACS version:2020.4
i’m trying to run this command

mpirun -np 4 gmx_mpi mdrun -s remd_.tpr -multi 36 -replex 500 -deffnm remd_ -v >& md.out &

I got

Mpirun: cannot start gmx_mpi on n0 (o): No such file or directory

You need to configure your environment to know where the gmx_mpi binary is.

thanks for replying … the error changed to become

[proxy:0:0@user] HYDU_create_process (utils/launch/launch.c:74): execvp error on file gmx_mpi (No such file or directory)
[proxy:0:0@user] HYDU_create_process (utils/launch/launch.c:74): execvp error on file gmx_mpi (No such file or directory)
[proxy:0:0@user] HYDU_create_process (utils/launch/launch.c:74): execvp error on file gmx_mpi (No such file or directory)
[proxy:0:0@user] HYDU_create_process (utils/launch/launch.c:74): execvp error on file gmx_mpi (No such file or directory)

That’s the same problem, just printed multiple times by the different MPI processes.

I did the following

cmake ..  -DGMX_MPI=ON 

and got

CMake Warning at cmake/gmxManageSharedLibraries.cmake:74 (message):
Both BUILD_SHARED_LIBS and GMX_BUILD_MDRUN_ONLY are set.  Generally, an
mdrun-only build should prefer to use static libraries, which is the
default if you make a fresh build tree.  You may be re-using an old build
tree, and so may wish to set BUILD_SHARED_LIBS=off yourself.
Call Stack (most recent call first):
CMakeLists.txt:456 (include)
-- Configuring done
-- Generating done
-- Build files have been written to: /home/user/Desktop/gromacs-2020.4

and I tried to get the path of gmx_mpi by

whereis gmx_mpi

got

gmx_mpi:

also I check echo $GMXDATA

and got /usr/local/gromacs/share/gromacs

and gmx --version worked well

Does i have gmx_mpi ?

i’m trying a lot to figure out the problem and searched more … i exchanged the command to become using

mdrun_mpi instead of gmx_mpi mdrun

does that right ?

i tried to run it

mpirun -np 4 mdrun_mpi -s remd_.tpr -multi 20 -replex 500 -deffnm remd_ -v >& md.out &

and got

The number of ranks (1) is not a multiple of the number of simulations (20)

i have 20 temperatures file … … cpu 6 cores

and got in

md.out file

MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.

thanks for your time and help really