Unable to run fully populated MPI job

GROMACS version: 2019.3
GROMACS modification: No

First of all a brand new to use GROMACS, so I apologize if what I say does not make sense at all. Here it is my problem: I book a full 128 cores node for my jobs, however if I run a fully populated MPI GROMACS job I get an evil error:

mpirun -n 128 gmx_mpi mdrun -resethway -noconfout -s ion_channel.tpr
mpirun was unable to start the specified application as it encountered an

Error code: 63
Error name: (null)

But I can run a 64 MPI processes job:

mpirun -n 64 gmx_mpi mdrun -resethway -noconfout -s ion_channel.tpr
Using 64 MPI processes
Using 1 OpenMP thread per MPI process



The problem is with configuration of your node and/or MPI, not with GROMACS itself.
Seems like problem with incorrectly compiled OpenMPI on the 2xAMD Epyc node and Mellanox/IB
See for example these thread on the official OpenMPI forums: