Error obtained from GROMACS simulation

GROMACS version: GROMACS 2018
GROMACS modification: No
Dear GROMACS user,

I got some error, when I am running GROMACS simulation in a supercomputer. My job is running using 6 nodes and 120 processors. Job is running. But, no output file is written. Sometimes, output file is written. But, simulation is very slow. Please see the error message below:

--------------------------------------------------------------------------
Open MPI failed to open the /dev/knem device due to a local error.
Please check with your system administrator to get the problem fixed,
or set the btl_sm_use_knem MCA parameter to 0 to run without /dev/knem

support
Local host: c212
Errno: 2 (No such file or directory)


WARNING: Open MPI failed to open the /dev/knem device due to a local
error. Please check with your system administrator to get the problem fixed, or set the btl_vader_single_copy_mechanism MCA variable to none to silence this warning and run without knem support.
The vader shared memory BTL will fall back on another single-copy

mechanism if one is available. This may result in lower performance.

Local host: c212

Errno: 2 (No such file or directory)

***WARNING: Linux kernel knem support was requested via the btl_vader_single_copy_mechanism MCA parameter, but Knem support was either not compiled into this Open MPI installation, or Knem support was unable to be activated in this process.

The vader BTL will fall back on another single-copy mechanism if one is available. This may result in lower performance.

Local host: c326***

Does anyone familiar with this kind of error? I am bit confused regarding this issue. Any kind of advice/ suggestions will be deeply appreciated.

Thanks in advance,
Snehasis Chatterjee

Hi,
I suggest to ask to the supercomputer support.
Best regards
Alessandra

Hi,

Thank you so much for your suggestion. I already contacted supercomputer support. But, they were unable to solve the problem. Actually, they are also not familiar with GROMACS simulation.

Regards,
Snehasis

The error being reported comes from the MPI software, it’s nothing coming from GROMACS. It seems like something is incorrectly configured with respect to MPI, which should be something the system admins can look at, even if they don’t know the specifics of GROMACS.

Hi Prof. Lemkul,

Thank you very much for your kind advice.

Regards,
Snehasis