Low performance and Thread-MPI error in multi GPU usage

GROMACS version: 2023.01
GROMACS modification: No

I installed GROMACS 2023.01 using the cmake command below.

cmake … -DGMX_USE_RDTSCP=ON -DGMX_SIMD=AVX2_256 -DGMX_BUILD_MDRUN_ONLY=ON -DREGRESSIONTEST_DOWNLOAD=ON -DGMX_BUILD_OWN_FFTW=ON -DGMX_MPI=ON -DGMX_THREAD_MPI=ON -DGMX_GPU=CUDA -DCMAKE_C_COMPILER=gcc-9 -DCMAKE_CXX_COMPILER=g+±9

But, When I execute mdrun like below,
I got the error, THREAD_MPI is not complied during installation.
mpirun -np 2 gmx mdrun -v -deffnm ${mini_prefix} -ntmpi 2 -npme 1 -gputasks 01

When I use another command to execute mdrun, the process run normally but the performance is more slower than when I use one GPU (77 ns/day → 32 ns/day).
mpirun -np 2 gmx mdrun -v -deffnm ${mini_prefix} -gpu_id 01

I want to execute mdrun with -ntmpi, -npme setting to improve multi-GPU performance, but I can’t do it due to THREAD_MPI error.
Is there any issue for my cmake build command?

Hi!

It looks like you are mixing up “real” MPI and threadMPI.

  • threadMPI is GROMACS internal “pseudo-MPI” implementation; you launch it like gmx mdrun -ntmpi NRANKS .... It can only work on a single machine.

  • libMPI is an external, proper MPI implementation (OpenMPI, MPICH, etc.), able to scale across multiple compute nodes. With it, you launch GROMACS like mpirun -np NRANKS gmx_mpi mdrun .... If you want to use multiple GPUs efficiently in this case, you need GPU-aware (CUDA-aware) MPI installation (if you just apt installed it, it probably is not).

You cannot mix the two.

If you run on a single machine, don’t use mpirun. Build GROMACS with -DGMX_MPI=OFF -DGMX_THREAD_MPI=ON, and use gmx mdrun -ntmpi 2 -npme 1 ....

It would also be helpful to set the GMX_ENABLE_DIRECT_GPU_COMM=1 environment variable; see https://manual.gromacs.org/current/user-guide/mdrun-performance.html#assigning-tasks-to-gpus

I got it. I I understand your explanation.
I must have been confusing the two concepts because I’m still a beginner at GROMACS.

Thank you for your kind explanation.
I’ll try to rebuild GROMACS as you said.

Thank you, again.