Using GPU for gromacs

GROMACS version: gromacs/2018.6-ompi-4.0.0/gcc/6.5.0-cuda9
GROMACS modification: No
Here post your question

Hi Gromacs Community,

To increase the simulations speed for my system (~100,000 atoms), I decided to try and use GPU to speed things up. However, with this mdrun command:

gmx_mpi mdrun -v -cpi state.cpt -gputasks 0001 -nb gpu -pme gpu -npme 1 -ntmpi 4.

this error :

Fatal error:
Setting the number of thread-MPI ranks is only supported with thread-MPI and


Program: gmx mdrun, version 2018.6
Source file: src/gromacs/taskassignment/resourcedivision.cpp (line 681)
MPI rank: 1 (out of 4)

Fatal error:
Setting the number of thread-MPI ranks is only supported with thread-MPI and
GROMACS was compiled without thread-MPI

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors

Compiled SIMD: AVX_512, but for this host/run AVX2_256 might be better (see
log).

however if I put this command:

gmx_mpi mdrun -v -cpi state.cpt -nb gpu -pme gpu

This error occurs:
Program: gmx mdrun, version 2018.6
Source file: src/gromacs/taskassignment/decidegpuusage.cpp (line 352)
Function: bool gmx::decideWhetherToUseGpusForPme(bool, gmx::TaskTarget, const std::vector&, bool, int, int, bool)
MPI rank: 2 (out of 4)

Feature not implemented:
The input simulation did not use PME in a way that is supported on the GPU.

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors

Can someone explained to me 1) is my system suitable for using GPU+CPU, 2) how can I make my system run with GPU?

best,

ben

Hi Btam,

The first error you get is not related to using GPUs, using gmx_mpi without ntmpi is the first step here (alternatively using just gmx compiled with thread-mpi if you are using just one node).

If you can, try using GROMACS 2020, as more and more things have been ported to the GPU and error messages improved.

Can you say something more about your system? Is it charged? Is it plain protein in water? Do you use plain periodic boundary conditions?

Hi,

My system is a 3400 atom proteins in 30000+ water molecules with ions (CL or Na) to neutralise the system. It is a charged system. Furthermore, yes I am using normal PBC without any modification.

Unfortunately I cannot use GROMACS 2020 as it is not updated in the cluster.

I am now stuck with this error:
Program: gmx mdrun, version 2018.6
Source file: src/programs/mdrun/runner.cpp (line 1005)
MPI rank: 9 (out of 10)

Fatal error:
Cannot run short-ranged nonbonded interactions on a GPU because there is none
detected.

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors

When I use the mdrun command:
gmx_mpi mdrun -v -cpi state.cpt -nb gpu -ntomp $SLURM_NTASKS_PER_NODE

Or this error comes out:

Fatal error:
You limited the set of compatible GPUs to a set that included ID #1, but that
ID is not for a compatible GPU. List only compatible GPUs.

when I put:
gmx_mpi mdrun -v -cpi state.cpt -gpu_id 1 -nb gpu -ntomp $SLURM_NTASKS_PER_NODE

And this error comes up if I put -pme gpu
Feature not implemented:
The input simulation did not use PME in a way that is supported on the GPU.

Thank you for your help.

Ben