Parallelization

GROMACS version: 5.0.7
GROMACS modification: Yes/No
Here post your question
Hello.
I am studying polymers and I am running Gromacs in a “University cluster” which has multiple nodes and each node has a number of cores (8, 16, 32, etc). The gromacs is not compiled with mpi, so I use “gmx mdrun”.
First of all, which is the proper way to run a simulation in parallel? I have read the manual that you can specify -ntomp (OpenMP) and -ntmpi (thread-MPI) but I don’t get excactly the difference.
If I want to run on more than 1 node is the compilation with mpi mandatory?
Finally, I tried to run in 1 node a rather big system (10.000 atoms) using:
$ gmx mdrun -deffnm “tpr”
Choosing 8 cores I get an error that “an atom moved too far between two domain decompositions”.
Increasing the number of cores the error is “there is no domain decomposition for x ranks that is compatible with the given box”. How can I fix these problems? It has to do with choosing the proper parallelization, right?
Thank you in advance.

Hi,

First off, if you can, use a newer version of GROMACS, because your simulations will run much faster under all circumstances.

GROMACS uses different levels of parallelisation. Indeed, if you want to run on multiple nodes, there is no way around MPI.

However, you can often obtain quite reasonable performance already on single nodes. There, GROMACS tries to grab what it can, but you can fine-tune with -ntomp and -ntmpi.

The errors you get are more indicative of something off with the way you describe the physics of your system. 10.000 atoms is not that much for MD standards - did you run through all the initial steps with solvation, adding ions, energy minimisation, position-restraint equilibration, etc.?

Thanks for your reply.
My system is just a melt of PP. I don’t have neither solution or ions. In order to build it I used the “Amorphous Builder” which is included in MAPs (Material and Processes simulation). It has a gromacs expansion with which you can easily generate the 3 files that you need (gro, mdp, top).
For a smaller system of PP (2000 atoms) with the same force field the simulation worked fine.
Maybe, it has to do with equilibration. What about choosing a rather small “emtol” ? Does smaller “emtol” drive to better equilibration?

Ah, sorry - I just assumed protein, nice to see GROMACS usage in other contexts :)

You can try smaller emtol, but the first step might be a smaller MD integration time-step.

Other culprits might be pressure or temperature coupling algorithms if you use them. You might see large box-size fluctuations, there the effect amplifies with system size, so adding an equilibration phase with Berendsen pressure coupling might help, because this is more robust.

BTW, if your workstation has a decent NVIDIA GPU and a recent linux version, you can try compiling the latest GROMACS2020.4 version with GPU support and you should be able to run this type of simulation quite fast just on your workstation, without even having to bother with the university cluster.

Indeed, I am using an temperature coupling algorithm. I perform an NVT simulation with “velocity rescale thermostat” and timestep 0.01 ps. During this NVT I got the error. So I will try to experiment with smaller timesteps, as you said.
Thank you again.

Ah, okay - velocity rescale and NVT should be quite stable though. You could try setting the tau parameter for the temperture coupling to zero, effectively sucking all extra energy in your system out immediately, while still being thermodynamically correct. Another guess, might be that your density is a bit off and your system quite strained due to that.

But isn’t tau parameter the time constant that the temperature couples? I see in manual that v-rescale works also with tau-t, but what does this really means?

Yes, tau-t is the time constant for the pressure coupling, so tau-t = 0 means that v-rescale is memory-less and immediately sucks out excess energy from the system.

If you use timestep 0.01 ps, this would be the first culprit for the instability though. Lowering to 0.002 ps should help; in newer GROMACS versions grompp should have issued a warning though, saying something about time-scales in your system.

That’s clear. I will follow your instructions.
Thank you very much.