Gmx grompp crashes with a very simple system

GROMACS version: 2023
GROMACS modification: No

Dear all,

Using version 2023, I have observed grompp crash without any further explanation for several of my systems. I could simplify one of the systems so it contains now only 1 single atom, but the crash is still reproduced. Minor modifications (e.g. changing the cell size or switching off the thermostat) make the crash disappear.
I use the following command:
gmx grompp -f run.mdp -c system.gro -p system.top -o run.tp

The files to reproduce the problem are attached. Plase note that with version 2021 the problem vanishes; other versions have not been tested.

Any help is appreciated.

Stephan
system.top (558 Bytes)
mdout.mdp (10.3 KB)
run.mdp (60 Bytes)

Does the GROMACS test suite pass? Did you compile GROMACS yourself? If so, try running make check from the directory where you built GROMACS.

I have not compiled GROMACS on my own, so I don’t know whether the test suite passes. In any case, the problem can be reproduced with the official version installed on major European supercomputer, so I would assume that it should not be related to a wrong compilation. But of course you never know.

Hi Stephan,

Please attach system.gro file (you said that the cell size is relevant, and that’s where it is, so we cannot reproduce the issue without it) and the full output log.

Here it is, I forgot to upload in my original post, apologies.
(it is called system.gro.dat since .gro uploads seem to be blocked)
system.gro.dat (71 Bytes)

Thanks, I can reproduce the problem with 2023.4 and 2024.0.

The problem is caused by large negative atom coordinates (specifically, Y=-8.229), which are not properly handled in our code for average density estimation. We hope to fix it in the next release.

As a workaround, you can manually shift the system so that it does not have any coordinates less than negative-size (-6.2 in your case). Large positive coordinates or small negative coordinates should not be a problem.

Hi @al42and,

Is it supposed to be fixed in the last release ? I think I am facing the same problem with 2024.1.

The fix was just a bit too late to make it into 2024.1. It will be in 2024.2, which will be released in about a month.