New GROMACS user

GROMACS version: gromacs/2018_gpu
GROMACS modification:
Here post your question
I am a new user of GROMACS software package, I have a couple of questions:

  1. I am trying to run it over PSC bridges (XSEDE account) so is there anyone using it this platform to run jobs?
  2. I am also new to molecular dynamic simulations, although I have reached a stage where I can generate the topology files for protein and ligand but totally confused about how to proceed further.

The PSC user guide has some examples of how to load and use GROMACS but the documentation is a bit sparse and they warn that the user should be well acquainted with routine GROMACS usage before trying to submit jobs. There is a GPU-enabled version and likely a normal parallelized (OpenMP and/or MPI) version available.

Might I suggest some tutorials, particularly: Protein-Ligand Complex

Make sure you’re familiar with routine GROMACS protocols before trying to do something complicated that involves topology hacking. Learn to walk before you run :)

1 Like

Thank you for the response @jalemkul, I read some papers regarding its application and also the tutorial on how to use GROMACS but its implementation over XSEDE is a bit difficult. It would be a great help if you can connect me with someone who uses GROMACS over XSEDE for running simulations.

Thank you.

hi @jakemkul. Thanks for your attention to Amar’s problem. I have looked at Amar’s files on Bridges 2 and the error that he is getting is:

Fatal error:
56 particles communicated to PME rank 2 are more than 2/3 times the cut-off
out of the domain decomposition cell of their charge group in dimension x.
This usually means that your system is not well equilibrated.

I see that error has been reported on the Gromacs Forum on March 2 with the title:
Domain decomposition on multiple ranks.
Has there been a definitive answer to that problem? Just as in that case, I found that I can run your system with one MPI and 2 or 4 openmp threads as follows:

mpirun -n 1 gmx_mpi mdrun -v -deffnm md_0_1 -s md_0_1.tpr -ntomp 2

but not with more than one mpi rank. It might be possible that just as in the March 2 case, Amar has prepared the files on his workstations to then attempt to run on Bridges 2 and that has caused the error?

This is Gromacs 2020.1 compiled with spack.

Marcela Madrid, Ph.D
Senior Computational Scientist
Pittsburgh Supercomputing Center

The issue is that the system is unstable. It is unrelated to anything specific in the hardware or software.

hi @jalemkul, yes, the error message says the system needs to be equilibrated more. But, does that affect the mpi decomposition? because it works on one processor. thanks, Marcela

Running in parallel triggers a lot more algorithms that are sensitive to unstable systems. If it’s unstable in parallel, it will eventually crash one one processor, as well.

thanks!

Dr. Lemkul,

Thank you for your answer. I followed your lysozyme in water tutorial and generated an input file, I ran it simultaneously on my local workstation and on XSEDE. It somehow again gave me the same system unstable error on XSEDE but ran without error on the local workstation.

In this case what would be your suggestion?

Try using fewer processors on the supercomputer. Perhaps your initial conditions are simply somewhat out of equilibrium, which leads to rapid changes in DD cell size that can’t be accounted for. This is still a form of instability, though one you may be able to bypass with a slightly different setup.

Thank you Dr. Lemkul. I’ll try that.