I have a simple POPC membrane which runs well prior to trying to use deform to stretch it.
Only change is to add a line to the pcouple section:
pcoupl = Parrinello-Rahman ;; Berendsen ;
pcoupltype = surface-tension
tau_p = 2.0
ref_p = 1.0 01.0
compressibility = 0 4.5e-5
refcoord_scaling = com
deform = 1 1 0 0 0 0
grompp executes normally with and without the deform statement
mdrun throws an error: tMPI error: Invalid tMPI_Comm (in valid comm)
the cmake I used to compile is straightforward ( cmake 3.20, CUDA 11, gcc 9.3 )
cmake … -DGMX_BUILD_OWN_FFTW=ON -DREGRESSIONTEST_DOWNLOAD=ON -DGMX_GPU=CUDA -DGMX_USE_OPENCL=off
Do I need to recompile with MPI to use deform ( assuming deform is the correct parameter to stretch a membrane ) ?
Firstly, mdp… deform is the correct parameter to use for strain development.
I am able to get the system to run by not using -ntmpi and -npme when starting a simulation. However without thses specificatons now only one gpu is used . This is likley which i related to the tMPI error.
Are you using the 2022 release? Does the problem only appear if using multiple thread-MPI ranks, e.g. -ntmpi N N>=2. Does it happen with or without direct communication? What happens if you use a regular MPI build?
The issue with the setup in the original post is that a deformation speed of 1 nm per ps is insanely fast. Unless the system is a gas, it will explode due to instabilities. A deformation speed on the order of 1 nm per ns is more reasonable.
We were able to circumvent the error by using less threads (-nt 32), or, by using the 2020.4 version of Gromacs. Change in deformation speed (in my case usually 0.1 nm/ns) had no influence on the issue.