Running parallelized simulations in Condor

GROMACS version: 2022
GROMACS modification: No

We are trying to run parallelized simulations. Our HPC cluster use Condor and openmpi-4.1.3 is installed.

The current scripts look like this:

em_mpi.sub

universe = parallel
executable = /opt/data/HPC01A/cjalmeciga/Daniel/ejemplo/openmpiscriptGROMACS_v3
getenv=true
when_to_transfer_output = on_exit_or_evict
requirements = (Machine == “pujnodo10.javeriana.edu.co”)
output = em_(cluster)_(NODE).out
error = em_(cluster)_(NODE).err
log = em_$(cluster).log
notification = never
machine_count = 10
queue

openmpiscriptGROMACS_v3

#!/bin/sh
##**************************************************************

Copyright (C) 1990-2010, Condor Team, Computer Sciences Department,

University of Wisconsin-Madison, WI.

Licensed under the Apache License, Version 2.0 (the “License”); you

may not use this file except in compliance with the License. You may

obtain a copy of the License at

Apache License, Version 2.0

Unless required by applicable law or agreed to in writing, software

distributed under the License is distributed on an “AS IS” BASIS,

WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.

See the License for the specific language governing permissions and

limitations under the License.

##**************************************************************

YOU MUST CHANGE THIS TO THE PREFIX DIR OF OPENMPI

MPDIR=/usr/lib/openmpi

#Gromacs
export PATH=$PATH:/hpc_cluster/apps/gromacs-2022/build/bin
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/hpc_cluster/apps/gromacs-2022/build/lib

#OpenMPI
export PATH=$PATH:/hpc_cluster/apps/openmpi-4.1.3/bin
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/hpc_cluster/apps/openmpi-4.1.3/lib

#fftw
export PATH=$PATH:/hpc_cluster/apps/fftw-3.3.10/bin
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/hpc_cluster/apps/fftw-3.3.10/lib

#Cmake
export PATH=$PATH:/hpc_cluster/apps/cmake-3.23.1/bin#Gromacs
export PATH=$PATH:/hpc_cluster/apps/gromacs-2022/build/bin
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/hpc_cluster/apps/gromacs-2022/build/lib

/hpc_cluster/apps/gromacs-2022/build/bin/gmx_mpi_d mdrun -deffnm /opt/data/HPC01A/cjalmeciga/Daniel/ejemplo/em

Does anyone has a suggestion or script to run gromacs MPI in Condor?