GROMACS version: 2021.3
GROMACS modification: Yes/No
Here post your question
Hi,
i have recently installed Gromacs on ubuntu platform.
I used following to cmake:
cmake … -DGMX_BUILD_OWN_FFTW=ON -DREGRESSIONTEST_DOWNLOAD=ON -DGMX_MPI=on -DGMX_GPU=CUDA -DMPI_C_COMPILER=mpicc -DGMX_MPI=on
I am getting following error at the end:
/usr/bin/ld: …/…/…/…/lib/libmdrun_test_infrastructure.a(moduletest.cpp.o): undefined reference to symbol ‘MPI_Barrier’
/usr/bin/ld: /usr/lib/x86_64-linux-gnu/libmpich.so: error adding symbols: DSO missing from command line
collect2: error: ld returned 1 exit status
make[3]: *** [src/programs/mdrun/tests/CMakeFiles/mdrun-tpi-test.dir/build.make:113: bin/mdrun-tpi-test] Error 1
make[2]: *** [CMakeFiles/Makefile2:6646: src/programs/mdrun/tests/CMakeFiles/mdrun-tpi-test.dir/all] Error 2
make[1]: *** [CMakeFiles/Makefile2:2713: CMakeFiles/check.dir/rule] Error 2
make: *** [Makefile:249: check] Error 2
I did “make install” after this.
running an md job is giving me following error:
gmx mdrun -deffnm md_0_1 -nb gpu
GROMACS version: 2021.3
Precision: mixed
Memory model: 64 bit
MPI library: thread_mpi
OpenMP support: enabled (GMX_OPENMP_MAX_THREADS = 64)
GPU support: disabled
SIMD instructions: AVX_512
FFT library: fftw-3.3.8-sse2-avx-avx2-avx2_128-avx512
RDTSCP usage: enabled
TNG support: enabled
Hwloc support: disabled
Tracing support: disabled
C compiler: /usr/bin/cc GNU 9.3.0
C compiler flags: -mavx512f -mfma -Wno-missing-field-initializers -fexcess-precision=fast -funroll-all-loops -O3 -DNDEBUG
C++ compiler: /usr/bin/c++ GNU 9.3.0
C++ compiler flags: -mavx512f -mfma -Wno-missing-field-initializers -fexcess-precision=fast -funroll-all-loops -fopenmp
Running on 1 node with total 20 cores, 40 logical cores
Hardware detected:
CPU info:
Vendor: Intel
Brand: Intel(R) Xeon(R) Gold 6230 CPU @ 2.10GHz
Family: 6 Model: 85 Stepping: 7
Number of AVX-512 FMA units: 2
Please let me know how to resolve the issue and best use the GPU capabilities.
Thank you
There is no error in your output, but I assume it stated that your binary does not have GPU support. That’s because you build without GPU support as you can see from the " GPU support: disabled" in the version header.
Make sure to pass -DGMX_GPU= and specify your GPU platform (CUDA or OpenCL).
Secondly, your install commands should build an MPI build, but that seems to fail, then you run the gmx mdrun binary which is not an MPI binary (because " MPI library: thread_mpi" indicates it is not using lib-MPI). If you want to run multi-node, make sure you have a functioning MPI build.
I re did the cmake command:
==> “cmake … -DGMX_GPU=CUDA”
output:
– The GROMACS-managed build of FFTW 3 will configure with the following optimizations: --enable-sse2;–enable-avx;–enable-avx2;–enable-avx512
– Configuring done
– Generating done
– Build files have been written to: /home/mypc/Documents/software/gromacs-2021.3/build
==> after this I gave: “make”
output:
[ 1%] Built target fftwBuild
[ 1%] Built target scanner
[ 2%] Generating release version information
[ 2%] Built target release-version-info
[ 4%] Built target tng_io_obj
[ 4%] Built target tng_io_zlib
[ 4%] Built target lmfit_objlib
[ 5%] Built target thread_mpi
[ 25%] Built target linearalgebra
[ 26%] Built target modularsimulator
[ 93%] Built target libgromacs
[ 94%] Built target gmxapi
[ 95%] Built target nblib
[ 95%] Built target methane-water-integration
[ 95%] Built target argon-forces-integration
[ 95%] Built target gtest
[ 95%] Built target gmock
[ 98%] Built target testutils
[100%] Built target gmx_objlib
[100%] Built target mdrun_objlib
[100%] Built target view_objlib
[100%] Built target gmx
==> followed by “make check”
output:
[ 0%] Built target mdrun_objlib
[ 0%] Built target scanner
[ 1%] Generating release version information
[ 1%] Built target release-version-info
[ 2%] Built target fftwBuild
[ 3%] Built target tng_io_obj
[ 3%] Built target tng_io_zlib
[ 3%] Built target lmfit_objlib
[ 4%] Built target thread_mpi
[ 18%] Built target linearalgebra
[ 19%] Built target modularsimulator
[ 67%] Built target libgromacs
[ 68%] Built target gmx_objlib
[ 68%] Built target view_objlib
[ 68%] Built target gmx
[ 68%] Built target gmxtests
[ 68%] Built target gtest
[ 68%] Built target gmock
[ 70%] Built target testutils
[ 70%] Built target mdrun_test_infrastructure
[ 71%] Built target gmxapi
[ 72%] Built target workflow-details-test
[ 73%] Built target nblib
[ 73%] Built target nblib_test_infrastructure
[ 74%] Built target nblib-setup-test
[ 74%] Built target methane-water-integration
[ 74%] Built target argon-forces-integration
[ 74%] Built target nblib-integrator-test
[ 75%] Built target nblib-integration-test
[ 75%] Built target nblib-tests
[ 75%] Built target nblib-listed-forces-test
[ 75%] Built target nblib-util-test
[ 75%] Built target testutils-mpi-test
[ 75%] Built target testutils-test
[ 75%] Built target utility-mpi-test
[ 77%] Built target utility-test
[ 77%] Linking CXX executable …/…/…/…/bin/mdlib-test
[ 78%] Built target mdlib-test
[ 79%] Built target awh-test
[ 79%] Built target density_fitting_applied_forces-test
[ 79%] Built target applied_forces-test
[ 79%] Built target listed_forces-test
[ 79%] Built target onlinehelp-test-shared
[ 80%] Built target commandline-test
[ 81%] Built target domdec-mpi-test
[ 81%] Built target domdec-test
[ 82%] Built target ewald-test
[ 82%] Built target fft-test
[ 82%] Linking CXX executable …/…/…/…/bin/gpu_utils-test
[ 82%] Built target gpu_utils-test
[ 83%] Built target hardware-test
[ 84%] Built target math-test
[ 84%] Built target mdrunutility-test-shared
[ 84%] Built target mdrunutility-mpi-test
[ 84%] Built target mdrunutility-test
[ 85%] Built target mdspan-test
[ 85%] Built target mdtypes-test
[ 85%] Built target onlinehelp-test
[ 86%] Built target options-test
[ 86%] Built target pbcutil-test
[ 86%] Built target random-test
[ 86%] Built target restraintpotential-test
[ 86%] Built target table-test
[ 86%] Built target taskassignment-test
[ 86%] Built target topology-test
[ 86%] Built target pull-test
[ 87%] Built target simd-test
[ 87%] Built target compat-test
[ 87%] Built target gmxana-test
[ 88%] Built target pdb2gmx3-test
[ 88%] Built target pdb2gmx2-test
[ 88%] Built target pdb2gmx1-test
[ 89%] Built target gmxpreprocess-test
[ 89%] Built target correlations-test
[ 89%] Built target analysisdata-test-shared
[ 89%] Built target analysisdata-test
[ 90%] Built target coordinateio-test
[ 91%] Built target trajectoryanalysis-test
[ 91%] Built target energyanalysis-test
[ 92%] Built target tool-test
[ 92%] Built target fileio-test
[ 93%] Built target selection-test
[ 93%] Linking CXX executable …/…/…/…/bin/mdrun-tpi-test
/usr/bin/ld: …/…/…/…/lib/libmdrun_test_infrastructure.a(moduletest.cpp.o): undefined reference to symbol ‘MPI_Barrier’
/usr/bin/ld: /usr/lib/x86_64-linux-gnu/libmpich.so: error adding symbols: DSO missing from command line
collect2: error: ld returned 1 exit status
make[3]: *** [src/programs/mdrun/tests/CMakeFiles/mdrun-tpi-test.dir/build.make:113: bin/mdrun-tpi-test] Error 1
make[2]: *** [CMakeFiles/Makefile2:6646: src/programs/mdrun/tests/CMakeFiles/mdrun-tpi-test.dir/all] Error 2
make[1]: *** [CMakeFiles/Makefile2:2713: CMakeFiles/check.dir/rule] Error 2
make: *** [Makefile:249: check] Error 2