Fatal error: Cannot run short-ranged nonbonded interactions on a GPU because no GPU is detected

GROMACS version: 2024.5
GROMACS modification: No
I have compiled GROMACS with the following cmake options:

cmake … -DGMX_MPI=ON -DGMX_BUILD_OWN_FFTW=ON -DREGRESSIONTEST_DOWNLOAD=ON -DGMX_GPU=opencl -DOpenCL_LIBRARY=/usr/lib/x86_64-linux-gnu/libOpenCL.so -DGMX_GPU_NB_CLUSTER_SIZE=4

My Hardware: NVS Quadro 5400M
Driver: NVIDIA-390.157 (installed through apt repository - legacy version)
Client driver nvidia-legacy-390xx-opencl-icd has also been installed and my installation aborted with the following error on make check.

Essential dynamics tests FAILED with 7 errors!


92% tests passed, 7 tests failed out of 91

Label Time Summary:
GTest              = 134.20 sec*proc (87 tests)
IntegrationTest    =  78.79 sec*proc (30 tests)
MpiTest            =  59.39 sec*proc (23 tests)
QuickGpuTest       =  35.72 sec*proc (20 tests)
SlowGpuTest        = 131.97 sec*proc (16 tests)
SlowTest           =  23.31 sec*proc (13 tests)
UnitTest           =  32.10 sec*proc (44 tests)

Total Test time (real) = 179.75 sec

The following tests FAILED:
	 28 - DomDecMpiTests (Failed)
	 35 - MdrunUtilityMpiUnitTests (Failed)
	 74 - MdrunMultiSimTests (Failed)
	 75 - MdrunMultiSimReplexTests (Failed)
	 76 - MdrunMultiSimReplexEquivalenceTests (Failed)
	 90 - regressiontests/complex (Failed)
	 91 - regressiontests/essentialdynamics (Failed)
Errors while running CTest
make[3]: *** [CMakeFiles/run-ctest-nophys.dir/build.make:71: CMakeFiles/run-ctest-nophys] Error 8
make[2]: *** [CMakeFiles/Makefile2:3310: CMakeFiles/run-ctest-nophys.dir/all] Error 2
make[1]: *** [CMakeFiles/Makefile2:3343: CMakeFiles/check.dir/rule] Error 2
make: *** [Makefile:628: check] Error 2

Essentially I am trying to use my discrete GPU to use opencl for GROMACS run. my mdrun runs fine. However my installation went fine, no complaints whatsoever. When I try to use the GPU for computation it throws the following error:

~~> gmx_mpi mdrun -v -deffnm nvt_1 -nb gpu -pin on -pinoffset 0
                      :-) GROMACS - gmx mdrun, 2024.5 (-:

Executable:   /usr/local/gromacs/bin/gmx_mpi
Data prefix:  /usr/local/gromacs
Working dir:  /home/russellb/Documents/work/staph_nuclease/snase_charm/6eql
Command line:
  gmx_mpi mdrun -v -deffnm nvt_1 -nb gpu -pin on -pinoffset 0


Back Off! I just backed up nvt_1.log to ./#nvt_1.log.2#
Reading file nvt_1.tpr, VERSION 2024.5 (single precision)
Changing nstlist from 10 to 80, rlist from 1.007 to 1.171


-------------------------------------------------------
Program:     gmx mdrun, version 2024.5
Source file: src/gromacs/taskassignment/findallgputasks.cpp (line 85)

Fatal error:
Cannot run short-ranged nonbonded interactions on a GPU because no GPU is
detected.

For more information and tips for troubleshooting, please check the GROMACS
website at https://manual.gromacs.org/current/user-guide/run-time-errors.html
-------------------------------------------------------
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them

Could someone shed some light on what is that I am missing (or) how to make use of my GPU with OpenCL. Many thanks in advance. :-)

Update:

My driver module is loaded:

~~> nvidia-smi
Thu Mar 13 21:20:01 2025       
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 390.157                Driver Version: 390.157                   |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|===============================+======================+======================|
|   0  NVS 5400M           Off  | 00000000:01:00.0 N/A |                  N/A |
| N/A   73C    P8    N/A /  N/A |    150MiB /   963MiB |     N/A      Default |
+-------------------------------+----------------------+----------------------+
                                                                               
+-----------------------------------------------------------------------------+
| Processes:                                                       GPU Memory |
|  GPU       PID   Type   Process name                             Usage      |
|=============================================================================|
|    0                    Not Supported                                       |
+-----------------------------------------------------------------------------+

and

~~> lsmod | grep nvidia
nvidia_uvm            921600  4
nvidia_drm             53248  1
drm_kms_helper        204800  1 nvidia_drm
nvidia_modeset       1056768  7 nvidia_drm
nvidia              15888384  276 nvidia_uvm,nvidia_modeset
drm                   643072  5 drm_kms_helper,thinkpad_acpi,nvidia_drm
ipmi_msghandler        69632  2 ipmi_devintf,nvidia
video                  61440  2 nvidia,thinkpad_acpi

GPU is detected by other programs such as pymol. Here, I am trying to compile GROMACS with OpenCL. Any help would be much appreciated. :-)

Hi!

Regarding the device detection:

GMX_GPU_NB_CLUSTER_SIZE=4 is the setting for consumer Intel GPUs, not compatible with NVIDIA GPUs at all. You should set GMX_GPU_NB_CLUSTER_SIZE=8 and rebuild GROMACS.

Have you checked the output of clinfo output to see whether the OpenCL runtime itself is functioning?

And, well, you probably know that GROMACS 2024 does not officially support this GPU, so getting it through OpenCL is the best option but still not guaranteed to work. It might be worthwhile to use older version of GROMACS and build it with CUDA 9.x instead of OpenCL.

Regarding the failed tests: sharing the full output of make check is needed. Just knowing that certain tests failed is not enough to understand why they failed.

Thank you for this insight. I am building it now with GMX_GPU_NB_CLUSTER_SIZE=8. Following is the output of my clinfo:

~~> clinfo 
Number of platforms                               1
  Platform Name                                   NVIDIA CUDA
  Platform Vendor                                 NVIDIA Corporation
  Platform Version                                OpenCL 1.2 CUDA 9.1.84
  Platform Profile                                FULL_PROFILE
  Platform Extensions                             cl_khr_global_int32_base_atomics cl_khr_global_int32_extended_atomics cl_khr_local_int32_base_atomics cl_khr_local_int32_extended_atomics cl_khr_fp64 cl_khr_byte_addressable_store cl_khr_icd cl_khr_gl_sharing cl_nv_compiler_options cl_nv_device_attribute_query cl_nv_pragma_unroll cl_nv_copy_opts cl_nv_create_buffer
  Platform Extensions function suffix             NV

  Platform Name                                   NVIDIA CUDA
Number of devices                                 1
  Device Name                                     NVS 5400M
  Device Vendor                                   NVIDIA Corporation
  Device Vendor ID                                0x10de
  Device Version                                  OpenCL 1.1 CUDA
  Driver Version                                  390.157
  Device OpenCL C Version                         OpenCL C 1.1 
  Device Type                                     GPU
  Device Topology (NV)                            PCI-E, 0000:01:00.0
  Device Profile                                  FULL_PROFILE
  Device Available                                Yes
  Compiler Available                              Yes
  Max compute units                               2
  Max clock frequency                             950MHz
  Compute Capability (NV)                         2.1
  Max work item dimensions                        3
  Max work item sizes                             1024x1024x64
  Max work group size                             1024
  Preferred work group size multiple (kernel)     32
  Warp size (NV)                                  32
  Preferred / native vector sizes                 
    char                                                 1 / 1       
    short                                                1 / 1       
    int                                                  1 / 1       
    long                                                 1 / 1       
    half                                                 0 / 0        (n/a)
    float                                                1 / 1       
    double                                               1 / 1        (cl_khr_fp64)
  Half-precision Floating-point support           (n/a)
  Single-precision Floating-point support         (core)
    Denormals                                     Yes
    Infinity and NANs                             Yes
    Round to nearest                              Yes
    Round to zero                                 Yes
    Round to infinity                             Yes
    IEEE754-2008 fused multiply-add               Yes
    Support is emulated in software               No
    Correctly-rounded divide and sqrt operations  No
  Double-precision Floating-point support         (cl_khr_fp64)
    Denormals                                     Yes
    Infinity and NANs                             Yes
    Round to nearest                              Yes
    Round to zero                                 Yes
    Round to infinity                             Yes
    IEEE754-2008 fused multiply-add               Yes
    Support is emulated in software               No
  Address bits                                    64, Little-Endian
  Global memory size                              1010106368 (963.3MiB)
  Error Correction support                        No
  Max memory allocation                           252526592 (240.8MiB)
  Unified memory for Host and Device              No
  Integrated memory (NV)                          No
  Minimum alignment for any data type             128 bytes
  Alignment of base address                       4096 bits (512 bytes)
  Global Memory cache type                        Read/Write
  Global Memory cache size                        32768 (32KiB)
  Global Memory cache line size                   128 bytes
  Image support                                   Yes
    Max number of samplers per kernel             16
    Max 2D image size                             16384x16384 pixels
    Max 3D image size                             2048x2048x2048 pixels
    Max number of read image args                 128
    Max number of write image args                8
  Local memory type                               Local
  Local memory size                               49152 (48KiB)
  Registers per block (NV)                        32768
  Max number of constant args                     9
  Max constant buffer size                        65536 (64KiB)
  Max size of kernel argument                     4352 (4.25KiB)
  Queue properties                                
    Out-of-order execution                        Yes
    Profiling                                     Yes
  Profiling timer resolution                      1000ns
  Execution capabilities                          
    Run OpenCL kernels                            Yes
    Run native kernels                            No
    Kernel execution timeout (NV)                 Yes
    Concurrent copy and kernel execution (NV)     Yes
      Number of async copy engines                1
  Device Extensions                               cl_khr_global_int32_base_atomics cl_khr_global_int32_extended_atomics cl_khr_local_int32_base_atomics cl_khr_local_int32_extended_atomics cl_khr_fp64 cl_khr_byte_addressable_store cl_khr_icd cl_khr_gl_sharing cl_nv_compiler_options cl_nv_device_attribute_query cl_nv_pragma_unroll cl_nv_copy_opts cl_nv_create_buffer

NULL platform behavior
  clGetPlatformInfo(NULL, CL_PLATFORM_NAME, ...)  NVIDIA CUDA
  clGetDeviceIDs(NULL, CL_DEVICE_TYPE_ALL, ...)   Success [NV]
  clCreateContext(NULL, ...) [default]            Success [NV]
  clCreateContextFromType(NULL, CL_DEVICE_TYPE_DEFAULT)  No platform
  clCreateContextFromType(NULL, CL_DEVICE_TYPE_CPU)  No devices found in platform
  clCreateContextFromType(NULL, CL_DEVICE_TYPE_GPU)  No platform
  clCreateContextFromType(NULL, CL_DEVICE_TYPE_ACCELERATOR)  No devices found in platform
  clCreateContextFromType(NULL, CL_DEVICE_TYPE_CUSTOM)  Invalid device type for platform
  clCreateContextFromType(NULL, CL_DEVICE_TYPE_ALL)  No platform

ICD loader properties
  ICD loader Name                                 OpenCL ICD Loader
  ICD loader Vendor                               OCL Icd free software
  ICD loader Version                              2.3.1
  ICD loader Profile                              OpenCL 3.0

I am in the cross-roads and I don’t think I can afford to go back to CUDA9.x without resorting to a complete re-install of my OS ^_^. I would probably stick to this and try to get it work with OpenCL if not impossible. Could you direct me which version of GROMACS would support my GPU officially? I can try that. Will update this post with the outcome of GMX_GPU_NB_CLUSTER=8 shortly. Many thank you for the support.

UPDATE

Here is my test failure log:

. . .
87/91 Test #87: MdrunRotationTests ........................   Passed    0.82 sec
      Start 88: MdrunSimulatorComparison
88/91 Test #88: MdrunSimulatorComparison ..................   Passed    0.64 sec
      Start 89: MdrunVirtualSiteTests
89/91 Test #89: MdrunVirtualSiteTests .....................   Passed    0.91 sec
      Start 90: regressiontests/complex
90/91 Test #90: regressiontests/complex ...................***Failed   86.03 sec
Will test on 8 MPI ranks (if possible)
Will test using executable suffix _mpi

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/acetonitrilRF gmx_mpi mdrun     -nb cpu   -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in acetonitrilRF for acetonitrilRF

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/aminoacids gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in aminoacids for aminoacids

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/argon gmx_mpi mdrun     -nb cpu   -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in argon for argon

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/awh_multibias gmx_mpi mdrun        -notunepme -cpi /home/russellb/extpacks/gromacs-2024.5/build/tests/regressiontests-2024.5/complex/awh_multibias/continue -noappend >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in awh_multibias for awh_multibias

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/awh_multidim gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in awh_multidim for awh_multidim

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/butane gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in butane for butane

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/cbt gmx_mpi mdrun     -nb cpu   -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in cbt for cbt

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/dd121 gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in dd121 for dd121

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/dec+water gmx_mpi mdrun     -nb cpu   -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in dec+water for dec+water

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/ethyleenglycol gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in ethyleenglycol for ethyleenglycol

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/field gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in field for field

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/nacl gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in nacl for nacl

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/nbnxn-energy-groups gmx_mpi mdrun     -nb cpu   -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in nbnxn-energy-groups for nbnxn-energy-groups

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/nbnxn-free-energy gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in nbnxn-free-energy for nbnxn-free-energy

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/nbnxn-free-energy-vv gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in nbnxn-free-energy-vv for nbnxn-free-energy-vv

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/nbnxn-ljpme-geometric gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in nbnxn-ljpme-geometric for nbnxn-ljpme-geometric

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/nbnxn-ljpme-LB gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in nbnxn-ljpme-LB for nbnxn-ljpme-LB

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/nbnxn-ljpme-LB-geometric gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in nbnxn-ljpme-LB-geometric for nbnxn-ljpme-LB-geometric

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/nbnxn-vdw-force-switch gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in nbnxn-vdw-force-switch for nbnxn-vdw-force-switch

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/nbnxn-vdw-potential-switch gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in nbnxn-vdw-potential-switch for nbnxn-vdw-potential-switch

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/nbnxn-vdw-potential-switch-argon gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in nbnxn-vdw-potential-switch-argon for nbnxn-vdw-potential-switch-argon

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/nbnxn_pme gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in nbnxn_pme for nbnxn_pme

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/nbnxn_pme_order5 gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in nbnxn_pme_order5 for nbnxn_pme_order5

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/nbnxn_pme_order6 gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in nbnxn_pme_order6 for nbnxn_pme_order6

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/nbnxn_rf gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in nbnxn_rf for nbnxn_rf

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/nbnxn_rzero gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in nbnxn_rzero for nbnxn_rzero

Abnormal return value for '/usr/bin/mpiexec -np 6 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/nbnxn_vsite gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in nbnxn_vsite for nbnxn_vsite

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/nst_mismatch gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in nst_mismatch for nst_mismatch

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/octahedron gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in octahedron for octahedron

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/position-restraints gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in position-restraints for position-restraints

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/pr-vrescale gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in pr-vrescale for pr-vrescale

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/pull_constraint gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in pull_constraint for pull_constraint

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/pull_cylinder gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in pull_cylinder for pull_cylinder

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/pull_geometry_angle gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in pull_geometry_angle for pull_geometry_angle

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/pull_geometry_angle-axis gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in pull_geometry_angle-axis for pull_geometry_angle-axis

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/pull_geometry_dihedral gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in pull_geometry_dihedral for pull_geometry_dihedral

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/reb gmx_mpi mdrun     -nb cpu   -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in reb for reb

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/swap_x gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in swap_x for swap_x

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/swap_y gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in swap_y for swap_y

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/swap_z gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in swap_z for swap_z

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/tip4p gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in tip4p for tip4p

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/tip4p_continue gmx_mpi mdrun        -notunepme -cpi /home/russellb/extpacks/gromacs-2024.5/build/tests/regressiontests-2024.5/complex/tip4p_continue/continue -noappend >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in tip4p_continue for tip4p_continue

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/tip4pflex gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in tip4pflex for tip4pflex

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/urea gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in urea for urea

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/complex/walls gmx_mpi mdrun        -notunepme >mdrun.out 2>&1' was -1
FAILED. Check mdrun.out, md.log file(s) in walls for walls
45 out of 48 complex tests FAILED

      Start 91: regressiontests/essentialdynamics
91/91 Test #91: regressiontests/essentialdynamics .........***Failed    9.70 sec
Will test on 8 MPI ranks (if possible)
Will test using executable suffix _mpi

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/essentialdynamics/linfix gmx_mpi mdrun       -ei sam.edi -eo linfix.xvg >mdrun.out 2>&1' was -1

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/essentialdynamics/linacc gmx_mpi mdrun       -ei sam.edi -eo linacc.xvg >mdrun.out 2>&1' was -1

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/essentialdynamics/radfix gmx_mpi mdrun       -ei sam.edi -eo radfix.xvg >mdrun.out 2>&1' was -1

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/essentialdynamics/radacc gmx_mpi mdrun       -ei sam.edi -eo radacc.xvg >mdrun.out 2>&1' was -1

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/essentialdynamics/radcon gmx_mpi mdrun       -ei sam.edi -eo radcon.xvg >mdrun.out 2>&1' was -1

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/essentialdynamics/flooding1 gmx_mpi mdrun       -ei /home/russellb/extpacks/gromacs-2024.5/build/tests/regressiontests-2024.5/essentialdynamics/flooding1/sam.edi -eo flooding1.xvg >mdrun.out 2>&1' was -1

Abnormal return value for '/usr/bin/mpiexec -np 8 -wdir /home/russellb/extpacks/gromacs-2024.5/build/tests/essentialdynamics/flooding2 gmx_mpi mdrun       -ei /home/russellb/extpacks/gromacs-2024.5/build/tests/regressiontests-2024.5/essentialdynamics/flooding2/sam.edi -eo flooding2.xvg >mdrun.out 2>&1' was -1
Essential dynamics tests FAILED with 7 errors!


92% tests passed, 7 tests failed out of 91

Label Time Summary:
GTest              = 131.80 sec*proc (87 tests)
IntegrationTest    =  76.69 sec*proc (30 tests)
MpiTest            =  58.38 sec*proc (23 tests)
QuickGpuTest       =  34.51 sec*proc (20 tests)
SlowGpuTest        = 131.18 sec*proc (16 tests)
SlowTest           =  23.25 sec*proc (13 tests)
UnitTest           =  31.86 sec*proc (44 tests)

Total Test time (real) = 178.09 sec

The following tests FAILED:
	 28 - DomDecMpiTests (Failed)
	 35 - MdrunUtilityMpiUnitTests (Failed)
	 74 - MdrunMultiSimTests (Failed)
	 75 - MdrunMultiSimReplexTests (Failed)
	 76 - MdrunMultiSimReplexEquivalenceTests (Failed)
	 90 - regressiontests/complex (Failed)
	 91 - regressiontests/essentialdynamics (Failed)
Errors while running CTest
make[3]: *** [CMakeFiles/run-ctest-nophys.dir/build.make:71: CMakeFiles/run-ctest-nophys] Error 8
make[2]: *** [CMakeFiles/Makefile2:3310: CMakeFiles/run-ctest-nophys.dir/all] Error 2
make[1]: *** [CMakeFiles/Makefile2:3343: CMakeFiles/check.dir/rule] Error 2
make: *** [Makefile:628: check] Error 2

While -np 8 is impossible since I only have 4 CPU cores.Abnormal return values.. is something I am not able to understand. Correct me if I am wrong in my assertion. Many thanks in advance.

Update-2

I am still getting the same issue:

~~> gmx_mpi mdrun -v -deffnm md_0_31 -nb gpu
                      :-) GROMACS - gmx mdrun, 2024.5 (-:

Executable:   /usr/local/gromacs/bin/gmx_mpi
Data prefix:  /usr/local/gromacs
Working dir:  /home/russellb/Documents/work/Amyloid/MDSim/abeta42/7prd
Command line:
  gmx_mpi mdrun -v -deffnm md_0_31 -nb gpu

Reading file md_0_31.tpr, VERSION 2024.5 (single precision)
Changing nstlist from 10 to 100, rlist from 1 to 1.164


-------------------------------------------------------
Program:     gmx mdrun, version 2024.5
Source file: src/gromacs/taskassignment/findallgputasks.cpp (line 85)

Fatal error:
Cannot run short-ranged nonbonded interactions on a GPU because no GPU is
detected.

For more information and tips for troubleshooting, please check the GROMACS
website at https://manual.gromacs.org/current/user-guide/run-time-errors.html
-------------------------------------------------------
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------

May be I will try with an earlier version of GROMACS. Which one would support my GPU?

Hi @al42and. Would it be reasonable to downgrade to an earlier version of GROMACS? if so, could you kindly advise which version would be better? Many thanks in advance.