Gromacs on cloud GPU

GROMACS version: 2023 onwards into future
GROMACS modification: Yes/No

Hi guys!

I saw a little bit starting to appear online about Gromacs on cloud GPUs, incl gromacs’ paper from last year.

So, I guess, some of you already started using cloud GPUs. Could you share any tips/insights?

I am interested in running relatively small (typically runs on a single GPU over 1-3 days) simulations, but many of them. They are standard MD runs, nothing extra needed.

I have never used clouds for calculations, so I am interested in:

  • any pointers to where to start (ok, I know google);
  • what are good servers and why?
  • what do I need in my own infrastructure;
  • what are typical associated costs?
  • what are good/bad things you have discovered using clouds.

Thank you!

In case you have not already seen: these slides and this paper

Maybe @Carsten can weight in for further ressources.

1 Like

Thank you! I have not seen these slides - they are good!!!

Hi Valentina,

there is also a slightly newer and more up-to-date version of the slides available here:



Oh even better! Thank you!

PS - I love your illustrations @Carsten !


@Carsten Do you have any more recent data in particular comparing CPU instances (including EPYC Milan and Graviton 3)?

@pszilard Fig. 1 of this poster shows strong scaling across hpc6a AMD EPYC instances.

Apart from that, I benchmarked the g5g instances (orange in the attached plot), which are Graviton2 CPUs with NVIDIA T4G GPUs.

But nothing more recent up to now.

Thanks @Carsten!

You can also try launching Cloud computation with GROMACS using GROMACS Wizard in SAMSON. It allows one to prepare, minimize, equilibrate, and run simulations both locally and in the Cloud, including on Cloud GPUs.
See GROMACS Wizard tutorial, GROMACS Wizard – Computing in the cloud tutorial.


Thank you Dmitriy,

Did not know about this.