Scaling factor for surface tension in Reduced Units

GROMACS version: 2027.0-dev-20251202-15e0cf6836-unknown
GROMACS modification: No

I’m trying to replicate the results of the paper “Development and application of a particle-particle particle-mesh Ewald
method for dispersion interactions”. Here is a snippet of the details of the simulation I am trying to replicate :

“The Lennard-Jones simulations were performed in a box
with volume 11.01 × 11.01 × 176.16σ^3 and 4000 particles
that were placed randomly in a subvolume at the center of
the box. After minimization using a soft potential, the system
was equilibrated for 100 000
√ timesteps. The timestep was set
to 0.005 τ , where τ = σ m/. Simulations were executed
at reduced temperatures T = kB T/ ∈ {0.7, 0.85, 1.1, 1.2}
using a Nosé-Hoover39 thermostat with damping factor 10 τ. …
In simulations with the
dispersion PPPM we used cutoffs of 3σ , 4σ , and 5σ . We used
P = 5, β = 0.9σ −1 and a grid with 9 × 9 × 144 mesh points”

I think gromacs uses SPME instead of PPPM, so I just used those parameters for SPME instead. I calculated the necessary ewald-rtol-lj (0.11948574141923778) that corresponds to beta of 0.9. As the manual says, for reduced units, I used sigma = 1 nm, epsilon = 1 kJ / mol, m = 1 amu, and to simulate a reduced temperature of 0.7, I used T = 120.272 36 * 0.7 = 84.190652.

I did not use soft potential to reduce it but using packmol I was able to set the tolerance (1 nm) such that normal minimization worked without errors or clashing. I then proceeded to perform all the simulations and tried to calculate the surface tension on my own. I extracted surface tension values from the energy files.

Since surface tension has pressure values multiplied in units of bar, I converted that to amu / (nm * ps^2), the scaling factor I got was 0.06022140640952086 [ (1e-9 . 1e-12 . 1e-5) / (1.6605391 . 1e-27) ].

I am attaching a python script and the last few lines of the surface tension xvg file that I obtained from the simulation. My only concern is that the final value of the surface tension I obtained is on a different scale than what the paper reports (Simulation value : 979.9390508365599, paper value: 0.588).

Am I missing a scaling factor in my calculations?

Any help would be greatly appreciated. Thanks!

scale_surface_tension_py.dat (1.1 KB)

surf_ten_xvg.dat (105.5 KB)