...
Version | Installation Path | modulefile | compiler | comment |
---|---|---|---|---|
2018.4 | /sw/chem/gromacs/2018.4/skl/impi | gromacs/2018.4 | intelmpi | |
2018.4 | /sw/chem/gromacs/2018.4/skl/impi-plumed | gromacs/2018.4-plumed | intelmpi | with plumed |
2019.6 | /sw/chem/gromacs/2019.6/skl/impi | gromacs/2019.6 | intelmpi | |
2019.6 | /sw/chem/gromacs/2019.6/skl/impi-plumed | gromacs/2019.6-plumed | intelmpi | with plumed |
Usage
Load the necessary modulefiles. Note that Intel MPI module file should be loaded first
...
In order to run simulations MPI runner should be used:
mpirun gmx_mpi mdrun MDRUNARGUMENTS
Job Script Examples
For Intel Cascade Lake compute nodes – simple case of a GROMACS job using a total of 960 CPU cores distributed over 10 nodes running 96 tasks each for 12 hours
Codeblock | ||||
---|---|---|---|---|
| ||||
#!/bin/bash #SBATCH -t 12:00:00 #SBATCH -p standard96 #SBATCH -N 10 #SBATCH --tasks-per-node 96 export SLURM_CPU_BIND=none module load impi/2019.5 module load gromacs/2019.6 mpirun gmx_mpi mdrun MDRUNARGUMENTS |