GROMACS is a molecular dynamics package mainly designed for simulations of proteins, lipids, and nucleic acids. It was originally developed in the Biophysical Chemistry department of University of Groningen, and is now maintained by contributors in universities and research centers worldwide. GROMACS is one of the fastest and most popular software packages available, and can run on central processing units (CPUs) and graphics processing units (GPUs).
It provides calculation progress and estimated time of arrival (ETA) feedback, a trajectory viewer, and an extensive library for trajectory analysis. In addition, support for different force fields makes GROMACS very flexible. It can be executed in parallel, using Message Passing Interface (MPI) or threads. It contains a script to convert molecular coordinates from Protein Data Bank (PDB) files into the formats it uses internally. Once a configuration file for the simulation of several molecules (possibly including solvent) has been created, the simulation run (which can be time-consuming) produces a trajectory file, describing the movements of the atoms over time. That file can then be analyzed or visualized with several supplied tools.
 


Sample Script:
#!/bin/bash
#SBATCH --job-name=Swater
#SBATCH --nodes=1                 ## Depend on your requirment
#SBATCH --ntasks-per-node=24       ## CPUs Working; Never Change
##SBATCH --mem=10gb                ## RAM Size; Never Change
#SBATCH --error=job.%J.err
#SBATCH --output=job.%J.out
#SBATCH --time=14-00:00:00        ## Depend on your requirment
#SBATCH --partition=standard
##SBATCH --cpus-per-task=2
##SBATCH --constraint=gpu

################# Load or Unload Module as requerment ########################
module load apps/gromacs/parallel_studio_xe_2019.3.062/2019.6-mpi
#############################################################################
export OMP_NUM_THREADS=1         ## Never Change
###################### Your Job ###############################
gmx_mpi grompp -f nvt.mdp -c em.gro -r em.gro -p topol.top -o nvt.tpr
mpiexec.hydra -n $SLURM_NTASKS gmx_mpi mdrun -deffnm nvt
gmx_mpi grompp -f npt.mdp -c nvt.gro -t nvt.cpt -r nvt.gro -p topol.top -o npt.tpr
mpiexec.hydra -n $SLURM_NTASKS gmx_mpi mdrun -deffnm npt
gmx_mpi grompp -f md.mdp -c npt.gro -t npt.cpt -p topol.top -o md_0_10.tpr
mpiexec.hydra -n $SLURM_NTASKS gmx_mpi mdrun -deffnm md_0_10