GROMACS is a software package for molecular dynamics simulations, and OCTOPUS allows the use of GROMACS 2021.2 by batch request. Anyone who has applied to use SQUID is eligible to use it.
Basic use
-
GROMACS execution is only allowed to process by batch request. Connect to the front-end node, create the input file and job script necessary for the calculation, and then submit the job. Examples of job scripts and job execution methods for using GROMACS are described below.
* For the manual about GROMACS, please refer to official HP.
Writing a Job Script(for General Purpose CPU nodes)
-
The following example is an example job script when GROMACS is run on 152 parallel (using 2 nodes, 76 parallel per node). Although there is no particular specification for the file name, we named "gromacs.sh" in this section.
1 2 3 4 5 6 7 8 9 10 11 12 |
#!/bin/bash #PBS -q SQUID #PBS -l cpunum_job=76 #PBS --group=[group name] #PBS -b 2 #PBS -T intmpi #PBS -l elapstim_req=01:00:00 #PBS -v OMP_NUM_THREADS=1 module load BaseApp module load gromacs/2021.2 cd $PBS_O_WORKDIR mpirun ${NQSII_MPIOPTS} -np 152 -ppn 76 gmx_mpi_d mdrun (Input file) |
For other lines of the job script, see here.
Writing a Job Script(for GPU nodes)
The following is an example of a job script for running GROMACS using a GPU. Although there is no particular specification for the file name, we named "gromacs.sh" in this section.
123456789101112131415161718
#!/bin/bash#PBS -q SQUID#PBS --group=[group name]#PBS -l elapstim_req=00:10:00#PBS -b 2#PBS -T openmpi#PBS -l gpunum_job=8#PBS -v NQSV_MPI_MODULE=BaseGCC/2021:cuda/11.2#PBS -v OMP_NUM_THREADS=6 module load BaseAppmodule load gromacs/2021.2mpi.GPU cd $PBS_O_WORKDIRexport GMX_MPI=`which gmx_mpi` gmx mdrun -s poly-ch2.tpr -ntmpi 6mpirun ${NQSV_MPIOPTS} -np 2 -npernode 1 ${GMX_MPI} mdrun -s poly-ch2.tpr
For other lines of the job script, see here.
How to execute
You can submit your job script with "qsub" command.
% qsub gromacs.sh
If the execution is successful, the calculation result will be output to the result file.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 |
#!/bin/bash #PBS -q SQUID #PBS --group=[group name] #PBS -l elapstim_req=00:10:00 #PBS -b 2 #PBS -T openmpi #PBS -l gpunum_job=8 #PBS -v NQSV_MPI_MODULE=BaseGCC/2021:cuda/11.2 #PBS -v OMP_NUM_THREADS=6 module load BaseApp module load gromacs/2021.2mpi.GPU cd $PBS_O_WORKDIR export GMX_MPI=`which gmx_mpi` gmx mdrun -s poly-ch2.tpr -ntmpi 6 mpirun ${NQSV_MPIOPTS} -np 2 -npernode 1 ${GMX_MPI} mdrun -s poly-ch2.tpr |
For other lines of the job script, see here.
-
You can submit your job script with "qsub" command.
% qsub gromacs.sh
If the execution is successful, the calculation result will be output to the result file.