Difference between revisions of "Applications/Orca"
From HPC
m |
m |
||
| Line 20: | Line 20: | ||
* module add orca/4.0.1 | * module add orca/4.0.1 | ||
* module add orca/4.1.0/openmpi212 and openmpi313 | * module add orca/4.1.0/openmpi212 and openmpi313 | ||
| + | * module add orca/4.1.1/openmpi313 | ||
| Line 37: | Line 38: | ||
#SBATCH -p compute | #SBATCH -p compute | ||
#SBATCH --exclusive | #SBATCH --exclusive | ||
| − | #SBATCH --time= | + | #SBATCH --time=12:00:00 |
echo $SLURM_JOB_NODELIST | echo $SLURM_JOB_NODELIST | ||
Revision as of 09:56, 4 February 2019
Contents
Application Details
- Description : CP2K is a quantum chemistry and solid state physics software package that can perform atomistic simulations of solid state, liquid, molecular, periodic, material, crystal, and biological systems.
- Versions : 3.0.0, 3.0.0, 4.0.0, 4.0.1 and 4.1.0
- Module names : orca/3.0.0, orca/3.0.0, orca/4.0.0, orca/4.0.1, and orca/4.1.0/openmpi212 and openmpi313
- License: Free for academic use
Note : orca versions >4.0.0 requires openMPI version 2 or 3
Modules Available
- module add orca/3.0.0
- module add orca/3.0.3
- module add orca/4.0.0
- module add orca/4.0.1
- module add orca/4.1.0/openmpi212 and openmpi313
- module add orca/4.1.1/openmpi313
Usage Examples
Batch Submission
#!/bin/bash
#SBATCH -J orca_Zr-L1
#SBATCH -N 1
#SBATCH --ntasks-per-node 14
#SBATCH -o %N.%j.%a.out
#SBATCH -e %N.%j.%a.err
#SBATCH -p compute
#SBATCH --exclusive
#SBATCH --time=12:00:00
echo $SLURM_JOB_NODELIST
module purge
module add orca/3.0.3
module add openmpi/gcc/1.10.2
export I_MPI_FABRICS=shm:tmi
export I_MPI_FALLBACK=no
module list
mpirun --version
ORCA=/trinity/clustervision/CentOS/7/apps/orca/3.0.3/orca
export OMP_NUM_THREADS=14
#CHANGE HERE FOR INPUT FILE (.inp)
inpfile=Zr-amine-cyclam-3-Ligand.inp
#test out
outfile=(${inpfile//inp/out})
echo writing output into : $outfile
SCRATCH=/tmp/$USER/$SLURM_JOB_ID
echo Creating temp dir $SCRATCH
mkdir -p $SCRATCH || exit $?
echo Coping files. srun cp is equivalent to loop over each node + scp
cp -r $SLURM_SUBMIT_DIR/$inpfile $SCRATCH || exit $?
cd $SCRATCH
$ORCA $inpfile > $SLURM_SUBMIT_DIR/$outfile
echo calculation finished - copying files back to home directory
cp $SCRATCH/*.gbw $SLURM_SUBMIT_DIR
cp $SCRATCH/*.hess $SLURM_SUBMIT_DIR
cp $SCRATCH/*.trj $SLURM_SUBMIT_DIR
cp $SCRATCH/*.xyz* $SLURM_SUBMIT_DIR
echo calculation finished - removing scratch dir
rm -rf $SCRATCH
[username@login01 ~]$ sbatch orca-test.job Submitted batch job 189522
| |
Orca version 4.0.0 requires openMPI version 2 ( e.g. module add openmpi/2.0.2/gcc-5.2.0 ) |