Difference between revisions of "Applications/Orca"
From HPC
m |
m |
||
Line 4: | Line 4: | ||
* Description : CP2K is a quantum chemistry and solid state physics software package that can perform atomistic simulations of solid state, liquid, molecular, periodic, material, crystal, and biological systems. | * Description : CP2K is a quantum chemistry and solid state physics software package that can perform atomistic simulations of solid state, liquid, molecular, periodic, material, crystal, and biological systems. | ||
* Versions : 3.0.0, 3.0.0 and 4.0.0.2 | * Versions : 3.0.0, 3.0.0 and 4.0.0.2 | ||
− | * Module names : orca/3.0.0, orca/3.0.0 and orca/4.0.0 | + | * Module names : orca/3.0.0, orca/3.0.0 and orca/4.0.0 |
* License: Free for academic use | * License: Free for academic use | ||
+ | |||
+ | Note : '''orca version 4.0.0''' requires openMPI version 2. | ||
Revision as of 10:05, 23 March 2017
Contents
Application Details
- Description : CP2K is a quantum chemistry and solid state physics software package that can perform atomistic simulations of solid state, liquid, molecular, periodic, material, crystal, and biological systems.
- Versions : 3.0.0, 3.0.0 and 4.0.0.2
- Module names : orca/3.0.0, orca/3.0.0 and orca/4.0.0
- License: Free for academic use
Note : orca version 4.0.0 requires openMPI version 2.
Modules Available
- module add orca/3.0.0
- module add orca/3.0.3
- module add orca/4.0.0
Usage Examples
Batch Submission
#!/bin/bash #SBATCH -J orca_Zr-L1 #SBATCH -N 1 #SBATCH --ntasks-per-node 14 #SBATCH -o %N.%j.%a.out #SBATCH -e %N.%j.%a.err #SBATCH -p compute #SBATCH --exclusive #SBATCH --time=48:00:00 echo $SLURM_JOB_NODELIST module purge module add orca/3.0.3 module add openmpi/gcc/1.10.2 export I_MPI_FABRICS=shm:tmi export I_MPI_FALLBACK=no module list mpirun --version ORCA=/trinity/clustervision/CentOS/7/apps/orca/3.0.3/orca export OMP_NUM_THREADS=14 #CHANGE HERE FOR INPUT FILE (.inp) inpfile=Zr-amine-cyclam-3-Ligand.inp #test out outfile=(${inpfile//inp/out}) echo writing output into : $outfile SCRATCH=/tmp/$USER/$SLURM_JOB_ID echo Creating temp dir $SCRATCH mkdir -p $SCRATCH || exit $? echo Coping files. srun cp is equivalent to loop over each node + scp cp -r $SLURM_SUBMIT_DIR/$inpfile $SCRATCH || exit $? cd $SCRATCH $ORCA $inpfile > $SLURM_SUBMIT_DIR/$outfile echo calculation finished - copying files back to home directory cp $SCRATCH/*.gbw $SLURM_SUBMIT_DIR cp $SCRATCH/*.hess $SLURM_SUBMIT_DIR cp $SCRATCH/*.trj $SLURM_SUBMIT_DIR cp $SCRATCH/*.xyz* $SLURM_SUBMIT_DIR echo calculation finished - removing scratch dir rm -rf $SCRATCH
[username@login01 ~]$ sbatch orca-test.job Submitted batch job 189522
Orca version 4.0.0 requires openMPI version 2 ( e.g. module load openmpi/2.0.2/gcc-5.2.0 ) |