Difference between revisions of "Applications/Orca"

From HPC
Jump to: navigation , search
m
m
Line 3: Line 3:
  
 
* Description : CP2K is a quantum chemistry and solid state physics software package that can perform atomistic simulations of solid state, liquid, molecular, periodic, material, crystal, and biological systems.
 
* Description : CP2K is a quantum chemistry and solid state physics software package that can perform atomistic simulations of solid state, liquid, molecular, periodic, material, crystal, and biological systems.
* Versions : 3.0.0, 3.0.0, 4.0.0, 4.0.1, 4.1.0 and 4.1.1
+
* Versions : 3.0.0, 3.0.0, 4.0.0, 4.0.1, 4.1.0, 4.1.1 and 4.1.2.
* Module names : orca/3.0.0, orca/3.0.0, orca/4.0.0, orca/4.0.1, orca/4.1.0/openmpi313 and 4.1.1/openmpi313
+
* Module names : orca/3.0.0, orca/3.0.0, orca/4.0.0, orca/4.0.1, orca/4.1.0/openmpi313, 4.1.1/openmpi313 and 4.1.2/openmpi313
  
 
* License:  Free for academic use
 
* License:  Free for academic use
Line 19: Line 19:
 
* module add orca/4.1.0/openmpi313
 
* module add orca/4.1.0/openmpi313
 
* module add orca/4.1.1/openmpi313
 
* module add orca/4.1.1/openmpi313
 +
* module add orca/4.121/openmpi313
  
  

Revision as of 15:27, 3 April 2019

Application Details

  • Description : CP2K is a quantum chemistry and solid state physics software package that can perform atomistic simulations of solid state, liquid, molecular, periodic, material, crystal, and biological systems.
  • Versions : 3.0.0, 3.0.0, 4.0.0, 4.0.1, 4.1.0, 4.1.1 and 4.1.2.
  • Module names : orca/3.0.0, orca/3.0.0, orca/4.0.0, orca/4.0.1, orca/4.1.0/openmpi313, 4.1.1/openmpi313 and 4.1.2/openmpi313
  • License: Free for academic use

Note : orca versions >4.0.0 requires openMPI 3


Modules Available

  • module add orca/3.0.0
  • module add orca/3.0.3
  • module add orca/4.0.0
  • module add orca/4.0.1
  • module add orca/4.1.0/openmpi313
  • module add orca/4.1.1/openmpi313
  • module add orca/4.121/openmpi313


Usage Examples

Batch Submission


#!/bin/bash
#SBATCH -J orc_4N
#SBATCH -N 1
#SBATCH --ntasks-per-node 28
#SBATCH -o %N.%j.%a.out
#SBATCH -e %N.%j.%a.err
#SBATCH -p compute
#SBATCH --exclusive

echo $SLURM_JOB_NODELIST

module purge
module load orca/4.0.1

export I_MPI_FABRICS=shm:tmi
export I_MPI_FALLBACK=no

module list

mpirun --version

export ORCA=`which orca`
export OMP_NUM_THREADS=2
echo "Using ORCA located at " $ORCA

#CHANGE HERE FOR INPUT FILE (.inp)
inpfile=py-lpno-pccsd-freq.inp

#test out
outfile=(${inpfile//inp/out})
echo writing output into : $outfile

SCRATCH=/local/$USER/$SLURM_JOB_ID
echo Creating temp dir $SCRATCH
mkdir -p $SCRATCH || exit $?
echo Coping files. srun cp is equivalent to loop over each node + scp
cp -r $SLURM_SUBMIT_DIR/$inpfile  $SCRATCH || exit $?
#cp -r $SLURM_SUBMIT_DIR/oh-pi-p-b3lyp-freq.hess  $SCRATCH || exit $?

cd $SCRATCH

$ORCA   $inpfile > $SLURM_SUBMIT_DIR/$outfile

echo calculation finished - copying files back to home directory

cp $SCRATCH/* $SLURM_SUBMIT_DIR
cp $SCRATCH/*.gbw $SLURM_SUBMIT_DIR
cp $SCRATCH/*.hess $SLURM_SUBMIT_DIR
cp $SCRATCH/*.trj $SLURM_SUBMIT_DIR
cp $SCRATCH/*.xyz* $SLURM_SUBMIT_DIR

echo calculation finished - removing scratch dir
rm -rf  $SCRATCH


[username@login01 ~]$ sbatch orca-test.job
Submitted batch job 2189522
Icon tick.png Orca version 4.0.0 requires openMPI version 3 (openmpi3)

Further Information