Difference between revisions of "Applications/Orca"

From HPC
Jump to: navigation , search
(Batch Submission)
m
Line 8: Line 8:
  
  
== Toolboxes ==
 
 
No toolboxes are listed here.
 
  
 
== Modules Available ==
 
== Modules Available ==
Line 22: Line 19:
 
=== Batch Submission ===
 
=== Batch Submission ===
  
<pre style="background-color: #C8C8C8; color: black; border: 2px solid blue; font-family: monospace, sans-serif;">
+
<pre style="background-color: #C8C8C8; color: black; border: 2px solid #C8C8C8; font-family: monospace, sans-serif;">
 +
 
 
#!/bin/bash
 
#!/bin/bash
  
Line 37: Line 35:
  
 
module purge
 
module purge
module load orca/3.0.3
+
module add orca/3.0.3
module load openmpi/gcc/1.10.2
+
module add openmpi/gcc/1.10.2
  
 
export I_MPI_FABRICS=shm:tmi
 
export I_MPI_FABRICS=shm:tmi

Revision as of 12:26, 8 February 2017

Application Details

  • Description : CP2K is a quantum chemistry and solid state physics software package that can perform atomistic simulations of solid state, liquid, molecular, periodic, material, crystal, and biological systems.
  • Versions : 3.0.3 and 3.0.0
  • Module names : orca/3.0.3 and orca/3.0.0
  • License: Free for academic use


Modules Available

  • module load orca/3.0.0
  • module load orca/3.0.3


Usage Examples

Batch Submission


#!/bin/bash

#SBATCH -J orca_Zr-L1
#SBATCH -N 1
#SBATCH --ntasks-per-node 14
#SBATCH -o %N.%j.%a.out
#SBATCH -e %N.%j.%a.err
#SBATCH -p compute
#SBATCH --exclusive
#SBATCH --time=48:00:00

echo $SLURM_JOB_NODELIST

module purge
module add orca/3.0.3
module add openmpi/gcc/1.10.2

export I_MPI_FABRICS=shm:tmi
export I_MPI_FALLBACK=no

module list

mpirun --version

ORCA=/trinity/clustervision/CentOS/7/apps/orca/3.0.3/orca
export OMP_NUM_THREADS=14

#CHANGE HERE FOR INPUT FILE (.inp)
inpfile=Zr-amine-cyclam-3-Ligand.inp
#test out
outfile=(${inpfile//inp/out})
echo writing output into : $outfile

SCRATCH=/tmp/$USER/$SLURM_JOB_ID
echo Creating temp dir $SCRATCH
mkdir -p $SCRATCH || exit $?
echo Coping files. srun cp is equivalent to loop over each node + scp
cp -r $SLURM_SUBMIT_DIR/$inpfile  $SCRATCH || exit $?

cd $SCRATCH

$ORCA   $inpfile > $SLURM_SUBMIT_DIR/$outfile

echo calculation finished - copying files back to home directory

cp $SCRATCH/*.gbw $SLURM_SUBMIT_DIR
cp $SCRATCH/*.hess $SLURM_SUBMIT_DIR
cp $SCRATCH/*.trj $SLURM_SUBMIT_DIR
cp $SCRATCH/*.xyz* $SLURM_SUBMIT_DIR

echo calculation finished - removing scratch dir
rm -rf  $SCRATCH


[username@login01 ~]$ sbatch orca-test.job
Submitted batch job 189522

Further Information