Difference between revisions of "Applications/CP2K"

From HPC
Jump to: navigation , search
(GPU Queue)
m (Further information)
 
(20 intermediate revisions by 3 users not shown)
Line 2: Line 2:
 
== Application Details ==
 
== Application Details ==
  
* Description : ORCA  is a Quantum Chemistry Program package that contains modern electronic structure methods including density functional theory, many-body perturbation, coupled cluster, multi-reference methods, and semi-empirical quantum chemistry methods
+
* Description : CP2K is a quantum chemistry and solid-state physics software package that can perform atomistic simulations of solid-state, liquid, molecular, periodic, material, crystal, and biological systems.
 
* Versions : 3.0
 
* Versions : 3.0
* Module names : cp2k/3.0
+
* Module names : cp2k/3.0 ; cp2k/6.1.0/gcc-7.3.0/intelmpi-2018 ; cp2k/6.1.0/gcc-7.3.0/openmpi-3.0.0 ; cp2k/9.1.0/gcc-8.5.0/openmpi-4.1.1
 
* License: Freely available under the GPL license
 
* License: Freely available under the GPL license
* Further information: [https://www.cp2k.org/ https://www.cp2k.org/ CP2K]
 
  
 +
== Modules Available ==
  
== Toolboxes ==
+
* module add cp2k/3.0
 +
* module add cp2k/6.1.0/gcc-7.3.0/intelmpi-2018
 +
* module add cp2k/6.1.0/gcc-7.3.0/openmpi-3.0.0
 +
* module add cp2k/9.1.0/gcc-8.5.0/openmpi-4.1.1
  
No toolboxes are listed here.
+
== Usage Examples ==
  
== Modules Available ==
+
=== Batch Submission ===
 +
==== OpenMPI Submission Script ====
 +
The following is a sample submission script for an OpenMPI CP2K task running across 3 nodes with the input file cp2ktask.inp (change this to reflect your particular task):
 +
<pre style="background-color: #C8C8C8; color: black; border: 2px solid #C8C8C8; font-family: monospace, sans-serif;">
 +
#!/bin/bash
 +
#SBATCH -J cp2k_openmpi
 +
#SBATCH -N 3
 +
#SBATCH --ntasks-per-node 27
 +
#SBATCH -o cp2k-%j.out
 +
#SBATCH -e cp2k-%j.err
 +
#SBATCH -p compute
 +
#SBATCH --exclusive
  
* module load cp2k/3.0
+
module purge
 +
module add cp2k/9.1.0/gcc-8.5.0/openmpi-4.1.1
  
 +
NP=$(( $SLURM_JOB_NUM_NODES * $SLURM_NTASKS_PER_NODE ))
 +
echo $NP "processes"
  
== Usage Examples ==
+
INPUTFILE=cp2ktask.inp
 +
OUTPUTFILE=cp2ktask.out
  
=== Batch Submission ===
+
export OMP_NUM_THREADS=1
 +
mpirun -np $NP cp2k.psmp  $INPUTFILE > $SLURM_SUBMIT_DIR/$OUTPUTFILE
 +
</pre>
  
 
==== Compute Queue ====
 
==== Compute Queue ====
  
<pre style="background-color: #C8C8C8; color: black; border: 2px solid blue; font-family: monospace, sans-serif;">
+
<pre style="background-color: #C8C8C8; color: black; border: 2px solid #C8C8C8; font-family: monospace, sans-serif;">
  
 
#!/bin/bash
 
#!/bin/bash
Line 34: Line 54:
 
#SBATCH -p compute
 
#SBATCH -p compute
 
#SBATCH --exclusive
 
#SBATCH --exclusive
 +
#SBATCH --mail-user= your email address here
  
 
echo $SLURM_JOB_NODELIST
 
echo $SLURM_JOB_NODELIST
Line 39: Line 60:
 
module purge
 
module purge
  
module load intel/mkl/64/11.3.2  
+
module add intel/mkl/64/11.3.2  
module load intel/mpi/64/5.1.3.181   
+
module add intel/mpi/64/5.1.3.181   
module load intel/compiler/64/2016
+
module add intel/compiler/64/2016
module load cp2k/3.0
+
module add cp2k/3.0
  
 
export I_MPI_FABRICS=shm:tmi
 
export I_MPI_FABRICS=shm:tmi
Line 58: Line 79:
  
 
mpirun -genvall  -np $NP env PSM_TRACEMASK=0x101 $CP2K  H2O-64-RI-MP2-TZ.inp >  H2O-64-RI-MP2-TZ-omp2.out
 
mpirun -genvall  -np $NP env PSM_TRACEMASK=0x101 $CP2K  H2O-64-RI-MP2-TZ.inp >  H2O-64-RI-MP2-TZ-omp2.out
 
 
</pre>
 
</pre>
  
  
<pre style="background-color: #C8C8C8; color: black; border: 2px solid black; font-family: monospace, sans-serif;">
+
<pre style="background-color: black; color: white; border: 2px solid black; font-family: monospace, sans-serif;">
 
[username@login01 ~]$ sbatch cp2k-test.job
 
[username@login01 ~]$ sbatch cp2k-test.job
 
Submitted batch job 189522
 
Submitted batch job 189522
Line 69: Line 89:
 
==== GPU Queue ====
 
==== GPU Queue ====
  
<pre style="background-color: #C8C8C8; color: black; border: 2px solid blue; font-family: monospace, sans-serif;">
+
 
 +
<pre style="background-color: #C8C8C8; color: black; border: 2px solid #C8C8C8; font-family: monospace, sans-serif;">
 +
 
 
#!/bin/bash
 
#!/bin/bash
 
#SBATCH -J cp2k-gpu
 
#SBATCH -J cp2k-gpu
Line 82: Line 104:
 
module purge
 
module purge
  
module load cp2k/3.0
+
module add cp2k/3.0
module load cuda/7.5.18
+
module add cuda/7.5.18
module load intel/mkl/64/11.3.2
+
module add intel/mkl/64/11.3.2
module load intel/mpi/64/5.1.3.181
+
module add intel/mpi/64/5.1.3.181
  
 
module list
 
module list
Line 100: Line 122:
  
 
$CP2K H2O-64.inp > H2O-64.out
 
$CP2K H2O-64.inp > H2O-64.out
 
 
</pre>
 
</pre>
  
  
<pre style="background-color: #C8C8C8; color: black; border: 2px solid black; font-family: monospace, sans-serif;">
+
<pre style="background-color: black; color: white; border: 2px solid black; font-family: monospace, sans-serif;">
 
[username@login01 ~]$ sbatch cp2k-test-gpu.job
 
[username@login01 ~]$ sbatch cp2k-test-gpu.job
 
Submitted batch job 189523
 
Submitted batch job 189523
 
</pre>
 
</pre>
  
 +
== Further information ==
 +
 +
* [https://www.cp2k.org/ https://www.cp2k.org/]
  
[[Category:Applications]]
+
{{Modulepagenav}}

Latest revision as of 10:40, 16 November 2022

Application Details

  • Description : CP2K is a quantum chemistry and solid-state physics software package that can perform atomistic simulations of solid-state, liquid, molecular, periodic, material, crystal, and biological systems.
  • Versions : 3.0
  • Module names : cp2k/3.0 ; cp2k/6.1.0/gcc-7.3.0/intelmpi-2018 ; cp2k/6.1.0/gcc-7.3.0/openmpi-3.0.0 ; cp2k/9.1.0/gcc-8.5.0/openmpi-4.1.1
  • License: Freely available under the GPL license

Modules Available

  • module add cp2k/3.0
  • module add cp2k/6.1.0/gcc-7.3.0/intelmpi-2018
  • module add cp2k/6.1.0/gcc-7.3.0/openmpi-3.0.0
  • module add cp2k/9.1.0/gcc-8.5.0/openmpi-4.1.1

Usage Examples

Batch Submission

OpenMPI Submission Script

The following is a sample submission script for an OpenMPI CP2K task running across 3 nodes with the input file cp2ktask.inp (change this to reflect your particular task):

#!/bin/bash
#SBATCH -J cp2k_openmpi
#SBATCH -N 3
#SBATCH --ntasks-per-node 27
#SBATCH -o cp2k-%j.out
#SBATCH -e cp2k-%j.err
#SBATCH -p compute
#SBATCH --exclusive

module purge
module add cp2k/9.1.0/gcc-8.5.0/openmpi-4.1.1

NP=$(( $SLURM_JOB_NUM_NODES * $SLURM_NTASKS_PER_NODE ))
echo $NP "processes"

INPUTFILE=cp2ktask.inp
OUTPUTFILE=cp2ktask.out

export OMP_NUM_THREADS=1
mpirun -np $NP cp2k.psmp  $INPUTFILE > $SLURM_SUBMIT_DIR/$OUTPUTFILE

Compute Queue


#!/bin/bash
#SBATCH -J cp2k-cpu
#SBATCH -N 120
#SBATCH --ntasks-per-node 14
#SBATCH -o %N.%j.%a.out
#SBATCH -e %N.%j.%a.err
#SBATCH -p compute
#SBATCH --exclusive
#SBATCH --mail-user= your email address here

echo $SLURM_JOB_NODELIST

module purge

module add intel/mkl/64/11.3.2 
module add intel/mpi/64/5.1.3.181  
module add intel/compiler/64/2016
module add cp2k/3.0

export I_MPI_FABRICS=shm:tmi
export I_MPI_FALLBACK=no

module list
mpirun --version

# calculating the number of processes
NP=$(( $SLURM_JOB_NUM_NODES * $SLURM_NTASKS_PER_NODE ))
echo $NP "processes"

CP2K=/home/user/cp2k/cp2k/exe/Linux-x86-64-intel-host/cp2k.psmp
export OMP_NUM_THREADS=2

mpirun -genvall  -np $NP env PSM_TRACEMASK=0x101 $CP2K   H2O-64-RI-MP2-TZ.inp >  H2O-64-RI-MP2-TZ-omp2.out


[username@login01 ~]$ sbatch cp2k-test.job
Submitted batch job 189522

GPU Queue


#!/bin/bash
#SBATCH -J cp2k-gpu
#SBATCH -N 1
#SBATCH --ntasks-per-node 24
#SBATCH -o %N.%j.%a.out
#SBATCH -e %N.%j.%a.err
#SBATCH -p gpu
#SBATCH --gres=gpu:1
#SBATCH --exclusive

module purge

module add cp2k/3.0
module add cuda/7.5.18
module add intel/mkl/64/11.3.2
module add intel/mpi/64/5.1.3.181

module list

nvidia-smi -a

mpirun --version

# calculating the number of processes
NP=$(( $SLURM_JOB_NUM_NODES * $SLURM_NTASKS_PER_NODE ))
echo $NP "processes"

CP2K=/trinity/clustervision/CentOS/7/apps/cp2k/build-v1intel-cp2k-20151010-120408/cp2k/exe/Linux-x86-64-cuda/cp2k.sopt

$CP2K H2O-64.inp > H2O-64.out


[username@login01 ~]$ sbatch cp2k-test-gpu.job
Submitted batch job 189523

Further information





Modules | Main Page | Further Topics