Difference between revisions of "Applications/Mvapich2"
From HPC
m (Pysdlb moved page Mvapich2 to Applications/Mvapich2 without leaving a redirect) |
m (→Further Information) |
||
(2 intermediate revisions by the same user not shown) | |||
Line 17: | Line 17: | ||
<pre style="background-color: black; color: white; border: 2px solid black; font-family: monospace, sans-serif;"> | <pre style="background-color: black; color: white; border: 2px solid black; font-family: monospace, sans-serif;"> | ||
− | [username@c001 ~]$ module | + | [username@c001 ~]$ module add gcc/5.2.0 |
− | [username@c001 ~]$ module | + | [username@c001 ~]$ module add mvapich2/2.2/gcc-5.2.0 |
[username@c001 ~]$ srun -n16 --mpi=pmi2 mvapichDEMO | [username@c001 ~]$ srun -n16 --mpi=pmi2 mvapichDEMO | ||
</pre> | </pre> | ||
Line 25: | Line 25: | ||
<pre style="background-color: black; color: white; border: 2px solid black; font-family: monospace, sans-serif;"> | <pre style="background-color: black; color: white; border: 2px solid black; font-family: monospace, sans-serif;"> | ||
− | [username@c001 ~]$ module | + | [username@c001 ~]$ module add mvapich2/2.2/gcc-5.2.0 |
</pre> | </pre> | ||
Line 50: | Line 50: | ||
#SBATCH -J MPI-testXX | #SBATCH -J MPI-testXX | ||
#SBATCH -N 1 | #SBATCH -N 1 | ||
− | #SBATCH --ntasks-per-node | + | #SBATCH --ntasks-per-node 16 |
#SBATCH -D /home/user1/CODE_SAMPLES/MPICH | #SBATCH -D /home/user1/CODE_SAMPLES/MPICH | ||
#SBATCH -o %N.%j.%a.out | #SBATCH -o %N.%j.%a.out | ||
Line 60: | Line 60: | ||
module purge | module purge | ||
− | module | + | module add gcc/5.2.0 |
− | module | + | module add mvapich2/2.2/gcc-5.2.0 |
export I_MPI_DEBUG=5 | export I_MPI_DEBUG=5 | ||
Line 88: | Line 88: | ||
* [https://slurm.schedmd.com/mpi_guide.html https://slurm.schedmd.com/mpi_guide.html] | * [https://slurm.schedmd.com/mpi_guide.html https://slurm.schedmd.com/mpi_guide.html] | ||
− | + | {{Modulepagenav}} | |
− | { | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− |
Latest revision as of 10:52, 16 November 2022
Contents
Application Details
- Description: The MVAPICH2 software, based on MPI 3.1 standard, delivers performance, scalability and fault tolerance for high-end computing systems and servers using Omni-Path (Viper) networking technologies.
- Version: 2.2
- Modules: mvapich2/2.2/gcc-5.2.0, mvapich2/2.2/gcc-6.3.0 and mvapich2/2.2/intel-2017
- Licence: Open source
Usage Examples
Message Passing Interface (MPI) is a standardized and portable message-passing system designed by a group of researchers from academia and industry to function on a wide variety of parallel computing architectures. The standard defines the syntax and semantics of a core of library routines useful to a wide range of users writing portable message-passing programs in C, C++, and Fortran. The MVAPICH2 software family is ABI compatible with the version of MPICH it is based on.
Interactive Mode
This example runs on a reserved node (eg. c001)
[username@c001 ~]$ module add gcc/5.2.0 [username@c001 ~]$ module add mvapich2/2.2/gcc-5.2.0 [username@c001 ~]$ srun -n16 --mpi=pmi2 mvapichDEMO
This can be run with a host file, the file here is called machinefile:
[username@c001 ~]$ module add mvapich2/2.2/gcc-5.2.0
The 'machinefile' is of the form:
c001 c002:2 c003:4 c004:1
- The ':2', ':4', ':1' segments depict the number of processes you want to run on each node.
Non interactive job
This runs on the scheduler SLURM
#!/bin/bash #SBATCH -J MPI-testXX #SBATCH -N 1 #SBATCH --ntasks-per-node 16 #SBATCH -D /home/user1/CODE_SAMPLES/MPICH #SBATCH -o %N.%j.%a.out #SBATCH -e %N.%j.%a.err #SBATCH -p compute #SBATCH --exclusive echo $SLURM_JOB_NODELIST module purge module add gcc/5.2.0 module add mvapich2/2.2/gcc-5.2.0 export I_MPI_DEBUG=5 export I_MPI_FABRICS=shm:tmi export I_MPI_FALLBACK=no srun -n16 --mpi=pmi2 /home/user1/CODE_SAMPLES/OPENMPI/scatteravg 100
And passing it to SLURM
[username@login01 ~]$ sbatch mpidemo.job Submitted batch job 889552