Difference between revisions of "Applications/RaXML"

From HPC
Jump to: navigation , search
(Batch Session)
m (Further Information)
 
(4 intermediate revisions by 2 users not shown)
Line 4: Line 4:
 
*Licence:  Open source
 
*Licence:  Open source
  
 +
==usage==
 
===Interactive===
 
===Interactive===
  
Line 33: Line 34:
 
#SBATCH --exclusive        # Request exclusive access to a node (all 28 cores, 128GB of RAM)
 
#SBATCH --exclusive        # Request exclusive access to a node (all 28 cores, 128GB of RAM)
  
module load openmpi/gcc/1.10.2
+
module add openmpi/gcc/1.10.2
 
cd /home/USERNAME/PATH/OF/DIR
 
cd /home/USERNAME/PATH/OF/DIR
 
/trinity/clustervision/CentOS/7/apps/raxml/8.2.11/openmpi-1.10.2/gcc-4.9.3/bin/raxmlHPC-MPI-AVX -N 10 -m GTRCAT -s myalign.phy -n myalign.phy pause
 
/trinity/clustervision/CentOS/7/apps/raxml/8.2.11/openmpi-1.10.2/gcc-4.9.3/bin/raxmlHPC-MPI-AVX -N 10 -m GTRCAT -s myalign.phy -n myalign.phy pause
Line 47: Line 48:
  
 
</pre>
 
</pre>
 +
 
==Further Information==
 
==Further Information==
  
 
* https://sco.h-its.org/exelixis/web/software/raxml/cluster.html
 
* https://sco.h-its.org/exelixis/web/software/raxml/cluster.html
  
 
+
{{Modulepagenav}}
 
 
{|
 
|style="width:5%; border-width: 0" | [[File:icon_home.png]]
 
|style="width:95%; border-width: 0" |
 
* [[Main_Page|Home]]
 
* [[Applications|Application support]]
 
* [[General|General]]
 
* [[Training|Training]]
 
* [[Programming|Programming support]]
 
|-
 
|}
 

Latest revision as of 10:56, 16 November 2022

Application Details

  • A tool for Phylogenetic Analysis and Post-Analysis of Large Phylogenies.
  • Version: MPI AVX 8.2.11
  • Licence: Open source

usage

Interactive

While logged into viper login node: you could find the example file used in this path /home/ViperAdmin/software/source/iqtree/iqtree-mpi-1.4.4-Linux/example.phy copy it to the location where you would like to run the below steps

[username@login01 ~]$ interactive
salloc: Granted job allocation 296769
Job ID 296769 connecting to c170, please wait...
Last login: Wed Jan 25 09:10:51 2017 from 10.254.5.246
[username@c170 ~]$ module add openmpi/gcc/1.10.2
[username@c170 ~]$ /trinity/clustervision/CentOS/7/apps/raxml/8.2.11/openmpi-1.10.2/gcc-4.9.3/bin/raxmlHPC-MPI-AVX -N 10 -m GTRCAT -s myalign.phy -n myalign.phy pause 

Batch Session

Below is how to submit the same example using our scheduler slurm. please replace /home/USERNAME/PATH/OF/DIR to the path of the directory where you copied the example.phy file or to the file that you want to use.

#!/bin/bash
#SBATCH -J raxml_example   # Job name, you can change it to whatever you want
#SBATCH -o %N.%j.out        # Standard output will be written here
#SBATCH -e %N.%j.err        # Standard error will be written here
#SBATCH -n 14               # number of cores
#SBATCH -N 1                # Number of nodes
#SBATCH -p highmem          # Slurm partition, where you want the job to be queued
#SBATCH --exclusive         # Request exclusive access to a node (all 28 cores, 128GB of RAM)

module add openmpi/gcc/1.10.2
cd /home/USERNAME/PATH/OF/DIR
/trinity/clustervision/CentOS/7/apps/raxml/8.2.11/openmpi-1.10.2/gcc-4.9.3/bin/raxmlHPC-MPI-AVX -N 10 -m GTRCAT -s myalign.phy -n myalign.phy pause

Then the next step is to submit the above script to slurm for example if you name the above text file raxml.sh


[user@login01 ~]$ sbatch raxml.sh
Submitted batch job 409671

Further Information





Modules | Main Page | Further Topics