Applications/BayesTraits

From HPC
Revision as of 11:10, 24 May 2019 by Pysdlb (talk | contribs)

Jump to: navigation , search

Application Details

Usage Examples

Interactive Session


[user@login01 ~]$ interactive
salloc: Granted job allocation 409670
Job ID 409670 connecting to c128, please wait...
[username@c128 ~]$ module add bayesTrait/3.0.0
[username@c128 ~]$ BayesTraitsV3 /trinity/clustervision/CentOS/7/apps/bayesTraits/3.0.0/Mammal.trees bob.tress
BayesTraits V3.0 (Mar  2 2017)
Mark Pagel and Andrew Meade
www.evolution.reading.ac.uk

Batch Session

#!/bin/bash

#SBATCH -J BayesTrait # Job Name
#SBATCH -N 1      # Number of  Nodes to use
#SBATCH  -n 28   # Number of CPUs
#SBATCH -o %N.%j.%a.out # Output file name
#SBATCH -e %N.%j.%a.err  # Error file name
#SBATCH -p compute       # Partition to run on
#SBATCH --exclusive       # Instructs SLURM to not run any other job in the node(s) selected

module add bayesTraits/3.0.0
BayesTraitsV3 Mammal.trees bob.tress < command.txt


[user@login01 ~]$ sbatch bayestrait-test.job
Submitted batch job 409671

Multi-Batch Jobs

The following job script will automatically run 3 identical BayesTrait jobs in an array, so that three outputs can be compared. It takes care of separating the jobs, running them and then bringing the outputs together.

#!/bin/bash
#SBATCH -J BayesTrait_array
#SBATCH -N 1
#SBATCH  -n 28
#SBATCH -o %N.%j.%a.out
#SBATCH -e %N.%j.%a.err
#SBATCH -pcompute
#SBATCH --exclusive
#SBATCH --array 1-3
 
module add bayesTraits/3.0/intel-2017
 
# Create 3 sub-directories inside which each job will run
RUNDIR=`pwd`
mkdir $SLURM_ARRAY_TASK_ID
cd $SLURM_ARRAY_TASK_ID
 
# Update here to set command, tree and data files 
COMMAND_FILE=MCcSM_RJHPexp05_WCLCBNoZero_command.txt
TREE_FILE=FritzHARD_WCLCBNoZero.trees
DATA_FILE=MCcSM_RJHPexp05_WCLCBNoZero.txt

# Make a 'copy' of the command, tree and data files in the job sub-directories
ln -s ../$COMMAND_FILE .
ln -s ../$TREE_FILE .
ln -s ../$DATA_FILE .
 
BayesTraitsV3 $TREE_FILE $DATA_FILE < $COMMAND_FILE
 
cd $RUNDIR
 
# Copy the output files from each of the 3 run directories into a files called run_1_<data_file>.log.txt
cp $SLURM_ARRAY_TASK_ID/${DATA_FILE}.Log.txt run_${SLURM_ARRAY_TASK_ID}_${DATA_FILE}.Log.txt
cp $SLURM_ARRAY_TASK_ID/${DATA_FILE}.Schedule.txt run_${SLURM_ARRAY_TASK_ID}_${DATA_FILE}.Schedule.txt

The job is submitted in the same way as a standard batch job.

Further Information