Difference between revisions of "Applications/ABySS"

From HPC
Jump to: navigation , search
m (Parallel processing)
m
 
(4 intermediate revisions by the same user not shown)
Line 37: Line 37:
 
===Parallel processing===
 
===Parallel processing===
  
The np option of abyss-pe specifies the number of processes to use for the parallel MPI job. Without any MPI configuration, this will allow you to use multiple cores on a single machine. To use multiple machines for assembly, you must create a hostfile for mpirun, which is described on the mpirun man page.
+
The np option of abyss-pe specifies the number of processes to use for the parallel MPI job. This will allow you to use multiple cores on a single machine without any MPI configuration. To use multiple machines for assembly, you must create a host file for mpirun, which is described on the mpirun man page.
  
 
Do not run '''mpirun -np 8 abyss-pe'''. To run ABySS with 8 threads, use '''abyss-pe np=8'''. The abyss-pe driver script will start the MPI process, like so: '''mpirun -np 8 ABYSS-P.'''
 
Do not run '''mpirun -np 8 abyss-pe'''. To run ABySS with 8 threads, use '''abyss-pe np=8'''. The abyss-pe driver script will start the MPI process, like so: '''mpirun -np 8 ABYSS-P.'''
Line 59: Line 59:
 
Through '''SLURM''' this would become the script:
 
Through '''SLURM''' this would become the script:
  
<pre style="background-color: black; color: white; border: 2px solid black; font-family: monospace, sans-serif;">
+
<pre style="font-family: monospace, sans-serif;">
 
 
 
#!/bin/bash
 
#!/bin/bash
  
Line 69: Line 68:
 
#SBATCH -o %N.%j.%a.out
 
#SBATCH -o %N.%j.%a.out
 
#SBATCH -e %N.%j.%a.err
 
#SBATCH -e %N.%j.%a.err
 +
#SBATCH -p highmem
 
#SBATCH --exclusive
 
#SBATCH --exclusive
 
#SBATCH -t 00:30:00
 
#SBATCH -t 00:30:00
Line 78: Line 78:
  
 
abyss-pe name=test k=48 n=8 in='test-1.fa test-3.fa'
 
abyss-pe name=test k=48 n=8 in='test-1.fa test-3.fa'
 
 
</pre>
 
</pre>
  
Line 100: Line 99:
  
 
==Further Information==
 
==Further Information==
*[http://computing.bio.cam.ac.uk/local/doc/abyss.html http://computing.bio.cam.ac.uk/local/doc/abyss.html]
+
 
 +
*[https://www.bcgsc.ca/resources/software/abyss https://www.bcgsc.ca/resources/software/abyss]
 +
*[https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2694472/ https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2694472/]
 +
 
  
 
{{Modulepagenav}}
 
{{Modulepagenav}}

Latest revision as of 15:16, 23 August 2023

Application Details

  • Description: ABySS is a de novo sequence assembler.
  • Version: 1.5.2 (gcc-4.9.3)
  • Modules: abyss/1.5.2/gcc-4.9.3
  • Licence: Free, open-source

Introduction

ABySS is a de novo sequence assembler intended for short paired-end reads and genomes of all sizes.

Usage Examples

Assemble a small synthetic data set


[username@login] module add abyss/1.5.2/gcc-4.9.3
[username@login] wget http://www.bcgsc.ca/platform/bioinfo/software/abyss/releases/1.3.4/test-data.tar.gz
[username@login] tar xzvf test-data.tar.gz
[username@login} abyss-pe k=25 name=test \
    in='test-data/reads1.fastq test-data/reads2.fastq'

Calculate assembly contiguity statistics


[username@login] module add abyss/1.5.2/gcc-4.9.3
[username@login] abyss-fac test-unitigs.fa

Parallel processing

The np option of abyss-pe specifies the number of processes to use for the parallel MPI job. This will allow you to use multiple cores on a single machine without any MPI configuration. To use multiple machines for assembly, you must create a host file for mpirun, which is described on the mpirun man page.

Do not run mpirun -np 8 abyss-pe. To run ABySS with 8 threads, use abyss-pe np=8. The abyss-pe driver script will start the MPI process, like so: mpirun -np 8 ABYSS-P.

The paired-end assembly stage is multithreaded but must run on a single machine. The number of threads to use may be specified with the parameter j. The default value for j is the value of np.

Note: this example is done on a high memory node, usually access would be achieved with the scheduler


[username@login01 ~]$ interactive -phighmem
salloc: Granted job allocation 3619680
Job ID 3619680 connecting to c231, please wait...
c231.vc-main

[username@c230 ~]$ module add abyss/1.5.2/gcc-4.9.3
[username@c230 ~]$ mpirun abyss-pe np=40

Through SLURM this would become the script:

#!/bin/bash

#SBATCH -J abyss
#SBATCH -p highmem
#SBATCH -N 2
#SBATCH --ntasks-per-node=40
#SBATCH -o %N.%j.%a.out
#SBATCH -e %N.%j.%a.err
#SBATCH -p highmem
#SBATCH --exclusive
#SBATCH -t 00:30:00

module purge
module add abyss/1.5.2/gcc-4.9.3

#Run your ABySS commands

abyss-pe name=test k=48 n=8 in='test-1.fa test-3.fa'

The Quantum Package has been designed pretty much as an interactive environment for quantum-chemistry calculations, in order to facilitate the user experience.

  • qp
  • qp_e_conv_fci
  • qp_name
  • qp_reset
  • qp_stop
  • qpsh
  • qp_convert_output_to_ezfio qp_export_as_tgz
  • qp_plugins
  • qp_set_frozen_core
  • qp_test
  • qp_create_ninja
  • qp_mpirun
  • qp_prepend_export
  • qp_srun
  • qp_update

Further Information






Modules | Main Page | Further Topics