... | ... | @@ -48,34 +48,38 @@ conda activate snakemake_rapidrun |
|
|
You are ready to run the analysis !
|
|
|
|
|
|
|
|
|
## Download data
|
|
|
|
|
|
|
|
|
The complete data set can be downloaded and stored into [resources/tutorial](https://gitlab.mbb.univ-montp2.fr/edna/snakemake_rapidrun_obitools/-/tree/master/resources/tutorial) folder with the following command:
|
|
|
## Download example data
|
|
|
|
|
|
```
|
|
|
wget -c https://gitlab.mbb.univ-montp2.fr/edna/tutorial_metabarcoding_data/-/raw/master/tutorial_rapidrun_data.tar.gz -O - | tar -xz -C ./resources/tutorial/
|
|
|
curl -JLO http://gitlab.mbb.univ-montp2.fr/edna/snakemake_rapidrun_data_test/-/raw/master/test_rapidrun_data.tar.gz; tar zfxz test_rapidrun_data.tar.gz -C resources/test/
|
|
|
```
|
|
|
* Data is downloaded at [resources/tutorial](https://gitlab.mbb.univ-montp2.fr/edna/snakemake_rapidrun_swarm/-/tree/master/resources/tutorial)
|
|
|
* This is a tiny subset of a real metabarcoding analysis in rapidrun format
|
|
|
* Data is downloaded at [resources/test/test_rapidrun_data](https://gitlab.mbb.univ-montp2.fr/edna/snakemake_rapidrun_swarm/-/tree/master/resources/tutorial)
|
|
|
* this folder contains a reference database for 4 markers (Teleo01; Mamm01; Vert01; Chond01), NGS metabarcoding raw data, required metadata to handle demultiplexing on RAPIDRUN format
|
|
|
|
|
|
|
|
|
# Run the workflow
|
|
|
## Run the workflow
|
|
|
|
|
|
Simply type the following command to process data (estimated time: 25 minutes)
|
|
|
Simply type the following command to process example data
|
|
|
|
|
|
```
|
|
|
bash main.sh config/config_tutorial.yaml 8
|
|
|
snakemake --configfile config/config_test_rapidrun.yaml --cores 4 --use-conda
|
|
|
```
|
|
|
|
|
|
* The first argument `--configfile` [config/config_test_rapidrun.yaml](https://gitlab.mbb.univ-montp2.fr/edna/snakemake_rapidrun_swarm/-/blob/master/config/config_test_rapidrun.yaml) contains parameters to apply
|
|
|
* The second argument `--cores` **4** is the number of CPU cores you want to allow the system uses to run the whole workflow
|
|
|
* The third argument `--use-conda` is only necessary if you installed the program and its dependencies using conda
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
______________________________________________________
|
|
|
|
|
|
|
|
|
* This will generate OTU occ:urences tables into [results/06_assignment/04_table](https://gitlab.mbb.univ-montp2.fr/edna/snakemake_rapidrun_swarm/-/tree/master/results/06_assignment/04_table)
|
|
|
* The first argument [config/config_tutorial.yaml](https://gitlab.mbb.univ-montp2.fr/edna/snakemake_rapidrun_swarm/-/blob/master/config/config_tutorial.yaml) contains mandatory parameters information
|
|
|
* The second argument **8** is the number of CPU cores you want to allow the system uses to run the whole workflow
|
|
|
|
|
|
# To go further
|
|
|
|
|
|
Please check the [wiki](https://gitlab.mbb.univ-montp2.fr/edna/snakemake_rapidrun_swarm/-/wikis/home).
|
|
|
|
|
|
|
|
|
# Cluster MBB
|
... | ... | |