Thank you for your answer.
1. Have you tried a rerun yet? This error usually occurs for a very small fraction of jobs and is not reproducible (small cluster hiccup). But I’ve also seen problematic inputs be the root reason when reproducible.
Yes, I tried to rerun the mapping but I still have errors.
- If a rerun also fails…
Are you running the most current version of the mapping tool(s)? Which? Please post back the tool names and versions for each, or better, try using the latest version(s) first, then post back which were used/failed.
I am using the tools available on the usegalaxy sever:
RNA STAR Gapped-read mapper for RNA-seq data (Galaxy Version 2.7.8a+galaxy0)
HISAT2 A fast and sensitive alignment program (Galaxy Version 2.2.1+galaxy0)
TopHat Gapped-read mapper for RNA-seq data (Galaxy Version 2.1.1)
This is the error I get for each tool:
Does a tool like
FastQC execute properly against the fastq input datasets?
Yes, I started with a FASTQC quality control step, trimmed my adapters with the trim galore! tool (output format: fastqsanger) then reran FASTQC/MultiQC to check if my files were ready to use for mapping and every step executed properly.
How are the read data organized? Single end or paired? Individual datasets, multiple datasets, or a dataset collection?
I have paired-end reads and I select multiple datasets.
Are you using a built-in indexed for the mapping (available in the drop-down menu on the tool form) or using a custom reference genome (fasta from the history)?
Yes, I am using a built-in reference genome (the human genome (hg38)) without a built-in gene model. As for the gene model, I specify .gtf file (GRCh38) downloaded from the ensemble database. Then, I specify the length of the genomic sequence around annotated junctions (ReadLength-1) and I execute my job.
I just want to specify that the tutorial I am following treats a different genome (dm6) and I had no problem with the files.
So I do not know if I am using the wrong file formats or not the adequate tools but I hope someone can enlighten me on this matter.