Trinity error: Remote job server indicated a problem running or monitoring this job.

Hello, I ran 2 trinity jobs with paired .fastq files. I upload my fastq.gz files (2 files of 1.6 GB size each and 2 files of 1.9 GB size each) and the format fastqsanger (fastqsanger.gz) was automatically assigned to my files. Before uploading the files i checked the quality of my files with FastQC and there was no problem, they had a Phred score of 36 and no overrepresented sequences were detected. However, trinity ended up with the following error in both jobs:
tool error. An error occurred with this dataset: Remote job server indicated a problem running or monitoring this job.
I checked the Dataset Error Report and reported the following in both jobs: JOB (number of the job) ON 1004 CANCELLED AT 2020-10-15T21:24:56 DUE TO TIME LIMIT ***
I re-ran both jobs but again, they showed the same error. I would appreciate if you could help me solve this problem. Thanks

1 Like

Hi @juan_martinez

This error in the stderr logs means that the job is running for longer than the maximum execution time the public service can provide. There could be an input problem that you can fix – or the data really is too large to assemble at the public resource.

There are no known server issues with this tool. A test was just completed successfully yesterday for this exact same tool at this exact same server – after a similar report from another end-user (perhaps even you?).

  1. Try at least one rerun to eliminate server/cluster factors.
  2. Trinity requires both ends of a read when assembling pairs. Double-check if that is true for your read inputs.
  3. Do some more QA/QC on your reads (tools: FastQC > Trimmomatic > FastQC > MultiQC). Bonus here is that Trimmomatic will sort your reads into paired vs not-paired after QA outputs.
  4. Consider downsampling your reads. (tool: Seqtk)
  • If the job remains too large to execute at the public server after making adjustments, setting up your own server with more resources is less complicated than many realize – and non-technical researchers around the world choose that way to use Galaxy every day for ongoing, large, or time-sensitive projects, or to simply have more flexibility. You can certainly still have an account on the public server to publish data publically, etc, even when running your own private server for routine work.


1 Like

Thank for sending in a bug report. This is your problem ^^ --both ends of all pairs are not in each of the inputs.

1 Like

so is necessary trimming my files to fix the problem? or there is somenthing more that i can do?

1 Like

This is one choice:

If you don’t want to use Trimmomatic, then you can use FASTQ interlacer followed by FASTQ de-interlacer. Only pairs that have both ends present will result.

Prior Q&A related to those functions is here:

1 Like

In the case that I wanted to perform a de novo assembly to obtain a reference transcriptome of 6 paired-end samples (each ~ 1.9 GB) uploaded as individual datasets, I should first perform sequence trimming to obtain the pairs and then concatenate the files in all right (forward) and all left (reverse) to be able to work in trinity? or are there any tips or steps that you should consider or follow?
again, thank you very much for the help you have given me.

1 Like

You can do that (usually)


Set the option at the top of the form as Are you pooling sequence datasets? == Yes. Make sure that matched pairs are added in the same order. You could even put all into a paired dataset collection, and run that as pooled.

Search this forum with the keyword “collection” if you are not sure how to use them. Is optional but a good way to organize multi-datasets.

The GTN tutorials also have many examples of dataset collection creation and usage.

Example creating a collection during data Upload:

Example creating a collection after Upload:

1 Like

Jennifer, thank you very much for helping me solve these doubts and problems, i really appreciate the time you have given me to answer all my questions in the shortest possible time. I hope you have a nice day.

1 Like