Large job running on Salmon quant

I am attempting to run Salmon quant on a large data set (100 sets of paired end reads) with a decoy-away index. The job has been gray for 4+ days now, and I am wondering if there is just a long queue or if I should reduce the size of this job to multiple quantifications of less sets

Hi @wad

If you are running the pairs inside a collection, the individual jobs will run the same as if run by themselves. Using a collection just simplifies how you launch the jobs (in a batch). Everything is queued at once, then will run when resources free up.

Right under where you input the short reads, there is this message in blue text on the tool form. Whenever you see it, the jobs are not pooled but run individually.

This is a batch mode input field. Individual jobs will be triggered for each dataset.

If you click into the output collection, you might be able to see some jobs completed or queued or completed differently. FAQ: Understanding job statuses

If any happen to fail by chance, you can rerun just those and replace the results back into the original output collection(s). Use the “rerun” button and check the extra box on the tool form. FAQ: Different dataset icons and their usage

I hope this helps. Please ask more questions if anything is not clear. :slight_smile:

This helped a lot, thank you! I just need to wait until resources free up to see if I set it up correctly.

Hi @wad

Hope all this works out. This FAQ has some help if you do have some troubles: FAQ: Extended Help for Differential Expression Analysis Tools