Jobs paused and web interface (localhost:8080) does not respond.

Hi,

I’m quite new at running galaxy from local (but used galaxy.org and galaxy.eu for years) so I might be unaware of some basics.
It seems that there are different issues although the prompt does not yield Error messages.

Some background:

  • I run a workflow (Fasta Split, Bowtie2, ivar variants, … VCF combine) on a dataset collection of 10000 fasta files (one sequence of 4kb each file). It seemed to work fine for three days up to tonight that got stuck before the last step VCFCombine.
  • The entire workflow was tested on a collection on 20 files and worked perfectly.
  • Approximately at the same time my computer got its wifi connection broken but now has been repaired (I don’t know if it is connected to Galaxy issue).
  • My history (450 Mb used) shows the last step (VCV combine) paused, and the two previous steps finished in green.
  • I have Galaxy installed in a Virtual machine (VirtualBox) 20 GB RAM and 23 GB memory left.

The issue:

  • Every time I load the localhost:8080 interface and click any button to stuck for a while
  • I tried to download a whole dataset collection from the interface but did not produce anything
  • When I look at the local folder database/jobs_directory/ there are several folders but ALL empty
  • When I resume the paused job from the history menu, the prompt says:

“was paused before the job started, Input dataset ‘Variants (VCF)’ was paused before the job started, Input dataset ‘Variants (VCF)’ was paused before the job started. To resume this job fix the input dataset(s).”

How could I resume the last job or fix the input dataset?
I also would like to download the dataset collection just in case I have to do the last step using command line. Anything that saves the work done.

I would appreciate very much any help and advice.

Cheers,
Ana

1 Like

Hi @AValero, this means that one of your input jobs has failed. You can go to “User → Workflow Invocations” and expand your last workflow run. You can then expand the step before the last step and find the job that has failed.


Find the job id, and copy paste the following into your browser:
<your_galaxy_url>/root?job_id=<the_job_id_you_just_found_there>

So in my example that is http://127.0.0.1:8000/root?job_id=61da110c947ecb3c
You can then run the job again, and if you select “Yes” in “Resume dependencies for this job” the remaining jobs should continue.

Open the dataset collection you want to download. This may take a little moment if the collection is very large. The click on the download symbol.

I hope that helps.

1 Like

Hi @mvdbeek

Finally, could get the Job Id but found out that there is one different for each of 10000 datasets. The history only loads the first 1000s, which have no errors. When trying to access to the job id from the workflow invocation it failed as the connection or pipe breaks before getting the answer.

Similarly, when trying to dowload the dataset collection from the last job that worked, the connection fails before downloading.

  • Is there anywhere within my local files and folders where such data is stored as I’m working local?
  • Is it possible that changing the maxwait time in any config file solve the broken pipes and help to finish the downloading?

Well, I finally did a workaround.
I managed to download the dataset collection with the 10000 VCF files that was producing the error, using command-line and the dataset id that copied from the history panel in the localhost:8080 server.

$ wget --no-check-certificate http://localhost:8080/api/dataset_collections/MY_DATASET_ID/download?key=MY_API_KEY -O dataset.txt

Due to the huge amount of datasets (10000), it was imposible to dowload from the localhost:8080 by clicking on the “flopy disk” icon of the history. The waiting time was too long and the pipe breaks.

Also found out that there were twelve datasets missing, probably did not pass some filters in previous steps, so the whole workflow couldn’t be accomplished. Now, trying to run VCFcombine directly on a newly uploaded dataset collection.
Thanks!

1 Like

Reference FAQ: Downloading Data