Workflow submission failed

Hello!

I am trying to install galaxy and our workflow to another server. On the old server this workflow works like a charm. However when i exported this workflow, imported it to the new server and ran it, it said:

Workflow submission failed

The server could not complete the request. Please contact the Galaxy Team if this error persists.

{
    "new_history_name": "h1902_he4722",
    "history_id": null,
    "resource_params": {},
    "replacement_params": {},
    "parameters": {
        "0": {
            "sample": "h1902_he4722",
            "left": {
                "values": [
                    {
                        "hid": 1,
                        "id": "f2db41e1fa331b3e",
                        "keep": false,
..
..

And long enumeration of parameters of all 55 steps, but nothing about actual error. And no task was queued.

By means of deleting some steps i found that the root of all evil is step “AddOrReplaceReadGroup tumor transcriptome” which is described as follows:

    "16": {
        "annotation": "",
        "content_id": "toolshed.g2.bx.psu.edu/repos/devteam/picard/picard_AddOrReplaceReadGroups/2.18.2.1",
        "errors": null,
        "id": 16,
        "input_connections": {
            "inputFile": {
                "id": 13,
                "output_name": "outFile"
            }
        },
        "inputs": [
            {
                "description": "runtime parameter for tool AddOrReplaceReadGroups",
                "name": "read_group_id_conditional"
            }
        ],
        "label": "AddOrReplaceReadGroup  tumor transcriptome",
        "name": "AddOrReplaceReadGroups",
        "outputs": [
            {
                "name": "outFile",
                "type": "bam"
            }
        ],
        "position": {
            "bottom": 678.1381988525391,
            "height": 163.7804718017578,
            "left": -710.7962646484375,
            "right": -576.7962493896484,
            "top": 514.3577270507812,
            "width": 134.00001525878906,
            "x": -710.7962646484375,
            "y": 514.3577270507812
        },
        "post_job_actions": {},
        "tool_id": "toolshed.g2.bx.psu.edu/repos/devteam/picard/picard_AddOrReplaceReadGroups/2.18.2.1",
        "tool_shed_repository": {
            "changeset_revision": "9ffcddf6f9c0",
            "name": "picard",
            "owner": "devteam",
            "tool_shed": "toolshed.g2.bx.psu.edu"
        },
        "tool_state": "{\"CN\": \"\", \"DS\": \"\", \"DT\": \"\", \"PI\": \"\", \"PL\": \"ILLUMINA\", \"PU\": \"run\", \"inputFile\": {\"__class__\": \"ConnectedValue\"}, \"read_group_id_conditional\": {\"do_auto_name\": \"false\", \"__current_case__\": 1, \"ID\": {\"__class__\": \"RuntimeValue\"}}, \"read_group_lb_conditional\": {\"do_auto_name\": \"false\", \"__current_case__\": 1, \"LB\": \"LIB1\"}, \"read_group_sm_conditional\": {\"do_auto_name\": \"false\", \"__current_case__\": 1, \"SM\": \"\"}, \"validation_stringency\": \"LENIENT\", \"__page__\": null, \"__rerun_remap_job_id__\": null}",
        "tool_version": null,
        "type": "tool",
        "uuid": "507b3f9d-a7a3-45bc-9251-161d66e647b4",
        "workflow_outputs": [
            {
                "label": "AddOrReplaceReadGroups on input dataset(s): BAM with replaced/modified readgroups",
                "output_name": "outFile",
                "uuid": "e9fe451f-226d-4f8c-8527-38b5804d15bb"
            }
        ]
    },

I think it can not be caused by absence of picard tools (which had been installed via admin panel but galaxy can not run them anyway, but this is another question) cos other picard tool invocations lead to successful queuing of such task and “picard: command not found” error in the stderr output.

On the other hand, i changed absolutely nothing when exporting and importing the workflow, and both galaxies are 20.09, so it should not be version mismatch.

Can you please tell me what is going on? And how can i defeat this error?

Thanks in advance.

I installed picard tools but the error is still there. Moreover, i can execute (successfully!) this particular step and the workflow without this step, but i can not execute the whole workflow!

Hi @wormball, I think you were unlucky enough to update to a commit that was broken on release 21.01. If you update to the latest commit of release 21.01 you should be able to replace this step. Importantly, do not copy the step in the workflow editor, but add picard_AddOrReplaceReadGroups from the left hand side and set the parameters as necessary.

Thanks, but i am using release 20.09 on both servers!

Has the workflow ever seen a 21.01 server ? If you can download the workflow and put it somewhere I can fix it for you, or you can do this yourself by removing

        "inputs": [
            {
                "description": "runtime parameter for tool AddOrReplaceReadGroups",
                "name": "read_group_id_conditional"
            }
        ],

and the reference to read_group_id_conditional in this step’s tool state entry. (Or using the instruction I posted earlier)

I found that if i specify read group ID and SM, it works, and if i do not specify, it do not works, in both cases regardless of suggested editions. However when i first encountered this error, i specified such fields and still got the error.

I got this error in the old galaxy too. Galaxy’s picard 2.18.2 gave me errors, so i uninstalled it at the “manage dependencies” tab and installed 2.20.5 outside the galaxy. And after that when i run picard as “run again” in some history, it runs perfectly, but when i invoke the whole workflow, i get aforementioned error.

It magically worked again (on the old server). But i am not sure why and not sure it will work tomorrow.

When i upload my files and invoke the workflow, it says me in some file selection controls:

the previously selected dataset has been deleted

despite i have not deleted anything in the current history. And says “Workflow submission failed” if i press “run”. However if i “copy” the whole history and invoke the workflow on the “copied” history, it works fine.

Also i made automated workflow like advised here Workflow automation? - #5 by wormball , and it began to schedule, but only 7 of 60 steps were scheduled, and then it said

Invocation scheduling failed - Galaxy administrator may have additional details in logs.

But when i looked at the “View Error Logs” in the admin panel, i saw “No errors available.”, so previous “Workflow submission failed” errors also left no trace in the logs.

Then i looked at galaxy.log and found many letters there. Here are all error entries for march-april:

galaxy.web.framework.decorators ERROR 2021-03-18 18:16:54,941 [p:484294,w:1,m:0] [uWSGIWorker1Core0] Uncaught exception in exposed API method:
galaxy.jobs DEBUG 2021-03-18 19:05:44,300 [p:484294,w:1,m:0] [LocalRunner.work_thread-2] (1042) setting dataset 1154 state to ERROR
galaxy.jobs DEBUG 2021-03-18 19:05:44,327 [p:484294,w:1,m:0] [LocalRunner.work_thread-0] (1040) setting dataset 1152 state to ERROR
galaxy.jobs DEBUG 2021-03-18 19:07:10,039 [p:484294,w:1,m:0] [LocalRunner.work_thread-0] (1048) setting dataset 1160 state to ERROR
galaxy.jobs.runners ERROR 2021-03-18 19:19:51,740 [p:484294,w:1,m:0] [LocalRunner.work_thread-2] (1049/) Job wrapper finish method failed
galaxy.jobs DEBUG 2021-03-19 20:46:51,586 [p:484294,w:1,m:0] [LocalRunner.work_thread-1] (1105) setting dataset 1224 state to ERROR
galaxy.jobs DEBUG 2021-03-19 23:21:16,758 [p:484294,w:1,m:0] [LocalRunner.work_thread-0] (1111) setting dataset 1234 state to ERROR
galaxy.jobs DEBUG 2021-03-19 23:21:16,766 [p:484294,w:1,m:0] [LocalRunner.work_thread-1] (1057) setting dataset 1172 state to ERROR
galaxy.jobs DEBUG 2021-03-23 14:23:12,842 [p:484294,w:1,m:0] [LocalRunner.work_thread-3] (1132) setting dataset 1257 state to ERROR
galaxy.jobs ERROR 2021-03-23 14:32:17,280 [p:484294,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1043
galaxy.jobs ERROR 2021-03-23 14:32:17,381 [p:484294,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1045
galaxy.jobs ERROR 2021-03-23 14:32:22,933 [p:484294,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1069
galaxy.jobs ERROR 2021-03-23 14:32:23,030 [p:484294,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1120
galaxy.jobs ERROR 2021-03-23 14:32:24,704 [p:484294,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1063
galaxy.jobs ERROR 2021-03-23 14:32:24,850 [p:484294,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1113
galaxy.jobs ERROR 2021-03-23 14:32:25,019 [p:484294,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1115
galaxy.jobs ERROR 2021-03-23 14:32:26,295 [p:484294,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1078
galaxy.jobs ERROR 2021-03-23 14:32:26,630 [p:484294,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1122
galaxy.jobs ERROR 2021-03-23 14:32:27,064 [p:484294,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1123
galaxy.jobs ERROR 2021-03-23 14:32:27,238 [p:484294,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1125
galaxy.jobs ERROR 2021-03-23 14:32:27,419 [p:484294,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1126
galaxy.web.framework.decorators ERROR 2021-03-25 15:22:15,094 [p:44177,w:1,m:0] [uWSGIWorker1Core3] Uncaught exception in exposed API method:
galaxy.web.framework.decorators ERROR 2021-03-25 15:22:23,517 [p:44177,w:1,m:0] [uWSGIWorker1Core0] Uncaught exception in exposed API method:
galaxy.web.framework.decorators ERROR 2021-03-25 15:22:23,805 [p:44177,w:1,m:0] [uWSGIWorker1Core1] Uncaught exception in exposed API method:
galaxy.jobs DEBUG 2021-03-26 19:13:47,828 [p:180146,w:1,m:0] [LocalRunner.work_thread-3] (1208) setting dataset 1339 state to ERROR
galaxy.jobs DEBUG 2021-03-26 19:21:21,702 [p:180146,w:1,m:0] [LocalRunner.work_thread-2] (1211) setting dataset 1342 state to ERROR
galaxy.jobs DEBUG 2021-04-01 15:51:37,634 [p:180146,w:1,m:0] [LocalRunner.work_thread-3] (1440) setting dataset 1598 state to ERROR
galaxy.jobs DEBUG 2021-04-06 13:45:12,989 [p:180146,w:1,m:0] [LocalRunner.work_thread-3] (1590) setting dataset 1763 state to ERROR
galaxy.jobs DEBUG 2021-04-06 13:58:24,889 [p:180146,w:1,m:0] [LocalRunner.work_thread-0] (1592) setting dataset 1765 state to ERROR
galaxy.jobs DEBUG 2021-04-06 19:19:32,792 [p:180146,w:1,m:0] [LocalRunner.work_thread-3] (1594) setting dataset 1767 state to ERROR
galaxy.jobs.runners ERROR 2021-04-06 19:19:36,931 [p:180146,w:1,m:0] [LocalRunner.work_thread-3] (1597/) Job wrapper finish method failed
galaxy.jobs DEBUG 2021-04-06 19:19:41,142 [p:180146,w:1,m:0] [LocalRunner.work_thread-3] (1598) setting dataset 1772 state to ERROR
galaxy.jobs ERROR 2021-04-06 19:19:41,958 [p:180146,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1601
galaxy.jobs ERROR 2021-04-06 19:19:44,578 [p:180146,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1615
galaxy.jobs ERROR 2021-04-06 19:19:45,945 [p:180146,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1627
galaxy.jobs ERROR 2021-04-06 19:19:47,349 [p:180146,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1637
galaxy.jobs ERROR 2021-04-06 19:19:47,772 [p:180146,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1638
galaxy.tools.toolbox.base ERROR 2021-04-06 19:33:57,892 [p:180146,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/codingSnps.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/codingSnps.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-06 19:33:57,893 [p:180146,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/add_scores.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/add_scores.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-06 19:33:57,894 [p:180146,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'phenotype_association/sift.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/sift_db.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-06 19:33:57,895 [p:180146,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'melanoma_tools/scripts_hla_la.xml', '[Errno 2] No such file or directory')
galaxy.tools.toolbox.base ERROR 2021-04-06 19:33:57,902 [p:180146,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/codingSnps.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/codingSnps.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-06 19:33:57,903 [p:180146,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/add_scores.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/add_scores.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-06 19:33:57,904 [p:180146,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'phenotype_association/sift.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/sift_db.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-07 11:52:47,046 [p:180146,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/codingSnps.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/codingSnps.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-07 11:52:47,048 [p:180146,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/add_scores.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/add_scores.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-07 11:52:47,049 [p:180146,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'phenotype_association/sift.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/sift_db.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-07 11:52:47,049 [p:180146,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'melanoma_tools/scripts_hla_la.xml', '[Errno 2] No such file or directory')
galaxy.tools.toolbox.base ERROR 2021-04-07 11:52:47,056 [p:180146,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/codingSnps.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/codingSnps.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-07 11:52:47,057 [p:180146,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/add_scores.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/add_scores.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-07 11:52:47,058 [p:180146,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'phenotype_association/sift.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/sift_db.loc'")
galaxy.jobs ERROR 2021-04-07 13:04:59,058 [p:180146,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1648
galaxy.jobs ERROR 2021-04-07 13:04:59,275 [p:180146,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1649
galaxy.jobs ERROR 2021-04-07 13:04:59,496 [p:180146,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1650
galaxy.jobs ERROR 2021-04-07 13:05:00,754 [p:180146,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1659
galaxy.jobs ERROR 2021-04-07 13:05:01,143 [p:180146,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1660
galaxy.jobs ERROR 2021-04-07 13:05:01,416 [p:180146,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1661
galaxy.jobs ERROR 2021-04-07 13:05:02,765 [p:180146,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1672
galaxy.jobs ERROR 2021-04-07 13:05:02,974 [p:180146,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1673
galaxy.jobs DEBUG 2021-04-07 13:07:00,262 [p:180146,w:1,m:0] [LocalRunner.work_thread-0] (1683) setting dataset 1867 state to ERROR
galaxy.jobs DEBUG 2021-04-07 13:44:40,904 [p:180146,w:1,m:0] [LocalRunner.work_thread-2] (1721) setting dataset 1910 state to ERROR
galaxy.jobs DEBUG 2021-04-07 13:44:41,303 [p:180146,w:1,m:0] [LocalRunner.work_thread-3] (1722) setting dataset 1911 state to ERROR
galaxy.jobs ERROR 2021-04-07 13:44:51,501 [p:180146,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1729
galaxy.jobs ERROR 2021-04-07 13:44:51,709 [p:180146,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1730
galaxy.jobs ERROR 2021-04-07 13:44:53,034 [p:180146,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1736
galaxy.jobs ERROR 2021-04-07 13:44:54,297 [p:180146,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1748
galaxy.jobs ERROR 2021-04-07 13:44:55,726 [p:180146,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1756
galaxy.jobs ERROR 2021-04-07 13:44:56,257 [p:180146,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1757
galaxy.jobs ERROR 2021-04-07 13:44:57,795 [p:180146,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1761
galaxy.jobs ERROR 2021-04-07 13:44:58,296 [p:180146,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1762
galaxy.jobs ERROR 2021-04-07 13:44:58,882 [p:180146,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1763
galaxy.jobs DEBUG 2021-04-07 14:57:09,115 [p:310870,w:1,m:0] [LocalRunner.work_thread-1] (1770) setting dataset 1965 state to ERROR
galaxy.jobs.runners ERROR 2021-04-07 15:10:17,426 [p:310870,w:1,m:0] [LocalRunner.work_thread-3] (1772/) Job wrapper finish method failed
galaxy.jobs DEBUG 2021-04-07 15:33:59,106 [p:310870,w:1,m:0] [LocalRunner.work_thread-0] (1773) setting dataset 1970 state to ERROR
ERROR conda.core.link:_execute(568): An error occurred while installing package 'conda-forge::_libgcc_mutex-0.1-conda_forge'.
ERROR conda.core.link:_execute(568): An error occurred while installing package 'conda-forge::ncurses-6.2-h58526e2_4'.
ERROR conda.core.link:_execute(568): An error occurred while installing package 'conda-forge::_libgcc_mutex-0.1-conda_forge'.
galaxy.jobs ERROR 2021-04-08 15:53:19,449 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1781
galaxy.jobs.runners ERROR 2021-04-08 15:53:20,554 [p:315295,w:1,m:0] [LocalRunner.work_thread-1] (1780) Failure preparing job
galaxy.jobs ERROR 2021-04-08 15:53:21,096 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1782
galaxy.jobs ERROR 2021-04-08 15:53:21,620 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1783
galaxy.jobs ERROR 2021-04-08 15:53:22,200 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1784
galaxy.jobs ERROR 2021-04-08 15:53:23,566 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1789
galaxy.jobs ERROR 2021-04-08 15:53:23,841 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1790
galaxy.jobs ERROR 2021-04-08 15:53:25,111 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1800
galaxy.jobs ERROR 2021-04-08 15:53:25,339 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1801
galaxy.jobs ERROR 2021-04-08 15:53:25,577 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1802
galaxy.jobs ERROR 2021-04-08 15:53:27,214 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1807
galaxy.jobs ERROR 2021-04-08 15:53:27,455 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1808
galaxy.jobs ERROR 2021-04-08 15:53:28,858 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1817
galaxy.jobs ERROR 2021-04-08 15:53:29,107 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1818
galaxy.jobs ERROR 2021-04-08 16:12:44,256 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1886
galaxy.jobs ERROR 2021-04-08 16:12:45,127 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1887
galaxy.jobs ERROR 2021-04-08 16:12:47,221 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1888
galaxy.jobs ERROR 2021-04-08 16:12:47,971 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1889
galaxy.jobs ERROR 2021-04-08 16:12:50,675 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1837
galaxy.jobs ERROR 2021-04-08 16:12:51,952 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1838
galaxy.jobs ERROR 2021-04-08 16:12:52,820 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1839
galaxy.jobs ERROR 2021-04-08 16:12:55,946 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1899
galaxy.jobs ERROR 2021-04-08 16:13:00,542 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1850
galaxy.jobs ERROR 2021-04-08 16:13:01,122 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1851
galaxy.jobs ERROR 2021-04-08 16:13:01,635 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1907
galaxy.jobs ERROR 2021-04-08 16:13:03,460 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1858
galaxy.jobs ERROR 2021-04-08 16:13:03,886 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1859
galaxy.jobs ERROR 2021-04-08 16:13:05,793 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1870
galaxy.jobs ERROR 2021-04-08 16:13:06,038 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1871
galaxy.jobs ERROR 2021-04-08 16:13:06,241 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1872
galaxy.jobs ERROR 2021-04-08 16:13:06,518 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1873
galaxy.jobs ERROR 2021-04-08 16:13:10,954 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1925
galaxy.jobs ERROR 2021-04-08 16:13:12,082 [p:315295,w:1,m:0] [JobHandlerQueue.monitor_thread] Unable to cleanup job 1927
galaxy.tools.toolbox.base ERROR 2021-04-08 18:00:29,910 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/codingSnps.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/codingSnps.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-08 18:00:29,912 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/add_scores.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/add_scores.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-08 18:00:29,913 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'phenotype_association/sift.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/sift_db.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-08 18:00:29,914 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'melanoma_tools/scripts_hla_la.xml', '[Errno 2] No such file or directory')
galaxy.tools.toolbox.base ERROR 2021-04-08 18:00:29,925 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/codingSnps.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/codingSnps.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-08 18:00:29,940 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/add_scores.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/add_scores.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-08 18:00:29,942 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'phenotype_association/sift.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/sift_db.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-08 18:05:12,706 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/codingSnps.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/codingSnps.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-08 18:05:12,708 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/add_scores.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/add_scores.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-08 18:05:12,709 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'phenotype_association/sift.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/sift_db.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-08 18:05:12,710 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'melanoma_tools/scripts_hla_la.xml', '[Errno 2] No such file or directory')
galaxy.tools.toolbox.base ERROR 2021-04-08 18:05:12,733 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/codingSnps.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/codingSnps.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-08 18:05:12,734 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/add_scores.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/add_scores.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-08 18:05:12,736 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'phenotype_association/sift.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/sift_db.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-08 18:05:32,510 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/codingSnps.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/codingSnps.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-08 18:05:32,511 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/add_scores.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/add_scores.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-08 18:05:32,512 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'phenotype_association/sift.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/sift_db.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-08 18:05:32,513 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'melanoma_tools/scripts_hla_la.xml', '[Errno 2] No such file or directory')
galaxy.tools.toolbox.base ERROR 2021-04-08 18:05:32,526 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/codingSnps.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/codingSnps.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-08 18:05:32,528 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/add_scores.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/add_scores.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-08 18:05:32,532 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'phenotype_association/sift.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/sift_db.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-09 13:23:00,375 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/codingSnps.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/codingSnps.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-09 13:23:00,376 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/add_scores.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/add_scores.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-09 13:23:00,378 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'phenotype_association/sift.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/sift_db.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-09 13:23:00,378 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'melanoma_tools/scripts_hla_la.xml', '[Errno 2] No such file or directory')
galaxy.tools.toolbox.base ERROR 2021-04-09 13:23:00,385 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/codingSnps.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/codingSnps.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-09 13:23:00,386 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/add_scores.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/add_scores.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-09 13:23:00,387 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'phenotype_association/sift.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/sift_db.loc'")
galaxy.workflow.run ERROR 2021-04-09 15:48:56,907 [p:315295,w:1,m:0] [WorkflowRequestMonitor.monitor_thread] Failed to execute scheduled workflow.
galaxy.tools.toolbox.base ERROR 2021-04-09 15:55:26,230 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/codingSnps.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/codingSnps.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-09 15:55:26,231 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/add_scores.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/add_scores.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-09 15:55:26,232 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'phenotype_association/sift.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/sift_db.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-09 15:55:26,233 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'melanoma_tools/scripts_hla_la.xml', '[Errno 2] No such file or directory')
galaxy.tools.toolbox.base ERROR 2021-04-09 15:55:26,253 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/codingSnps.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/codingSnps.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-09 15:55:26,254 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'evolution/add_scores.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/add_scores.loc'")
galaxy.tools.toolbox.base ERROR 2021-04-09 15:55:26,256 [p:315295,w:1,m:0] [Thread-1] ("Error reading tool configuration file from path '%s': %s", 'phenotype_association/sift.xml', "[Errno 2] No such file or directory: '/data/galaxy/tool-data/sift_db.loc'")
galaxy.jobs DEBUG 2021-04-09 16:02:21,691 [p:315295,w:1,m:0] [LocalRunner.work_thread-1] (2005) setting dataset 2226 state to ERROR
galaxy.workflow.run ERROR 2021-04-09 16:19:44,894 [p:315295,w:1,m:0] [WorkflowRequestMonitor.monitor_thread] Failed to execute scheduled workflow.

It seems like:

  • When i invoke a workflow with some tool versions not equal to installed, i get the yellow "Some tools are being executed with different versions compared to those available when this workflow was last saved because the other versions are not or no longer available on this Galaxy instance. To upgrade your workflow and dismiss this message simply edit the workflow and re-save it.
    " warning, and then the aforementioned error.

  • When i got this error, every other workflow invocation on the same history will result in the same error, even if i save the workflow again so it contains correct tool versions.

  • However, if i “copy” the history, i have a good chance to schedule workflow correctly from this copy.

  • But if i e. g. schedule the workflow with wrong parameters, purge this history and then invoke the workflow another time, or schedule two workflows in short time, it can result in either the same error or “An error occurred while updating information with the server. Please contact a Galaxy administrator if the problem persists. Ok”. The latter error also manifestates in partial scheduling of the last workflow (e. g. 16 steps of 60).