Downloading histories by curl or wget and link only downloads an HTML file

Hi team,

I’m trying to download my Galaxy histories to free up room for more analysis. However when I select ‘Export History to File’ and get a link to use with wget or curl (On MacOS and I’ve attempted both) it only downloads an HTML file instead of a tar.gz file of the datasets in the history.

The file looks like this:
==> export_archive?id=XXXXXXXXXX <==

<!DOCTYPE HTML>

<html>

<!–base.mako–>

Hope that this makes sense,

Best,
Tim.

admin edit: remove archive ID

1 Like

Hello,

Try downloading the archive from the web page directly. Curl/wget will only work for datasets, with collection datasets requiring that your Galaxy API key is included.

Reference FAQ for anyone else reading: * Downloading Data

Thanks!

Hi Jen,

I would really like to use wget or another terminal based command since the downloads can sometimes be large and would take too long. Is it not possible to do what I want to do from command line? I don’t understand the API instructions in the FAQ.

Best,
Tim.

1 Like

I agree this is a very good idea, and so do the developers, it has just not been implemented yet.

Open ticket: https://github.com/galaxyproject/galaxy/issues/2968

Please feel free to comment there, too. I did link in your post. More community feedback can sometimes help to bump an enhancement idea up in priority.

For the API instructions, those are for downloading collection datasets. Under User > Preferences you can generate an api key for your account. That is added to the end of the dataset link (copied from the “disc icon”). But this won’t work for entire histories – just datasets.

1 Like

Great, thank you for your help Jen :slight_smile:

1 Like