Downloading Large Files - 1Gb Limit (Galaxy Main | wget | curl | API Key)

@KEELE Yes, the data is large. The connection dropped for me too.

I remembered that this has come up before. See this prior Q&A on how to “resume” downloads with curl and wget and see if one of those works for you.

Your other option is to create a History archive and download that. Once uncompressed, you’ll find your data inside of it. Tip: Copy just the data you need into a new history to make it smaller/faster to process this way. Copies of dataset you already have in your account do not consume any additional account quota space. I’m guessing that you don’t need everything in the original history, just the results, but either way should work. It just takes longer to create then download or import-to-another-Galaxy-server a really large history archive.

  • The option to create and download/generate a link to a History archive is under the History menu (gear icon).
  • The option to import a History archive (already downloaded archive file or URL from a publically accessible Galaxy server) is at the top of the Saved History page, if you have a local Galaxy and want to store data in context.

Be sure to create a share link to the history (and objects) – just a link is fine, you don’t need to publish it) – before creating the archive if you decide to go this route. Sharing after (while the archive is being created) doesn’t work as well.