Hi @nirjana_dewan
Yes, over a terabyte is quite large for a single file! And, downloading individual files or entire large collections is tedious (although likely possible!).
I would suggest trying this:
- Use Copy Datasets to split up the bulk of the data into new, smaller, histories. → FAQ: Copy a dataset between histories
- Try to generate RO-Crate links for those new smaller histories.
Exact copies of data do not consume additional quota space, so you can do this copy action as many times as you want while testing. Keeping the original history intact will allow you to extract a workflow before you purge it. Even if you are not going to run that workflow, this will capture the flow and parameters and exact tools for you with context. You can also export the citations! → FAQ: How can I reduce quota usage while still retaining prior work (data, tools, methods)? && FAQ: How do I cite the tools I used in my history?
I’m not sure exactly what the “small enough” threshold is at UseGalaxy.org.au is, so maybe start with half (~ 500 GB) and test to see if that works. Then drop down to 250 (this should definitely work!). You don’t need to be exact with sizes – copy logical groupings, such as entire collections. When you copy a collection, all the datasets inside of it are also copied (you don’t need to click on each – just the top level collection folder).
When deciding on the size, you’ll need to consider the speed of the internet on your side too! Moving data to another cloud-based location would be most reliable. Then, download from that other site if you need data locally. Your own local server would be within your network and a commercial cloud service will likely have infrastructure to support faster downloads (probably!). If you are downloading to your computer itself – a home internet might work but work connections tend to be better. You can use web-based “speedtest” to see what to expect.
Regional considerations: This advice is mostly anecdotal, and probably dependent on what the larger “internet” is experiencing that day, but some have had better luck with larger download transfers not aborting when downloading from the UseGalaxy server that is also in their region. Example: if you are in the EU, moving data to the UseGalaxy.eu server first (from another Galaxy server), then downloading to a local computer worked “better”.
Then, importantly, I can let you know that Galaxy AU supports custom file repositories. This means you can set one of these up, and your data can be read and then written to your remote location during the analysis process itself! This avoids the Download steps entirely since you already have the files, and the compressed version of the history can be created there too if you want. You can choose to mix using a custom file storage space with others the same as you would do with the permanent and temporary data storage locations on the server itself.
Add a New File Source at a UseGalaxy server
1. Navigate to https://usegalaxy.org.au/

2. Click on your username at the top right of the masthead menu

3. Click on Preferences in the drop down User menu

4. Click on Manage Your Repositories. These are custom data storage locations you can set up. Galaxy can read and write to these locations, allowing you to avoid large data transfers entirely. Useful for big HTP projects!

5. Click on Create new file source to see the supported options at each server. These may be different on different Galaxies and will update to gain more options over time.

So – two steps but lots of details to consider!! Please give that a try and we can follow up more with the AU people here at this forum if my advice is not enough.
Thanks! 
ping @igor do you have other ideas or advice?