Negative disc quota


I currently have about 111GB of data that I have loaded to my account, but it says I have a negative quota of -346 GB (-138% of quota). When I delete items the number becomes more negative, when I add files it becomes less negative…When I try to run jobs, it says I am using too much memory.

Any suggestions on how to fix this?

1 Like

Try logging out of Galaxy then logging in again.

FAQ: The account usage quota seems incorrect

If that doesn’t work, please reply with the URL of the public Galaxy server being used.

I tried logging out and back in several times without any luck. Now it says I have exceeded my quota at ~460GB.

1 Like

Thanks, I found your account.

Some tools still need uncompressed fastq inputs even when compressed are given as the input. These tools create the uncompressed version of the data as part of processing – and that is what happened in your case, putting you over quota.

The compressed data is over 50GB per paired-end input, so over 100GB combined. This is too much data to send to Trinity on the public server.


  • Downsample your data and continue to use the public server. If you want to do this, permanently delete the larger combined files to recover working space. You might need to pre-process the data in batches before combining them. We can help with space short term – see below.

  • Consider working at your own Galaxy server where more resources can be allocated. Cloudman is a popular choice for scientists. Please see:,, and

This is for academic work, correct? We can help with a bit more quota for the short term. Note this won’t help Trinity to assemble the full 100GB+ data, just give you more space to store/prep the data in your account. Send us an email (using your academic registered account email address) to the private mailing list for Galaxy Main: and explain your project briefly. Include a link to this post for reference. FAQ:

2 posts were split to a new topic: Upload data size limits and vcf datasets