Cluster command error (Mothur analysis) -- Job too large for public Galaxy? Try a custom Galaxy server

Hi everyone
I got an error with Mothur command “cluster”
“This job was terminated because it used more memory than it was allocated”
The preceded command “Dist.seqs” was used with cutoff “0.20” and the output file is ~170Gb

I know that I can use the alternative command “cluster.split” which uses less memory, but for some reason I have to use cluster command.

BTW I am using

any advices?
I appreciate your help in advance
thank you

1 Like

This page can be helpfull Account quotas it looks like you are exceeding quota’s.


Hi @Jalal

We discussed this directly via email, correct?

Summary for others:

  1. The input dataset is so large that it exceeds computing resources at and probably would exceed the resources at any public Galaxy server.

  2. Computing resources are distinct from data storage (quota space) associated with accounts.

  3. The 16s tutorial here explains why splitting clusters is a reasonable choice for many: 16S Microbial Analysis with mothur (extended)

  4. If that method does not suit your needs or the needs of anyone running any analysis that is too large for public servers, then moving to a private Galaxy server where you can control the resources is one way forward.



Hi Jen,
Yes already discussed via email
thank you for help