Hi @DHowell23
The memory error can be produced by any tool, and is usually a clue that something might be wrong with the input data or parameters. A recent topic with more details about what to do is here, and most of that will apply for you, too. → Troubleshooting Snippy memory allocation error
Then, if the data is Ok, you can try one more option for this specific tool. Filtering your BAM file might help the job to process. Remove any unmapped and lower mapQ alignments. Maybe filter for proper pairs and the primary hit. The downstream analysis steps probably expect filtered data anyway (example: Chip-seq would). See our tutorials for how the filtering is commonly applied and why (improves results!).
If the filtering is not enough, then you can try at a different Galaxy server to see what happens. To move data from the UseGalaxy.org server to another server, maybe UseGalaxy.eu, follow this guide. You can move a single dataset or a batch of data.
Xref
- error with Picard's MarkDuplicates - This job was terminated because it used more memory than it was allocated.
- Some details about the processing limits for this tool across different UseGalaxy servers is here Enhancement: Increase memory allocation to 18 GB for picard_MarkDuplicates3.1.1.0 at ORG · Issue #719 · galaxyproject/usegalaxy-tools · GitHub.
Hope this helps!