Hello, I am attempting to use markduplicates on my data aligned and sorted by Bowtie2. Everytime I attempt to run this tool, I receive the error detailed below.
“The job terminated because it used more memory than allocated”. This is the command line code
_JAVA_OPTIONS=${_JAVA_OPTIONS:-“-Xmx2048m -Xms256m -Djava.io.tmpdir=${TMPDIR:-${_GALAXY_JOB_TMPDIR}}”} && export _JAVA_OPTIONS && ln -sf ‘/corral4/main/objects/8/6/4/dataset_864f72d3-51b1-4ab5-a694-4e8925bb7933.dat’ ‘Rep_2_1-250127-A02_fastq’ && picard MarkDuplicates --INPUT ‘Rep_2_1-250127-A02_fastq’ --OUTPUT ‘/corral4/main/jobs/064/890/64890385/outputs/dataset_8e9d140c-9b3d-4a7f-acc4-194d99bcf1e4.dat’ --METRICS_FILE ‘/corral4/main/jobs/064/890/64890385/outputs/dataset_6e50595b-8618-4d4c-88be-86e49ae876af.dat’ --COMMENT ‘using BAM with read group, 1000 for nxtsq 2k’ --REMOVE_DUPLICATES ‘false’ --ASSUME_SORTED ‘true’ --DUPLICATE_SCORING_STRATEGY SUM_OF_BASE_QUALITIES --OPTICAL_DUPLICATE_PIXEL_DISTANCE ‘1000’ --VALIDATION_STRINGENCY ‘LENIENT’ --TAGGING_POLICY All --QUIET true --VERBOSITY ERROR
How do I allocate more memory to this tool?