Z-score calculation (Table Compute tool) - wrong output

Hello community,

My apologies if it is a basic question, I just started with the tutorial recently. I have a quick question about the Z-score calculation.

I am following the Galaxy tutorial for Z-score calculation (Hands-on: Hands-on: Reference-based RNA-Seq data analysis / Transcriptomics in " Hands-on: Compute the Z-score of all genes"). As input I am using normalized counts for some differential expressed genes but after the first step of table compute I am getting the same values as output. In theory, I should get as an input the mean values per row, right?

Here the steps that I followed in table compute tool:

  • “Input Single or Multiple Tables”: Single Table
    • param-file “Table”: Normalized counts file on ... (output of DESeq2 tool)
    • “Type of table operation”: Perform a full table operation
      • “Operation”: Custom
        • “Custom expression on ‘table’, along ‘axis’ (0 or 1)”: table.sub(table.mean(1), 0)

Many thanks in advance for your feedback.

Juber Herrera

Hi @Jurrera18

The computation is described here, and uses this tool twice, not just once. The second step is where the mean values are compared back to the original value and converted to a z-score (“standard deviations away from the mean”).

topics/transcriptomics/tutorials/ref-based/tutorial.html#hands-on-compute-the-z-score-of-all-genes

1 Like