-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DESeq2 for SLAM-seq #126
Comments
So you have two cells side by side where you perform SLAM-seq and one of them on top had a drug treatment correct? Do you know also the drug effect, is it something like global transcriptional inhibition? Or why would you assume that the tc-readcounts should differ between the two conditions? |
Hi Tobias, I was wondering if the results from DESeq2 would differ if I would start with the Filtered TC BAM file instead of the TCReadCounts? Best wishes, |
In theory they should not, because the filtered TC read bam file contains exactly those reads upon we base our TCReadCounts quantification. |
Hi Tobias, Thank you for such a quick response. I do get a different number of DEGs from my analysis, however, the gene set enrichment looks quite similar to the analysis from TCReadCount. Best wishes, |
How do you quantify the reads from the TC read bam file? |
I am using summarizeoverlaps in R to quantify the reads. |
Does that somehow filter flagged multimappers? |
Hi, I am trying to use DESeq2 to analyze differential read counts in a slam-seq experiment. I have two identical cells, one of which was pulsed/chased with a drug, so the non-transformed read counts should be identical between the two, whereas the transformed read counts should indicate the degree of nascent transcription.
I am passing in values from the ReadCount column to DESeq2, however, I find that DESeq2 is highly conservative, as my non-adjusted pvalue histogram is not uniform. This makes me worried that for some reason DESeq2 is not appropriate for these data. Do you have any advice or thoughts on this matter?
The text was updated successfully, but these errors were encountered: