Hello, I am interested in analysing this data and was wondering if it is possible to get access to the fastq files for the samples? Currently only the bam files are available. Many thanks Devika

Created by Devika Agarwal dpag0891
This is true and escaped being in the documentation. My apologies. The RNA prep was neither poly A selection nor ribozero depletion. According to our vendor, RNA levels were really low in the microglial samples, so the procedure was to use the Nugen Ovation RNA Seq V2 system. The resulting cDNA was then sheared using our Covaris and the libraries were prepared using the Kapa LTP Library Preparation Kit.
Hi Corey, I believe since I first posted the query two months ago , I have an answer to my initial question in regards to why the mapping rates were about 45% when the FQs i re-generated were quantified to transcripts with Salmon. I have since then carried out Picard CollectRNASeq metrics and found that there is a high percentage of non coding reads in bam files. This makes me think that the RNA seq carried out on these samples was not with a PolyA selection method, so having a low rate with Salmon, when quantifying based on a reference transcriptome (rather than the genome) ,makes sense. The only reason I asked for the FQ files was because of the low mapping rate. I was encouraged to request the FQ files to make sure, there wasnt a problem with the regeneration. Sorry I should have updated an answer to my query. if I am correct , then i believe there is no need to take the time to upload the FQ files. In order to convert the bam files to FQ. I first used samtools to query sort them and then used bedtools bamtofastq command. This did give me a lot of warnings about certain reads not being paired correctly or having a missing mate. So that is why I was worried. The commands used were based on the bedtools documentation (http://bedtools.readthedocs.io/en/latest/content/tools/bamtofastq.html ) Thank you for taking the time to respond to my query Devika
@dpag0891 Devika, Hello, I'm one of the individuals who runs the RNASeq pipeline so Dave reached out to me and encouraged me to reply. I'm looking at our FQ files for this project and we have separated these files by batch for each sample, so I'm seeing 192 batches for only 32 samples. Before I take the time to stitch these FQ files back together and upload ~431GB of data, do you mind sharing the command you used to convert the BAMs to FQ along with the version of bedtools? Unfortunately we use a custom script to convert FQ to BAMs so I don't have any experience with bedtools. Thanks, Corey
Hi @david_c_airey - I have created this folder syn11036177 within your project and gave you access. It will stay private until the data is uploaded. We have a new method for uploading data, provenance and annotations in bulk based on creating a manifest. See here - http://docs.synapse.org/articles/uploading_in_bulk.html Since these are fastq, don't worry about provenance, but it would be great if we could capture the annotations on upload. I will send you a manifest template that has the keys for the annotations we would like to see (+ a list of values) Thanks!
Hi Devika and Ben, Yes, we can upload these files. Where do we put them, Ben? Is there a particular synapse ID we should use? It's been too long since I UPloaded to Synapse. -Dave
Thanks Ben
Hi Devika, I'm adding @david_c_airey on this thread as he may be able to answer your question. Ben
Hi Ben, I have managed to regenerated the fastq files using bedtools. However my maprate to the transcriptome while using Salmon was not ideal and was worried that it might be due to a lot of reads not being recognised as paired during the conversion. Hence wanted to double check my work and thought to ask for the fastq files to make sure i havent gone wrong somewhere along the way. Devika
Hi Devika, I'm not sure it is, but you should be able to regenerate fastqs from the bams using the picard tool: http://broadinstitute.github.io/picard/command-line-overview.html#BamToBfq Ben

syn5478323 is it possible to get the fastq files for the samples page is loading…