PacBio Forecast 2015

ID-10081802As already predicted, it is not only Illumina who communicates innovations for their NGS portfolio. Here you can read about the implementations Pacific Biosciences plans this this. I think the good news for many users of PacBio machines is, that they do not talk about new instruments, but improvments that affect already installed machines (GenomeWeb):

  • PacBio plans to improve the sequencing chemistry, including the active loading of single polymerase enzymes onto the chip
  • PacBio plans to improve the workflows for an easier and faster handling of samples
  • PacBio plans to improve bioinformatics for faster de novo genome assemblies & better analysis of full-length HLA analysis

With this changes PacBio wants to extend the data output to more than 4 gigabases / SMRT cell and increase the average read lengths to 15-20 kbp.

Read more about it here.

I still wonder if there will be news from PacBio this year about a new system? Maybe a benchtop like everyone has?

I will keep you updated!

FacebookTwitterGoogle+Share

Assessment of NGS Tools for Crime Laboratories

protocolUS Researchers has been awarded with $825,000 to evaluate the use of NGS technology for forensic applications.

Pennsylvania State University will work in conjunction with the Battelle Memorial Institute, the lead institution on the grant, and 6 other laboratories. As the sole university partner, Penn State will be performing evaluations of forensic investigative tools that will expand the capabilities of forensic DNA laboratories.

The grant will test the feasibility of new instruments, laboratory materials and software tools in the field of DNA-based forensics. The study’s aim is to vet tools using next generation sequencing technology and implement them into working crime laboratories.

According to the grant abstract, DNA samples are provided by the National Institute of Standards and Technology. They will be sequenced using Illumina’s MiSeq platform or Life Technologies’ Ion PGM Sequencing System.

The laboratories hope to be able to get tools that use NGS into working crime laboratories to replace current less-informative forensic methods. The new technology will increase efficiency in forensic work and could also help generate investigative leads and identify individuals with only traces of genetic evidence.

Visit forensics.psu.edu/research for more information about the Department of Forensic Science at Pennsylvania State University.

New Illumina Instruments

HiSeq_picsNew Year – New Innovations. Illumina directly starts off 2015 with a huge announcement: the launch of 4 new systems (GenomeWeb, 12th Jan).

Here a short overview of the new systems:

  • HiSeq X Five – scaled down version of the X Ten; costs: $6 million
  • HiSeq 3000  – uses a single flow cell and offers a lower price per data point than the HiSeq 2500; half the throughput (750G) as the HiSeq 4000; costs: $740,000
  • HiSeq 4000 – uses a dual flow cell and can sequence up to 12 genomes or 180 exomes in 3,5 days or less; costs: $900,000
  • NextSeq 550 – combines microarray scanning with NGS; applications: cytogenetics & prenatal genetic diagnostsis; costs: $275,000

Now I am curious to see if also other providers will have such surprising news as Illumina. We will keep you posted…

Transcriptome assemblers put to the test

Next Generation Sequencing produces millions and billions of reads – and the interpretation of this reads rely on bioinformatic tools.

Especially for de novo assemblies of genomes or transcriptomes the result can vary dependent on the quality of the assembly.

In a recent publication Shorash Amin and his co-workers sequence the transcriptome of the non-model gastropod Nerita melanotragus with the Ion PGM. Afterwards they used different softwares and compared the quality different assemblies of the transcriptome (Amin et. al).

Oases, Trinity, Velvet and Geneious Pro, were the four de novo transcriptome assemblers that were used for this study. The assemblers were compared on different parameters like the length of the contigs, N50 statistics, BLAST and annotation success.

The longest contig was created with the Oasis assembler (1700 bp) and overall Trinity and Oasis delivered much better results than the de novo assembly of Ion PGM reads with Velvet or Geneious Pro.

Furthermore the mapping to a reference genome showed that Ion PGM transcriptome sequencing and subsequent de novo assembly with either Trinity or Oasis generates reliable and accurate results.

Read the complete publication here.

Different QM/QA Levels for Genomics Analyses

quality1High quality standards are essential for non-clinical QC testing. When we obtained GLP certification, people ask me about relevant QM/QA levels for genomics analyses. This is what I tell them:

ISO 9001 – the basis

The ISO 9001 standard is a global quality management standard that favours process orientation, customer orientation, satisfaction and continuous improvement. ISO 9001 provides the basis for a quality management system ensuring that all processes are documented and defined in SOPs. In an ISO 9001 compliant laboratory responsibilities are clearly defined, all work environment and infrastructure is suited for its intended purpose. Equipment and facilities are qualified and maintained and measuring and testing equipment requires regular calibration. Also the staff is qualified and well trained and the training is recorded. Supplier management and purchase are controlled processes. Non-conforming work and failures are corrected and documented. Processes for corrective and preventive actions are implemented, as well as a proper complaint management. In an ISO 9001 QM system all business processes are monitored (e.g. by internal and external audits). Customer feedback and all data obtained are analysed on a regular basis. These data and information are the basis for continuous improvement of the ISO 9001 QM system.

ISO 17025 – assures technical valid results

The ISO 17025 is derived from ISO 9001. With an ISO 17025 accreditation a laboratory demonstrates its technical competence and the ability to generate technical valid and correct results. In addition to the ISO 9001 standard the participation in external proficiency testings is mandatory. Furthermore, the documentation of the lab procedures is a lot more detailed and involves dedicated protocolling procedures.

GLP – the gold standard to conduct non-clinical safety studies

The GLP (Good Laboratory Practice) standard adds on top of that a framework in which laboratories non-clinical safety studies are planned, performed, monitored, recorded, reported and archived. GLP helps to assure regulatory authorities that submitted data are a true reflection of the results, obtained during the study and can therefore be relied upon when making risk/safety assessments. In addition to the requirements of ISO 9001 and ISO 17025, GLP involves the nomination of a study director and dedicated trained personnel for GLP compliant processes. A study will involve always the creation of a study plan which will be signed by the study director. All processes applied in the study need to be described within the study plan. Any deviations to the study plan will lead to an amendment of the study plan. After completion of the analyses the study director generates a final report signed by Study Director and QA/QM. It also includes a signed QA-and GLP compliance statement. Each study is audited by quality assurance staff. Furthermore, there needs to be restricted laboratory access and restricted access to relevant data as well as dedicated archiving procedures (GLP archive) for all GLP documents and raw data..

GCP – similar to GLP with focus on clinical studies and patient safety

The GCP (Good Clinical Practice) standard is very similar to the GLP standard; however it is relevant only for clinical studies and has thus a focus on patient safety and reporting of adverse drug events. In a study that involves GCP compliance it has to be assured that only such things are analyzed that a study patient has consented to.

Feel free to write a comment for further clarification. I am looking forward to get in contact with you.

Cheers, Katrin

Don’t forget the controls!

Almost every day new data about the composition of microbiomes are published. Many of these studies analyse the human microbiome, but also environmental samples.

Today we have the ability to sequence microbiomes in much more depth than a couple of years ago. Looking deeper sheds light on an important point: Contamination! In the very interesting publication of Salter et al. they could show that contaminating DNA is present in DNA extraction kits and other lab reagents.

The researchers sent dilutions of pure cultures of Salmonella bongori to three different institutes for DNA extraction and PCR, followed by sequencing on Illumina MiSeq. While S. bongori was the only organism identified in the undiluted samples, contaminating bacteria increased in relative abundance with higher degrees of dilution, and finally became dominant after the fifth dilution.

They did a similar analysis performing shotgun metagenomics of a pure S. bongori culture. This time, they used four different DNA extraction kits. Again, they saw that contamination increased with the degree of dilution, with contamination being the predominant feature after the fourth dilution. Also, they could show that each kit gave a different bacterial profile.

They also report on a study on the nasopharyngeal microbiota of children, analyzed over 2 years. They could show that using 4 different DNA extraction kits over time led to the false conclusion that differences in the microbial spectrum were associated with age. When DNA extraction was repeated on original samples using a different kit lot, the OTUs previously identified as contaminants were no longer detected.

In conclusion, contamination affected both 16S and metagenomic shotgun sequencing projects and was especially critical for samples with low biomass. Salter et al. present a list of potential contaminating organisms, as well as recommendations on how to cope with this problem. One recommendation is very obvious, and very effective: use negative controls!

Altogether, we should be very careful in planning our experiments in order to deliver results instead of artefacts. Especially, we need to be very careful when interpreting the data!

Prepare NGS for clinical use

Molecular diagnostics (MDx) is to my opinion the most sensitive application for all kinds of molecular biology techniques like PCR, Sanger Sequencing or Next Generation Sequencing. Today, NGS is still a niche application and needs further improvement to be a common tool for MDx. One thing that is lacking is the standardisation of NGS for clinical use.

The NGS Working Group, established by the Friends of Cancer Research worked out a master plan (The ASCO Post), with critical points that need to be addressed to use NGS more commonly:

1. Define a regulatory pathway for cancer panels (a selection of multimarker gene assays) intended to identify actionable oncogenic alterations (those with supporting data to create risk-benefit assessment of treatment choice) that allow flexibility in the appropriate FDA medical device pathway—for instance, one based on risk classification of different panel components depending on the specific marker.

2. Approaches to validation studies should be based on the types of alterations measured by the assay rather than on every alteration individually.

3. Determine the contents of a cancer panel by classifying potential markers based on current utility in clinical care and clinical trials and peer-reviewed publications, as well as recognized clinical guidelines. Draw upon various sources to determine the recommended marker set for an actionable cancer panel.

4. Promote standardization of cancer panels through development and use of a common set of samples to ensure reproducibility on each platform.

5. Establish a framework for determining an appropriate reference method rather than relying on any single method for all studies.

Get more information to each proposal here.

Whole Genome Sequences Of World’s Oldest Living People Published

senior-asian-woman-100226669Researchers looked at the genome of some of the oldest living people. While they did not find a significant association with extreme longevity, the researchers published their genome findings. At least the data will be available as a resource for future researchers looking at the “genetic basis” of longevity.

There are 74 supercentenarians (110 years or older) alive worldwide, with 22 living in the United States. The authors of this study performed whole genome sequencing on 17 of them to explore the genetic basis underlying extreme human longevity.

“We were looking for a really simple explanation in a single gene,” said Stuart K. Kim, a Stanford geneticist and molecular biologist. “And we know now that it’s a lot more complicated, and it will take a lot more experiments and a lot more data from the genes of more supercentenarians to find out just what might account for their ages.”

From the limited sample size the researchers were not able to find protein-altering variants associated with extreme longevity, according to a study in PLOS ONE by Hinco Gierman from Stanford University and colleagues published November 12, 2014 . But they did find one supercentarian had a genetic variant related to a heart condition that had very little effect on his health considering he reached such and elderly age. The researchers noted that it is recommended by the American College of Medical Genetics and Genomics to report this instance as an incidental finding.

The whole genome sequences of all 17 supercentenarians are now available as a public resource so that they can be used to assist the discovery of the genetic basis of extreme longevity in future studies.

 

Compare to Large genome sequencing studies in the USA (posted August 26, 2014 )