Need For Strict Quality Control Will Limit Number of Parallel Assays Possible
While writing about encoded micro/nanoparticles for multiplexed bioassays two thoughts crossed my mind
- Do we really need very large scale multiplexing technologies for protein assays? and
- What if all these technology platforms are commercially realized tomorrow?
Many of these encoded particles technologies claim anywhere from few hundreds to 220 individual codes and hence similar number of individual assays are theoretically possible. I will deal with question-do we really need these many assays for relevant biological information or clinical diagnosis- in a later post. Let me focus on ‘what if’ question. What other challenges need to be solved if such multiplexing technologies are to be biologically/clinically relevant? Two recent articles dealing with multiplexing technologies and the quality control/reproducibility issue are pertinent to this issue. First is the article in Clinical Chemistry that present a case study on measuring 15 proteins from plasma of 2322 participants using protein arrays. Second was a multicentric study to test the reproducibility of measuring 20 test proteins in 27 different laboratories using Liquid chromatography-Mass spectroscopy (LC-MS) method.
Protein arrays have the capability of testing hundreds to thousands of analytes in a single samples hence as the authors of the Clinical Chemistry article point out can lead to “—specimen conservation, limited sample handling and decreased time and cost—”. The study tested 15 proteins from 2322 participants using commercial validated assays resulting in total of 95,040 measurements. However the study concludes that based on FDA guideline of acceptable intra- and inter- assay precision to be <30%, substantial portion of measurements will be rejected. The major challenges of multiplexed assays pointed out by the authors were
- Preanalytical/non-biological variations that will be laboratory and technician dependent
- Large variation in protein concentration in blood/plasma/serum means a single dilution of sample may not be sufficient to bring the analyte within the dynamic range of the assays. Protein concentration in blood can vary by 1012 order of magnitude hence any multiplexed assays claiming to measure thousands of analytes simultaneously could very well need that kind of dynamic range or multiple sampling of samples hence defeating the purpose.
- Samples processing/handling lead to significant changes in the measured values
Second study was to check for the reproducibility of the LC-MS based approaches. Use of MS for protein identification have exploded recently and claims to provide solution to everything from clinical diagnostics to proteomics problems. The study sent sample containing 20 test proteins to 27 different laboratories and only seven could identify all 20 test proteins. Worst still only 1 out of 27 could identify all 22 tryptic peptide of 1250Da in 20 protein samples. According to authors
“Our centralized analysis determined missed identifications (false negatives), environmental contamination, database matching and curation of protein identifications as sources of problems.”
What is remarkable about these two studies is that although both protein arrays and LC-MS based technologies claim to be high throughput assays the studies tested only 16 and 20 samples respectively. Talk about thousands of assays sounds like a joke!
That brings me back to the ultra high-throughput encoding technologies and question “what if”. Although the multiplexing technologies have evolved rapidly the biological assays part has received relatively less attention and may prove to be a major bottleneck limiting wider application of multiplexing technologies. Key challenges in need of immediate attention include
- Sample handling and processing
- Sensitivity and dynamic range to cover all the analytes being tested
- Successful QC of multiplexed assay involving thousands of proteins will involve stocking up on pure controls covering each protein leading to significant cost runs
- Validation of these assays will need studies involving large cohorts as done in GWAS (Genome Wide Association Studies)
- Extensive training
Do we really need to go to all this trouble to make multiplexing technologies successful or should we stick with our standard glucose sensor or pregnancy dipstick?