A Glorious Past and Relevant Present: Leading the Lab into the Future
October 1988: I remember it well. My lab overlooked the parking lot at Varian Associates. We were expecting a visit from Bill Wham and Ken Halaby, the founders of American Laboratory magazine. A limo pulled up and two people emerged. One was a well-dressed Scotsman (Bill Wham); the other was Ken Halaby, who was notorious for showing up in various disguises, and that day was no exception. The dynamic duo made their pitch about how American Laboratory would work closely with vendors, customers, and subscribers to help develop the laboratory market by publishing practical applications of modern instrumentation and supplies. This was akin to linking features and benefits. Today this is nothing new, but at that time, it was ground-breaking, since most instruments were sold on specifications that were difficult to understand and impractical to impossible to verify.
For 45 years, articles in American Laboratory have chronicled each step in the evolution of modern instrumentation. Early data recording devices included strip chart recorders. The most sophisticated had multiple pens to monitor two detectors simultaneously. Peaks were measured with mechanical ball and disk integrators or rulers. All the pens used ink reservoirs as in fountain pens. Ten percent of the runs were wasted since the pen stopped writing. Another 10% failed since the ink ran all over the chart, obscuring the signal trace. Plus, lab rats were identified at lunch by the distinctive blood red ink on their hands and clothing.
In the early 1970s, computers started to appear to directly capture data, distill them to meta data, and prepare reports via a teleprinter (electronically activated typewriter). At about the same time, digital integrators (DIs) came on the scene. These used derivative detection of peaks and valleys implemented with discrete transistors. DIs were easily confused by electronic noise and quickly earned a reputation as being temperamental. However, DIs gave better precision than the manual evaluation technology, plus they had a wider dynamic range than chart paper.
In the late 1970s, the first microprocessors started to appear. Microprocessors were programmable and ideal for automating instrument function. When they were combined with a cathode ray tube, one could enter and display much more information. Overnight, unattended operation was promised, and indeed, generally worked. However, microprocessors required a wizard to program and debug. People had many more ideas than anyone could keep up with, but the firms often sold on the basis of future designs, which led to pejorative tales about “vapor ware.”
As microprocessors became ever more powerful, instrument control and data processing were integrated first in external boxes and then into the chassis of integrated instruments as the HP/Agilent GC series. In HPLC, the modules, pumps, detectors, and data processing were more complicated. Each module needed a microprocessor. Fortunately, Moore’s law was in effect, so these were affordable.
The exponential growth phase of HPLC preceded the corresponding adoption of the microprocessor by about 10 years. Column technology was the key technical driver or gatekeeper. Columns used smaller and smaller particles, which gave ever-narrower peaks and faster run times. It seemed that the detectors were always fighting to catch up. Engineers would use long-time constant filtering to reduce the short-term noise level, but this ruined chromatographic resolution. Even today, HPLC column technology is far from mature. The need for a new round of instruments is visible, but more about that later.
By the 1980s, people were starting to see that biochemistry was emerging from the descriptive stage to the predictive stage. Basic bioengineering was discussed in keynote and plenary lectures. The genome was understood first by biochemists, then high school students, and later by the lay public.
But DNA was so complex and novel, the question was: How do you work with it?
Dr. Kary Mullis, working at Cetus, shocked the world with the polymerase chain reaction (PCR). This one invention fueled the next 25 years of genomics, including the sequencing of thousands of genomes, including humans. The first human genome sequence cost about $3 billion. Today, Whole Genome Sequencing (WGS) is about $1000 each. Yes, analysis of the WGS data costs more than 10 times more, but this will come down rapidly as the algorithms connecting disease states to DNA are improved.
The successful interfacing of HPLC to mass spectrometry is also one of the most significant advances in practical applications of instrumentation. Generally, these use electrospray ionization, but there are a variety of other interface designs. HPLC/MS/MS is a further related extension, providing even more detection sensitivity and information content. These instruments are used in the development of small-molecule drugs and biotherapeutics. Proteomics would not be possible without LC/MSn. The same is true for lipidomics, glycomics, metabolomics, and systems biology.
All of these and more have been the subject of many reports in American Laboratory.
Top technologies that have impacted biochemical and chemical labs
My ranking of the top technologies that have impacted the chemical and biochemical lab is:
Microprocessors are clearly the overall leader, since the impact is so general.
My vote for second place is HPLC, since it is one of the most broadly used technologies developed in the last 45 years. While some consider it mature, it seems that we are on the edge of still more advances in column technology.
By the numbers, HPLC is the clear leader in publications in American Laboratory, with over 100 in the last decade. Some titles include:
- “Capillary UHPLC Coupled With MRM-MS for Quantitative, High-Sensitivity Bioanalysis”1
- “Determination of Free Formaldehyde in Cosmetic Preservatives and Surfactants by HPLC With Postcolumn Derivatization.”2
These articles range from basics of the technology to new applications. Plus, I’ve authored annual reports that cover advances in HPLC columns, instrumentation, and applications shown at Pittcon®, such as “Pittcon® 2013: Developments in Liquid Chromatography.”3
3. Polymerase chain reaction
PCR is my nomination for third. PCR is a novel, and eminently useful, technology that has been widely adopted.
Some recent articles include:
- “A Practical Guide to Publishing RT-qPCR Data That Conform to the MIQE Guidelines”4
- “Use of qPCR to Detect Biomarkers for Assessing Stem Cells”5
- “Overcoming the Challenges of Polymerase Chain Reaction”6
- “Product Intelligence: Thermal Cyclers: Behind the Technology.”7
LC/MSn is my fourth choice. MS technology is still expanding rapidly. Its marriage with HPLC has been synergistic to both MS and LC. The combination is much more than the simple sum of the parts.
A total of 174 articles mentioning LC/MS have appeared in American Laboratory during the last decade. About half mention some form of MSn. Some interesting titles include:
- “Rapid, Quantitative Analysis of Multiple Mycotoxins by Liquid Chromatography Tandem Mass Spectrometry”8
- “Achieving Ultrahigh-Speed Analysis With LC-MS-MS”9
- “Fused Core Particles for HPLC Columns.”10
American Laboratory today: A horizontal publication in a vertical world
Remaining true to the original concept, American Laboratory continues to provide a valuable venue for a broad spectrum of topics. American Laboratory has emerged as a horizontal publication running across an increasingly vertical technical world. Some recent examples:
An article in the June/July 2013 issue described the design of a high-performance ion mobility spectrometer for rapid screening of food.11 Ions are produced by electrospray ionization and separated by their mobility in a drift tube. This is faster than chromatographic separations and avoids the expense of creating and controlling a vacuum as required by MS.
Neutron crystallography for the characterization of proteins was reported on in the March issue.12 Neutron crystallography is the only technique that shows the location of hydrogen (1H and 2H) in enzymes. The case in point was carbonic anhydrases, which are important medical targets in the treatment of glaucoma, altitude sickness, epilepsy, and hypertension.
A study on the analysis of water from various sources, including fracking fluids and production water, was featured in the March issue as well.13 Most methods used ion chromatography. The reshoring of the petroleum industry is reviving the entire market for technical jobs, particularly along the Gulf of Mexico. However, many have pointed out the associated risks and need for science-based regulation.
Many of today’s products for the lab utilize very sophisticated technology. Accordingly, American Laboratory now publishes purchasing guides to aid in the selection of instruments and systems that are fit-for-purpose.
Future technical advances
At the risk of being faulted for myopia, let me offer my views of pending technical advances by 2020.
Bioinformatics will emerge as a major and expensive endeavor for research and health-care providers. The ultimate driving force is to connect the dots between assay results and improved patient outcomes. Prevention costs less than treatment. Plus, there is a large multiplier if assay results can provide high-efficacy therapeutics.
Efficacious bioinformatics requires high-quality data. Research in genomics, proteomics, lipidomics, glycomics, and ultimately the connectome will provide large data sets. But, since they come from many sources, each using different methods, correlating the results will be challenging at best. The FDA’s Drug Development Tools (DDT) Qualification Programs on biomarkers and trans-species toxicology may help.14 Spinout of NSA’s data farms will help quickly process, store, and retrieve zettabytes of data necessary for the precision treatment of billions of people.
Automation is another important contributor to high-quality data. Usually, automation can improve data quality by reducing interlaboratory bias. It also reduces the impact of operator variance. Today, even the poorest automated systems probably provide more reliable data than the corresponding manual protocol. Yes, there are cases where the experienced eye is still essential, but the engineers see replacement as an engineering challenge and probably a commercial opportunity.
Capillary liquid chromatography with “slip flow” mode will be the next advance in column technology. It will require new instrumentation. In this case, a complete redesign will be in order since the flow rates will be in the low to sub-nanoliter/min range. Enabled applications will include proteomics, lipidomics, bacterial identification, and systems biology.
Multidimensional techniques such as ICP/MS will work at the cutting edges of the hybrid techniques. LC × LC × MSn will emerge as the technology of choice for elucidation of the complexity in lipidomics, metabolomics, glycomics, microbiome, and post-translational modification (PTM), both covalent and dynamic.
Solid-state optics will enable a variety range of new spectroscopic techniques. Some examples: Plasmonic15 and infrared spectroscopic imaging16 will greatly expand the information content relating structure to function in living systems and nanomaterials. This will be aided with advanced fluorescent probes, including quantum dots.17
Laser imaging will continue to evolve with shorter pulse widths and higher repetition frequency. This may be useful in probing nano environments in biofluids and tissues.18 For example, hole-burning spectroscopy is poised to probe biostructures and liquids.19
Technologies to determine brain–structure relationships
The connectome will require new analytical technologies to figure out and describe the structure and function of brains, particularly in living humans. NMR imaging is but the first step. Growing brain tissue from stem cells may enable spectroscopic measurement of synapse function in living tissue. Hopefully mental diseases can ultimately be understood on a biochemical level, which may enable more effective treatment.
Advances in battery technology will empower portable instruments that will be used at the site including remediation, supply chain management, process control, and ultimately point of sale to consumers. Here’s how it would work: You want some swordfish. The butcher shows you a whitefish steak, and you ask for a wipe of the fish surface. You put the wipe into the food authenticator, which you carry in your shopping bag. The food authenticator uses DNA hybrids along with other chemical sensors. The DNA matches mako shark, and the amine monitor shows high readings, indicating “not fresh.”
Scientific communications are also changing. Print is no longer the only medium. Indeed, American Laboratory currently has close to 18,000 digital subscribers. In addition to reading online, more and more frequently we are accessing information on mobile devices like smartphones and tablets.
In closing, I’ve enjoyed my association with American Laboratory over the decades. There have been so many changes and advances in the industry, and I’ve learned a lot. Over the years I’ve tried to communicate my excitement to our readers, and will continue to do so in the years ahead.
Thank you for your support.
With a smile,
Robert L. Stevenson
- www.americanlaboratory.com/914-Application- Notes/504-Determination-of-Free- Formaldehyde-in-Cosmetic-Preservativesand- Surfactants-by-HPLC-With-Postcolumn- Derivatization/.
- Kawata, S. Plasmonics for nanoimaging and nanospectroscopy. Appl. Spectrosc.2013, 67(2), 117–25.
- Bhargava, R. Infrared spectroscopic imaging: the next generation. Appl. Spectrosc. 2012, 66(10), 1091–1120.
- Petryayeva, E.; Algar, W.R. et al. Quantum dots in bioanalysis: a review of applications across various platforms for fluorescence spectroscopy and imaging. Appl. Spectrosc. 2013, 67(3), 215–52.
- Su, X.; Xiao, X. et al. Nucleic acid fluorescent probes for biological sensing. Appl. Spectrosc. 2012, 66(11), 1249–62.
- Wagie, H.E.; Geissinger, P. Hole-burning spectroscopy as a probe of nano-environments and processes in biomolecules: a review. Appl. Spectrosc. 2012, 66(6), 609–27.
Robert L. Stevenson, Ph.D., is a Consultant and Editor of Separation Science for American Laboratory/Labcompare; e-mail: email@example.com.