Optimizing Value With New Lab Technology

The upcoming year promises to be busy and exciting for early adopters of laboratory technology. Daydreams will be split between advances in instrumentation (demand-push) and advances in applications (demand-pull). Looking ahead, a few of these interesting developments focus on the energy efficiency of instruments, including low thermal mass gas chromatographs and updates in gas chromatography and mass spectrometry instruments that enable lab work to be more easily moved to the sampling site. Ultra-HPLC will also garner more attention in 2012, driven by even higher column performance. The end goal for the user with all the new technology being presented is to connect the dots between new capabilities and new needs, within the constraint of optimizing value.

Portable lab equipment

Portable instruments will advance quickly, powered by the devolvement of small, lightweight batteries. Heavy batteries have been an anchor holding back these instruments, but new batteries are much lighter, and will become even more so over the next 10 years. Efficient solar cells will also help, as will the energy efficiency of new instrument designs. For example, new low thermal mass (LTM) GCs with resistively heated temperature-controlled zones have appeared, and next-generation designs are expected shortly. LTM reduces the power requirement by 90% or more if air conditioning is factored in. This will enable more lab work to be moved to the sampling site, providing real-time results and greatly improving response time and assay value. Specific instruments that will take advantage of this ability include GC, GC-MS, LC-MS, and optical and X-ray spectroscopy in many forms.

Computer technology advances

Advances in computer technology including tablets and smartphones may soon appear as the communication interface between a new series of instruments and the rest of the world. Could the camera on a smartphone be replaced with a spectral imaging device such as surface-enhanced Raman spectroscopy? What about an iPhone with a flash source and an array of band-pass filters to measure fluorescence? Adding the electronics required of ion-selective electrodes including pH might be another example. It is true that laboratory instruments might offer better resolution and detection sensitivity, but the average person might be interested in a handheld sensor to evaluate fish freshness or pesticide residue while grocery shopping.

Liquid sample preparation

Preparing samples is time consuming (and boring!), expensive, and problem prone. Mettler Toledo (Columbus, OH) has developed a suite of instruments that prepare liquid samples gravimetrically, which facilitates automation and reduces reagent consumption. Dry solids are agitated and weighed into the vial, and solvent is weighed in next. Shakers, etc. take care of dissolving the sample. This should be a major breakthrough that improves productivity dramatically, with payback measured in months. Implementation may require revision of basic laboratory protocols that specify dilution to volume, but that should be a short-term problem, given the technique’s strong economic, safety, and environmental advantages.


Low-dispersion ultra-HPLCs (UHPLCs) will be another attention getter, opening up unprecedented column performance with sub-2-μm core-shell column packings. Fasha Mahjoor, President of Phenomenex (Torrance, CA), mentioned recently that we have only begun to “scratch the shell” of this technique, a comment that raises my expectations. This technology may lead to another round of instrument development built to support core-shell particles in the 1-μm range, perhaps even sub-1-μm. Such instruments would probably have working pressures of 30,000 to perhaps 50,000 psi to deliver sub-minute chromatograms with a hundred peaks or so. It is difficult to pinpoint many applications that would require this speed, but in chromatography, if you build it, applications often appear in time.

MS instrumentation

The rapid pace of improvements in MS should continue. Bruker (Billerica, MA) will add marketing muscle to the Advance™ ESI interface, which the company acquired from Michrom Bioresources (Auburn, CA) in 2011. The Advance provides nanospray performance with conventional analytical columns in legacy LCs. Surface acoustic wave (SAW) ionization devices will offer improved detection for complex samples such as drugs in plasma. Concern about food safety will also increase the need for quantitative MS detection in both LC and GC.

Semantic technology

The focus of semantic technology (ST) will jump from data integration to natural language, which will greatly improve science. As user skills develop, inference engines associated with ST will facilitate thought integration, which invariably leads to asking new and better questions. Elsevier owns the copyright on approximately 25% of the world’s scientific literature; thus, they have a privileged position regarding natural language integration. Can they capitalize on this opportunity, however? The track records of other firms capitalizing on discovered strategic advantages in IT are abysmal. In the meantime, other firms are racing to commercialize semantic technology. For example, with a machine called Watson, IBM used ST implemented with 2800 parallel processors to win the Jeopardy Challenge in early 2011. Clinical diagnostics is Watson’s next challenge.

Flow FFF

Technical advances do not always catch on immediately. Supercritical fluid chromatography (SFC) and field flow fractionation (FFF) are two such cases, with more than 30 years spent in infancy. Flow FFF with capillary fibers appears to be just what biopharm needs to study aggregation, higher-order structures, and particulate contamination. Nanotechnology should also develop into a significant market for FFF instruments for analysis and prep.

Optical imaging

Rapid optical imaging addresses two important applications. At the high end, fluorescence imagers for high-content assays will continue to improve in speed, field depth, and image resolution. This will provide quicker response and facilitate time-course studies as opposed to today’s fixed-endpoint protocols. Already gigantic data files will increase accordingly, but the value of the results should justify the expense.

With good reason, the FDA is concerned about particulate contamination of biological products. Flow imagers for subvisible particles will find use in formulation research as well as in QC.

Post-translational modification of proteins

Traditionally, a strong applications pull at first leads to the adaptation of existing instruments to the problem. Then, if the demand warrants it, this adaptation is followed by the development of new instruments with features focused on meeting the specific needs of the application. Post-translational modification (PTM) of proteins is an example of this phenomenon. PTM will grow in importance as proteomics drills down in the quest to explain the biochemistry of normal and disease states. PTMs include over 100 covalent and many more dynamic modifications of protein structure that are not programmed directly by DNA coding. Elucidating the chemistry of PTMs will require improved separation of proteins or very large fragments plus MS with electron capture dissociation (ECD). On the informatics side, a need for improved data integration can be anticipated as one looks for patterns, perhaps even short-lived ones, in noisy data.

Higher-ordered structures

One of the intriguing properties of proteins is their ability to change shape and binding partners in response to various stimuli. This changes activity, often through a complex chain of events. Understanding the biochemistry of life will entail explaining the structure–activity relationships involving single or multiple proteins, hormones, lipids, and nucleic acids. In addition to LC-MS, fluorescent probes and associated fast, high-resolution imagers will be required.


Nanomaterials are showing promise and starting to appear in commercial products. The unique properties of nanomaterials opens up the possibility of making highly specific analyzers that utilize their unique properties for specific assays.

Regulatory science

Nearly all of the various regulatory agencies in the developed world salute the idea of science-based regulation. This is a much more rational approach than responding to the loudest alarmist. All too often, alarmists make unreasonable demands that exceed need and current technology: assay at any cost! Regulatory science is a new discipline that will need to combine targeted analytics, particularly for toxicological studies of diverse populations. The regulators will need to carefully respond to rare events that may affect a small clade. In addition to good analytics, the regulators will need excellent informatics to look for common causes in apparently random events. After all, most crises start with a single report.

Food safety

With good reason, nothing grabs headlines better than a food scare. Supplying food is a huge global effort, and the people involved in doing so may have the best in analytics or none at all. Food safety uses lots of GC and LC, often with MS. And, as in regulatory science, tracking food problems involves tracing fragmentary information to a common source.

LC-MS for clinical diagnosis

Mass spectrometers, particularly triple quads, are now fast and have useful detection sensitivity. This makes them attractive for a variety of assays in clinical diagnostics. Today, most of the analytes are small molecules, but when proteomics starts to deliver as promised, then we can expect a similar focus on protein biomarkers. To date, finding biomarkers in plasma has not been very fruitful, but recent work using tissue samples is showing more promising results.

Meetings in 2012

I am excited about several meetings in the upcoming year. Planners know that post-holiday meetings in California, Arizona, and Florida will draw increased attendance. (And snowbirds know that although California’s High Sierras can measure its snow accumulation in feet per day, it’s a very different story in coastal cities such as San Diego, San Francisco, and Santa Barbara.)

The year will start with the usual flurry of meetings in California, beginning with High-Content Analysis (January 10–13) in San Francisco’s Fairmont Hotel. A few days later, Well Characterized Biologics (WCBP 2012, January 22–25) will be held at the Intercontinental Hotel, also in San Francisco. The first joint meeting of the SLAS (Society for Laboratory Automation and Screening) will run February 6–8 in San Diego. Next, the Molecular Medicine Tri-Conference is scheduled for San Francisco’s Moscone Center from February 21 to 23. In March, Pittcon will return to the Orange County Convention Center in Orlando, FL (March 11–15), followed by HPLC 2012 in Anaheim, CA, June 16–21. I hope to see you there!

Robert L. Stevenson, Ph.D. is a Consultant and Editor of Separation Science for American Laboratory/Labcompare; e-mail: rlsteven@comcast.net.