In an opening plenary, Prof. Barry
Karger of the Barnett Institute of
Chemical and Biological Analysis
(Boston, MA) concisely defined analytical
chemistry as “The science of generating
relevant information by means of chemical
and biological measurements.” Much of
the Joint Congress 2011 focused on doing
this with higher speed, resolution, and reliability.
Frequently, subdisciplines in science
emerge in parallel silos using similar technology,
usually with different ontologies
and little cross-talk. The market-driving
applications vary, but the core technical
principles span the applications. Leaders
at CASSS (Emeryville, CA) recognized
the need and opportunity to improve communication
by creating a forum designed
to improve technical cross-talk between
scientists in adjacent silos. Thus, CASSS
organized the Joint Congress 2011, held
May 1–5 in San Diego, CA. The technical
program consisted of three parallel tracks
organized by three different scientific committees
focused on Capillary Separations
(35th International Symposium on Capillary Chromatography), Microscale Separations
(26th International Symposium on
MicroScale Bioseparations), and Comprehensive GC (8th GC×GC Symposium).
Correcting the human genome
The opening plenary started things off on a
critical note. Prof. David C. Schwartz (University
of Wisconsin, Madison) is correcting
the numerous errors in genomes sequenced
by shotgun technology by sequencing linear
DNA using capture genome hybridization
(CGH). Individual strands of DNA
are stretched into a line on the surface of
a microscope slide and cut with restriction
enzymes into specific sections. Since the
connectivity is established and confirmed
by the site specificity of the restriction
enzymes, assembling the genome is much
more accurate. This technique catches
duplicates, insertions, and inversions, all of
which are common. The output is a series of
bars that read like a bar code.
For comparison, bar codes for individuals are
aligned. This reveals similarities and differences
in sequence. Capture genome hybridization
for humans and chimps shows that
humans have 510 deletions. Generally these
are for proteins associated with mental development.
Another study compared wild type
and cancerous breast tissue. Prof. Schwartz
found that the genes from cancerous disuse
had been randomized with long sequences
exchanged between the 20 genes. So 10 years
after the human genome was hailed as completed,
we are slowly getting a much more
accurate structure. Plus, we now have tools
to avoid ambiguities associated with shotgun
sequencing. With a correct genome to study,
perhaps it will be more useful. However, it is
doubtful that genomics will deliver anything
close to the expectations of 5–15 years ago.
After 10 years and billions of dollars, scientists
are recognizing that serum biomarkers are in
danger of being broadly and charitably classed
as hype. The problem is that the concentration
range of relevant analytes spans 1011 or more,
which is about 107 times the dynamic range
of the best detectors (mass spectrometers). In
one of the most memorable lectures of all time,
Prof. Giorgio Righetti (Politecnico di Milano,
Italy) focused on the hopeless inadequacies
associated with current enrichment techniques,
including depletions. He concluded
that affinity-based depletions fail to deliver relevant
information on biomarkers for serum
samples. Further, Prof. Righetti anticipated
the question, “What technology should we use
now to win fame and fortune for biomarkers?”
His advice was to take any available funds to a
casino and bet them. Such refreshing honesty!
I know that all the audience, except one, felt
relieved that they did not have to follow Prof.
Righetti at the podium.
In the next lecture, Dr. Andrei Drabovich
(Mt. Sinai Hospital, Toronto, Canada)
showed that the search for biomarkers can
succeed if one starts with cleaner and more
concentrated samples. Blood collects the
waste from all organs and hence is too dilute.
Other fluids in the body, including seminal
fluid in males, are more tractable sources.
His specific goal was differential diagnosis of
azoospermia in infertile males by looking for
biomarkers associated with blockage or other
problems. Using nanospray electrospray ionization (ESI) with selected reaction monitoring
(SRM), 18 proteins were validated in
several dozen clinical samples. From this,
they developed a smaller panel of discriminating
biomarkers with near absolute specificity
and sensitivity. This obviates the need
for testicular biopsy, which is a significant
benefit to patients. So the take-home message
is to focus on simple systems, which are
more compatible with today’s tools.
For similar reasons, Prof. Karger focused on
tissue samples. Today, the Barnett Institute
has two major thrusts: translational medicine
and translational regulatory science. Years
ago, he recognized the problems with serum
proteins above. Tissue was selected because
it is more complex and biologically relevant
than cell culture, since the structural features
are potentially important. Cells are isolated
by laser dissection capture. Currently, he
needs about 10,000 cells, but hopes to soon
reduce the sample to 1000.
The proteins are measured with bottom-up
peptide sequencing using LC-MS. The
data are displayed according to the spectral
index that Prof. Karger reported on
in 2008. When one compares cancerous
and normal cells, the spectral index easily
groups the proteins that are elevated or
suppressed in normal or cancer tissue.
The next question is: Are the correlations
meaningful or just flukes? After all, biological
systems have large populations, so random
nonsensical correlations are probable.
Thus, even at the 99% confidence level,
121 proteins were significantly different.
Using network analysis, these produced
two suggested networks, one of which
appears to make sense and may have some
therapeutic utility in the clinic.
The move down to single cells was reported in a poster from Lukas
Galla and colleagues at Biefeld University (Germany). They used
laser-induced fluorescence (λex = 266 nm) to detect a green fluorescent
protein (GFP)-labeled protein (γ-PKC [protein kinase c])
from a single cell of the insect Spodoptera frugiperda. The cell was
lysed with a 2500-V pulse at the head of a 3-cm-long CE separation
capillary. The S/N was more than 10.
As the sun sets on the serum protein era, we have daybreak over
the silo of GC×GC (aka comprehensive GC and multidimensional GC). Although multidimensional LC has been used for about 20
years, GC×GC has languished for nearly 50. The first report for GC
was by M.C. Simmons and L.R. Snyder in 1958.1 Until very recently,
GC×GC appeared to be a curiosity looking for a compelling application.
Analysis of petroleum and petroleum streams produced pretty
and inspiring chromatograms, but little more. The GC×GC track of
the Joint Congress was driven by applications including:
- Assay of edible oils for adulterants
- Screening of food supplements for active and illicit analytes
- Sports doping
- Assay of sulfur compounds in coal and heavy petroleum
- Assay of essential oils in fragrances
- Assay of enantiomeric purity.
Instrumentation has improved sufficiently that it is now easier and
quicker to use GC×GC for a difficult separation than to optimize a
long run for a difficult pair. Plus, the ability to concentrate analytes
during the run improves method detection sensitivity. For example,
a poster presented by Brian Barnes (Seton Hall University, South
Orange, NJ) reports development of a solid-phase micro extraction
(SPME)-GC×GC time-of-flight (ToF)-MS assay for cocaine and salvinorin
A of 41 and 33 ppb in urine. This is significantly more sensitive
than other assays. It was pointed out that GC×GC ToF-MS is particularly
powerful for assay of drug metabolites, even fast metabolizers.
Two lectures focused on GC×GC technology that can rapidly
increase separation speed. Prof. R.J. Simonson (Sandia National
Laboratories, Albuquerque, NM) needed to miniaturize a GC and
get a peak capacity of 100 peaks per second for military applications.
The goal was to make a portable instrument that could assay
airborne toxins in 4 sec at the 1-ppt level. This requires generation
of peak capacity of 50 peaks per second using air as the sample
stream and carrier gas. The peak capacity is only possible with
GC×GC. They finally used a micromachined column with a 90
cm length in the first dimension and a 30-cm-long wax column
machined into a 1-cm2 wafer. Injection and flow modulation are
controlled by micromachined valves.
If GC×GC is powerful, what about adding even another GC stage?
After all, in LC, up to six stages have been commercialized. For
triple-stage GC, Prof. Robert E. Synovec (University of Washington,
Seattle) estimated that the second stage of the GC should
have a 4-sec run time, and the third stage would need to be complete
in about 200 msec. He started experimental work with the
third stage and found that commercial big-box instruments were
not fast enough to be useful.
This led to a critical review of the sorry state-of-the-art in big-box
GCs. Prof. Synovec focused on extracolumn band broadening for low
Kʹ peaks. An unretained peak (Kʹ = 0.0) could be as narrow as 5 msec
with no column, but the minimum that could be measured with a
big box was about 2 sec, even with flame ionization detection (FID).
When a column is added (40 m × 180 μm), the peak width should
be 400 msec, but the measurement is 3 sec. While the extracolumn
band broadening may not be objectionable in some applications, in
GC×GC it really slows things down. In one example, the column set
had a peak capacity of 1000 peaks per min, which is about 10 times
more than is commonly achieved with GC×GC.