The second annual meeting of the Society for Laboratory Automation and Screening (SLAS 2013), held January 12–16, attracted 3800 scientists and engineers to the Gaylord Hotel in Orlando, FL for an opportunity to talk shop on issues of lab automation and screening. A day earlier, High-Content Analysis (HCA 2013) closed in San Francisco, CA. The meetings have a lot in common, but there are significant differences also.
I’ve been attending meetings on laboratory automation for 25+ years. In the early years, automation focused on improving turnaround time in clinics and hospitals by transporting samples to a central lab with pneumatic tubes (~3 in. diameter)‒the same ones that were used in department stores to process cash transactions. Whoosh! To the lab. Whoosh! Clang! Results went back to the nursing station via the return tube or by telephone. Later, fax machines replaced the return lines. Testing protocols were generally manual, except for a few assay panels that were run on segmented flow analyzers (aka bubble machines) using technology developed by Technicon Corporation (Tarrytown, NY) in the 1960s.
Automation in the chemistry and biochemistry lab developed along a different track. Automation was generally an accessory to the chromatograph, spectrometer, etc. Digital integrators with thermal printers were the output devices. A bit later, larger production labs used computers (such as the PDP-8) to process and report data via teletype. The motivation was to reduce labor cost per sample. But along the way, scientists began to recognize that automation often led to better-quality data, with improved ability to discriminate. Today, over half of the chromatographs use dedicated automated samplers, even when labor is cheap.
Fast forwarding, chemistry and biochemistry are moving ahead exponentially faster. Advances in analytics now give detailed chemical profiles of single living cells. The cells can be challenged with materials from million-compound libraries. IC50 and LD50 values are computed and tabulated. Human eyes need only focus on the extremes and occasionally the errors. Informatics summarizes the data into groups by activity and common method of action. Structure–activity relationships are essential in lead optimization.
This was the focus of two meetings in early January, which differed primarily in scale. The 10th High-Content Analysis meeting at the Fairmont Hotel in San Francisco attracted about 300 multidisciplinary scientists involved in core facilities designed to process screens of 10,000 wells or more, generally with several industrial robots. Outputs are in terabytes per year and occasionally per experiment. Globally, there are only a few hundred of these facilities. Major applications are lead discovery and evaluation for pharmaceuticals, cancer, and genetics research.
The SLAS 2013 meeting opened January 12th at the Gaylord Hotel in Orlando, FL. The typical attendee works in a research center, often in a lab supporting the mission of the firm, or in a lab led by a senior principal investigator. Typical experiments involve less than 5000 wells. Samples are processed with benchtop liquid handlers. However, this year, both meetings shared a common concern about data quality and relevancy. All too often results from the experiments provided false positives and negatives, plus in vitro toxicity predictions did not correlate with in vivo.
Assay integrity was the most common concern shared by HCA and SLAS. For example, cardiac myocytes plated on the bottom of a well, i.e., 2-D, died with nearly all compounds in one screen. However, when the screen was run on a suspended cluster of myocytes, LD50 values were higher (less toxic) often by 2–4 logs. This is prompting a large shift in the technology to upgrade to 3-D samples, which require new imagers. Plus, 3-D assays may be run for several days and occasionally for weeks, which is called 4-D screening. Other important experimental factors include edge effects, effect of metal ions, defects in multiple-well plates, and liquid handling technique.
Four-dimensional assays follow the growth of live and often label-free cells. Between scans, the plates are usually in a warm incubator, typically at 37 °C. Prof. Peter O’Brien of University College (Dublin, Ireland) noticed that results from wells on the perimeter of the plate were showing up as outliers. In one study of renal cells, cell count for wells on the plate edge decreased 46%. At the corners, the decrease was 72%. He started to investigate it in more detail, and found that wells along the edge also suffered from evaporation of the liquid. Wells at the corner lost about half their volume after four days. O’Brien warned that the parameters most sensitive to edge effects are nuclear area, membrane permeability, cell proliferation, and mitochondrial membrane polarization. He collaborated with Prof. Anthony M. Davies (also of University College) in evaluating the use of thixotropic gels to reduce evaporation while allowing addition of nutrients and removal of waste.
Idiosyncrasies of liquid handling
A lecture by Dr. Sean Ekins of Collaborations in Chemistry (http://www.collabchem.com/) compared results of drug activity with tip versus acoustic pipetting technology and showed that acoustic pipetting generally provided lower false positives and stronger potency as measured by IC50 values.1 This indicates that tip-based liquid handlers suffer from loss of drug during the aspirate and dispense cycle. Adsorption on or in the tip seems to be the most likely cause. Since the volumetric technology for many of the assays is seldom reported, is it valid to integrate or compare data from different data sources? Is the current liquid handling technology suitable for the intended use?
Lest you think that contactless dispensing is the preferred mode, Dr. Nadine Losleben et al. of Roche Diagnostics (Mannheim, Germany) compared two contactless dispensers. One was positive displacement and the other was time–pressure dependent. They found that performance varied with the properties of the liquid, especially temperature and viscosity, over a range of 4–20 °C. Low-viscosity liquids had more satellite drops, which scattered easily, producing a sample loss of 7.4%. Temperature changes of only 1 °C can cause more than a 3% error in volume.
Due to general solubility, most compound libraries for screening are dissolved in DMSO (dimethylsulfoxide) for dispensing. DMSO has a different viscosity than water, which raises the concern about performance during aspirate and dispense cycles with pipettors. Artel (Westbrook, ME) introduced a kit for dual dye volume measurement for its Multichannel Verification System (MVS). This provides the user with a quick way to verify the performance of liquid handlers with volumes as small as 10 nL in 384-well plates. Previously, the lower limit was 30 nL. For 96-well plates, the volume range is 100–999 nL.
Pipet tips and pins
Without exception, the assays described at SLAS and HCA involved at least one liquid transfer. Larger volume transfers were performed with plastic tips. Smaller volumes were done with pins, small plastic tips, or acoustically, as described above. Questions: Can the plastic tips be reused? If not, should they be cleaned before disposal? For the pins, what about carryover? IonField Systems Inc. (Moorestown, NJ) introduced the TipCharger™ to clean tips, cannulae, and pins. The plasma volatilizes residues quickly. Cleaned products are free of surface defects that can lead to errors from microbubbles. High-volume labs report payback in a year or less.
Contamination of results by metal ions
A poster at SLAS from Qing-Fen Gan et al. of Hoffmann-La Roche (Nutley, NJ) reported that Zn(+2) caused false positives in high-throughput screens. Specifically, Zn ions react with thiols, which are often insoluble, and can be scored as positive. False positives were leading to bad structure–activity relationships. A retrospective analysis of 175 screens found 41 with a hit rate of Zn compounds of at least 25%. Gan also warned that since metal ions are seldom specified, other metals may perturb results. He recommended chelators such as ethylenediaminetetraacetic acid (EDTA) or N,N,N',N'-tetrakis (2-pyridylmethyl)ethylenediamine (TPEN) to sequester free metal ions.
Detection of manufacturing defects in multiple-well plates
A group from the Translational Research Institute of the Scripps Screening Center (Jupiter, FL) reported on their use of the Brooks Plate Auditor™ to find and document defects in multiple-well plates used for compound management and screening. Air bubbles trapped in the interstitial area of 384-well plates resulted in warping of the plates. This produced a 1.4-mm gap between the plate holder and plate bottom at one edge. A warp of this magnitude would make the plate unsuitable for imaging. In 1536-well plates, they found evidence of bent pins in the injection mold, which gave an offset of 0.28 mm, apparently at the top. Normal well-to-well distance is 2.25 mm. The bend could move the well bottom even more.
For all of above reasons, and probably much more invisible tribal knowledge, assays are unique. Assays that work on small scale often present problems when transitioned to HCA. Instruments are different; work flows need to be different. Successful laboratory models use two systems‒one for assay development and the other for HCA screens.
This is a general introduction of common concerns and advances reported in January 2013. For a more focused study of the HCA 2013 meeting, please see http://www.americanlaboratory.com/913-Technical-Articles/130104-High-Content-Analysis-HCA-2013-3-D-Reduces-the-Gap-Between-In-Vitro-and-In-Vivo/.
- Ekins, S.; Olechno, J. et al. Dispensing Processes Impact Computational and Statistical Analyses; lecture at SLAS 2013, Jan 6, 2013.
Robert L. Stevenson, Ph.D., is a Consultant and Editor of Separation Science for American Laboratory/Labcompare; e-mail: email@example.com.