Henry Ford quipped, “There are no big problems; there are just a lot of little problems.” In the laboratory, everyday “little” problems can be costly. Best practices for improving workflow and financial performance are given below.

Avoid inventory mismanagement

When laboratories unexpectedly run out of a consumable, they need to reorder quickly and often pay an additional fee for expedited delivery, which is sometimes called hot-shotting.

For the most part, inventory is fairly predictable within a lab that repeatedly runs certain tests and has consistent workflows. For example, consider a disposable well tray: lab staff that routinely analyze samples for E. coli collectively know what is required for each workflow and how many tests are run annually, and thus should know how many trays to keep in inventory. This is good budgeting based on patterns of projected use.

But budgeting is only the first step in disciplined inventory management. The lab must also track what has been used, when and by whom. This information is important because high-demand items in the lab often disappear. Consider gas chromatography vials, which are frequently stowed for future use. If a shortage is discovered, the vials have to be ordered quickly to minimize disruption, and the lab may pay as much as double the price to accelerate delivery.

That better budgeting and tracking are important may be obvious, but accomplishing this is not easy. Many labs rely on spreadsheets for inventory management, introducing time-consuming data entry, as well as human error, into daily work. Furthermore, spreadsheets are simply not dynamic enough to establish a management system that enables planning and budgeting. With so much at stake, new approaches to inventory are critical.

Spot analytical trends early and often

Correcting errors can be costly in terms of materials and time spent to resolve issues. Even seemingly inconsequential errors can mushroom into much larger and systemic quality issues or create productivity gaps that eventually require expensive reconfiguration. In a typical lab with analysts running 50 or more tests each week (many of which still use paper spreadsheets), it is nearly impossible to discern trends related to time or material overruns.

Figure 1 – SampleManager LIMS delivers a complete informatics solution for method execution, laboratory and data management, directly addressing the increasing pressures on companies to improve time to market, ensure compliance and realize cost savings in QA/QC and analytical labs. With an in-built Lab Execution System in the LIMS, lab managers gain complete control to manage their own methods and SOPs, without the need to purchase, integrate and validate additional software from multiple vendors.

It is unrealistic to expect lab analysts, managers and technicians to manually identify subtle patterns and trends in data, especially at the scale at which most labs operate. High-throughput laboratories benefit from statistical quality control (SQC) software that can identify outof- conformance trends long before critical thresholds are reached. Some enterprise-level laboratory information management systems (LIMS) such as SampleManager LIMS (Thermo Fisher Scientific, Philadelphia, Penn.) include SQC functionality, eliminating the need to purchase, implement and validate SQC software sold separately (see Figure 1). Access to this type of trending data is crucial for laboratories engaged in analytical testing, and real-time monitoring using statistical algorithms is essential for decision-making. Analysts should look at data trends while an experiment is running, within the established workflow and while fully engaged in the project’s standard operating procedures (SOPs).

Minor errors are not trivial—they can lead to lost productivity, product recalls and consumable waste. A LIMS with built-in SQC functionality can alert a fast-paced, high-performance (and maybe understaffed) lab to minor problems before they become major.

Control SOPs

Developing and documenting SOPs is time-consuming but it is important to ensure that processes are consistent across all personnel; that data is reliably captured; and that results are defensible to management, end users and regulatory authorities. Process inconsistency is a fail-point in many labs—innovative workarounds that promise time savings rarely pay off in the long run. Innovations are acceptable only if they undergo rigorous SOP evaluation. SampleManager LIMS, for example, allows laboratories to define electronic SOPs (ESOPs) that establish preset workflows with clearly outlined technical corrective actions. If these ESOPs do not exist or paper SOPs are not available or are not clearly understood, errors can easily occur as personnel try to establish their own procedures.

  • When developing ESOPs, thoroughness, standardization, distribution and compliance are paramount. Important parameters like lab performance and data defensibility depend on how successfully these concepts are addressed.
  • Thoroughness—The more detailed the process, the better. With ESOPs, there will be no question of process and no opportunity for technicians to add the small, ad hoc substeps that can often upset workflow continuity the most.
  • Standardization—Existing ESOPs should be updated. Any new workflow or SOP can be built into the LIMS, removing guesswork and making it easier for personnel to follow the procedures necessary to achieve standardization and data reliability.
  • Distribution—When the LIMS is managing workflow distribution based on ESOPs, the process can be dramatically simplified.
  • Compliance—Staff should understand and be accountable for each SOP—every step is important. Requiring analysts to enter data as they generate it, for example, not only reduces errors but allows trends to be identified and improves QA/QC. It also helps labs adhere to standards such as ISO 17025 and FDA and EPA requirements.

Uphold data defensibility

A laboratory may be responsible for hundreds of tests each week, and with each test a litany of information may be required. For example: Where did a sample originate? What is the instrument’s maintenance history? What reagents and standards are used for the test? When was the analyst last certified? Which vendor supplied the consumables?

Defending data in a lab with a LIMS usually involves painstakingly retracing steps, many of which are so embedded in the fabric of the lab and its workflows that it may be impossible to isolate them. Hours can be spent sorting through handwritten notes without identifying the problem. In an environment in which regulations change almost continuously, being able to defend data is no longer optional.

LIMS are used for much more than basic sample management and data reporting. Enterprise-level LIMS can integrate with data in manufacturing execution systems (MES), process industry modeling systems (PIMS) and other enterprise systems to simplify and accelerate data defensibility. Everything needed to defend a result is securely captured, stored and archived for rapid access, analysis and reporting.

Keep up with instrument maintenance

Instruments that operate properly and without interruption are the lifeline of the laboratory, ensuring that forecast daily output is achieved, orders are filled and efficiencies are optimized. Data such as area counts, baseline conductivity and retention time provide valuable evidence which, if trended and analyzed, reveal a great deal about the health of an instrument. Regular instrument maintenance is a very important SOP, and enterprise-level LIMS enable managers to monitor instruments so that regular maintenance schedules can be effectively incorporated in the workflow. Notification of needed upcoming maintenance, even of anticipated part failure due to wear, allows maintenance to be scheduled before failure occurs, and any impact on output expectations to be considered. Signs of impending failure may not be readily apparent, but with a few easy steps using a LIMS, users can watch for deviations.

Conclusion

LIMS allow users to establish instrument maintenance schedules as part of their routine workflow, which can lead to better decisionmaking, improved efficiencies due to less downtime and ultimately better and more defensible results.

Trish Meek is director of product strategy, Informatics, Thermo Fisher Scientific, 1601 Cherry St., Philadelphia, Penn. 19102, U.S.A.; tel.: 215- 964-6020; e-mail: [email protected]www.thermoscientific.com/SM11