Evolving Software Enables Smarter Workflows: An Interview With Gene Tetreault of BIOVIA

The digital revolution in the laboratory began in the late 1960s, with digital integrators measuring the area of chromatographic peaks in microvolt seconds. The next step was to use microcomputers, such as the PDP-8 from Digital Equipment Corp., to gather signals from the bank of the GCs and provide a formatted report via a teleprinter.

Computer networks started to appear by the mid-1970s, allowing electronic reports to be routed to other computers, but usually only within the same facility. With the arrival of the Internet two decades later, report files could be sent globally. Soon after came the ability to access reports and historical data in real time with intercomputer communication. It was predicted that paper would be replaced by pixels, and to a large extent this has happened.

Laboratories have responded to the need for more data to support decision-making and research with even faster analytics. For example, in chromatography, subminute separations are now common; various spectroscopic instruments are being used in real time. Next-generation sequencing (NGS) and results from ’omics technologies have placed more emphasis on the handling of very large data files that may span several evolutionary changes in operating systems. Metadata has become important; and natural language processing has opened the scientific literature to smart, convenient computer searching.

The ability to describe and model workflows and processes is now supporting interacting with data from report to prediction. Advances in process analytics can provide near-real-time data that supports within-run optimization.

With this as a background, I interviewed Gene Tetreault, senior director of products and marketing for Dassault Systèmes BIOVIA (San Diego, Calif.).

RLS: We are currently at a convergence of high-throughput analytics with super-computing. How do you view this?

GT: Yes, at BIOVIA, we are impressed with the rapid growth of chemical and especially biochemical information, including next-generation sequencing. It makes little sense to have big databases unless you can use the data. Historically, Accelrys [San Diego, Calif.] served the laboratory space and Dassault Systèmes [Vélizy-Villacoublay, France] is strong in the engineering space, especially 3D design, 3D digital mock-up and product lifecycle management. Together, we [BIOVIA is the combination of Accelrys and Dassault Systèmes] offer a complete package that covers discovery to product.

RLS: To me, data visualization is a weak point in the utility of large databases. What is coming?

GT: Yes, you have a point. The combination and associations of data types is limitless these days. With the correct context such as location, timing [and] control parameters, to name a few, powerful heat maps or spider graphs will provide new insight. But we need visualization tools that are compatible with more dimensions.

RLS: Process monitoring is a traditional role for analytics. Please tell us more about adding predictive capability for optimizing a particular run.

GT: Okay, let’s look at a couple of examples.

Example 1: With a proper correlation of empirical data from prior situations, we can imagine that process monitoring provides critical variations automatically without having to program the scenarios ahead of time.

Example 2: Another possible use of this data is to tune the process dynamically to provide the desired outcome such as the yield or the amount of critical material consumed in the process.

I think that these show the power of the confluence of high-speed analytics with super-computing.

RLS: Can we expect similar advances in the clinic?

GT: This is happening now with NGS. The analytics is currently ahead of the computing, but this will change very quickly.

RLS: What is your view of the new human interfaces such as Google Glass (Google, Mountain View, Calif.) and iWatch (Apple Inc., Cupertino, Calif.)? What impact do you foresee, particularly in the laboratory space?

GT: Wearable devices and the Internet of Things (IoT) will increase the amount of available data. Additionally, these devices will seamlessly combine the physical world display with data overlays enhancing what a scientist will see, e.g., a scientist can look at a sample or a device and instantly see additional information, immediately augmenting their [sic] understanding of the situation. For example, wearable devices could present a quick display of trend lines and control charts. Instead of having to look at run history, they could see how this run compares to the prior runs.

RLS: How will instrument design respond to changes in computing and communication?

GT: Instruments will automatically connect to the Internet and their data will be made available to vendors for diagnostics or to the business for better utilization. This connection will also provide seamless integration with the scientist as they [sic] are performing experiments and tests.

RLS: In addition to the predictive capability, what will the lab of the future be like?

GT: Wi-Fi has reduced communication constraints, but this is only partly played out. Each instrument or workstation will self-identify and report status to the analyst and other stakeholders. This will be integrated with LIMS systems [sic] and the BIOVIA Unified Lab to keep track of samples, reagents, staff and deadlines. I anticipate that self-checking will become more sophisticated, including cross-checking that the proposed sample and method are consistent.

Out-of-specification results and quality audits will be anticipated and supported over the product lifetime, which may span several operating systems from the instrument to enterprise level.

RLS: How will laboratory staff be affected?

GT: Electronic notebooks will become more useful in the lab, [and] not just used in stationary situations; mobility will enable the elimination of paper finally. But local NFC, which stands for near field communications, will provide detailed logs of people and events. Today this is partially implemented with RFID [radiofrequency identification] chips and fingerprint readers. Future developments will include biometric information, such as an iris scan, or a personal ID bracelet that logs in the analyst from information on the bracelet.

Further out, heads-up displays will read the method and guide the lab tech to the next step and illustrate the desired process. This is already being implemented with personal eyeware such as Google Glass. This may lead to live video exchanges with colleagues, which might be very useful in method transfers and maintaining proficiency across many sites.

RLS: What about sustainability? I understand that in 2013, the electrical power consumed by data farms was about 2% of the global energy diet. Plus, the doubling time is less than two years. A genome consumes about 7 kilowatt hours of storage. Is this sustainable?

GT: Moore’s law is working on energy efficiency also. We will need new, more energy-efficient storage devices, and these are in development. Spinning discs will be replaced, hopefully soon, with solid-state memory. This should offer greatly improved speed, reliability and energy efficiency.

RLS: How do you see the evolution of future technology? I see a contrast in the way that sound recording, primarily music, jumps from one format to the next about every 10 years.

GT: I anticipate that laboratory communication will be slower to develop than consumer electronics. The market is smaller, and instruments have longer useful lives. I expect that adoption will be more a continuum than disruptive breakthrough.

For example, wearable displays and headsup safety glasses will probably evolve over decades. After all, despite all the advantages, adoption of electronic laboratory notebooks has been slower than many predicted. But the biometric bracelet or necklace may be a simple extension of the company ID card.

RLS: Social media use in the lab has lagged behind its adoption by consumers. What do you see?

GT: Social media could play a role in helping networking among users of particular instruments. Points of technique could be especially useful. Video communication could help in method transfer. This is done some inside an enterprise, but I could see how this could be extended to consultants.

In the area of forensic science, I can see that continuous recording via body cams could help restore credibility to forensics. This is similar to the current use of cell phones and body cameras by police. In the lab, the cameras could be tied to ELNs [electronic laboratory notebooks] so that visual evidence would corroborate the analytics, particularly in sample manipulation and preparation. The recent publicized problems with crime labs must be addressed.

RLS: What should we lab rats be implementing now?

GT: Lab staff should be concerned about “truth and trust.” As laboratory work advances using ever more sophisticated technology, it is essential that we validate our results to support valid decision-making. Metrology without integrity is an oxymoron. Look for ways that computers can help systematize our work and results. Humans too often fail doing routine tasks, but computers are not good at recognizing the unexpected. Clearly, good science needs both. So automate where you can, but do not lose sight of the process. Continue to look for opportunities for improvement. I think that in-process optimization routines are an excellent current opportunity.

Robert L. Stevenson, Ph.D., is Editor Emeritus, American Laboratory/Labcompare; e-mail: [email protected].