ELN in the Bioanalytical Laboratory
- xyli83
- Jun 1, 2017
- 4 min read
Toxicokinetics (TK) is generation of kinetic data for systemic exposure and toxicity assessment of the drug. These studies help us to estimate the observed toxicity to that dose. TK evaluation is very important in drug development phase in both regulatory and scientific perspective. There are several guidelines to conduct TK study in animals recommended by regulatory bodies (OECD). TK evaluation is useful in selection of dose, dosing form, alternative dosing route, evaluation of toxicological mechanism, and also used for the setting safe dose level in clinical phases. This TK studies also used to reduces the animal number (replacement, reduction and refinement). On the other hand, TK data are practically used for the purpose of drug discovery such as lead-optimization and candidate-selection. Email:marketing@medicilon.com.cn Web:www.medicilon.com
Bioanalytical laboratories play a crucial role in the development of pharmaceutical drug products. Labs in large biopharmaceutical companies typically partner with other internal groups, such as pre-clinical toxicology and clinical pharmacology, to support their analytical requirements. Many, but certainly not all, of the sample analysis performed support pharmacokinetic (PK) and toxicokinetic (TK) studies. Some studies are exploratory, while most are drug product property measurement for regulatory submissions.
In a medium-sized company — or at a minor site of a large company — it is not uncommon to find these labs also supporting discovery research kinetics. In even smaller operations, the data generation function is most commonly fully outsourced to a contract partner, though this is, of course, a growing trend across the full spectrum of organizations.
Bioanalytical departments are under tremendous pressure to increase efficiency and throughput, while at the same time maintaining compliance with good laboratory practice (GLP) regulations. Further adding to the stress on operation, their scientists are expected to keep up with leading-edge instrumentation technologies to develop new assays and improve sensitivities.
In the last few years, the drive to escalate the number of drug candidates has resulted in a growing number of pre-clinical and clinical studies. With labor being the major operational expense, increasing numbers of biological samples must be processed with the same level of staffing in a tight budgetary environment. For an internal operation, the cost structure must be competitive with contract research organizations. For chemically synthesized small molecule drugs, liquid chromatography/mass spectrometry (LC/MS) has been the dominant analytical technique for many years. For large molecule biologics, electrochemiluminescence and fluorescence instruments are the norm. Robotics for sample preparation and technology to shorten analysis time (e.g., UPLC) has been utilized in the last few years to increase sample throughput at the bench. With the large volumes of data generated, a laboratory information management system (LIMS) is commonplace for sample and result management; the dominant product in this market space being Thermo Scientific’s Watson LIMS.
Despite the millions spent on advanced analytical technology, robotics automation and LIMS, there are process bottlenecks that must be overcome to further streamline workflow and reduce cycle time. One of these is the wide-scale use of paper lab notebooks for proving regulatory compliance with standard operating procedures (SOP), developing methods, solution / quality control (QC) / standard preparation, experiment documentation, and other tasks. Combined with poor LIMS report automation and the paper notebook information silos, generation and reviews of study reports are also labor-intensive. Samples are rapidly processed only to have the delivery of results logjammed.
Opportunities for deviations from established SOPs and the introduction of transcription and calculation errors increase with manual activities. Consequently, to minimize regulatory risk exposure, there are additional delays to double-check entries, perform peer reviews, and final quality assurance checks. Our analysis indicates this has a 15 to 20 percent negative impact on overall efficiency. As sample loads increase further, additional personnel must be added to enhance capacity if there are no improvements.
Managing by exception
The IDBS BioBook deployment undertaken by Abbott Laboratories’ BioAnalytical department (Abbott Park, IL, and Ludwigshafen, Germany) specifically targeted laboratory throughput as the primary project justification.1 After an analysis of laboratory processes, the department concluded that reporting and reviews were the biggest barriers to expediting FDA submissions. Robotics, LC/MS, Watson LIMS, and NuGenesis Scientific Data Management System (SDMS) all helped to dramatically decrease cycle times over the last 10 years, but there was much room for improvement.
According to IT Program Manager, Dr. Yan Song, “The manual and paper-based processes added weeks to the release of a final report. There were too many reviews to capture any deviations that might have been introduced.”
If a deviation from policy was discovered, it was addressed, and the report had to undergo yet another review cycle. The Abbott team, therefore, decided to implement a two-pronged strategy: management by exception and report automation.
Management by exception is the philosophy of only reviewing deviations from established policies. For example, instead of a quality control review of every toxicokinetic study report, errors can be prevented or captured at the source. Any deviations are reviewed while the rest of the report passes through the process. Tracking nonconformities also allows management to implement lean practices to address quality and continuous improvement. ELN was, therefore, required to take advantage of its template, tracking and audit trail capabilities. According to Song, “SOP requirements are built into ELN templates that enforce capture of required information and perform any necessary calculations. All deviations in required fields are easily search and tabulated.” There are built-in checks to alert a user before accidentally introducing an error, such as warnings that a certificate of analysis (COA) has expired or that a balance check has to be performed. In some cases, where a deviation might be acceptable, it is flagged for an additional review by a principal investigator.
bbott integrated BioBook with Watson LIMS for the exchange of sample lists for preparation and the creation of LC/MS sequence files that are sent to the instrument for execution. At the end of a study, a report is generated from Watson using UptoData’s iStudy Reporter. A future release of iStudy Reporter will pull study data from BioBook at the same time, but for now, notebook data are copied and pasted into the auto-generated MS Word file. The quality assurance department will simply review the exception and audit log in the ELN for any deviations, which has “eliminated weeks” from the old process, according to Song.
Comments