Fielding Questions about Data Integrity from Regulatory Inspectors
Streamline interactions with inspectors by taking steps to minimize errors, writing comprehensive SOPs, and more
Regulatory inspections can be stressful experiences for a lab manager. Aside from the inherent stress of hoping that the lab passes the inspection, lab managers are also tasked with answering questions and providing data in a timely, efficient manner to minimize disruption to lab operations. In this interview, Heather Longden, independent regulatory compliance consultant to the pharmaceutical industry, sheds light on the types of questions to expect, the importance of error documentation, addressing challenges associated with answering regulators’ questions, and more.
Q: For what types of inspections would one need to answer questions about laboratory data? Who are the inspectors?
A: Laboratory analytical data can be essential at various key points in the pharmaceutical or medical device development or manufacturing process:
- To assess the biological efficacy of the drug in clinical trials
- To verify the manufacturing process development, stability, shelf-life claims, and future analytical methods in the development phase
- As a critical assurance of quality during manufacturing steps and of the final packaged and delivered drug
This leads to regulatory health authorities having a keen interest in the data created in the first two steps above, in order to issue a license or authorization to put the drug on the market. Inspection of the facilities as well as the data are part of the “pre-authorization” process. The third step is essential for ongoing quality and may be inspected by regulatory health authorities at any regular/periodic GMP inspection.
Q: What kinds of questions would these inspectors ask and why?
A: Inspectors (or auditors from the businesses that are buying the product) are primarily concerned with any data that supports a “quality decision” that was either made about a drug in development or a drug already on the market. Inspections are typically scheduled to verify that data being generated by internal processes are accurate and truthfully recorded. This is true for manufacturing records as well as the analytical data that supports quality as described above.
A good data governance program would include these kinds of questions as part of regular internal audits, in order to proactively uncover and correct any concerns or onerous processes that make accurate data difficult to create, review, or retrieve.
In the GMP space, auditors and inspectors are keen to see how standard operating procedures (SOPs) are followed and documented, in order to prevent manufacturing errors or analytical errors, which could result in substandard or unsafe products being supplied to patients or consumers.
Typically, in a laboratory setting an inspector will be interested to see that equipment, instruments, and software are correctly operated and assured as “fit for purpose,” especially where the operation is automated. This is generally illustrated through documented validation processes that verify the correct operation periodically throughout the lifetime of the equipment use.
A big focus, however, is ensuring that all analysts and reviewers have been trained on the SOPs, have a proper understanding of the SOPs, and that they can document that all steps were followed correctly.
With regards to both equipment and staff, inspectors are particularly interested in how errors are documented, verified, and addressed within an appropriate response timeframe.
- Which mitigating procedures are in place to prevent errors occurring? (Proactive mitigation)
- When errors do occur, how are they corrected, and or mitigated, to prevent further harm to patients? (Immediate remediation)
- How are error patterns trended/evaluated and reviewed to determine impact and build in process improvements? (Reactive mitigation)
In order to assess these steps, any auditor will typically be interested in tracing a particular analytical result—or, more likely, a particular error—to see how it was created, documented, and reviewed.
Q: Why are regulators interested in error handling?
A: Clearly, errors may lead to incorrect results, which could be hiding a failed product behind a passing result. It is important to see how lab staff reacted to an error, corrected it, and how they assessed its impact , as well as how they formulated the wider impact assessment or mitigation plan. Out of specification (OOS) results, repeat analyses, orphan (non-reported) data, error messages on instruments, results rejected by reviewers, and corrective and preventive actions (CAPAs) could all be part of a regulatory inspection, in addition to training records, SOPs, and validation documentation.
Orphan data, or data that does not trace directly to a reported result, may simply result from errors that were observed and corrected without a recorded error in the computerized system. The non-reported data are simply ignored, and a new, correct result is generated. However, these data could be an indication of other egregious analyst behavior that might possibly lead to hidden failing product tests that are not reported or reviewed.
Q: Why are regulators concerned about error documentation?
A: The most obvious concern is that an error might allow a failing product to reach patients. Whether the failure is a sub- or super-potent drug, a “higher than safe” level of an impurity, microbial contamination, or inadequate packaging to assure the safe stability of the drug in its journey to the patient.
Regulators are concerned that a more worrying scenario might be occurring. With expert users, the desire to not create any OOS or failing results could be pressuring them to hide any failing results—whether triggered by a simple error or a truly faulty product. The pressure may be because of basic metrics that penalize failing results or because investigating and documenting failed results is onerous and time consuming. Wishing that the failed result never happened could lead analysts, or entire lab cultures, to repeat the analysis until it passes—with whatever “adjustments” may be required—and might lead to failed product being packaged and introduced into the supply chain.
Investigating orphan data may well be the only way to see if a test that was not reported indicates a failed test or a failing product.
Q: What are common challenges in answering questions relating to laboratory errors?
A: There are three main challenges that can lead to difficulties answering questions relating to laboratory errors.
- Almost all analytical methodologies are a blend of manual preparation procedures and a manual, hybrid, or fully electronic measurement procedure
Computerized measurement systems that have been validated and contain the technological capabilities for authentication, proceduralization, automatic calculation, and generation of results, with changes that are documented and audit-trailed, are still operated with human intervention—and analysts could use them incorrectly.
Sample preparation can be automated to assist the analyst, but there remain many steps that the analyst must carry out to present the right sample and reagents to the automation robot.
There are also still analytical tests that are wholly reliant on the analyst for both test preparation and the measurement of results.
Whenever a manual procedure is required of the analyst, there is an equivalent need to document the actions of the analyst to have a fully documented process.
- Analysts are trained experts who may know more about the analysis than is written in the SOP
Often, analysts know more about the analysis than what is actually documented, especially in regard to addressing errors. Steps to troubleshooting and diagnosing an analytical error are unlikely to be recorded, and analysts may tend to just fix the error and forego documentation if they encounter an error they made or discover an erroneous analytical result. For instance, in high-performance liquid chromatography (HPLC), a trial injection will allow an experienced analyst to quickly determine if:
- The wrong method, column, or solvents have been used.
- The column has degenerated, or the solvents have expired.
- The sample has degraded.
- The whole system is not equilibrated.
- The pressure is too high, or the retention times are not correct.
- There is carry-over or air in the system.
- The integration parameters or component identification parameters are incorrect.
- The calibration curve is incorrect.
- The instrument is not configured correctly, or faulty.
- The connection to the data system is faulty.
- There is carry-over or air in the system.
Instead of initiating a full documented investigation, the expert analyst may possibly make a quick note of it, but generally just (the error and carry on with their work. In other cases, they may reject the data (if any was collected) and simply repeat the test, hoping to not repeat the error they made previously.
A less experienced analyst, however, may not notice their own manual errors (because that detailed expert knowledge is unlikely to be recorded in the SOP), simply carrying on with the analyses. If the results are in specification, the error is extremely unlikely to be discovered. Incorrect results would be recorded, potentially “passing” a product that ought to have failed the analytical test designed to prevent that batch of drug from being shipped and could now be a hazard to patient safety.
If the results are OOS, it’s incorrectly seen as a more significant event. OOS results are expected to be trended, fully documented, and investigated. Regulatory guidance details how these should be handled. An OOS investigation could be very lengthy, trigger significant scrutiny of the lab, and derail it from the day-to-day workload. Typically, the “root cause” of the OOS result is that the analyst did something in error, however recording “human error” as a root cause is very much frowned upon as superficial and for being “too easy” of an explanation.
In both cases there may be incorrect analytical data generated that cannot be relied on and should not be used to determine the quality of the product or process. Orphan data are only collated, secured, and investigated when a result transparently triggers an OOS.
- While most analysts are experienced experts in the lab processes, it is unusual for reviewers and approvers to be such experts
It is especially uncommon for reviewers to have equivalent expertise when they’re part of a non-lab function, such as QA, and/or have limited laboratory experience (possibly limited to a very specific area of analytical chemistry). Expecting them to be able to spot errors in analytical results is a big challenge. Often, final approval is performed on a very high-level summary of the results, not the original raw or processed data, which makes identifying errors impossible. This is why a peer review process, with analysts checking each other’s results, is critical.
Furthermore, reviewers will rarely be expert enough to search for orphan data that are related to a specific sample but ignored in the final report.
Q: How can such a variety of challenges be addressed?
A: As mentioned above, with manual or combined manual and electronic processes, a critical role of the reviewer and/or QA is to question every result and verify that it is truly representative of the complete data associated with that sample (the term “complete” was added under the “+” of ALCOA+ used to define data integrity—maybe it should have replaced the ”c” representing “contemporaneous” in the original ALCOA definition). Trained reviewers with knowledge of lab SOPs, actual laboratory practices, specific analytical techniques, and specific commonly used lab software are much more likely to be able to spot errors and orphan data in the original data.
To ensure proper documentation of all errors, the OOS investigation SOPs should permit lab analysts to transparently and quickly document error mitigation, and lab managers or QA to simply affirm the correct invalidation of errored data, helping prevent failed results from being hidden from QA due to an over-onerous procedure or lab performance metric.
The most effective approach would be to prevent errors from occurring, which is why the “human error” root cause is frowned upon. Why did the analysts make a mistake? What was it about the setup in the laboratory or method which allowed them to make a mistake? Obviously, the easiest way to address it is to have clear and definitive SOPs and SOP training, including how to prevent and correctly mitigate simple errors in the SOPs is also critical. Those SOPS should be periodically revised with expert input to enhance the analyst’s knowledge about possible analytical errors.
Automation may be the most effective mitigation possible. This might be via simple automation at key points in the procedure, most likely focused on preventing the analyst making a bad choice in the software or instrument panel or minimizing the chances of making an error in sample preparation/chemistry, which is not recorded automatically in a computerized system.
The long-term solution would be to remove as many of the manual steps as possible by converting them into a validated autonomous process. This can be achieved using sample preparation robots that feed prepared samples directly into automatically prepared and set-up measurement instruments. While even these rarely result in a truly hands-off analysis (someone still needs to program the robots and software, as well as bring the sample and chemicals to the robot and analytical instruments), with each and every step towards automation and digitization, the opportunity for human error is reduced. This in turn removes a step that requires detailed expert review and approval and removes another process that is no longer subject to real or perceived concerns from auditors or regulatory inspectors.
Heather Longden is an independent regulatory compliance consultant to the pharmaceutical industry. Heather specialises in data integrity, regulatory compliance, data management, GLP and GMP and computer system validation. She acts as an advisor to various regulatory groups, with a leadership role at GAMP® Americas steering committee and serves on the ISPE Boston Area Chapter Board and Education Program Committee (EPC).