Safeguarding data integrity—the accuracy, consistency, and reliability of data—is multifactorial and involves robust security measures. Data integrity is critical to regulatory compliance, and secure, high-quality data is essential to preserve a lab’s profitability and reputation.
There are a number of ways that data integrity may be compromised, including instrumentation errors, poor record keeping, human error, and cybersecurity threats. Maintaining data integrity is challenging and involves managing the interaction between laboratory personnel and technology.
The FDA (The US Food and Drug Administration) introduced the ALCOA (Attributable, Legible, Contemporaneous, Original, and Accurate) and ALCOA+ (data is also Complete, Consistent, Enduring, and Available) principles to describe the most important data integrity requirements. The principles were originally intended for the pharmaceutical industry but have since been adopted by numerous others, including the healthcare and food and beverage industries.
Labs have access to a variety of tools to preserve data integrity. Quality management systems, data encryption tools, data backup and recovery systems, method validation software, and more can be implemented to safeguard data. Many modern instruments also have data integrity features such as electronic signatures and audit trails built in. With so many options, lab managers must carefully consider many factors to ensure the best fit for their lab’s current and future needs. In addition to these tools, it is critical to ensure proper training of lab personnel on data integrity procedures and develop comprehensive documentation protocols.
These principles and tools should be part of a holistic solution for data security that encompasses effective data management practices and fosters a culture of transparency within the organization.
Download the full resource guide to:
- Gain new clarity on how to meet integrity criteria beyond ALCOA+
- Discover guidance on selecting data management solutions
- Explore features of a compliance-ready ELN platform
- Read expert advice to make regulatory inspections less daunting and more efficient
Data Integrity Safeguarding your Lab's Profitability and Reputation
DATA INTEGRITY RESOURCE GUIDE
Safeguarding your Lab's Profitability and Reputation
Regulations, tools, and expert guidance on how to protect data assets
HIGHLIGHT
Clarifying guiding principles
HIGHLIGHT
Data backups: Why and how
HIGHLIGHT
Tips for streamlining regulatory audits
Table of Contents
3 Preserving Laboratory Data Integrity
4 Managing the Integrity of Data
7 What Lab Managers Should Know About Data Backups
10 The Role of ELNs in Lab Data Security
2 Lab Manager
13 Fielding Questions about Data Integrity from Regulatory Inspectors
Introduction
Preserving Laboratory Data Integrity
Principles, tools, and expert advice
Safeguarding data integrity-the accuracy, consistency, and reliability of data-is multifacto- rial and involves robust security measures. Data integrity is critical to regulatory compliance, and secure, high-quality data is essential to preserve a lab's profitability and reputation.
There are a number of ways that data integrity may be compromised, including instrumentation errors, poor record keeping, human error, and cybersecurity threats. Maintaining data integrity is challenging and involves managing the interaction between laboratory personnel and technology.
The FDA (The US Food and Drug Administration) introduced the ALCOA (Attributable, Leg- ible, Contemporaneous, Original, and Accurate) and ALCOA+ (data is also Complete, Consis- tent, Enduring, and Available) principles to describe the most important data integrity require- ments. The principles were originally intended for the pharmaceutical industry but have since been adopted by numerous others, including the healthcare and food and beverage industries.
Labs have access to a variety of tools to preserve data integrity. Quality management systems, data encryption tools, data backup and recovery systems, method validation software, and more can be implemented to safeguard data. Many modern instruments also have data integ- rity features such as electronic signatures and audit trails built in. Thith so many options, lab managers must carefully consider many factors to ensure the best fit for their lab's current and future needs. In addition to these tools, it is critical to ensure proper training of lab personnel on data integrity procedures and develop comprehensive documentation protocols.
These principles and tools should be part of a holistic solution for data security that encompasses effective data management practices and fosters a culture of transparency within the organization.
This resource guide reviews the ALCOA+ principles of data integrity and introduces the concept of "TRUST" to help labs establish effective practices. It explores data management requirements and solutions to help guide purchasing decisions and ensure the best fit for the lab, with particular focus on different data backup options and features to look for in an ELN platform. Also included is expert guidance on how to streamline daunting regulatory inspections.
Managing the Integrity of Data
Lab managers must refocus their efforts on managing data due to increased regulatory agency activity
by Dan Zuccarello
Managing data integrity has always been difficult. Technolo- gy evolves, more complex data management systems appear, and workers interface with systems in new ways. It has taken decades to develop and finalize regulations to ensure data integrity based on constantly changing conditions. To-
day, regulations require conformance to the principles of ALCOA (Attributable, Legible, Contemporaneous, Origi- nal, and Accurate). But even ALCOA continues to evolve. Additional concepts have created ALCOA+, and more are likely to come, making the process even more confusing and challenging. Perhaps a new acronym can provide additional clarity to this difficult topic: TRUST (Tangible, Reliable, Unique, Sustainable, and Tested). In this article, we'll ex- plain how TRUST along with ALCOA can help ensure that your data systems meet integrity criteria.
Tangible
"If it's not documented, it didn't happen." As a lab manager, you are routinely asked to provide tangible documented evi- dence of data, which questions integrity. Regulatory agencies focus on electronic dynamic data and audit trail review. This is currently where many observations of non-compliance are being cited. A real-time review of electronic data and place- ment of review or approval signatures within the data record is required. Think of this as the "Thitnessed by:" that was in hard cover notebooks. Critical for lab managers is that older equipment may not meet current regulatory requirements. Previously, gaps in compliance to Part 11 and data integrity could be fixed procedurally. However, with stronger guid- ance as to dynamic data and audit trail review, these work- arounds are no longer permitted and are being cited during regulatory audits.
This change may force you to remediate observations by evaluating your current capabilities. Older software may actually have hidden or unknown audit or data review functions which were not tested in the original computer validation that might be used. If not, then consider software upgrades or new software or hardware that provides the functionality needed. Keep in mind that new equipment or software purchases may significantly impact your budgets.
Reliable
Data integrity is all about reliability. The computer sys- tems, instrumentation, wiring, and connections must work together and be dependable. Otherwise, your laboratory may end up reviewing nothing but blank files. For systems to be
reliable, it's important to generate user requirements (URS) that suit your lab's infrastructure and workflows to inform the decision of which system to buy. However, this is often not the case, and post-purchase URS become more about making equipment work and not about the user's needs.
Developing URS before the purchase builds robustness and reliability into your system and vendor selection process.
Unique
Generating, storing, and protecting original data is the basic tenet of electronic records. Most systems can time and date stamp when additional files are created or copied from the original. To further protect the uniqueness of data and eliminate the potential to copy, alter, or delete raw data, it's possible to disable removable storage functionality like op-
tical disk drives or USB ports so that external drives can't be used to copy data from servers. Inspectors review computer systems, directories, and histories to determine if the users are moving or copying data.
The unintentional creation of unsecured networks is a grow- ing concern within a post-COVID, remote workforce. Remote workers wanting access to network applications through virtual private networks (VPN) may accidentally create open networks capable of scheduling and starting analyses, review, processing, and even printing of data. Open systems must meet different electronic signature requirements that likely were not considered during the closed system's validation.
Furthermore, personal computers may render critical systems vulnerable if used to transfer data between unregulated stor- age devices and your organization's hardware.
One suggestion for lab managers is to work with your IT departments to determine boundaries for using a VPN. It may become necessary to restrict access to critical validated software programs or confirm through testing that remote users cannot gain access, especially if your systems are to re- main closed. However, if your operations are moving toward open systems and remote access, then review your current validation requirements and your organization's SOPs on computer system validation and risk assessment relative
to open systems. Simple test scripts can be generated to confirm passwords, electronic signatures, and functionality of software applications. Finally, contact the software/hard- ware vendor to determine if open software functions exist, can be applied easily, and what security features they offer to protect critical aspects of the program.
Sustainable
Demonstrating that you can store, backup, and recover data from different servers or locations is critical to data integri- ty. Media aging should be considered, as backups are made to removable media then stored at off-site locations. Aging studies may be required as some locations, such as a bank vault, may not be environmentally controlled, which could impact the integrity of the data or media itself.
The concept of sustainability also includes traceability of the validation data relative to URS and functional and design specifications of the application. Creation of legacy systems and proper long-term storage of the workstation, applica- tions, and data must be considered once equipment ages out or is replaced with updated versions.
Tested
Test scripts are vital to maintaining lab data integrity. The primary purpose of test scripts is to find errors before they happen, not to test your systems into compliance. Stress testing pushes every aspect of the electronic system to a point above where it is expected to operate normally. If a system is expected to be used at 40°C but is designed or only tested at room temperature, the system may fail once it's placed in the harsher environment. Thhat if all of the equipment in your lab was scheduled to collect data at the same time at a high collec- tion rate? The amount of data being generated and sent to the network may cause unknown stress on the system. Evaluate the potential load based on number of systems, then test if you can actually collect that level of information without error.
Test your computer's CPU, memory, graphics, and storage for the potential stress errors on normal operations and data collection. This is particularly valuable if equipment has ex- tremely fast collection rates. There are commercial programs that will perform millions of parallel operations concurrent- ly to stress-test a computer and report the results.
Thith the advent of an ever-increasing remote workforce and heightened regulatory scrutiny related to electronic records, laboratory managers must re-evaluate their data integrity programs. Risk assessments may find that day-to-day oper- ations or procedural fixes must be revised or eliminated to become compliant.
Evaluating and managing your programs based on the concepts of TRUST should provide an additional framework
that, along with ALCOA, can help effectively manage the integrity of your laboratory's data.
What is ALCOA+?
Data integrity refers to the completeness, consistency, and accuracy of data.1 The FDA released the ALCOA+ principles to describe the most important data integrity requirements.
Attributable
Data should be attributable to the individual who created it.
Legible
Data should be clear and easy to read by anyone reviewing it.
Contemporaneously recorded
Data should be recorded in real-time.
Original
Whenever possible, the original record should be preserved.
Accurate
Data should reflect actual observations or measurements.
Additions to create ALCOA+:
Complete
Data should include all the information needed for interpretation.
Consistent
Data should be consistent within and across datasets.
Enduring
Data should be kept for the entire period specified by regulations or protocols.
Available
Data should be accessible whenever needed.
1. https://www.fda.gov/media/119267/download
What Lab Managers Should Know About Data Backups
A hybrid backup solution could be one of the best investments for the lab
by Holden Galusha
Thhile IT system administrators handle backups at most organizations, it is prudent for lab managers to understand what kind of backup options exist so they can advocate for the option that best suits their lab. To maximize the ben- efits of having a robust backup system, lab managers must understand that backups are not just a necessary expense but a strategic investment in the lab's future.
What characterizes a robust backup system?
Comprehensive: All sources of data in the lab-LIMS, ELNs, instrument-generated data, administrative records, client account information, project correspondence, etc.- should be backed up. A comprehensive approach encom- passing everything from experiment data to administrative records will ensure that no critical aspect of the lab's opera- tions is vulnerable to data loss.
Consistent: Backups must occur regularly to be effective. Restoring from an outdated backup will result in data loss. To facilitate consistent backups, it is important that they not be reliant on any one individual carrying them out manually.
Automating your lab's backups is vital to ensuring consistency.
Automated: Automated backups are more reliable, as there is no risk of anyone forgetting to run the backup program. They are also much more efficient, saving staff time and effort to focus on higher-value activities.
Tested: A backup solution is worth nothing if not regularly tested. Lab managers should build it into their security SOP to test backups.
Redundant: It is vital that backup data exists in multiple plac- es. In typical modern labs, there are local and cloud backups.
Local backups occur on the lab's hardware, such as a server or network-attached storage. Often, another copy of the data is written to other media, such as tape, and then delivered to a secure storage facility within driving distance of the lab to act as an offsite backup. Cloud backups funnel data over the in- ternet to a server managed by a cloud service provider. Thhile cloud backups can be seen as another type of offsite backup, they come with additional benefits, such as easy accessibility.
Local backups vs cloud backups
There are two basic destinations for backups: local storage and cloud storage. Each approach has pros/cons that will be weighed differently depending on your organization's unique needs, wants, and financial standing.
LOCAL BACKUPS | |
Pros | Cons |
More configuration/control over the backup solution and its storage hardware | Vulnerable to local disasters (malware, floods, fires) |
Lower associated operating expenses | Higher initial capital investment |
Not dependent on the internet | Requires more effort to manage and maintain |
Rapidly recover large data volumes | Does not scale easily |
CLOUD BACKUPS | |
Pros | Cons |
Easy to implement | Dependence on web connectivity |
Offsite by nature, protecting against local disasters | Slower to recover large datasets |
Scalable (pay for more storage) | Higher associated operating expenses |
Lessens the burden of management | Requires careful qualification of service vendors |
Questions to consider when testing backups:
Can all data be restored across every onsite and remote backup avenue?
Once restored, is any of the data corrupted?
Does restoring from a backup have unforeseen effects on the lab's broader digital infrastructure?
Best of both worlds: A hybrid backup solution
If your budget allows, implementing a hybrid backup solu- tion that uses both local and cloud systems is a very robust approach. Thhile it may be the costliest option, it fits all the above characteristics of what a backup solution should be and allows a degree of customization, flexibility, and redundancy that neither local nor cloud backups offer independently.
8 Lab Manager Data Integrity Resource Guide
Advantages of a
HYBRID BACKUP SOLUTION
While a hybrid solution may be the most expensive approach initially, its scalability, flexibility, and redundancy justify the cost for most labs. It's not merely a safeguard against data loss-it's a strategic investment that hedges the lab's resilience and longevity against threats of all kinds.
Regulatory flexibility
Meet regulatory requirements without compromising the data's safety-more sensitive data can be stored locally while the rest is backed up to the cloud.
Redundancy
Two-way protection schema with data stored locally and in the cloud.
Cost-effective scalability
Fixed cost associated with local storage increases
if the lab outgrows the server, while the recurring cost associated with cloud storage scales in step with needs and budget.
Flexible security
Cloud storage makes backups impervious to local disasters. Local backups prevent loss by cybercriminals attempting to sabotage cloud
backups, are suitable for information too sensitive for cloud upload, and offer more flexible user configuration capabilities.
Accessibility
If a local disaster wipes out backups, cloud access
is invaluable.
The Role of ELNs in Lab Data Security
Electronic lab notebooks enable reproducible research with secure data
by Gail Dutton
It's easy to imagine the data in your lab is safe from outside threats like viruses and hackers because it is protected by
a layer of enterprise-level security. The top electronic lab notebooks (ELNs) limit that risk by taking a zero-trust approach in which all users are authorized, authenticat- ed, and continuously validated before they can access the
applications and data. They also provide user access controls, traceability, and data encryption.
Industry standards and best practices
One of the overarching protections ELNs provide is compli- ance with industry standards and best practices, as well as with relevant government regulations. In the US, this means the Federal Risk and Authorization Management Program (FedRAMP). This program standardizes a risk-based data security approach for cloud service providers throughout the federal government.
Also expect an ELN to comply with the International Orga- nization for Standardization/International Electrotechnical Commission (ISO/IEC) standard 27001 for information security management. It advocates a robust, holistic risk management approach to current and future threats applying to people and technologies, as well as data security policies.
Add-on modules enable some ELNs to comply with indus- try-specific regulations, but the onus is still on the user to meet relevant compliance standards. Life science labs, for example, can benefit from an ELN that helps them become compliant with 21 CFR Part 11 in the US and Annex 11
in the EU.
An ELN should take a compliance-by-design approach in the software development lifecycle, providing features that help users meet industry standards and regulatory commit- ments. This can include an audit trail, electronic signatures, time stamps, and other security features that ensure any changes to data can be traced back to the individual who made the changes and when they were made. As projects advance, adherence to these regulations ensures regulators that the submitted data is reliable.
"One of the overarching protections ELNs provide is compliance with industry standards and best practices, as well as with relevant government regulations."
Automatic backups protect data
Ideally, ELN data will be backed up automatically to the cloud and cross-replicated to multiple data centers in differ- ent geographic locations. This enables data retrieval from another site if there is a problem in a different location. Note that data privacy and storage regulations shift depending on where a data center is located, so users should seek providers that have data centers located in their country. ELN cloud storage providers should host data centers in North America, Europe, Asia, and Oceania to reach as many users as possible.
Lab data also should be encrypted. Industry best practices call for using Advanced Encryption Standard (AES) 256 for 256-bit encryption. It is outlined in FedRAMP and ISO 27001 and approved in the US for top-secret government information. Atop that, another protocol, Transport Layer Security (TLS), further protects data on the move. The latest version, 1.3, was released in 2018.
This multi-tier encryption thwarts man-in-the-middle attacks, which, as the name implies, surreptitiously reroute or alter data in transit.
Cloud storage
Large cloud platform providers like Amazon Theb Services put comprehensive data security measures in place for their ELN customers. Because providing cloud services is their core business, they have the resources and expertise to en- sure it is the best in the world.
Because many ELNs provide their applications as a service (Software-as-a-Service, or SaaS), the threat of security risks caused by unpatched software is reduced. Security doesn't depend upon whether lab personnel have time to perform software updates themselves. For onsite deployments, software updates should always be installed on the local IT infrastructure.
In a virtual, cloud-based environment, a customer's instance of the ELN can be independent of those of other clients.
Each instance is like a safety deposit box inside a bank vault. Although many boxes share the vault, only specific users can access an individual box. This type of ELN deployment further increases the level of security.
Archiving
Data from an ELN has value to multiple business units out- side the laboratory, and that value can persist for years. Rules for data retention may differ across industries, but five years is often the minimum requirement.
Ideally, an ELN will not delete any data for the duration of a client's account with the service. Therefore, users can see
the evolution of the data as it is accessed, analyzed, augment- ed, and compiled into reports, as well as how, when, and by whom it was changed.
Role-based access control
ELNs also help lab managers control users' access to data. For example, a lab intern may be able to read data but not change it, and scientists and their collaborators may access some projects but not others. Controlling access is important not only from a security aspect but also in terms of manag- ing the risks of errors or modifications that may affect data integrity.
Mobile security
Mobile apps that function as a companion to browser-based ELNs are still relatively uncommon. By accessing their ELN from their phones or tablets, scientists can easily check protocols and make notes at the bench without having to manually transcribe those notes, thus minimizing potential errors. Because these notations are synchronized with the ELN in real time, data is current and reliable.
The ability to segment access to data and reports provides
the security labs need to allow collaborations with exter- nal organizations-including companies with which they sometimes may compete-while protecting proprietary information.
Authentication methods can be both secure and user-friend- ly. Single sign-on authentication technologies let users access many digital platforms with one log-in credential while providing multi-factor authentication. Typically, a user may sign in with a username and password and authenticate themselves with a biometric identifier (like a face or finger- print scan), a PIN, a one-time code sent to a smartphone, or a physical token.
For added protection, a session timeout feature ensures that files aren't left open indefinitely, thus limiting the ability of an unauthorized party to access or alter data surreptitiously if the user walks away. Human readable formats also make it easier for users to spot entries that may be amiss.
"With AES-256 encryption, even if TLS is breached, the data is still secure. It's like having a lockbox inside an armored car."
The same security features that manage access to the brows- er version of the ELN-notably, multi-factor authentica- tion-also should be in place for mobile apps. For example, a mobile authentication flow could require users to log into the ELN and generate a single-use, time-sensitive QR code that they then scan with their camera before accessing the ELN and signing into their appropriate sections.
Fielding Questions about Data Integrity from Regulatory Inspectors
Streamline interactions with inspectors by taking steps to minimize errors, writing comprehensive SOPs, and more
by Heather Longden
Regulatory inspections can be stressful experiences for a lab manager. Aside from the inherent stress of hoping that the lab passes the inspection, lab managers are also tasked with answering questions and providing data in a timely, efficient manner to minimize disruption to lab operations.
In this interview, Heather Longden, independent regulatory compliance consultant to the pharmaceutical industry, sheds light on the types of questions to expect, the importance of error documentation, addressing challenges associated with answering regulators' questions, and more.
Q: For what types of inspections would one need to answer questions about laboratory data?
A: Laboratory analytical data can be essential at various key points in the pharmaceutical or medical device development or manufacturing process:
To assess the biological efficacy of the drug in clin- ical trials
To verify the manufacturing process development, stability, shelf-life claims, and future analytical methods in the development phase
As a critical assurance of quality during manufacturing steps and of the final packaged and delivered drug
Q: What kinds of questions would these inspectors ask and why?
A: Inspectors (or auditors from the businesses that are buying the product) are primarily concerned with any data that supports a "quality decision" that was either made about a drug in development or a drug already on the market.
Inspections are typically scheduled to verify that data being generated by internal processes are accurate and truthfully recorded.
A good data governance program would include these kinds of questions as part of regular internal audits, in order to proactively uncover and correct any concerns or onerous processes that make accurate data difficult to create, review, or retrieve.
In the GMP space, auditors and inspectors are keen to see how standard operating procedures (SOPs) are followed and documented. Ultimately this ensures substandard or unsafe products being supplied to patients or consumers. A big focus is ensuring that all analysts and reviewers have been trained on the SOPs.
In a laboratory setting an inspector will be interested to see that equipment, instruments, and software are correctly operated and assured as "fit for purpose," especially where the operation is automated.
Thith regards to both equipment and staff, inspectors are par- ticularly interested in how errors are documented, verified, and addressed within an appropriate response timeframe.
Which mitigating procedures are in place to prevent errors occurring? (Proactive mitigation)
When errors do occur, how are they corrected, and or mitigated, to prevent further harm to patients? (Immedi- ate remediation)
How are error patterns trended/evaluated and reviewed to determine impact and build in process improvements? (Reactive mitigation)
Q: Why are regulators interested in error handling?
A: Errors may lead to incorrect results, which could be hid- ing a failed product behind a passing result. It is important to see how lab staff reacted to an error, corrected it, and how they assessed its impact, as well as how they formulated the wider impact assessment or mitigation plan.
Orphan data
Orphan data, or data that does not trace directly to a reported result, may result from errors that were observed and corrected without a recorded error in
the computerized system. The data are ignored, and a new, correct result is generated.
Thhat's the risk? These data could indicate other egregious analyst behavior that may lead to hidden failing product tests that are not reported or reviewed.
Q: Why are regulators concerned about error documentation?
A: The most obvious concern is that an error might allow a failing product to reach patients. Regulators are also con- cerned that a more worrying scenario might be occurring. Thith expert users, the desire to not create any out of speci-
fication (OOS) or failing results could be pressuring them to hide any failing results-whether triggered by a simple error or a truly faulty product.
Q: What are common challenges in answering questions relating to laboratory errors?
A: There are three main challenges that can lead to diffi- culties answering questions relating to laboratory errors:
Almost all analytical methodologies are a blend of manual preparation procedures and a manual, hybrid, or fully electronic measurement procedure
Analysts are trained experts who may know more about the analysis than is written in the SOP
While most analysts are experienced experts in the lab processes, it is unusual for reviewers and approvers to be such experts
Q: How can such a variety of challenges be addressed?
A: Thith manual or combined manual and electronic processes, a critical role of the reviewer and/or QA is to question every result and verify that it is truly representative of the complete data associated with that sample. Trained reviewers with knowledge of lab SOPs, actual laboratory practices, specific analytical techniques, and specific com- monly used lab software are much more likely to be able to spot errors and orphan data in the original data.
To ensure proper documentation of all errors, the OOS in- vestigation SOPs should permit lab analysts to transparently and quickly document error mitigation, and lab managers or QA to simply affirm the correct invalidation of errored data, helping prevent failed results from being hidden from QA due to an over-onerous procedure or lab performance metric.
The most effective approach would be to prevent errors from occurring, which is why the "human error" root cause is frowned upon. Thhy did the analysts make a mistake? Thhat was it about the setup in the laboratory or method which allowed them to make a mistake? Obviously, the easiest way to address it is to have clear and definitive SOPs and SOP training, including how to prevent and correctly mitigate simple errors in the SOPs is also critical. Those SOPS should be periodically revised with expert input to enhance the analyst's knowledge about possible analytical errors.
Automation may be the most effective mitigation possible. This might be via simple automation at key points in the procedure, most likely focused on preventing the analyst making a bad choice in the software or instrument panel or minimizing the chances of making an error in
Product Spotlight
Drive Lab Digital Transformation with Clinisys Laboratory Solution™
Clinisys Laboratory Solution™ is an out-of-the-box (OOTB) SaaS laboratory information management system (LIMS) designed to drive digital transformation in the lab. Clinisys Laboratory Solution™ de- livers greater efficiency, scalability and heightened analytical insights to future-proof your informatics investment, whilst reducing total cost of ownership.
Transform your lab workflows and processes with configurable industry packages that solve the specific needs of labs operating in different sectors, disciplines, and specializations across envi- ronmental, water, wastewater, food safety & sustainability, healthcare, life sciences, and public health & safety.
Clinisys enables healthier and safer communities as a global provider of intelligent laboratory infor- matics solutions and expertise that redefine the modern laboratory across healthcare, life sciences, public health and safety. Millions of laboratory results and data insights are generated every day using Clinisys' platform and cloud-based solutions in over 4,000 laboratories across 39 countries.
In partnership with