If you are interested in data integrity, you likely have read many articles about the topic. This blog delivers 7 new insights to data integrity, looking at it from different angles. Seven ways to think, in a positive manner, towards a subject that is often underestimated or thought of as being fuzzy. Seven tips beyond the theory, to gain a head start in your data integrity project plans. And seven reasons to convince your management that data integrity is vital and worth investing in.
In today’s world, data is the driving force in business. Collecting and processing (huge amounts of) data is not just the business model of the Googles and Facebooks in this world. Every business, including yours, revolves around data.
The concept of collection and processing data has been applied in the Life Sciences world (and pharmaceuticals in particular) for many decades. Data is collected throughout the product lifecycle; from research data, clinical data, manufacturing data, and product data, to name a few. This data documents the quality and efficacy of the product. We are already accustomed to writing everything down in procedures, utilizing good documentation practices, and implementing system lifecycles. But, is this enough for data integrity? Maybe not.
Sure, we could show you the (high) numbers of FDA warnings concerning data integrity issues. And often these warnings are the trigger for companies to start working on data integrity. However, this is negative trigger and may result in some quick resolutions in a pressure cooker to address the findings specifically, instead of establishing a plenary data governance, and fully understanding the processes and data flows.
Only looking at potential FDA warnings may convince you or your management to act right now, but not give you business value in the long run. Instead, we present you with seven positive benefits your company will realize when data governance is fully applied.
The topic ‘Data Integrity’ is on everybody’s mind. But how do you measure what you are already doing and what you should be doing? Using a popular tool like a maturity model will help management to assess the current situation (your baseline) and define the desired situation. The ISPE GAMP Guide: “Records and Data Integrity” has defined the Data Integrity Maturity Model, which you can put to good use.
How do you define the right measures and controls to improve data integrity in your company? To understand data integrity, you need to understand what data you are collecting and the interaction of your processes with this data.
By creating detailed process maps you will gain insight into the various flows within and between your processes. By analyzing the data inputs and output for each process step you can start building your data maps.
Ask yourself the following questions:
Maps can be extended by adding details about the user roles involved: who is performing an action, when, and where. You can even add more value by not only documenting the process designed, but also tracking the actual performance of your process by defining data points as measurements and placing them in your maps.
Once you have an insight in your processes and data, you can start defining the steps to implement process improvements. You will see where process and data controls can be applied. Having both process, data maps, and a baseline maturity level, you are practically fully informed about your current data integrity status. Now you are ready to move to creating specific improvement plans.
Looking at the future, process and data maps can be used for more purposes than just to get insight in your operations. By looking beyond the basics, setting up and maintaining good process and data maps will enable you to gain maximum benefit and aiming for future praxis. For example, by digitizing your process maps you can create a digital representation of your company. This is known as a digital twin, and there are many ways of using this in PHARMA 4.0. To give one example, by using a digital twin for simulation of your operational processes you can virtually try out various scenarios for processing improvements. This will help you in better estimating of the outcome of a proposed change.
Guidelines on data integrity, e.g. WHO: Guideline on Data Integrity or MHRA: ‘GXP’ Data Integrity Guidance and Definitions, focus on three elements– or baskets if you like - to address for data governance: Technology, Behavior, and Procedures. Also, the inspectors brought a guidance, PIC/S: Good Practices for Data Management and Integrity in Regulated GMP/GDP Environments, which address these baskets. Much more, they emphasize appropriate organizational culture and behaviors.
Your efforts to improve data integrity should never be focused on a single basket. Spreading your efforts will not only create more effective controls, but they will also enhance and enforce each other. Let’s use these three baskets, and make it as simple as ‘1-2-3’.
The take-away message here is this: when identifying appropriate controls, be sure all three baskets are covered.
A cross-functional team, including the process owner, data owners, and possibly Quality, IT, and Engineering, can help you carry the baskets.
It is important to have all necessary procedural and technical controls in place, but without the awareness of your employees regarding the importance of data integrity, these controls will not be as effective, or may even fail.
Many companies are still focused on creating procedures and implementing technology as the sole means to ensure data integrity. More often, the context of why these procedures and technologies exist is assumed, or even considered not relevant. But you can probably think of an example in your company where the ‘creativity’ of the employees has circumvented or ignored these controls: when there is no easy way, one will be created.
In an organization where data integrity is a ‘way of living’, the awareness of the importance of data integrity and an understanding the context of how this impacts the day to day operations is very important. This enables employees to be triggered by issues and to report these so they can quickly be resolved, no harm done.
Not every (harmful) situation can be explicitly described in a procedure. But by understanding the context in which these procedures are used and not just blindly following them, employees can identify potential data integrity issues. An organization that creates a high level of data integrity awareness by training all employees to understand the importance of data integrity and how procedures and technology will help them in achieving this, will benefit from good data governance and reporting of (potential) issues for improvement.
Data integrity requirements can be found in all GxP regulations, from GLP to GVP. Data governance should overarch all phases in healthcare product lifecycle: from product discovery and research to final drug surveillance, from data input to data retention and removal.
Define in each phase of the lifecycle, and for each process, what data falls under regulations and what data is critical and should be kept. For an even greater insight, also document what data you will not collect and store, including the reason.
Think about how this data is created and how and where it is used, changed, and transformed through the data lifetime. For each of these steps, create the appropriate data integrity controls. Don’t forget to consider when and how to archive and ultimately remove the data. Having a clear understanding of when to remove data will ensure your data remains relevant and reduces e-waste. It also reduces unnecessary exposure of outdated data in case of – for example – a legal hold.
The essential message here is: understand what data you need and why you need it. Apply this to any process in your company. Make it a habit not to just talk about processes but also to talk about the data, especially with the key stakeholders. Next time a new computerized system is implemented, which introduces a new set of data in your organization, not only discuss and determine the system lifecycle, but also the data lifecycle, preventing data integrity issues after go-live.
We as humans are not good at consistent execution and consistent data capturing. Sorry to pop the bubble, but that is reality. But we are great at analyzing and critical thinking!
Automated systems on the other hand, are really good at consistent execution and data capturing, but epically fail at creatively solving your issues. Make sure to use every resource to the best of their capabilities.
You should realize that manual entries can never be 100% flawless, and as such, cause a weak link in your chain of data processing. Even the 4-eyes principle (by a witness or reviewer) helps only to some extent. The vision gets blurred after many repetitions or under stress. (Have you also seen colleagues signing blind or assuming it is okay?) That is why automation is recommended.
Moving from manual or even hybrid data capturing to automated data capturing will not only improve your data integrity levels, it will also create an environment for future enhancements. Consistent data capture and integrity of your data is, for example, a prerequisite to move towards PHARMA 4.0.
But be aware; while automation may lower some risks, it does not eliminate all data integrity risks. Technical and procedural controls still need to be defined. As many companies rely strongly on a computerized system and computer system validation is a common understanding, the pitfall is to think: ‘it is validated, so the data must be safe’. However, data integrity requirements for the system and data should be defined in the user requirements. If no one tells the appropriate IT/Validation colleague which data is critical, who is allowed to access the data, and when to perform a data integrity risk assessment, how can validation in itself safeguard the data?
Data integrity is not just a part of pharma law and regulation, or the focus of inspectors, it should be your focus as well. Good data governance results in increased trust in your data, less effort seeking data, reduced troubleshooting, and so on. In the end, this will lower your costs.
How much time has your organization spent on (manually) verifying data in each process step, because there are no validated technical controls in place? How many good decisions have been wasted by using bad data?
If your data is rubbish, one can expect troubles ahead (‘garbage in, garbage out’). However, if your data is trustworthy (complete, consistent, and accurate) your management will be able to make better business decisions. This will result in lower costs and improved revenue.
Lean and continuous improvement principles such as "First time right" and "Elimination of waste" also apply to data integrity. Repairing a (potential) data breach or performing a recall are… ah well, that is clear to everybody…
By having implemented a data governance framework, consistent execution and being in control, resulting in a trustworthy data system, your company will grow in esteem. Clients choose you as a reliable partner.
To prevent any data integrity issues (read: ‘high and unnecessary costs’), all employees (including management) should have a deeper understanding and recognition of data integrity, to implement efficient processes with appropriate controls, commensurate to the defined risks. Thoughtful and alert employees will help by promptly recognizing, reporting, and resolving data integrity issues.
Now, wouldn’t it be nice to be relaxed and prepared for audits and inspections concerning data integrity? To be able to rely on your data governance, and to trust your data to make your decisions upon? Less stress within the company and more time to focus on growing business!
So, there you have it, our 7 “aha moments” regarding data integrity. Does it match any of your experiences? We hope this article helps you in your data integrity journey and gives some new perspective and inspiration.
Share your moment of deeper insight. Let’s talk.