Dr. Nimita Limaye, PhD
CEO &Principal Consultant, Nymro Clinical Consulting Services
Interviwed by Dr. Manoj Jadhav PhD, FCP
Translational Clinical Pharmacologist, CRC Pharma, LLC., New Jersey, USA

Last updated on March 26th, 2022 at 02:36 pm

About Dr. Nimita Limaye, CEO & Principal Consultant, Nymro Clinical Consulting Services, has over twenty years of experience working across the global and local Pharma and CRO industry, playing key leadership roles. She has led global operations, managed strategic relationships, and has led risk based monitoring, medical writing, clinical data management (CDM), clinical analytics, and healthcare offerings. She has led teams of domain experts and has played a key role in contributing to technology roadmaps, data visualization strategies, providing key insights for platform development, identifying white spaces and creating new business opportunities.

She has chaired multiple conferences, presented at diverse global forums and has multiple publications to her credit. She was on the editorial board of the Journal of Applied and Translational Genomics, has given a keynote talk in London on ‘Disruptive Innovation in Clinical Trials’ and a chapter has been authored by her in the book ‘How India Found Its Feet: The Story of India Business Leadership and Value Creation 1991–2010’. She is also the past chair of the Society of Clinical Data Management (SCDM), and the first person from India to chair this association. She recently presented at the SCDM Leadership Forum and Conference in San Diego on RBM strategy and implementation challenges.

Q.1. What do one really understand by the term “data strategies” ?

NL: ‘Data strategy’ per se is a generic term and can incorporate a lot of things from the modalities of data acquisition, governance, integration, analytics whether the data is structured or unstructured or a blend of the two, the number of vendors involved and whether data capture will be centralized or not. Multiple factors impact data strategy, including the scale and complexity of the data, whether data is flowing in real time from sensors / devices or not, whether it is being used for a regulatory submission or not, whether it is an interventional or a noninterventional study, the number of protocol amendments, the number of applications from which data is flowing in, the budget, etc. It is good to have data managers, data scientists and the clinical team involved in drafting the data strategy for a clinical study. A welldefined governance plan is critical.

Q.2. Do you sense a change in the data management landscape over the period globally ?

NL: To an extent, I would say, ‘Yes’. From the conventional sense of the term, ‘clinical data management’ has matured. On the other hand with novel data acquisition strategies coming in including the ‘chip in a pill’ involving IEMs (Ingestible Event Markers), wearables, data from social media, massive data flowing in realtime, the need to derive meaningful insight from the same, with multiple vendors and systems being involved, the data manager of yore has to don multiple hats. He/she needs to become a project manager, a data scientist, a techie and of course, understand what clinical data management is all about. Technologies, roles and processes are all changing. The ability to proactively embrace this change and reinvent oneself is key.

Q.3. How does the quality of data directly impact the clinical outcome and the cost involved ?

NL: Poor quality means rework – effectively this is the Cost of Poor Quality (CPQ). No one bears that cost, but oneself. Well defined SOPs, compliance with Good Clinical Data Management Practices (GCDMP), which is the key publication of the Society of Clinical Data Management (SCDM), and the implementation of the right percentage of QC at the right stages of the life cycle of the project is important. It is also necessary to incorporate quality at the grassroot level, when defining the edit checks, when building the database, while designing the CRF and the CCGs (CRF Completion Guidelines) while training the investigator, when framing the queries. Quality issues at this level can result in a cascade of quality issues. Error proofing or poka yoke (a simple example that we can all relate to is the ‘patient weight’, which allows only numeric data that has a fixed length to be entered) is extensively used in CDM. In addition, we should collect only data that is relevant and critical to the trial. More data means more errors and more queries. Every query comes at a cost. Queries that are poorly framed or leading may actually result in multiple requeries and data being manipulated to provide the correct answer. It has been reported that the average cost of resolving a query is as high as $53.87 (Medidata Solutions, Sep’12). Considering the hundreds of queries that go out on a trial, by collecting only relevant data, and framing these well, costs can be significantly contained.

Q.4. Which are the key guidelines/regulations shaping this dynamic domain ?

NL: There are multiple guidelines and regulations that are shaping the CDM industry, but some of the significant ones issued by the US FDA include the draft guidances ‘ Use of Electronic Health Record Data in Clinical Investigations – May 16’, ‘Acceptance of Medical Device Clinical Data from Studies Conducted Outside the US – Nov 15’, then the guidance on ‘Oversight of Clinical Investigations: A Risk Based Approach to Monitoring Aug’ 13,’General Principles of Software Validation – Jan’02, ‘Guidance for Industry Computerized Systems Used in Clinical Investigations – May’07’, the CFR Code of Federal Regulations Title 21, part 11, Mar 97.

Similarly, the EMA issued a reflection paper in 2010, on esource, the “EMA Reflection Paper On Expectations for Electronic Source Data and Data Transcribed to Electronic Data Collection Tools in Clinical Trials”, a “Data Integrity Guidance, Aug, 16″, ‘Reflection paper on risk based quality management in clinical trials – Aug, 11″. The most critical I would say is the ICH E6 R2 guideline – Nov, 16’. Several key position papers have been published by Transcelerate Biopharma as well. Another important change is the shift from Safe Harbor to Privacy Shield.

The EUU. S. Privacy Shield Framework was jointly designed by the U.S. and the EU to ensure compliance with EU data protection requirements when transferring personal data from the EU to the US. Those companies self-certifying for Privacy Shield (early certification timeline – 30th Sep’16) needed to ensure that third parties with whom they were engaging in data transactions conformed to the Accountability for Onward Transfer Principle within 9 months. This principle mandates that data should be transferred only to third parties that comply with privacy shield and that are consistent with the notice and choices that have been provided to the consumers. The transfer of such data should be limited to the specified purposes for which it was collected and that necessary steps should be taken to remediate unauthorized processing by third parties.

The new EU GDPR (General Data Protection Regulation) becomes law in 2018. If a company does not comply with the early certification timeline, then it has to comply with all of the Principles of Privacy Shield, including the onward transfer principle.

Q.5. What are the current gaps and how those can be bridged to improve data quality ?

NL: Gaps arise when concepts of data integrity are not ingrained in all stakeholders who touch the data. The quality of the data is not determined only by people who process the data, but also by the people who generate it. A zero tolerance for non – conformance, and a QbD (Quality by Design) approach to design quality centric processes which ensure that quality is integrated at every stage in the process is essential. Data flows in through various devices, sensors, EDC systems, labs, eSource, CTMS, etc. Data can get corrupted intentionally, as a result of fraud, owing to a lack of calibration of equipment, at points of exchange of the data across systems, as a result of duplication or deletion, while transforming the data to standardize data formats, etc. So to begin with there needs to be intensive training on best practices, 21 CFR part 11 compliance (Electronic Records and Electronic Signatures), the importance of IQ (Installation Qualification), OQ (Operational Qualification) and PQ (Performance Qualification) of systems, on making people understand the needs of access control, audit trails, and of taking regular backups of data. Data needs to be ALCOACCEA (Attributable, Legible, Contemporaneous, Original, Accurate, Complete, Consistent, Enduring and Available).

As M&As and restructuring become a norm and trials grow larger and more complex every day, one increasingly has to deal with big data, flowing in real time in multiple formats and from multiple systems. The implementation of appropriate data governance and change management systems becomes critical. It is desirable for an organization at times to drive ‘unlearning’ and ‘relearning’ for employees to move towards a culture of quality centricity. Conducting periodic audits, as well as Computerized System Risk Assessments (CSRA) is necessary. It is important to remember that the final ownership for the integrity and quality of the data lies with the sponsor, not with an Application Service Provider (ASP). It always helps to have an external auditor. Leveraging analytics to monitor outliers or too much of consistency for that matter, as well as simple logic checks go a long way in assessing the integrity of the data. As the need for seamless interoperability becomes a reality, the industry is trying to assess how Fast Healthcare Interoperability Resources (FHIR), an interoperability standard for the electronic exchange of healthcare information, developed by Health Level Seven International (HL7) could be used to provide a standard API for evaluating eligibility criteria, supporting patient recruitment, and harvesting data from Electronic Health Records (EHRs) to support clinical trials and provide real world evidence. Violations in data integrity or a Breach in Data Integrity (BDI) can result in the issuance of warning letters or FDA 483s. Most importantly, they cost patient lives. Thus training on the importance of data integrity and best practices in an organization is critical.

Q.6. How does the new guidance on RBM is being perceived/implemented across the developed and developing world ?

NL: The ICH E6 R2 guidance has given a strong push in favour of RBM. It puts a significant focus on the sponsor’s responsibility for Quality Management and outlines stepwise requirements for implementing a quality management system. This includes Critical Process and Data Identification, Risk Identification, Risk Evaluation, Risk Control, Risk Communication, Risk Review, and Risk Reporting. It proposes a three way risk evaluation methodology akin to the Risk Probability Number (RPN) of Lean Six Sigma, which is a product of the probability of detecting the risk, impact or severity of the risk and the ease of detecting the risk. It has stressed that “Predefined quality tolerance limits should be established, based on the medical and statistical characteristics of the variables as well as the statistical design of the trial. It has also mandated that irrespective of whether Risk Based Monitoring (RBM) is being utilized by the sponsor or not, the sponsor is required to develop a monitoring plan that is tailored to the specific human subject protection and data integrity risks of the trial. In addition, computer systems have been considered as a critical trial process from a risk identification perspective at the system level.

Q.7. What according to you, the developing countries in the APAC region should strive for to improve the data quality in clinical trials ?

NL: In many ways, we have as a geography matured in the implementation of clinical data management operations. Areas for development would be a deeper understanding and involvement in the development of regulations for the CDM industry globally, a better understanding and implementation of data standards and laws regarding data privacy and security. Formal training on the GCDMP would certainly help. Data management has largely developed in these geographies in the FSP model, wherein people develop expertise indifferent functions of data management. However, it is important that the data managers have a holistic understanding of all the aspects of data management and the necessary training and opportunities need to be provided.

Q.8. With new approaches and technologies such as data science, RBM, data transparency, IEMs, virtual trials inundating the industry, is the cost of drug development really decreasing ?

NL: Tufts University, in Mar’16 has reported that the cost of drug development is around $2.558 billion. This is significantly higher than figures of approximately $1.2 to $1.4 billion reported earlier. This is despite all the initiatives that the industry has jointly taken, such as Transcelerate, the Clinical Trial Transformation Initiative (CTTI), etc. Analysts like Matthew Harper and Bernard Munos have actually indicated that the cost of drug development is significantly higher than the figure reported above, primarily as a result of the loaded costs of the drugs that never made it through. The cost of drug development is increasing as regulations become more stringent, as new methodologies also involve new technologies, which come at a premium and as patents are expiring and the number of blockbusters are declining, bringing an increasing focus on niche areas such as rare diseases and a precision medicine approach, which results in targeting a smaller patient population. All of these factors are directly driving an increase in drug development costs. Realtime, high quality data, supported by analytics, that can drive fast approvals and kill ‘nogos’ quickly, is invaluable to the pharma industry.

Conclusion

The clinical data management industry is in a dynamic state with continuously evolving technologies, standards and regulations. The clinical data manager needs to evolve as well and should have a clear understanding of the dynamic changes in this industry and up skill himself/herself to adapt to this changing environment. Data integrity, security, privacy and ownership should be prioritized and all efforts should be made to ensure that the patient comes first.