TMF Week 2024 - Day 4 - Session 2: Leveraging Data and Reporting to Drive Process Improvement in TMF

Montrium

/@Montrium

Published: September 9, 2024

Open in YouTube
Insights

This presentation, delivered by the TMF Operations group from Zenor, provides an in-depth look at how leveraging Trial Master File (TMF) metrics and reporting is crucial for identifying operational gaps, driving process enhancements, and ensuring continuous inspection readiness. The team—comprising the Head of TMF Operations, the Document Quality Manager, the eTMF System Manager, and a Quality Analyst—shared unique perspectives on the considerations, challenges, and actions associated with evaluating TMF data. The core methodology presented involves a continuous improvement loop: identifying key performance indicators (KPIs), setting targets, tracking metrics (monthly dashboards), analyzing root causes, adjusting strategies, communicating results, and implementing changes across systems, scope of work, and business processes.

The speakers detailed three distinct examples of process improvement triggered by metric analysis. The first involved refining the eTMF system based on quality metrics. After transitioning indexing to a new CRO, the team found that QC rejection reports were unreliable because too many rejections fell into a "miscellaneous/other" category. By analyzing the content of these rejections, the team revised the eTMF pick list to include more granular, specific rejection reasons. This simple system enhancement made it significantly easier for the Document Quality Manager to identify specific training needs for indexers, leading to faster resolution and improved overall quality.

The second and third examples focused on timeliness and completeness metrics, respectively. A persistent "red" status on the overall timeliness report (measuring time from document finalization to eTMF approval) led to the discovery that content owners were using external repositories before uploading to Veeva, causing delays and duplicates. The solution involved a scope of work adjustment: minimizing the burden on content owners (allowing simple drag-and-drop upload without classification/metadata input) and shifting the indexing and remediation burden to internal indexers. This change established the eTMF as the single source of truth, drastically improved the timeliness metric (especially between finalization and creation date), and reduced duplicates. Separately, an observed drop in completeness metrics during study closeout prompted the team to implement a new business process: requiring all Subject Matter Experts (SMEs) from functional groups (e.g., Data Management, Safety) to conduct periodic TMF reviews throughout the study life cycle, rather than relying solely on Clinical Operations. While this initially lowered the completeness metric (as missing documents were identified sooner), it made the metric more reliable and ensured the TMF was genuinely inspection-ready earlier.

The presentation culminated with a comprehensive case study: the decision to bring TMF indexing in-house. This major action was driven by all three core TMF metrics (Quality, Timeliness, Completeness) and incorporated all three types of changes (system, scope, process). Outsourcing had led to inconsistencies because CRO indexers worked across multiple TMF systems with different processes, hindering quick implementation of Zenor's specific nuances and training. Bringing indexing in-house enabled faster process and system changes, increased consistency across studies, allowed for more nuanced QC workflows (e.g., gradient management for minor vs. major issues), and significantly reduced overall outsourcing costs. Crucially, it required adjusting metrics—for instance, separating the quality of indexing from the quality of the source documentation, and using pre-upload timeliness as the definitive CRO evaluation metric.

Detailed Key Takeaways

  • Metrics Must Inform Actionable Training: When quality metrics show high rejection rates, the reporting mechanism must provide granular detail (e.g., specific rejection reasons) to pinpoint exact training deficiencies, moving beyond a generic "other" category.
  • Simplicity Drives Compliance: To enforce direct upload into the eTMF and eliminate external repositories, the system must minimize the burden on content owners (e.g., simple drag-and-drop upload) and shift the complexity of indexing and classification to dedicated internal TMF staff.
  • Establish eTMF as the Single Source of Truth (SSOT): Eliminating dual storage repositories (external drives, shared folders) by mandating direct eTMF upload resolves issues related to duplication, lack of visibility, and confusion over the true status of essential documents.
  • Timeliness Metrics Require Deep Dive: When timeliness is consistently poor, analysts should investigate the time gap between document finalization date and document creation/upload date in the eTMF, as this often reveals pre-system bottlenecks (like external repository storage).
  • Cross-Functional Reviews Enhance Completeness Reliability: Relying solely on Clinical Operations for TMF reviews is insufficient; requiring periodic (quarterly/semi-annual) reviews by functional Subject Matter Experts (SMEs) throughout the study life cycle ensures accurate identification of missing documents earlier.
  • Lower Completeness Can Indicate Better Readiness: A temporary drop in the completeness metric after implementing cross-functional reviews is a positive sign, as it reflects a more accurate, reliable status of missing documents being identified sooner, making the TMF more genuinely inspection-ready.
  • Standardize Review Tools for Non-TMF Users: When involving cross-functional teams in TMF reviews, provide standardized, intuitive tools based on filtered inventory reports (e.g., filtered by date range, site, document type) to prevent overlap and minimize the burden of review.
  • In-House Indexing Increases Agility: Bringing indexing in-house allows for faster implementation of system changes, immediate training/retraining, and consistent adherence to specific company TMF nuances and complex workflows that are often lost when outsourced to CROs managing multiple systems.
  • Adjust Metrics Post-Process Change: When responsibilities shift (e.g., bringing indexing in-house), metrics must be redefined. For example, separate the quality metric into "indexing quality" vs. "document quality" to accurately evaluate internal indexer performance.
  • Prioritize Accuracy Over Vanity Metrics: TMF metrics (Quality, Timeliness, Completeness) should be viewed as diagnostic tools to inform process decisions, not just targets to hit. A 90% accurate metric is more valuable than a misleading 100% metric.
  • Implement a TMF Metrics Program Structure: A successful program involves eight steps: identifying KPIs, setting clear thresholds, developing a tracking method (e.g., monthly dashboards), integrating KPIs into daily operations, regular review/analysis, strategy adjustment, stakeholder communication, and continuous improvement.

Tools/Resources Mentioned

  • Veeva: Mentioned as the eTMF system used by Zenor, specifically referencing the eTMF inbox and system configuration.
  • TMF Reference Model: Used as the structural basis for Zenor's TMF index.

Key Concepts

  • TMF Metrics (Core Three): The three primary metrics leveraged for process improvement:
    • Quality: Assessment of document accuracy, adherence to standards, and filing correctness (measured via QC rejection reports).
    • Timeliness: Measurement of the time elapsed from document finalization (e.g., signature date) to document approval in the eTMF.
    • Completeness: Assessment of whether all required documents are present in the TMF according to the TMF index.
  • eTMF Inbox: A holding area within the eTMF system (like Veeva) where documents reside after upload but before final classification, indexing, and approval, allowing for internal remediation workflows.
  • Single Source of Truth (SSOT): The concept that the eTMF should be the sole, definitive repository for all essential clinical trial documentation, eliminating the use of external, uncontrolled repositories.

Examples/Case Studies

  • QC Rejection Pick List Revision: Triggered by quality metrics showing high "other" rejections, the eTMF pick list for rejection reasons was revised to be more granular, enabling targeted retraining for indexers.
  • Direct Upload Mandate: Triggered by poor timeliness metrics, the process was changed to encourage drag-and-drop direct upload to the eTMF by content owners, eliminating external repositories and shifting indexing complexity to internal TMF staff.
  • Cross-Functional Review Implementation: Triggered by completeness metrics tanking at closeout, a new business process mandated periodic TMF reviews by all functional SMEs (Data Management, Safety, etc.) throughout the study lifecycle to ensure earlier identification of missing documents.
  • Bringing Indexing In-House: A major initiative driven by all three core metrics, resulting in system simplification, increased consistency, faster training, and significant reduction in outsourcing costs by managing TMF indexing internally.