What does 21 CFR Part 11 dictate in regards to a minimum expectation of EDC training prior to access?
Training must be performed
Training must include an exam
Training must be in the user's native language
Training must be face to face
UnderFDA 21 CFR Part 11, organizations using electronic systems must ensure thatall system users are trainedto perform their assigned functions before gaining access to the system. The regulation requires documented evidence of training but does not specify how it should be conducted (e.g., exam-based, in person, or language-specific).
TheGCDMP (Chapter: Computerized Systems and Compliance)further clarifies that personnel training should include instruction onsystem functionality, audit trails, data entry procedures, and electronic signaturesto maintain compliance and data integrity. Training must beperformed and documentedbut does not require a specific format or delivery method.
Therefore,option A—Training must be performed—is correct, as it reflects theminimum regulatory expectationper FDA and SCDM standards.
Reference (CCDM-Verified Sources):
FDA 21 CFR Part 11, Section 11.10(i) – Personnel Training Requirements
SCDM GCDMP, Chapter: Computerized Systems and Compliance, Section 5.4 – System Training and Documentation
ICH E6(R2) GCP, Section 2.8 – Qualified Personnel and Training Requirements
A Data Manager receives an audit finding of missing or undocumented training for two database developers according to the organization's training SOP and matrix. Which is the best response to the audit finding?
Identify the root cause and improve the process to prevent it
Remove the training items from the training matrix
Reprimand the person responsible for maintaining training documentation
Send the two developers to the required training
When an audit identifiesmissing or undocumented training, the most appropriate and compliant response is toidentify the root causeof the issue andimplement corrective and preventive actions (CAPA)to ensure that similar findings do not recur.
According toGood Clinical Data Management Practices (GCDMP, Chapter: Quality Management and Auditing), effective quality systems require root cause analysis (RCA) for all audit findings. The process involves:
Investigating why the documentation gap occurred (e.g., poor tracking, outdated SOP, or lack of oversight).
Correcting the immediate issue (e.g., ensuring the developers complete or document training).
Updating processes, training systems, or oversight mechanisms to prevent recurrence.
While sending the two developers to training (D) addresses thesymptom, it does not resolve thesystemic issueidentified by the audit. Options B and C are non-compliant and do not address quality system improvement.
Therefore,option A (Identify the root cause and improve the process)is the best and CCDM-compliant response.
Reference (CCDM-Verified Sources):
SCDM GCDMP, Chapter: Quality Management and Auditing, Section 6.2 – Corrective and Preventive Actions (CAPA)
ICH E6(R2) GCP, Section 5.1.1 – Quality Management and Continuous Process Improvement
FDA 21 CFR Part 820.100 – Corrective and Preventive Action (CAPA) Requirements
Which type of edit check would be implemented to check the correctness of data present in a text box?
Manual Check
Back-end check
Front-end check
Programmed check
Afront-end checkis a type ofreal-time validationperformed at the point of data entry—typically within anElectronic Data Capture (EDC)system or data entry interface—designed to ensure that the data entered in a text box (or any input field) isvalid, logically correct, and within expected parametersbefore the user can proceed or save the record.
According to theGood Clinical Data Management Practices (GCDMP, Chapter on Data Validation and Cleaning),edit checksare essential components of data validation that ensure data accuracy, consistency, and completeness. Front-end checks are implemented within the data collection interface and are triggered immediately when data are entered. They prevent invalid entries (such as letters in numeric fields, out-of-range values, or improper date formats) from being accepted by the system.
Examples of front-end checks include:
Ensuring a numeric field accepts only numbers (e.g., weight cannot include text characters).
Validating that a date is within an allowable range (e.g., not before the subject’s date of birth).
Requiring mandatory fields to be completed before moving forward.
This differs fromback-end checksorprogrammed checks, which are typically run later in batch processes to identify data inconsistencies after entry.Manual checksare human-performed reviews, often for context or data that cannot be validated automatically (e.g., narrative assessments).
Front-end edit checks are preferred wherever possible because theyprevent errors at the source, reducing the number of downstream data queries and cleaning cycles. They contribute significantly todata quality assurance,regulatory compliance, andefficiency in data management operations.
Reference (CCDM-Verified Sources):
Society for Clinical Data Management (SCDM), Good Clinical Data Management Practices (GCDMP), Chapter: Data Validation and Cleaning, Section 6.2 – Edit Checks and Real-Time Data Validation
FDA Guidance for Industry: Computerized Systems Used in Clinical Investigations, Section 6 – Data Entry and Verification Controls
ICH E6 (R2) Good Clinical Practice, Section 5.5 – Data Handling and Record Integrity
CDISC Operational Data Model (ODM) Specification – Edit Check Implementation Standards
A Data Manager is asked to manage SOPs for a department. Given equal availability of the following systems, which of the following is the best choice for managing the organizational SOPs?
Document management system
Customized Excel spreadsheet
Learning management system
Existing paper filing system
The best choice for managingStandard Operating Procedures (SOPs)in a compliant and auditable manner is aDocument Management System (DMS).
According to theGCDMP (Chapter: Regulatory Requirements and Compliance)andICH E6 (R2), SOPs must beversion-controlled, securely stored, retrievable, and auditable. Avalidated DMSsupports controlled access, document lifecycle management (draft, review, approval, and archival), and electronic audit trails, ensuring full compliance withFDA 21 CFR Part 11andGood Documentation Practices (GDP).
WhileLearning Management Systems (C)track training, they are not intended for document control.Spreadsheets (B)andpaper systems (D)cannot provide adequate version tracking, access security, or audit capability required for regulatory inspection readiness.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Regulatory Requirements and Compliance, Section 5.2 – SOP Management and Document Control
ICH E6 (R2) GCP, Section 5.5.3 – Document and Record Management
FDA 21 CFR Part 11 – Electronic Records and Signatures, Section 11.10 – System Validation and Document Controls
Which action has the most impact on the performance of a relational database system?
Entering data into the database from CRFs
Loading a large lab data file into the database
Executing a properly designed database query
Making updates to data previously entered into the database
In a relational database system used in clinical data management, performance refers to how efficiently the system processes transactions, retrieves data, and handles large volumes of information without delay or data integrity issues. Among the listed options, loading a large lab data file into the database (Option B) has the most significant impact on database performance.
According to theGood Clinical Data Management Practices (GCDMP, Chapter on Database Design and Build), the bulk data load process — such as importing large external datasets (e.g., central lab data, ECG results, or imaging metadata) — can be computationally intensive. This process engages the database’s input/output (I/O) subsystem, indexing mechanisms, and transaction logs simultaneously, often locking tables temporarily and consuming significant memory and processing resources.
Unlike standard CRF data entry (Option A) or record updates (Option D), which are incremental and typically processed in smaller transactional batches, bulk loading operations handle thousands or millions of rows at once. If not optimized (e.g., via staging tables, indexing strategies, or commit frequency control), such operations can degrade system performance, slow down concurrent user access, and increase the risk of transaction failure.
Executing a properly designed query (Option C) can also be resource-intensive depending on data volume and join complexity, but when queries are properly optimized (using indexed keys, efficient SQL joins, and selective retrieval), their impact is generally controlled and transient compared to large data imports.
Therefore, as outlined in theGCDMP Database Design and BuildandFDA Computerized Systems Guidance, the most performance-impacting activity in a relational database is bulk loading large external datasets, making Option B the correct answer.
Reference (CCDM-Verified Sources):
Society for Clinical Data Management (SCDM), Good Clinical Data Management Practices (GCDMP), Chapter: Database Design and Build, Section 6.7 – Database Performance and Optimization
FDA Guidance for Industry: Computerized Systems Used in Clinical Investigations, Section 6 – System Performance and Data Handling Efficiency
ICH E6 (R2) Good Clinical Practice, Section 5.5 – Data Handling and Record Integrity
CDISC Operational Data Model (ODM) Implementation Guide – Bulk Data Transfer and Validation Considerations
A study takes body-composition measurements at baseline using a DEXA scanner. Which information is needed to correctly associate the body-composition data to the rest of the study data?
Study number and subject number
Subject number
Study number and visit number
Subject number and visit number
To properly associatebody-composition data(from a DEXA scanner) with other study data, both thesubject numberand thevisit numberare required.
According to theGCDMP (Chapter: Data Management Planning and Study Start-up), every clinical data record must beuniquely identifiable and linkableto a specific subject and study event. Thesubject numberidentifies the participant, while thevisit numberdefines the temporal context in which the measurement was taken.
Without both identifiers, data integration becomes ambiguous—especially if multiple assessments occur over time (e.g., baseline, week 12, end of study). Including both ensuresdata traceability, integrity, and alignmentwith the protocol-defined schedule of events.
Study number (option A) alone does not distinguish between visits or subjects, and visit number alone (option C) lacks linkage to the individual participant.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Data Management Planning and Study Start-up, Section 4.4 – Data Linking and Identification Requirements
ICH E6 (R2) GCP, Section 5.5.3 – Data Traceability Principles
FDA Guidance for Industry: Computerized Systems Used in Clinical Investigations – Data Identification Requirements
Before the EDC system used for the trial is upgraded, what should be the data manager's first task?
Notify the sites of the upgrade
Update the user manual
Assess the impact on the data
Redesign the eCRF
Before implementing anEDC system upgrade, thefirst taskof the Data Manager is toassess the impact on the data.
According to theGCDMP (Chapter: Electronic Data Capture Systems)andFDA 21 CFR Part 11, any system upgrade must undergoimpact assessmentto determine how the change might affectdata integrity, functionality, validation, and ongoing study operations. This assessment ensures that no data are lost, corrupted, or rendered inconsistent during or after the upgrade.
The Data Manager should evaluate:
Potential effects on existing data, edit checks, and reports,
System functionality impacting current workflows, and
Any revalidation requirements.
Only after the impact is understood should the Data Manager proceed to communicate with sites (option A), update documentation (option B), or modify CRFs if required (option D).
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Electronic Data Capture Systems, Section 7.3 – System Upgrades and Change Control
FDA 21 CFR Part 11 – Change Control and Validation Requirements
ICH E6 (R2) Good Clinical Practice, Section 5.5.3 – Change Impact on Data Integrity and System Validation
A Data Manager is designing a CRF for a study for which the efficacy data are not covered by the current SDTM domains. Which search should the Data Manager do?
Use controlled terminology covering the needed concepts
Work with the study team to define new data elements
Search for relevant data element standards
Advise the study team not to collect the data
When existingSDTM (Study Data Tabulation Model)domains do not cover specific efficacy data, thebest practiceis to firstsearch for relevant data element standardsthat may be available throughCDISC CDASH (Clinical Data Acquisition Standards Harmonization)or other recognized industry standards.
PerGCDMP (Chapter: Standards and Data Integration), Data Managers must ensure that new CRF elements are consistent withstandardized definitions, controlled terminology, and data models to support interoperability, future analysis, and regulatory submission.
If no existing standards exist, only then should the Data Manager collaborate with the study team to define new elements — but standard searches always come first.
Thus,option Cis correct —search for relevant data element standardsensures alignment with CDISC best practices and regulatory expectations.
Reference (CCDM-Verified Sources):
SCDM GCDMP, Chapter: Standards and Data Integration, Section 5.1 – Use of CDISC Standards in CRF Design
CDISC CDASH Implementation Guide, Section 4.1 – Standardization of Data Collection Fields
FDA Study Data Technical Conformance Guide (SDTCG), Section 2.4 – Use of Standard and Custom Domains
In development of CRF Completion Guidelines (CCGs), which is a minimum requirement?
CCGs are designed from the perspective of the Study Biostatistician to ensure that the data collected can be analyzed
CCGs must be signed before database closure to include all possible protocol changes affecting CRF completion
CCGs must include a version control on the updated document
CCGs are developed with representatives of Data Management, Biostatistics, and Marketing departments
Case Report Form Completion Guidelines (CCGs)are essential study documents that instruct site staff on how to complete each field of the CRF correctly. Aminimum requirementfor CCGs, according toGood Clinical Data Management Practices (GCDMP, Chapter: CRF Design and Data Collection), is that they must includeversion control.
Version control ensures that all updates or revisions to the CCG—arising from protocol amendments or clarification of data entry rules—are documented, dated, and traceable. This guarantees that site personnel are always using the most current version and supports audit readiness.
Option A describes an important design consideration but not a minimum compliance requirement. Option B is inaccurate, as CCGs must be approved and implementedbefore data collection begins, not after. Option D includes an irrelevant stakeholder (Marketing).
Therefore,option C—“CCGs must include a version control on the updated document”—is correct and compliant with CCDM and GCP standards.
Reference (CCDM-Verified Sources):
SCDM GCDMP, Chapter: CRF Design and Data Collection, Section 4.3 – Development and Maintenance of CRF Completion Guidelines
ICH E6(R2) GCP, Section 8.2.1 – Essential Documents and Version Control Requirements
Which protocol section most concisely conveys timing of data collection throughout a study?
Study endpoints section
Study schedule of events
Protocol synopsis
ICH essential documents
TheStudy Schedule of Events(SoE) section in the protocol is the most concise and comprehensive representation of thetiming of data collectionthroughout a study.
According to theGood Clinical Data Management Practices (GCDMP, Chapter: Data Management Planning and Study Start-up)andICH E6 (R2) GCP, the SoE outlines what assessments, procedures, and data collections occur at each study visit (e.g., screening, baseline, treatment visits, follow-up). This table is a foundational tool forCRF design,database structure, andedit-check development, ensuring alignment between the protocol and data management systems.
While thestudy endpoints section (A)defineswhatis measured, and theprotocol synopsis (C)summarizes the design, only theschedule of events (B)specifieswhendata collection occurs for each parameter. TheICH essential documents (D)pertain to regulatory documentation, not study visit timing.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Data Management Planning and Study Start-up, Section 4.1 – Using the Schedule of Events for Database Design
ICH E6 (R2) GCP, Section 6.3 – Trial Design and Schedule of Assessments
FDA Guidance for Industry: Protocol Design and Data Collection Standards
In a cross-functional team meeting, a monitor mentions performing source data verification (SDV) on daily diary data entered by patients on mobile devices. Which of the following is the best response?
All diary data should be source data verified
The diary data should not be source data verified
Diary data to be source data verified should be selected using a risk-based approach
Diary data to be source data verified should be randomly selected
The best response is thatdiary data to be source data verified should be selected using a risk-based approach.
According to theGCDMP (Chapter: Data Quality Assurance and Control)andFDA Guidance on Risk-Based Monitoring (RBM), not all data require full SDV. Electronic patient-reported outcome (ePRO) or mobile diary data are typicallydirect electronic source data (eSource)captured at the time of entry, which already ensures authenticity and traceability.
Arisk-based SDV approachfocuses verification efforts on data critical tosubject safety and primary efficacy endpoints, as defined in the study’sRisk Assessment PlanorMonitoring Plan. Random or full verification of low-risk data (like diary compliance metrics) adds unnecessary effort and cost.
Thus,Option Caligns with current regulatory expectations and data management best practices.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Data Quality Assurance and Control, Section 7.3 – Risk-Based Monitoring and SDV
ICH E6 (R2) Good Clinical Practice, Section 5.18 – Risk-Based Quality Management
FDA Guidance for Industry: Oversight of Clinical Investigations — A Risk-Based Approach to Monitoring (2013)
A Clinical Data Manager reads a protocol for a clinical trial to test the efficacy of an antiviral to counteract a new epidemic. The stated primary efficacy endpoint is 3-month survival. Which data element is needed for the primary efficacy endpoint?
Death date
Date of autopsy
Cause of death
Birth date
When theprimary efficacy endpointin a clinical trial is3-month survival, the key data element required is thedeath date. This is because the survival endpoint is determined by calculating whether the subject lived or died within a defined time frame from study enrollment or randomization.
According to theGCDMP (Chapter: Data Management Planning and Study Start-up), the Clinical Data Manager (CDM) must identify and ensure the capture of allcritical data elementsnecessary to evaluate the study endpoints. For time-to-event analyses (e.g., survival studies), accurateevent dates (death date)are essential for endpoint derivation and statistical analysis.
Other data elements such as cause of death or date of autopsy (options B and C) may support secondary analyses or safety reviews but are not necessary to determine the survival endpoint itself. Similarly,birth date(option D) contributes to demographic data but is unrelated to the primary efficacy outcome.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Data Management Planning and Study Start-up, Section 4.4 – Critical Data Identification for Endpoints
ICH E9 – Statistical Principles for Clinical Trials, Section 2.2.3 – Time-to-Event Data Considerations
FDA Guidance for Industry: Clinical Trial Endpoints for Drug Development
A protocol is updated mid-study to add an additional procedure about which data needs to be collected. Which of these statements applies?
The DMP should be updated to reflect the changes to the protocol, but this update does not need to be communicated
The DMP should be updated to reflect the changes to the protocol and stakeholders notified
The DMP does not need to be updated as it represents the data at the beginning of the trial only
The DMP does not need to be updated until the end of the trial and all updates are included in the DMP to indicate what happened in the trial
When aprotocol is amended mid-study, resulting in additional data collection requirements, theData Management Plan (DMP)must beupdated accordinglyand all relevant stakeholders must benotified.
According to theGCDMP (Chapter: Data Management Planning and Study Start-up), the DMP is aliving documentthat defines all data management processes for a clinical study. It must accurately reflect thecurrent data flow, CRF design, validation procedures, and reporting structure. Any protocol amendments affecting data capture, structure, or analysis require immediate DMP revision and distribution to ensure alignment across data management, clinical, and biostatistics teams.
Failure to update and communicate DMP changes can lead to misalignment in data handling and introduce compliance risks during audits or inspections. Therefore,Option Bis correct: the DMP must be updated and the change communicated to all stakeholders (e.g., sponsor, CRO, clinical operations, biostatistics).
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Data Management Plan (DMP), Section 5.3 – Maintaining and Updating the DMP
ICH E6 (R2) Good Clinical Practice, Section 5.5.3 – Documentation of Protocol Changes and Data Handling Procedures
FDA Guidance for Industry: Computerized Systems Used in Clinical Investigations – Section on Data Management Documentation
A Data Manager is importing lab data for a study. The lab data and the associated audit trail is kept at the central lab. What is necessary to maintain traceability of the transferred data at the Data Manager's location?
Making changes only after data have been imported
Maintaining a copy of the data as received
Making changes only for exceptions
Making changes only on the copy of the received data
Maintainingtraceabilityof external data imports (such as laboratory results) is a fundamental principle of clinical data management. According to theGCDMP (Chapter: External Data Transfers and Integration), Data Managers mustretain an unaltered copy of the raw data exactly as received from the vendor.
This archived version serves as a reference for:
Data provenance verification,
Audit trail review, and
Discrepancy resolution between vendor and study database.
Since the central lab maintains its own audit trail, the Data Manager’s responsibility is to preserve theoriginal data transmission filebefore applying transformations, merges, or validations.
Options A, C, and D describe procedural safeguards but do not meet the regulatory requirement oftraceable data lineage. Onlyoption B (Maintaining a copy of the data as received)ensures compliance withICH E6(R2)andFDA 21 CFR Part 11standards for data traceability and integrity.
Reference (CCDM-Verified Sources):
SCDM GCDMP, Chapter: External Data Transfers and Integration, Section 5.2 – Data Traceability and Version Control
ICH E6(R2) GCP, Section 5.5.3 – Data Integrity and Source Data Verification
FDA Guidance for Industry: Computerized Systems Used in Clinical Investigations, Section 6.4 – Source Data Traceability and Archiving
A site study coordinator attempts to make an update in a study database in an EDC system after lock. What occurs?
The old value is replaced in all locations by the new value
The change is approved by the Data Manager before it is applied
The site study coordinator is not able to make the change
The change is logged as occurring after lock
Once a clinical database islocked, it becomesread-only— no further data modifications can be made by any users, including site personnel. This ensures that the data arefinalized, consistent, and auditablefor statistical analysis and regulatory submission.
According to theGCDMP (Chapter: Database Lock and Archiving), the lock process involves freezing the database to prevent accidental or unauthorized changes. After lock, access permissions are restricted, and all edit and update functions are disabled. If any corrections are required post-lock, the database must beunlocked under controlled procedures(with full audit trail documentation).
Thus,option C—The site study coordinator is not able to make the change— correctly reflects standard EDC functionality and regulatory compliance.
Reference (CCDM-Verified Sources):
SCDM GCDMP, Chapter: Database Lock and Archiving, Section 5.2 – Database Lock Procedures and Controls
ICH E6(R2) GCP, Section 5.5.3 – Data Integrity and Audit Trail Requirements
FDA 21 CFR Part 11 – Controls for Electronic Records and System Lock Functions
Which of the following statements would be BEST included in a data management plan describing the process for making self-evident corrections in a clinical database?
A senior level data manager may make audited changes to the database without further documentation.
Self-evident corrections made in the database will be reviewed and approved by a team leader or manager.
No changes will be made in the database without a query response signed by the investigator.
Self-evident changes may be made per the listed conventions and documented to the investigative site.
Aself-evident correction (SEC)refers to a data correction that is obvious, logical, and unambiguous — such as correcting an impossible date (e.g., 31-APR-2024) or standardizing a known abbreviation (e.g., “BP” to “Blood Pressure”). According to theGood Clinical Data Management Practices (GCDMP), SECs can be applied by data management stafffollowing pre-approved conventions defined in the Data Management Plan (DMP).
The DMP should explicitly describe the criteria for SECs, including the types of errors eligible for this correction method, the required documentation, and the communication procedure to inform the investigative site. The process must maintainaudit trail transparencyand ensure that all changes aretraceable and justified.
Options A and B suggest unauthorized or informal change procedures, which violate audit and compliance standards. Option C is too restrictive, as it prevents the efficient correction of non-clinical transcription or formatting errors.
Therefore,option Dis correct:“Self-evident changes may be made per the listed conventions and documented to the investigative site.”This approach aligns with CCDM expectations for balancing efficiency, accuracy, and regulatory compliance.
Reference (CCDM-Verified Sources):
SCDM GCDMP, Chapter: Data Validation and Cleaning, Section 6.2 – Self-Evident Corrections
FDA 21 CFR Part 11 – Electronic Records; Audit Trails and Traceability Requirements
For a study, body mass index is calculated from weight and height. Which information is needed to document the transformation?
Algorithm and algorithm version associated with the calculated value
Algorithm associated with the calculated value
User ID making the change and reason for change
Algorithm documented in the Data Management Plan
When derived or calculated variables (likeBody Mass Index) are created, it is essential to document thealgorithmused and itsversionto ensure full data traceability and reproducibility.
According toGCDMP (Chapter: Database Design and Derived Data), every derived field must include metadata describing:
Thederivation algorithm(e.g., BMI = weight [kg] / height² [m²])
Theversionof the algorithm (if updates or revisions occur)
Any associateddata sourcesor transformation rules
This ensures consistent calculation across systems, prevents discrepancies during regulatory submissions, and aligns withFDAandCDISCdocumentation expectations.
Option B lacks version control, which is critical for traceability. Option C describes audit trail data (not derivation metadata), and option D refers to broader documentation, not specific algorithm traceability.
Hence,option A (Algorithm and algorithm version associated with the calculated value)is the correct and compliant answer.
Reference (CCDM-Verified Sources):
SCDM GCDMP, Chapter: Derived Data and Algorithms, Section 5.3 – Documentation and Metadata Requirements
ICH E6(R2) GCP, Section 5.5.3 – Derived Data and Validation Traceability
FDA Guidance for Industry: Providing Regulatory Submissions in Electronic Format – Data Definitions (Define.xml)
Which method would best identify inaccuracies in safety data tables for an NDA?
Compare counts of appropriate patients from manual CRFs to counts in table cells
Compare counts of appropriate patients from line listings of CRF data to counts in table cells
Review the tables to identify any values that look odd
Review the line listings to identify any values that look odd
The best method for identifying inaccuracies in safety data tables prepared for aNew Drug Application (NDA)is tocompare counts of appropriate patients from line listings of CRF data to the counts in table cells.
According to theGCDMP (Chapter: Data Quality Assurance and Control), line listings representraw, patient-level dataextracted directly from the clinical database, whereas summary tables areaggregated outputsused for reporting and submission. Comparing these two sources ensuresdata traceability and accuracy, verifying that tabulated results correctly reflect the underlying patient data.
Manual CRF checks (option A) are less efficient and error-prone, as data entry is typically already validated electronically. Simply reviewing tables or listings for “odd values” (options C and D) lacks the systematic verification necessary for regulatory data integrity.
Thus,comparing line listings to tables (option B)provides a quantitative cross-check between the database and output deliverables, a standard practice in NDA data validation and statistical quality control.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Data Quality Assurance and Control, Section 5.2 – Validation of Tables, Listings, and Figures (TLFs)
FDA Guidance for Industry: Submission of NDA Safety Data, Section on Data Verification and Accuracy
ICH E6 (R2) GCP, Section 5.5.3 – Validation of Derived Data Outputs
A study has an expected enrollment period of one year but has subject recruitment issues. Twelve new sites are added toward the end of the expected enrollment period to help boost enrollment. What is the most likely impact on data flow?
The database set-up will need to be changed to allow for additional sites as they are added to the study.
The distribution of subjects selected for quality control will need to be stratified to allow for the twelve new sites.
A bolus of CRFs at the end of the study will result in the need to increase data entry and cleaning rates to meet existing timelines.
Additional sites will likely have increased query rates since site training is occurring closer to study close.
Adding multiple new sites late in the enrollment period creates aconcentrated influx of new datanear the end of the study. These sites typically start enrolling patients later, resulting in a“bolus” of Case Report Forms (CRFs)that must be entered, validated, and cleaned within a shorter timeframe to meet database lock deadlines.
According to theGood Clinical Data Management Practices (GCDMP, Chapter: Project Management and Data Flow), late site activation compresses the timeline for data management tasks, necessitating increased resources fordata entry, query management, and cleaning. Data management teams must anticipate this surge and plan accordingly—either by increasing staffing or revising timelines to prevent bottlenecks and maintain quality.
Whileoption D(increased query rates) can occur, it is a secondary effect. Themost direct and consistent impactis the surge in data volume requiring expedited processing near study end.
Reference (CCDM-Verified Sources):
SCDM GCDMP, Chapter: Project Management, Section 5.3 – Managing Changes in Site Activation and Data Flow
ICH E6(R2) GCP, Section 5.1 – Quality Management and Oversight
Which of the following data verification checks would most likely be included in a manual or visual data review step?
Checking an entered value against a valid list of values
Checking adverse event treatments against concomitant medications
Checking mandatory fields for missing values
Checking a value against a reference range
Manual or visual data reviewis used to identifycomplex clinical relationships and contextual inconsistenciesthat cannot be detected by automated edit checks.
According to theGCDMP (Chapter: Data Validation and Cleaning), automated edit checks are ideal for structured validations, such as missing fields (option C), reference ranges (option D), or predefined value lists (option A). However, certain clinical cross-checks—such as verifyingadverse event treatments against concomitant medication records—requireclinical judgmentandcontextual understanding.
For example, if an adverse event of "severe headache" was reported but no analgesic appears in the concomitant medication log, the data may warrant manual review and query generation. These context-based checks are best performed by trained data reviewers or medical data managers during manual data review cycles.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Data Validation and Cleaning, Section 6.3 – Manual Review and Clinical Data Consistency Checks
ICH E6 (R2) Good Clinical Practice, Section 5.18.4 – Clinical Data Review Responsibilities
FDA Guidance for Industry: Computerized Systems Used in Clinical Investigations – Data Verification Principles
Which of the following tasks would be reasonable during a major upgrade of a clinical data management system?
All of the data formats in the archive should be updated to new standards.
The ability to access and read the clinical data archive should be tested.
The data archive should be migrated to an offsite database server.
All of the case report forms should be pulled and compared to the archive.
During amajor system upgrade, it is critical to verify thatarchived data remain accessible, readable, and intactfollowing the implementation.
According to theGCDMP (Chapter: Database Lock and Archiving), regulatory requirements such as21 CFR Part 11andICH E6(R2)mandate that archived data must remain retrievable in ahuman-readable formatfor the duration of retention (often years after study completion).
Therefore, as part ofvalidation and verification testing, organizations must confirm that existing archives can still be accessed using the upgraded system or compatible tools.
Option A:Updating archive formats could alter original data integrity (noncompliant).
Option C:Migration offsite is an IT infrastructure task, not directly tied to the upgrade process.
Option D:Comparing CRFs to archives is unnecessary unless data corruption is suspected.
Hence,option B (testing archive accessibility)is the correct and compliant approach.
Reference (CCDM-Verified Sources):
SCDM GCDMP, Chapter: Database Lock and Archiving, Section 5.4 – System Upgrades and Archive Validation
ICH E6(R2) GCP, Section 5.5.3 – System Validation and Data Retention
FDA 21 CFR Part 11 – Data Archiving, Retention, and Retrieval Requirements
A study team member wants to let sites enroll patients before the system is ready. Which are important considerations?
Without the ability to capture the data electronically, the data cannot be checked or used to monitor and manage the study
If the study were audited, enrolling subjects prior to having the EDC system ready would become an audit finding
There is no way to identify, report and track adverse events and serious adverse events without the EDC system in place
Starting the study prior to the EDC system being ready will delay processing of milestone-based site payments
Enrolling subjects before theElectronic Data Capture (EDC) systemis ready poses majordata integrity and compliance risks. The primary issue is thatdata cannot be accurately captured, validated, or monitoredwithout the system in place.
Per theGCDMP (Chapter: Data Management Planning and Study Start-up), data collection systems must befully validated, tested, and releasedbefore enrollment begins to ensure:
Real-time data entry and quality control
Proper tracking of adverse events (AEs/SAEs)
Audit trails and traceability for regulatory compliance
Option A highlights the most critical consequence — without an operational EDC, data collection and verification processes cannot occur, compromising data quality and study oversight.
While options B, C, and D may be partially true, they aresecondary effects. The fundamental consideration is data capture capability and monitoring control, makingoption Acorrect.
Reference (CCDM-Verified Sources):
SCDM GCDMP, Chapter: Data Management Planning and Study Start-up, Section 4.2 – EDC Readiness and System Validation
ICH E6(R2) GCP, Section 5.5.3 – Computerized Systems Validation Before Use
FDA Guidance for Industry: Computerized Systems Used in Clinical Investigations, Section 6.1 – System Qualification Prior to Data Entry
Which statement is true regarding User Acceptance Testing (UAT) in an EDC application?
System tools in EDC do not remove the need for UAT
Data should not be collected in a production environment until UAT is completed
Every rule should be tested with at least one "pass" and one "fail" scenario
The extent of UAT (i.e., the number of test cases and rules) cannot be risk-based
InElectronic Data Capture (EDC)system validation,User Acceptance Testing (UAT)is a mandatory phase that must becompleted before data collection begins in the production environment.
According to theGCDMP (Chapter: Database Design, Validation, and Testing)andFDA 21 CFR Part 11, UAT ensures that the EDC system meets allprotocol-specific, functional, and regulatory requirementsbefore it is deployed for live use. The goal is to verify that the system performs exactly as intended by simulating real-world user interactions withtest datain avalidated test environment.
Data collection prior to UAT completion would violate validation requirements and risk noncompliance withICH E6 (R2) GCP Section 5.5.3, which mandates that all computerized systems be validated and tested before use.
While options A and C describe correct components of testing strategy,the key regulatory requirementis thatUAT must be completed and approved before live data entry begins. Option D is incorrect — risk-based UAT is an accepted modern validation approach under bothFDA and GAMP5principles.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Database Design and Validation, Section 5.3 – User Acceptance Testing
FDA 21 CFR Part 11 – Validation of Electronic Systems (Section 11.10(a))
ICH E6 (R2) GCP, Section 5.5.3 – Validation Before Use in Production Environment
The result set from the query below would be which of the following?
SELECT * FROM patient WHERE medical_record_number > 9000
Longer than the patient table
Shorter or of equal length than the patient table
Narrower than the patient table
Wider than the patient table
In Structured Query Language (SQL), theWHEREclause is used to filter records based on specified criteria. The query retrievesall columnsfrom thepatienttable (SELECT *) but onlythose rowswhere themedical_record_numbervalue is greater than 9000.
This means:
Thenumber of columns (fields)remains the same as the original table.
Thenumber of rows (records)will beequal to or less thanthe number of rows in thepatienttable, depending on how many patients meet the filter condition.
Hence, the result set can only beshorter or equal in lengthcompared to the original table. It cannot be longer, wider, or narrower, since no new rows or columns are created.
Therefore,option B—“Shorter or of equal length than the patient table”— is correct.
Reference (CCDM-Verified Sources):
SCDM GCDMP, Chapter: Database Design and Build, Section 5.2 – Relational Database Queries and Filtering Logic
ICH E6(R2) GCP, Section 5.5.3 – Data Retrieval, Filtering, and Storage Principles
FDA Guidance for Industry: Computerized Systems Used in Clinical Investigations, Section 6.4 – Query Logic and Record Subsetting
At a cross-functional study team meeting, a statistician suggests collecting blood gases electronically through the existing continuous hemodynamic monitoring system at sites rather than having a person record the values every five minutes during the study procedure. Assuming that sending, receiving, and integrating these data are possible, what is the best response?
Manual recording is preferred because healthcare devices are not validated to 21 CFR Part 11 standards
Manual recording is preferred because the sites may forget to turn on the machine and lose data
Electronic acquisition is preferable because more data points can be acquired
Electronic acquisition is preferable because the chance for human error is removed
Assuming the data transfer, integration, and validation processes are properly controlled and compliant,electronic acquisitionof clinical data from medical devices is preferred because it allowsmore frequent and accurate data collection, leading to higher data resolution and integrity.
Per theGCDMP (Chapter: Technology and Data Integration), automated data collection minimizes manual transcription and reduces latency in data capture, ensuring both efficiency and completeness. While manual processes introduce human transcription errors and limit frequency, continuous electronic data capture can record thousands of accurate, time-stamped measurements, improving the study’s analytical power.
However,option Dslightly overstates the case — human error isreduced, not entirely eliminated, since setup, calibration, and integration still involve human oversight. Therefore,option Cis the best and most precise response, emphasizing the advantage of more robust and complete data capture.
Reference (CCDM-Verified Sources):
SCDM GCDMP, Chapter: Technology and Data Integration, Section 5.4 – Automated Data Acquisition and Validation
ICH E6(R2) GCP, Section 5.5.3 – Validation of Computerized Systems and Electronic Data Sources
FDA Guidance for Industry: Computerized Systems Used in Clinical Investigations, Section 6.3 – Direct Data Capture from Instruments and Devices
Which metric reveals the timeliness of the site-work dimension of site performance?
Time from Last Patient Last Visit to database lock
Time from final protocol to first patient enrolled
Time from site contract execution to first patient enrolled
Median and range of time from query generation to resolution
Thesite-work dimension of site performanceevaluates how efficiently sites manage and resolve data-related tasks — particularly query resolution, data entry, and correction timelines. Among the given metrics, themedian and range of time from query generation to resolution (D)directly measures the site’s responsiveness and data management efficiency.
According to theGCDMP (Chapter on Metrics and Performance Measurement), this indicator helps identify sites that delay query resolution, which can impact overall study timelines and data quality. Tracking this metric allows the data management team to proactively provide additional training or communication to underperforming sites.
Other options measure different aspects of project progress:
Areflects overall database closure speed.
BandCrelate to study startup and enrollment readiness, not ongoing data work.
Thus,option Daccurately represents asite performance timeliness metric, aligning with CCDM principles for operational performance measurement.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Metrics and Performance Management, Section 5.4 – Site Query Resolution Metrics
ICH E6(R2) Good Clinical Practice, Section 5.18 – Monitoring and Site Performance Oversight
Which document contains the details of when, to whom, and in what manner the vendor data will be sent?
Project Plan
Communication Plan
Data Transfer Agreement
Data Management Plan
AData Transfer Agreement (DTA)defines the operational and technical details for transferring data between a sponsor and an external vendor (e.g., central lab, ECG vendor). It is a formalized, controlled document specifyingwhat data will be sent, when transfers will occur, the transfer method, file structure, encryption or security protocols, and the recipients of the data.
The DTA is developed jointly by the sponsor and vendor before production data transfers begin. According to theGCDMP, Chapter on External Data Transfers, this agreement ensures both parties share a clear understanding of timing, responsibility, and data content to minimize errors and ensure regulatory compliance.
TheData Management Plan (DMP)outlines general data handling processes but does not capture the technical specifics of vendor data transfer logistics. TheProject Plan (A)andCommunication Plan (B)are broader operational tools and not specific to data transfer protocols.
Hence,option C (Data Transfer Agreement)is the correct answer, as it precisely governs the procedural and technical framework of vendor data exchange.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: External Data Transfers, Section 4.1 – Data Transfer Agreements and Specifications
ICH E6(R2) Good Clinical Practice, Section 5.5 – Trial Management, Data Handling, and Record Keeping
Electronic submission standards require that an individual subject's complete CRF should be provided as what type of file:
Portable Document Format (.pdf)
Rich Text Format (.rtf)
Microsoft Word (.docx)
Statistical Analysis System (.sas)
Electronic submission standards, as established byFDA, CDISC, and ICH, require that an individual subject’scomplete Case Report Form (CRF)be submitted as aPortable Document Format (.pdf)file. ThePDF formatis universally recognized and accepted because it ensures that the structure, format, and visual fidelity of the CRF are preserved exactly as originally designed, regardless of software or hardware environment.
According to theFDA Guidance for Industry: Providing Regulatory Submissions in Electronic Format (2006)andCDISC SDTM standards, sponsors must include asubject-level CRFin PDF form for each participant in the submission dataset. This requirement ensures that reviewers can trace data points from analysis datasets back to their source entries in the CRF, fulfilling the principles ofdata traceability and transparency.
TheGood Clinical Data Management Practices (GCDMP)also support this requirement, emphasizing that CRF archiving should maintain readability and regulatory accessibility. Formats like RTF, DOCX, or SAS datasets are not acceptable substitutes for regulatory CRF submission because they may alter formatting, structure, or introduce modifiable content, violating FDA data integrity principles.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Data Archiving and Submission
FDA Guidance for Industry: Providing Regulatory Submissions in Electronic Format, April 2006
CDISC SDTM Implementation Guide, Section 5.3 – CRF Representation and Traceability
Which mode of data entry is most commonly used in EDC systems?
Double entry
Blind verification
Single entry
Third party compare
Themost common mode of data entryinElectronic Data Capture (EDC)systems issingle data entry.
According to theGCDMP (Chapter: Electronic Data Capture Systems), EDC systems have built-inedit checks, validation rules, and audit trailsthat ensure data accuracy and integrity at the point of entry. These real-time validation capabilities makedouble data entry(a legacy practice from paper studies) unnecessary.
EDC systems automatically verify data as they are entered by site staff, generating queries for inconsistencies or out-of-range values immediately.Blind verification (option B)andthird-party comparisons (option D)are not standard data entry modes but may be used for specialized reconciliation or external data imports.
Thus,single data entry (Option C)is the industry standard approach, ensuring both efficiency and compliance withFDA 21 CFR Part 11andICH E6 (R2)data integrity requirements.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Electronic Data Capture (EDC) Systems, Section 5.4 – Data Entry and Verification Processes
ICH E6 (R2) Good Clinical Practice, Section 5.5.3 – Computerized Systems and Data Validation
FDA 21 CFR Part 11 – Electronic Records and Electronic Signatures: Validation and Data Entry Requirements
Which competency is necessary for EDC system use in a study using the medical record as the source?
Screening study subjects
Using ePRO devices
Resolving discrepant data
Training on how to log into Medical Records system
In studies where themedical record serves as the source document, theElectronic Data Capture (EDC)system users (typically study coordinators or site personnel) must have appropriatetraining on how to access and log into the medical record system. This competency ensures that data abstracted from the electronic medical record (EMR) are complete, accurate, and verifiable in compliance with Good Clinical Practice (GCP) andGood Clinical Data Management Practices (GCDMP).
According to theGCDMP (Chapter: EDC Systems and Data Capture)andICH E6(R2), all personnel involved in data entry and verification must be trained in both the EDC and the primary source systems (e.g., EMR). This ensures that the integrity of data flow—from source to EDC—is maintained, and that personnel understand system access controls, audit trails, and proper documentation of source verification.
Whileresolving discrepant data (C)andscreening subjects (A)are part of study operations, thecompetency directly related to EDC system use in EMR-based studiesis the ability to properly log into and navigate the medical records system to extract source data.
Reference (CCDM-Verified Sources):
SCDM GCDMP, Chapter: Electronic Data Capture (EDC), Section 5.1 – Source Data and System Access Requirements
ICH E6(R2) Good Clinical Practice, Section 4.9 – Source Documents and Data Handling
FDA Guidance: Use of Electronic Health Record Data in Clinical Investigations, Section 3 – Investigator Responsibilities
Data characterizing the safety profile of a drug are collected to provide information for which of the following?
Survival curves
Efficacy meta-analyses
Product labeling
Quality of life calculations
Safety datacollected during a clinical trial are used primarily to supportproduct labeling, ensuring accurate communication of a drug’srisks, contraindications, and adverse reactionsto healthcare providers and patients.
According to theGCDMP (Chapter: Safety Data Handling and Reconciliation)andICH E2A/E2Fguidelines, alladverse events (AEs), serious adverse events (SAEs), and laboratory abnormalitiesare analyzed and summarized to define the safety profile of an investigational product. These data form the basis for regulatory submissions such as theClinical Study Report (CSR)andproduct labeling (e.g., prescribing information), as required by theFDAand other regulatory authorities.
While safety data may contribute indirectly to analyses such as survival curves (option A) or quality of life metrics (option D), theirprimary regulatory functionis to informproduct labelingand post-marketing surveillance documentation.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Safety Data Handling and Reconciliation, Section 4.3 – Use of Safety Data in Regulatory Submissions
ICH E2A – Clinical Safety Data Management: Definitions and Standards for Expedited Reporting
FDA Guidance for Industry: Adverse Event Reporting and Labeling Requirements
Which of the following laboratory findings is a valid adverse event reported term that facilitates auto coding?
Elevated HDL
ALT
Abnormal SGOT
Increased alkaline phosphatase, increased SGPT, increased SGOT, and elevated LDH
When coding adverse events (AEs) usingMedDRA (Medical Dictionary for Regulatory Activities), valid AE terms must correspond to specific, medically meaningful concepts thatmatch directly to a Preferred Term (PT)orLowest Level Term (LLT)in the dictionary.
Among the options,“Elevated HDL”(High-Density Lipoprotein) represents a single, medically interpretable, and standard term that can directly match to a MedDRA LLT or PT. This makes it suitable forauto-coding, where the system automatically maps verbatim terms to MedDRA entries without manual intervention.
In contrast:
ALT (B)andAbnormal SGOT (C)are incomplete or nonspecific; they describe test names or qualitative interpretations rather than events.
Option Dlists multiple findings, making it too complex for automatic mapping. Such compound entries would requiremanual coding review.
According toGCDMP (Chapter: Medical Coding and Dictionaries), a valid AE term should be:
Clinically interpretable(not just a lab test name)
Unambiguous
Single-concept based, not a collection of results
Thus,option A (Elevated HDL)is correct, as it aligns with MedDRA’s single-concept, standard terminology structure suitable for auto-coding.
Reference (CCDM-Verified Sources):
SCDM GCDMP, Chapter: Medical Coding and Dictionaries, Section 5.3 – Auto-coding and Verbatim Term Management
ICH M1 MedDRA Term Selection: Points to Consider, Section 2.1 – Coding Principles
ICH E2B(R3) – Clinical Safety Data Management: Data Elements for Transmission of Individual Case Safety Reports
Which information should an auditee expect prior to an audit?
Auditor's credentials and certification number
Corrective action requests
Standard operating procedures
Audit plan or agenda
Prior to an audit, theauditeeshould expect to receive anaudit plan or agenda, which outlines thescope, objectives, schedule, and logisticsof the audit.
According to theGCDMP (Chapter: Quality Assurance and Audits), anaudit planensures transparency, preparation, and efficient execution. It typically includes details such as:
The audit scope and objectives,
The audit team members,
Documents or processes to be reviewed, and
The audit schedule and timeframe.
This allows the auditee to prepare the necessary records, staff, and facilities. While the auditor’s credentials (option A) may be shared informally, they are not a regulatory requirement.Corrective actions (option B)are outcomes of the audit, not pre-audit materials.Standard Operating Procedures (option C)may be requested during the audit but are not provided in advance.
Thus,Option D – Audit Plan or Agenda– is the correct and compliant answer.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Quality Assurance and Audits, Section 6.1 – Pre-Audit Planning and Communication
ICH E6 (R2) Good Clinical Practice, Section 5.19.3 – Audit Procedures and Responsibilities
FDA Guidance for Industry: Computerized Systems Used in Clinical Investigations – Section 8.1 – Audit Preparation and Planning
Which data are needed to monitor site variability in eligibility screening?
Number of sites with low enrollment
Number of subjects screened and number of subjects enrolled
Number of subjects enrolled
Number of sites with high enrollment
To monitorsite variability in eligibility screening, you must analyzethe number of subjects screenedversusthe number of subjects enrolledat each site. This allows identification of sites that are over- or under-screening relative to their enrollment yield.
TheGCDMP (Chapter: Data Quality Assurance and Metrics)emphasizes thatscreening-to-enrollment ratiosare critical indicators of protocol compliance and data quality. Sites with unusually low conversion rates may have unclear understanding of inclusion/exclusion criteria, requiring targeted training or monitoring.
Other options (A, C, D) provide enrollment metrics but do not revealscreening efficiency or variability, which depend on bothscreening and enrollment data.
Thus,option Bcorrectly identifies the data necessary for monitoring eligibility screening performance across sites.
Reference (CCDM-Verified Sources):
SCDM GCDMP, Chapter: Data Quality Assurance and Metrics, Section 5.4 – Site Performance Metrics
ICH E6(R2) GCP, Section 5.18 – Monitoring and Site Oversight Requirements
There is a modification to the CRF and a sudden increase in the number of queries generated in the EDC system. Which action is most likely to reduce the number of queries?
Make some of the existing edit checks manually
Introduce a source data verification process
Review the edit checks for correctness
Have the monitor close the queries
When aCRF modificationleads to a sudden increase inEDC queries, the most likely cause is an error or misconfiguration in theedit checksintroduced during or after the change. Therefore, the first step should be toreview the edit checks for correctness.
TheGCDMP (Chapter: Database Design and Validation)emphasizes that any database or CRF modification should triggerretesting of affected validation rules. Incorrect logic, thresholds, or missing conditional statements in automated edit checks can cause false or redundant queries, leading to unnecessary data management burden and site frustration.
Manually handling edit checks (option A) or adding SDV (option B) does not address the root cause. Having monitors close queries (option D) would mask the problem rather than resolve it.
Thus, the correct corrective measure isOption C — review and validate the edit checksto ensure proper functionality.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Database Design and Validation, Section 5.5 – Edit Check Testing and Review
ICH E6 (R2) GCP, Section 5.5.3 – Validation and Change Control for Electronic Systems
FDA 21 CFR Part 11 – System Validation and Change Documentation
The result set from the query below would be which of the following?
SELECT Pt_ID, MRN, SSN FROM patient
Wider than the patient table
Shorter than the patient table
Longer than the patient table
Narrower than the patient table
In aSQL (Structured Query Language)database, theSELECTstatement specifies which columns to display from a table. In this query, only three columns —Pt_ID,MRN, andSSN— are being selected from thepatienttable.
This means the resulting dataset will contain:
The same number ofrows (records)as the original table (assuming noWHEREfilter), and
Fewer columnsthan the full table.
In database terminology:
“Wider” refers to more columns (fields).
“Narrower” refers to fewer columns (fields).
Since this query retrieves only 3 columns (out of potentially many in the original table), the result set isnarrower than the patient table, makingoption Dcorrect.
Reference (CCDM-Verified Sources):
SCDM GCDMP, Chapter: Database Design and Build, Section 5.1 – Relational Databases and Query Logic
ICH E6(R2) GCP, Section 5.5.3 – Data Retrieval and Integrity Principles
FDA Guidance for Industry: Computerized Systems Used in Clinical Investigations, Section 6.4 – Database Query Controls
ePRO data are collected for a study using study devices given to subjects. Which is the most appropriate quality control method for the data?
Programmed edit checks to detect out of range values after submission to the database
Manual review of data by the site study coordinator at the next visit
Data visualizations to look for site-to-site variation
Programmed edit checks to detect out of range values upon data entry
When electronic patient-reported outcomes (ePRO) devices are used, data are captured directly by subjects through validated devices and transmitted electronically to the study database. To ensurereal-time data quality control,programmed edit checksshould be implementedat the point of data entry— that is, as subjects input data into the device.
According toGood Clinical Data Management Practices (GCDMP, Chapter: Data Validation and Cleaning),front-end programmed edit checksare the optimal method to prevent entry of invalid or out-of-range values in ePRO systems. This helps maintain data accuracy at the source, minimizing downstream queries and data cleaning workload.
OptionsAandBinvolve post-submission or manual review, which is less efficient and not compliant with the principle offirst-pass data validation.Option C(visualization) is a valuable secondary QC method for trends, but not for immediate data validation.
Therefore,option Dis correct —programmed edit checks upon data entryensure immediate validation and higher data integrity.
Reference (CCDM-Verified Sources):
SCDM GCDMP, Chapter: Data Validation and Cleaning, Section 5.3 – Automated Edit Checks and Front-End Validation
ICH E6(R2) GCP, Section 5.5.3 – Computerized System Controls and Validation
FDA Guidance for Industry: Electronic Source Data in Clinical Investigations (2013), Section 6 – Real-Time Data Quality Control
QA is conducting an audit on a study for ophthalmology which is ready for lock. Inconsistencies are found between the database and the source. Of the identified fields containing potential data errors, which fields are considered critical for this particular study?
Subject Identifier
Concomitant Medications
Weight
Medical History
In anophthalmology clinical study, data criticality is determined by how directly a data element affectssafety evaluation,efficacy assessment, andregulatory decision-making. According to theGood Clinical Data Management Practices (GCDMP, Chapter on Data Validation and Cleaning), critical data fields are those that:
Have a direct impact on theprimary and secondary endpoints, or
Are essential forsafety interpretation and adverse event causality assessment.
Among the listed options,Concomitant Medications (Option B)are consideredcritical datafor ophthalmology studies. This is because many ocular treatments and investigational products can interact with systemic or topical medications, potentially affectingocular response,intraocular pressure,corneal healing, orvisual function outcomes. Any inconsistency in concomitant medication data could directly influencesafety conclusionsorefficacy interpretations.
Other options, while important, are less critical for this study type:
Subject Identifier (A)is essential for data traceability and audit purposes but is not directly related to safety or efficacy outcomes.
Weight (C)may be relevant in dose-dependent drug trials but is rarely a pivotal variable in ophthalmology, where local administration (eye drops, intraocular injections) is common.
Medical History (D)provides contextual background but does not have the same immediate impact on endpoint analysis as current concomitant treatments that can confound the therapeutic effect or cause ocular adverse events.
PerGCDMPandICH E6 (R2) GCPguidelines, data validation plans must definecritical data fieldsduring study setup, reflecting therapeutic area–specific priorities. For ophthalmology,concomitant medications, ocular assessments (visual acuity, intraocular pressure, retinal thickness, etc.), and adverse eventsare typically designated as critical fields requiring heightened validation, source verification, and reconciliation accuracy before database lock.
Thus, when QA identifies discrepancies between the CRF and source, theConcomitant Medications field (Option B)is the most critical to address immediately to ensure clinical and regulatory data integrity.
Reference (CCDM-Verified Sources):
Society for Clinical Data Management (SCDM), Good Clinical Data Management Practices (GCDMP), Chapter: Data Validation and Cleaning, Section 6.4 – Critical Data Fields and Data Validation Prioritization
ICH E6 (R2) Good Clinical Practice, Section 5.18 – Monitoring and Source Data Verification
FDA Guidance for Industry: Oversight of Clinical Investigations — A Risk-Based Approach to Monitoring, Section 5.3 – Identification of Critical Data and Processes
SCDM GCDMP Chapter: Data Quality Assurance and Control – Therapeutic Area–Specific Data Criticality Examples (Ophthalmology Studies)
According to ICH E6, developing a Monitoring Plan is the responsibility of whom?
Sponsor
CRO
Data Manager
Monitor
According toICH E6(R2) Good Clinical Practice (GCP), Section 5.18.1, theSponsoris ultimatelyresponsible for developing and implementing the Monitoring Plan.
The Monitoring Plan defines:
Theextent and nature of monitoring(e.g., on-site, remote, risk-based).
Theresponsibilities of monitors.
Thecommunication and escalation proceduresfor data quality and protocol compliance.
While theCRO (B)orMonitor (D)may perform monitoring activities under delegation, theSponsorretains legal accountability for ensuring a compliant and effective plan is developed and maintained. TheData Manager (C)may contribute by outlining data review workflows, but is not responsible for authoring or owning the plan.
Therefore,option A (Sponsor)is the correct answer.
Reference (CCDM-Verified Sources):
ICH E6(R2) GCP, Section 5.18.1 – Purpose and Responsibilities for Monitoring
SCDM GCDMP, Chapter: Regulatory Compliance and Oversight, Section 5.3 – Sponsor Responsibilities in Monitoring and Quality Assurance
FDA Guidance for Industry: Oversight of Clinical Investigations – Sponsor Responsibilities (2013)
Which is the best reason why front-end checks are usually kept minimal, when compared to back-end checks, in a paper-based clinical study?
Data entry staff should be able to enter a value into the database just as it appears in the paper CRF
There is no need to alert the site personnel immediately about a data issue, as the study has happened already
There are approvals required to raise a Data Clarification Form which could take time
Data review can be performed at a later time due to the paper-based studies being smaller in size
Inpaper-based clinical studies,front-end data checks(those performed during data entry) are intentionally kept minimal to ensure thatdata are entered exactly as recorded on the paper CRF. This principle ensuresdata integrityby maintaining fidelity between source and electronic records before any cleaning or edit validation occurs.
TheGCDMP (Chapter: Data Validation and Cleaning)explains that data entry operators should input values as written, even if they appear incorrect or inconsistent, because the purpose of front-end checks is not to interpret but to capture data faithfully. Theback-end edit checks—performed later by data managers—are designed to identify inconsistencies, out-of-range values, or logical errors that require clarification through queries.
This approach separatesdata capturefromdata cleaning, minimizing bias and preserving original investigator input. Hence,option Aaccurately states the rationale for keeping front-end checks minimal in paper-based studies.
Reference (CCDM-Verified Sources):
SCDM GCDMP, Chapter: Data Validation and Cleaning, Section 4.2 – Data Entry, Edit Checks, and Query Process
ICH E6(R2) GCP, Section 5.5.3 – Data Handling and System Controls
FDA Guidance for Industry: Computerized Systems Used in Clinical Investigations, Section 6.1 – Data Entry and Verification Processes
When reviewing local lab data from a paper study, a Data Manager notices there are lab values not entered. What should the Data Manager request data-entry personnel do?
Flag the module for review
Call the patient to verify the information
Issue a query
Nothing
Whenlaboratory dataare missing from a paper-based clinical study, theData Managershould directdata-entry personnel to issue a queryto the investigative site for clarification or correction.
According to theGood Clinical Data Management Practices (GCDMP, Chapter: Data Validation and Cleaning), every missing, inconsistent, or out-of-range data point must be reviewed and, if necessary, resolved through the formalquery management process. This ensures that all discrepancies between the source documents and database entries are properly documented, traceable, and auditable.
Data-entry staff arenot authorizedto infer or fill in missing information. They must escalate such discrepancies to the site via query, preservingdata integrityandregulatory compliancewithICH E6 (R2)andFDA 21 CFR Part 11. Calling the patient directly (option B) would violate confidentiality and site communication protocol, while simply flagging or ignoring the issue (options A and D) would not meet GCDMP query resolution standards.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Data Validation and Cleaning, Section 5.2 – Query Management and Resolution
ICH E6 (R2) Good Clinical Practice, Section 5.18.4 – Communication of Data Discrepancies
FDA 21 CFR Part 11 – Electronic Records; Query Audit Trails Requirements
During a database audit, it was determined that there were more errors than expected. Who is responsible for assessing the overall impact on the analysis of the data?
Data Manager
Statistician
Quality Auditor
Investigator
TheStatisticianis responsible for assessing theoverall impact of data errors on the analysis and study results.
According to theGood Clinical Data Management Practices (GCDMP, Chapter: Data Quality Assurance and Control)andICH E9 (Statistical Principles for Clinical Trials), while theData Managerensures data accuracy and completeness through cleaning and validation, theStatisticiandetermines whether the observed data discrepancies are statistically significant or if they may affect thevalidity, power, or interpretabilityof the study’s outcomes.
TheQuality Auditor (C)identifies and reports issues but does not quantify analytical impact. TheInvestigator (D)is responsible for clinical oversight, not statistical assessment. Thus, after a database audit, theStatistician (B)performs a formal evaluation to determine whether the magnitude and nature of the errors could bias results or require reanalysis.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Data Quality Assurance and Control, Section 7.3 – Data Audit and Impact Assessment
ICH E9 – Statistical Principles for Clinical Trials, Section 3.2 – Data Quality and Analysis Impact Assessment
FDA Guidance for Industry: Computerized Systems Used in Clinical Investigations – Data Validation and Analysis Review
A study is using blood pressure as an efficacy measure. Which is the best way to collect the data?
Collecting the data from the medical record
Measurement using existing equipment at sites
Measurement using study-provisioned equipment
Asking the study subjects what their blood pressure usually runs
When a clinical study usesblood pressure (BP)as anefficacy endpoint, the most reliable and standardized method of data collection is throughstudy-provisioned equipment.
According to theGCDMP (Chapter: CRF Design and Data Collection), data collected for primary efficacy endpoints must beconsistent, accurate, and standardized across all investigative sites. Usingstudy-provided calibrated equipmentensures that measurements are taken under uniform conditions, eliminating inter-site variability due to differences in devices, calibration, or measurement methods.
Collecting BP data from medical records (option A) risks inconsistent timing and techniques. Using each site’s own equipment (option B) introduces variability, while patient self-reports (option D) lack reliability and objectivity.
Thus, thebest practiceis to provision and standardize all equipment used to collect endpoint-related physiological data, ensuring regulatory-quality results suitable for analysis.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: CRF Design and Data Collection, Section 5.1 – Standardization of Clinical Measurements
ICH E6 (R2) GCP, Section 5.5.3 – Data Accuracy and Equipment Standardization
FDA Guidance for Industry: Electronic Source Data in Clinical Investigations, Section 4.3 – Data Capture and Standardization Requirements
A Data Manager is establishing a timeline for database lock for a 100-person study where the data have been maintained almost all clean throughout the study. All data from external labs have been received and reconciled. Which is the best estimate of the amount of time needed to lock the database after Last Patient Last Visit?
A few hours
A few days
A few months
A few weeks
For a well-maintained 100-subject study withongoing data cleaningandcompleted reconciliations, thedatabase lockprocess typically takesa few daysafter theLast Patient Last Visit (LPLV).
According to theGCDMP (Chapter: Database Lock and Archiving), the duration of the lock process depends on the level of data cleanliness at LPLV. If the study team has conducted continuous data cleaning, query resolution, and external data reconciliation throughout the trial, then the final lock steps (e.g., final data review, documentation, and approvals) can be completed in2–5 days.
However, if significant cleaning or reconciliation remains outstanding, lock may take several weeks. Since the question states that data are “maintained almost all clean,”Option B – a few days– is the appropriate estimate.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Database Lock and Archiving, Section 6.2 – Database Lock Preparation and Timelines
ICH E6 (R2) Good Clinical Practice, Section 5.5.3 – Data Quality and Lock Procedures
FDA Guidance for Industry: Computerized Systems Used in Clinical Investigations – Data Lock and Archiving Procedures
Which document describes what study subjects expect with respect to data disclosure during and after a study?
Study data sharing plan
ICH essential documents
Informed consent form
Study protocol
TheInformed Consent Form (ICF)is the document that explicitly describes what study subjects can expect regardingdata disclosure, privacy, and confidentialityduring and after participation in a clinical trial. According toICH E6 (R2) Good Clinical PracticeandFDA Human Subject Protection Regulations (21 CFR Parts 50 and 56), participants must be fully informed about how their personal and clinical data will be collected, used, stored, and shared — both during the study and in any subsequent data-sharing or publication activities.
TheGCDMPreiterates that clinical data managers must ensure that all data handling practices align with the privacy commitments made in the ICF. This includes compliance withdata protection regulationssuch as HIPAA (in the U.S.) and GDPR (in the EU). The ICF defines the permissible scope of data use, ensuring ethical management and subject protection.
Documents like theprotocolordata sharing planmay outline procedures and responsibilities but do not directly inform participants of their rights and data use expectations. Only theICFis designed for that ethical communication purpose.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Ethics, Privacy, and Data Security
ICH E6 (R2) Good Clinical Practice, Sections 4.8.10 & 4.8.12
FDA 21 CFR Part 50 – Protection of Human Subjects, Informed Consent Requirements
Copyright © 2014-2025 Certensure. All Rights Reserved