PIC/S Data Integrity Guidance Implementation (Part 3)

PIC/S Data Integrity Guidance Implementation (Part 3)

Data Criticality

The criticality of data varies depending on the degree to which the data influences decision-making processes. When assessing data criticality, the following considerations must be evaluated:

1. Assessment of Impact on Decision-Making

It is essential to clearly identify which decisions are influenced by specific data. For example, compared to routine warehouse cleaning records, product disposition (release approval) has significantly higher criticality as it directly impacts product quality and patient safety.

2. Impact on Product Quality and Safety

The extent to which data affects product quality and safety must be evaluated. For instance, in the manufacture of oral solid dosage forms (tablets), Active Pharmaceutical Ingredient (API) test data has a greater impact on product quality and safety than tablet appearance non-conformance data. Since API purity and content directly affect patient therapeutic outcomes, it should be assigned higher criticality.

Data Risk

Scope of Data Integrity Requirements

Data integrity requirements apply to all Good Manufacturing Practice (GMP) and Good Distribution Practice (GDP) data. This means that data completeness, accuracy, and reliability must be ensured at all stages of manufacturing, quality control, storage, and distribution.

Importance of Risk-Based Approach

Data criticality assessment is essential for prioritizing responses. The rationale for prioritization must be documented in accordance with ICH Q9(R1) “Quality Risk Management” principles. This approach allows organizations to focus limited resources on managing the most critical data integrity risks.

Assessment of Data Vulnerabilities

Data risk assessment must comprehensively consider the following vulnerabilities:

  • Unintentional Changes: Unexpected data modifications due to human error, system errors, or procedural deficiencies
  • Deletion: Accidental deletion or deletion due to improper data lifecycle management
  • Data Loss: Loss due to hardware failures, software defects, or security breaches (cyberattacks, unauthorized access, etc.)
  • Data Recreation: Inaccurate data created based on memory or assumptions after original data loss
  • Intentional Falsification: Planned data modification or fabrication with fraudulent intent

Ensuring Detectability

The ability to detect data integrity breaches or data anomalies is a critical element of an effective data governance system. Systems must be established to rapidly detect data anomalies or unauthorized changes through audit trails, periodic data reviews, exception reports, and system alert functions.

Data Recovery Capability

It is important to implement measures that ensure complete and timely data recovery in the event of disasters, system failures, or security incidents. This includes regular backups, Disaster Recovery Plans, Business Continuity Plans, and periodic testing of these plans.

Implementation of Control Measures

Control measures that prevent fraud and enhance visibility/detectability are critically important for risk mitigation. These include:

  • Preventive Controls: Access control, user privilege management, validation functions at data entry
  • Detective Controls: Audit trail reviews, data integrity checks, anomaly detection systems
  • Corrective Controls: Incident response procedures, data recovery processes, Root Cause Analysis (RCA)

Risk Assessment Focused on Business Processes

Process-Centric Approach

Risk assessment must focus on the overall business processes (manufacturing, quality control, packaging, storage, etc.) and comprehensively evaluate data flows and methods of data generation and processing, not just IT system functionality or technical complexity.

Considerations in Assessment

1. Process Complexity

  • Multi-stage Processes: Processes involving multiple steps increase data integrity risks at each stage
  • Data Transfer Between Systems: Appropriate controls are necessary to maintain data completeness and accuracy when transferring data between different systems
  • Complex Data Processing: Processes involving integration from multiple data sources, complex calculations, or data transformations increase error risks

2. Data Management Methods

Evaluate methods of generating, processing, storing, and archiving data, as well as functions implemented to ensure data quality and integrity. This includes:

  • Controls at Data Generation: Automatic recording, electronic signatures, timestamps
  • Validation During Data Processing: Calculation verification, data integrity checks, exception handling
  • Data Storage and Archiving: Appropriate storage, backup, long-term preservation assurance

3. Process Consistency and Variability

Data integrity risks differ depending on the characteristics of production processes and analytical testing:

  • Biological Products: Bioprocesses may exhibit high variability, requiring more stringent data monitoring and controls
  • Small Molecule Chemical Drugs: Often more reproducible processes, but appropriate data integrity controls are still necessary
  • Analytical Testing: Test method complexity, operator skill, and equipment condition affect result variability

4. Degree of Automation and Human Intervention

Highly automated systems can reduce human error risks but must be appropriately designed, validated, and maintained:

  • Fully Automated Systems: Minimize human intervention and reduce data integrity risks
  • Semi-automated Systems: Combinations of automation and manual operations require appropriate controls at each stage
  • Manual Processes: High risk of human error makes double-checking, training, and clear procedures critical

5. Subjectivity of Results

Consider the balance between objective measurements and subjective judgments:

  • Objective Data: Automatic measurements by instruments have low subjectivity
  • Subjective Evaluations: Assessments depending on operator judgment, such as visual inspection or color determination, require more stringent controls and training

6. Consistency Between Electronic Data and Manual Records

Comparing electronic system data with manually recorded events can detect data integrity issues. For example:

  • Timestamp Discrepancies: Inconsistencies between analytical report creation time and raw data acquisition time
  • Sequence Anomalies: When the sequence of recorded events is not logical
  • Missing Data: When expected data points are not recorded

7. Inherent Integrity Controls of Systems

Evaluate inherent data integrity control functions built into systems or software:

  • Audit Trail Functionality: Automatic recording of all data changes
  • Access Control: User authentication, privilege management, security settings
  • Data Validation: Input validity checks, range checks, calculation verification
  • Backup and Recovery: Automatic backup, disaster recovery functions

Special Considerations for Computerised Systems

When using computerised systems, the risk assessment process must carefully consider interfaces (manual interfaces) between IT systems and humans.

Effective Risk Mitigation Through Automation

Configuration settings that do not permit or minimize human intervention, creating fully automated and appropriately validated processes, can significantly reduce data integrity risks and represent a desirable approach. Such systems minimize risks of human error, intentional data falsification, and inappropriate data processing.

Latest Regulatory Trends and Best Practices

In July 2025, the European Medicines Agency (EMA) and PIC/S published draft revisions to EU GMP Annex 11 (Computerised Systems). This revision includes the following key elements:

  • Enhancement of Quality Risk Management (QRM): Thorough application of risk-based approaches throughout the system lifecycle
  • Emphasis on Data Integrity: Addition of data integrity requirements for “Data in Motion”
  • Cloud Services Response: Clarification of requirements for cloud-based service providers
  • Cybersecurity: Clear control requirements for firewalls, patch management, antivirus protection, and disaster recovery
  • AI/ML Systems: New Annex 22 introduces requirements for the use of artificial intelligence and machine learning in GMP environments

Additionally, draft revisions to EU GMP Chapter 4 (Documentation) were published in July 2025, emphasizing the following:

  • Integration of Risk Management Principles: Central integration of risk management into data governance systems
  • Response to Diverse Documentation Formats: Ensuring accuracy, completeness, availability, and legibility of all documents in paper, digital, and hybrid formats
  • Response to New Technologies: Requirements that all forms of documentation (text, images, video, audio) remain complete and readable throughout their lifecycle

Critical Thinking by Inspectors and Data Governance Maturity Assessment

Importance of Critical Thinking

Inspectors must use critical thinking skills to determine whether implemented controls and review procedures effectively achieve intended results. This includes:

  • Evaluating Control Effectiveness: Assessing not just whether controls exist, but whether they actually function
  • Confirming Procedural Effectiveness: Verifying that written procedures align with actual practices
  • Verifying Risk-Based Approaches: Confirming that risk assessments are appropriately conducted and controls are implemented based on results

Data Governance Maturity

Data governance maturity is indicated by the degree of systematic understanding and acceptance of residual risks when organizations prioritize actions. Organizations with mature data governance systems:

  • Clearly identify and document risks
  • Implement appropriate controls based on risks
  • Evaluate residual risks and determine whether they are at acceptable levels
  • Improve data integrity through continuous improvement activities

Dangers of “No Risk” Perception

Organizations claiming there is “no risk” of data integrity failures may not be appropriately evaluating risks inherent in data lifecycles. Inherent risks exist in all data management processes. A “no risk” attitude may suggest:

  • Lack of or inadequate risk assessment implementation
  • Insufficient understanding of data integrity importance
  • Organizational lack of quality culture
  • Existence of potential data integrity issues

Focus Areas During Inspections

Therefore, inspectors must carefully examine an organization’s approach to evaluating data lifecycle, criticality, and risks. This may indicate the existence of potential non-conformances that should be investigated during inspections.

Inspectors focus on the following:

  • Risk Assessment Documentation: Whether risk assessments are appropriately conducted and documented
  • Appropriateness of Controls: Whether appropriate controls are implemented for identified risks
  • Effectiveness of Data Review: Whether data review processes actually function and can detect anomalies
  • Continuous Improvement: When data integrity issues are discovered, whether they are appropriately addressed with recurrence prevention measures implemented
  • Quality Culture: Whether the importance of data integrity is understood and practiced throughout the organization

Relationship with ALCOA+ Principles

PIC/S PI 041-1 guidance adopts ALCOA+ principles as fundamental data integrity principles. ALCOA+ refers to the following nine attributes:

  • Attributable: Data can be identified as to who generated it and when
  • Legible: Data is readable throughout its lifecycle
  • Contemporaneous: Data is recorded at the time the activity is performed
  • Original: Data is retained in the format in which it was first recorded
  • Accurate: Data is accurate and free from errors
  • Complete: All data is recorded with no omissions
  • Consistent: Data is chronologically consistent with no contradictions
  • Enduring: Data is preserved throughout its lifecycle
  • Available: Data can be retrieved when needed

These principles provide core criteria in data criticality assessment, risk assessment, and control measure implementation explained in this column.

This article is based on PIC/S PI 041-1 “Good Practices for Data Management and Integrity in Regulated GMP/GDP Environments” (effective July 2021), reflecting the latest regulatory trends (including 2025 draft revisions to EU GMP Annex 11 and Chapter 4, and newly established Annex 22 AI requirements).

Related post

Comment

There are no comment yet.