Addressing Misconceptions About Validation: The True Meaning of “Validated for Intended Use”
Introduction
As a consultant who frequently participates in FDA inspections, I have observed an increasing trend in Form FDA 483 observations related to computerized systems. One of the most common citations states that systems are “not validated for their intended use.” This article examines why such observations are issued and clarifies the fundamental concept of validation that is often misunderstood in the healthcare industry.
The FDA Definition of Validation
The FDA defines validation as “confirmation by examination and provision of objective evidence that the particular requirements for a specific intended use can be consistently fulfilled.” This definition, found in 21 CFR Part 11 and related guidance documents, emphasizes a critical point: validation is fundamentally about confirming alignment between user requirements and system specifications.
In Japanese regulatory terminology, this concept is expressed as “妥当性の確認” (confirmation of adequacy or fitness for purpose), which accurately captures the essence of validation—verifying that a system is suitable for its intended purpose.
| Key Validation Concepts | Description |
| User Requirements | The specific needs and intentions of the system user, documented in a User Requirements Specification (URS) |
| System Specifications | The functional capabilities and features that the system provides |
| Validation Objective | Confirming that system specifications consistently meet user requirements |
| Regulatory Expectation | Documented evidence that the system performs as intended throughout its lifecycle |
The IT Industry vs. Healthcare Regulatory Perspectives
In the general IT industry, “validation” typically refers to software testing—repeatedly testing systems to eliminate bugs and ensure technical functionality. However, regardless of how many times a system is tested, if the system specifications do not align with the intended use, the system cannot be considered validated.
To illustrate this critical distinction, consider a surgical scenario: if a scalpel is unavailable during surgery and someone purchases the highest-quality knife from the finest cutlery shop, that knife—despite being excellent—would be unsuitable for the “user’s intended use” of performing surgery. Using such a knife would pose obvious risks to patient safety. No amount of quality testing of the knife itself would make it appropriate for surgery.
In healthcare regulations governing pharmaceuticals and medical devices, there is a strict requirement that user requirements and system specifications must be completely aligned. The system must not only function correctly (verification) but must also be fit for its intended purpose (validation).
Common Examples of “Not Validated for Intended Use”
Recent FDA inspections have identified several recurring patterns where systems fail to meet the “validated for intended use” requirement:
Example 1: Inflexible Status Management in Quality Databases
A complaint and CAPA (Corrective and Preventive Action) database system does not allow sequential status updates. However, the company’s Standard Operating Procedure (SOP) requires that records be updated as investigations progress. This mismatch between system capability and procedural requirement represents a validation failure.
Regulatory Impact: This deficiency can result in inadequate investigation tracking and failure to meet 21 CFR Part 820.100 (Corrective and Preventive Action) requirements, potentially leading to Form FDA 483 observations or Warning Letters.
Example 2: Inappropriate Access Rights Configuration
Quality Assurance personnel and auditors have editing privileges in the electronic quality management system (eQMS). However, the SOP explicitly prohibits Quality Assurance and audit personnel from modifying records to maintain objectivity and independence. This system configuration contradicts the intended use defined in the SOP.
Data Integrity Concern: This violates the ALCOA+ principles (Attributable, Legible, Contemporaneous, Original, Accurate, plus Complete, Consistent, Enduring, and Available), which are fundamental to data integrity and increasingly emphasized in FDA inspections since 2018.
Example 3: Missing Audit Trail Functionality
The laboratory information management system (LIMS) lacks the capability to review audit trails. In the current regulatory environment, where data integrity is paramount, the absence of comprehensive audit trail functionality makes the system inherently unsuitable for GxP use, regardless of its other capabilities.
Current Regulatory Focus: FDA’s 2024-2025 inspection trends show that data integrity and audit trail deficiencies remain among the top observations. The ability to review who did what, when, and why is now considered a non-negotiable requirement for computerized systems.
Example 4: Overwritable Analytical Results
Analytical instrument software allows test results to be overwritten without maintaining a complete history. The SOP prohibits result modification, but the system design permits it. This fundamental mismatch poses serious compliance and data integrity risks.
21 CFR Part 11 Violation: This configuration violates 21 CFR Part 11.10(e), which requires “use of secure, computer-generated, time-stamped audit trails to independently record the date and time of operator entries and actions that create, modify, or delete electronic records.”
Industry Misconceptions and Vendor Practices
Many software vendors market their products as “CSV-compliant” or claim to provide “validated systems.” However, most vendors fundamentally misunderstand validation requirements. They often focus solely on Installation Qualification (IQ), Operational Qualification (OQ), and Performance Qualification (PQ) testing—essentially verifying that the software functions as designed—without addressing whether those functions actually meet the user’s intended use.
This vendor-centric approach to validation is insufficient and often leads to FDA observations during inspections. True validation requires:
- User-Centric Perspective: Starting with documented user requirements that reflect the actual business process and regulatory obligations
- Traceability: Demonstrating clear links between user requirements, functional specifications, design specifications, and test cases
- Risk-Based Approach: Applying validation rigor proportionate to the system’s impact on product quality, data integrity, and patient safety
- Lifecycle Management: Maintaining the validated state through change control, periodic review, and continuous monitoring
Contemporary Guidance: From CSV to CSA
The validation landscape has evolved significantly in recent years. The International Society for Pharmaceutical Engineering (ISPE) published GAMP 5 Second Edition in July 2022, which represents a paradigm shift from traditional Computer System Validation (CSV) to Computer Software Assurance (CSA).
Key Changes in GAMP 5 Second Edition
| Aspect | Traditional Approach (GAMP 5 1st Edition) | Modern Approach (GAMP 5 2nd Edition) |
| Development Model | Linear V-Model (waterfall) | Agile and iterative methodologies supported |
| Focus | Compliance and documentation | Patient safety, product quality, data integrity |
| Critical Thinking | Standardized approaches | Emphasis on critical, risk-based thinking |
| Legacy Documents | IQ/OQ/PQ protocols as standard | IQ/OQ/PQ less relevant; continuous validation emphasized |
| Technology Coverage | Traditional on-premises systems | Cloud, SaaS, AI/ML, blockchain addressed |
| Alignment | GAMP 4 hybrid | FDA Computer Software Assurance (September 2022) |
FDA Computer Software Assurance Guidance
In September 2022, the FDA published the draft guidance “Computer Software Assurance for Production and Quality System Software,” which promotes a more efficient, risk-based approach to software validation. The CSA approach encourages:
- Critical thinking over procedural compliance
- Least burdensome validation activities that are fit-for-purpose
- Unscripted testing for lower-risk software
- Leveraging vendor documentation where appropriate
- Focus on patient risk rather than documentation volume
This guidance explicitly states that the traditional extensive documentation and scripted testing may be excessive for lower-risk software, and encourages manufacturers to apply critical thinking to determine appropriate assurance activities.
Critique of Japan’s Computerized System Management Guideline
Japan’s “Guideline for Appropriate Management of Computerized Systems for Manufacturers of Pharmaceutical Products and Quasi-pharmaceutical Products” was issued on October 21, 2010, and became effective on April 1, 2012. While this guideline represents an important step forward for the Japanese pharmaceutical industry, it contains several problematic elements that warrant discussion.
Issue 1: Absence of Validation Definition
The guideline does not provide a clear definition of “validation.” Instead, it defines “verification activities” (検証業務) as “confirming that a computerized system is designed, installed, and demonstrates its functions and performance in accordance with the requirements specified in the requirement specifications, in the system’s operating environment and operating state.”
While this definition captures important aspects of system qualification, it conflates validation (confirming fitness for intended use) with verification (confirming correct implementation). This is a significant conceptual gap that can lead to misunderstanding in the industry.
Issue 2: Inappropriate Terminology for Software Validation
The guideline applies terminology from process validation and equipment qualification—specifically DQ (Design Qualification), IQ (Installation Qualification), OQ (Operational Qualification), and PQ (Performance Qualification)—to software validation. This is highly problematic and has created confusion in the Japanese pharmaceutical industry.
These qualification stages are appropriate for hardware equipment and manufacturing processes but are fundamentally different from software validation approaches. The key differences include:
| Characteristic | Equipment/Process Qualification | Software Validation |
| Nature of Subject | Physical equipment with measurable properties | Logic and algorithms without physical properties |
| Testing Approach | Measuring physical parameters (temperature, pressure, etc.) | Testing functional behavior, data flow, calculations |
| Qualification Stages | DQ → IQ → OQ → PQ (sequential, hardware-focused) | Requirements → Design → Build → Test (iterative, logic-focused) |
| Change Frequency | Relatively stable after installation | Frequent updates, patches, and version changes |
| Industry Standards | ISO 9000, GMP Annex 15 | GAMP 5, ISO/IEC 82304, IEC 62304, FDA CSA |
Issue 3: International Divergence
To my knowledge, no regulatory authority outside Japan uses DQ, IQ, OQ, and PQ terminology as regulatory requirements for Computer System Validation. International standards and guidelines focus on different concepts:
FDA (United States):
- 21 CFR Part 11 (Electronic Records and Electronic Signatures)
- FDA Guidance on General Principles of Software Validation (2002)
- FDA Computer Software Assurance Guidance (Draft, 2022)
- Emphasis on risk-based approach and software lifecycle management
European Union:
- EudraLex Volume 4, Annex 11: Computerised Systems (revised 2011)
- PIC/S Good Practices for Computerised Systems in Regulated “GXP” Environments (PI 011-3, 2007)
- Focus on risk management and data integrity
International Standards:
- ISPE GAMP 5 Second Edition (2022): The de facto industry standard, with no mandate for traditional IQ/OQ/PQ documentation
- ISO 13485:2016 (Medical Devices): Requires software validation but does not prescribe IQ/OQ/PQ
- IEC 62304 (Medical Device Software Lifecycle Processes)
- ISO/IEC 82304-2 (Health Software Product Safety)
This international divergence creates challenges for Japanese pharmaceutical companies operating globally, as they must reconcile domestic guidance with international expectations.
Issue 4: Mixing CSV and Process Validation Concepts
The Japanese guideline conflates computerized system validation with process validation, leading to conceptual confusion. While both are types of validation, they have different objectives, methodologies, and regulatory frameworks:
Computer System Validation focuses on ensuring that the software and hardware reliably perform their intended functions and maintain data integrity throughout the system lifecycle.
Process Validation demonstrates that a manufacturing process consistently produces a product meeting predetermined specifications and quality attributes.
These are complementary but distinct activities that should not be interchanged or confused in regulatory guidance.
Modern Validation Approach: Aligning with International Best Practices
Given the limitations of traditional approaches and the evolution of international guidance, organizations should adopt a modernized validation strategy that emphasizes:
1. Risk-Based Validation
Apply validation rigor proportionate to the system’s impact on product quality, data integrity, and patient safety. Lower-risk systems require less extensive validation than high-risk systems. The GAMP 5 Second Edition software categorization (Category 1: Infrastructure Software, Category 3: Non-configured Products, Category 4: Configured Products, Category 5: Custom Applications) provides a framework for risk-based approaches.
2. Agile and Iterative Methodologies
Embrace modern software development practices including Agile, DevOps, and continuous integration/continuous deployment (CI/CD), with appropriate controls and documentation. Validation activities can be integrated into development sprints rather than conducted as a separate phase.
3. Critical Thinking and Scientific Judgment
Move beyond checkbox compliance to thoughtful evaluation of what validation activities genuinely assure product quality and patient safety. Question whether traditional documents add value or merely create documentation burden.
4. Data Integrity by Design
Build ALCOA+ principles into system design from the outset rather than attempting to retrofit controls later. Ensure that systems inherently prevent unauthorized changes, maintain complete audit trails, and support data review and investigation.
5. Leveraging Supplier Assurance
For commercial off-the-shelf (COTS) software and Software as a Service (SaaS) platforms, appropriately leverage vendor documentation, certifications, and testing while focusing user validation on configuration, interfaces, and intended use.
6. Continuous Validation and Monitoring
Shift from periodic re-validation to continuous system performance monitoring, trend analysis, and risk-based assessment. Utilize automated monitoring tools where feasible.
Recommendations for Industry Practitioners
To avoid “not validated for intended use” observations and build truly compliant computerized systems:
For Quality Assurance and Validation Professionals:
- Always start validation with clearly documented User Requirements that reflect actual business processes and regulatory obligations
- Ensure complete traceability from user requirements through functional specifications, design, configuration, and testing
- Verify that system capabilities align with SOP requirements before deployment
- Implement robust change control that assesses impact on the validated state
- Conduct periodic reviews to confirm continued fitness for intended use
- Stay current with evolving guidance (GAMP 5 Second Edition, FDA CSA) rather than relying solely on outdated approaches
For System Implementers and IT Professionals:
- Engage quality and validation teams early in the system selection and design process
- Understand that “the vendor validated it” is not sufficient—user validation of intended use is required
- Configure systems to support, not contradict, SOPs and quality procedures
- Ensure audit trail and data integrity capabilities are built into system design
- Document system specifications clearly and maintain configuration management
- Establish effective communication with quality and regulatory teams to understand intended use requirements
For Management and Leadership:
- Allocate sufficient resources and time for proper validation activities
- Support critical thinking and risk-based approaches rather than demanding unnecessary documentation
- Foster collaboration between quality, IT, operations, and validation teams
- Invest in training on modern validation concepts and data integrity principles
- Ensure that vendor contracts clearly define validation responsibilities and documentation deliverables
- Establish metrics that measure validation effectiveness (not just completion of documents)
Addressing Common Validation Pitfalls
Pitfall 1: “The Vendor Validated It”
Many organizations mistakenly believe that if a vendor provides validation documentation (IQ/OQ/PQ protocols), the system is validated. However, vendor testing only verifies that the software functions as designed—it does not confirm that those functions meet your specific intended use. User validation, focusing on your requirements, configurations, and business processes, is always required.
Pitfall 2: Excessive Documentation Without Value
Creating hundreds of pages of validation documentation does not ensure that a system is fit for purpose. Focus validation efforts on activities that genuinely assess whether the system meets user requirements and supports data integrity. As GAMP 5 Second Edition emphasizes, “doing the right thing” is more important than “documenting everything.”
Pitfall 3: Validating the Wrong Thing
Organizations sometimes validate the software package itself rather than validating the configured system for their intended use. Validation should focus on how you have configured, customized, and will use the system in your specific environment, not on generic software features.
Pitfall 4: Ignoring the Human Element
Computerized systems include not only hardware and software but also the people who operate them and the procedures they follow. Validation must consider the entire system, including user training, SOPs, and organizational controls. A technically perfect system can still fail if users are inadequately trained or if procedures are not followed.
Pitfall 5: “One and Done” Validation
Validation is not a one-time activity. Systems must remain in a validated state throughout their lifecycle. This requires ongoing change control, periodic review, performance monitoring, and incident investigation. Organizations that treat validation as a checkbox activity at implementation often face significant compliance issues later.
Data Integrity: The Modern Validation Imperative
Since 2018, FDA inspections have placed increasing emphasis on data integrity, with Form FDA 483 observations frequently citing violations of ALCOA+ principles. Data integrity is now recognized as a fundamental component of system validation. A system that lacks adequate data integrity controls cannot be considered validated for GxP use, regardless of its functional capabilities.
ALCOA+ Principles and System Requirements
| Principle | System Requirement |
| Attributable | All actions must be linked to a specific individual; no shared logins; audit trails must record user identity |
| Legible | Data and records must be readable and understandable; clear presentation; appropriate retention throughout lifecycle |
| Contemporaneous | Data must be recorded at the time the activity is performed; timestamps must be accurate and synchronized |
| Original | Records must be in their original form or true copies; source data must be preserved |
| Accurate | Data must be correct, complete, and reflect actual observations or activities |
| Complete | All data must be available, including metadata, audit trails, and associated documentation |
| Consistent | Data must be internally consistent; timestamps must align with workflow; no unexplained discrepancies |
| Enduring | Records must be durable and preserved throughout the retention period; protected from loss or degradation |
| Available | Data must be readily retrievable throughout retention period; searchable and reportable |
Systems that cannot satisfy these principles for GxP-relevant data should be identified as unsuitable during the validation process, before they are placed into production use.
Conclusion: Validation as Fitness for Purpose
The fundamental principle of validation—confirming that a system is fit for its intended use—is often lost amid the mechanics of testing and documentation. Organizations and vendors must remember that validation is not merely about proving that software works; it is about demonstrating that the system, as configured and used, consistently fulfills the specific requirements for its intended purpose.
FDA observations regarding systems “not validated for intended use” stem from this fundamental misalignment. No amount of testing documentation can compensate for a system that lacks the functionality required by SOPs, fails to support data integrity principles, or allows practices that contradict regulatory requirements.
As the industry evolves toward Computer Software Assurance and adopts modern development methodologies, the core principle remains unchanged: validation must confirm fitness for intended use. By embracing critical thinking, risk-based approaches, and genuine alignment between system capabilities and user requirements, organizations can build compliant computerized systems that truly support product quality and patient safety.
Final Thoughts: The path forward requires industry practitioners to look beyond checkbox compliance and vendor claims, applying scientific judgment to ensure that every computerized system genuinely serves its intended purpose within the quality system. Only through this thoughtful, user-centric approach can we achieve the true objective of validation: protecting product quality and patient safety through reliable, fit-for-purpose systems.
About the Regulatory Landscape:
This article reflects best practices and regulatory expectations as of January 2026. Practitioners should monitor ongoing developments including:
- FDA’s finalization of Computer Software Assurance guidance
- Updates to international standards (ISO, IEC, ICH)
- Evolution of PIC/S and regional regulatory expectations
- Industry adoption of GAMP 5 Second Edition principles
- Emerging technologies (AI/ML, cloud computing, blockchain) and their validation requirements
Organizations are encouraged to maintain flexible validation programs that can adapt to evolving regulatory expectations while maintaining the fundamental principle of confirming fitness for intended use.
Comment