Prospective Validation - Design and Code Phase
During this phase of the life-cycle, the design is documented, verified by a design review, and the system coding/configuration performed against pre-determined standards. Design specifications may take a variety of forms. Diagrams should be used where appropriate to assist readability. Where large systems are being developed it is permissible to split the design specifications into a number of separate documents. Where this approach is adopted, effective change control must be implemented to ensure that the effect of changing one component on others is fully assessed.
Contents of the design specification should be cross-referenced to the system and functional requirements to demonstrate traceability.
System Overview
There should be a single document (clear, concise, accurate and complete) describing the purpose and function of the system. The system overview should be written in non-technical language so that personnel not trained in computing can understand it. It should include diagrams indicating the physical layout of the system hardware, any automated and manual interfaces to other systems, inputs, outputs and main data flows.
System overviews have a similar content compared to business requirements. The system overview is aimed however for use during inspections, providing a summary of the system scope, hardware and key functions. Business requirements may contain business case and business strategy information that is not appropriate to GxP regulatory inspection.
The system overview should be reviewed and approved prior to use of the system.
Functional and Design Specification Functional specification documents are commonly used as the highest-level design document from which more detailed design documents are developed.
The functional specification provides a response to the URS. Functional specifications will typically address:
• All inputs that the computerised system will receive.
• All functions that the computerised system will perform.
• All outputs that the computerised system will produce.
• All performance requirements that the computerised system will meet
(e.g. data throughput, reliability, timing).
• The definition of internal, external and user interfaces.
• What constitutes an error and how errors should be managed.
• The intended operating environment for the computerised system (e.g. hardware platform, operating system, etc. if this is a design constraint).
• All ranges, limits, defaults and specific values that the computerised system will accept.
• Safety considerations where inclusion of this information is deemed appropriate.
Detailed design specifications document the equipment hardware and/or software in sufficient detail and clarity to enable the hardware and software to be built and tested. The use of formal design techniques is encouraged.
Detailed design specifications should include but are not limited to:
• System architecture (software and hardware, modularity).
• System functionality (including reporting).
• Data processing and integrity.
• Security.
• Back-up, archiving and restoration.
• System interfaces (including human/operator interface).
• Disaster/failure recovery.
Detailed design specifications may be split into discrete elements for example, hardware design specification and software design specifications. Further subdivision is common for larger systems where for example unit/module design specifications may be produced. Other documentation includes:
• System architecture (including relevant design drawings).
• Software program specifications (All inputs, outputs, error/handling and alarm messages, ranges limits, defaults and calculations should be defined ready for programming and testing).
• Cabling/wiring schedules.
The design of a system should consider the partitioning of GxP and non-GxP elements such that they can be validated and supported separately.
Data Definition
Data dictionaries, architectures and flow should be defined. Actual data to be loaded into tables, files and databases should be specified by reference to its source. Data dictionaries should be used to describe different data types.
The data definition may standalone as a separate document or be incorporated within functional/design specification.
Specific functional aspects to be covered by the data definition include:
• ERES requirements
• Built in checks for valid data entry and data processing.
• Access controls to ensure data can only be entered or amended by persons authorised to do so.
Data Definitions may be incorporated into the Functional or Design Specification documents, or prepared as a separate document as appropriate.
Design Review
A design review is undertaken to ensure that all documentation from the requirements and design phases have been produced and are:
• Clear and concise: The specification should conform to documentation standards and should be readily understandable.
• Complete: To establish that the specifications unambiguously and adequately define the system and that the requirements traceability matrix (RTM) (or equivalent mechanism) has been maintained.
• Testable: Criteria within the specification to be used for user acceptance should be specific, measurable, achievable, realistic and traceable to the functional requirements and design specification.
• Fit for purpose: To generate confidence that the system will satisfy the user's requirements, have the necessary attributes of reliability, maintainability, usability, and minimise hazards. Methods such as a FMEA (failure mode effects analysis) or CHAZOP (computer hazard
and operability study) should be used to verify the design.
• Include ERES requirements (where applicable): The design review
should confirm whether or not electronic record and electronic signature functionality has been addressed.
• Current: Verify the documentation is current and necessary change
control has been applied.
After the review a report should be prepared and approved summarising the design review. The report should clearly state if the quality of the design is acceptable, list any deficiencies together with details of planned remedial action.
Design reviews may be known as design qualifications (DQ). Their scope of application may include use of the computerised system in its wider context including equipment, procedures, and operator interaction.
It should be understood that the requirement for this design review does not mean that other routine reviews are no longer appropriate. Activities and documents should be reviewed and approved as defined in the validation plan and management procedures. Coding, Configuration and System Build Prior to commencement of coding, configuration or system build the design
review should have been successfully completed. All software (including configuration) should be developed with good programming practices to appropriate standards.
Software/configuration developed internally should adopt programming standards. Such standards should reflect the type of programming language used, e.g. structured, object oriented, ladder logic, etc. Typically, programming standards will address:
• Content of header information in software listings (i.e. author, version, change details etc).
• Software/configuration structure and consistency (i.e. modular structure).
• Avoiding creation of non-executable code (i.e. dead code).
• In-code documentation.
• Naming and definition of variables.
• Data definition and scope (e.g. global versus local).
• Use of sub routines.
• Branching.
• Error/exception handling.
• Expandability considerations.
Hardware should be assembled and constructed in accordance with good practice taking into account aspects such as regulatory requirements and manufacturer.s recommendations.
Code/Configuration Review A source code or configuration review should be performed on all
bespoke/custom application software and configurations prior to formal testing. A two-tier approach is advocated: a high level overview of all software identifying areas of code and a low-level walk through of critical areas.
The source code review aims to provide confidence in the operability of the system and should:
• Verify expected use of good programming practices and adherence to programming standards referenced in development documentation.
• Determine a level of assurance that the code has been constructed against the approved requirements and design specifications.
• Provide assurance that the code is maintainable by a competent programmer.
• Detect possible coding errors.
• Identify evidence of dead code.
Configuration details should be reviewed to provide confidence in the operability of the system and should verify that:
• ’Unused’ options are deselected and cannot function.
• The configurable elements of the application fulfil design specifications. The outcome of the source code or configuration review will typically be a report providing an overview of the review conducted together with a list of all observations that have been noted. Care must be taken only to place actions on what is required from a regulatory or tangible business benefit perspective.
All actions should be completed before progressing to testing of the software unit/module.
Note: The correction of typographical errors is not needed if there is no impact on GxP functionality. Equally reports do not need to identify each individual typographical error where a general statement of observation can do just as well.
Complete hand-written annotated listings of software subject to detailed low-level walkthrough should be retained and attached to, or referenced by, the report. Where suppliers withhold software source codes then access agreements should be established. Other review evidence should also be retained and similarly managed.
No comments:
Post a Comment