Verification & validation (V&V) in software testing– what is the difference?

What is the difference between verification and validation in software testing?

Verification answers the question, “Did we build the right software?*” while validations addresses, “Did we build the software right?*”

Verification versus validation (V&V) – definitions & objectives

Verification in software testing – a definition
Verification is the process the project deliverable is evaluated in the development phase to find out whether it meets identified and approved requirement specifications

Validation in software testing – a definition

Validation is the process in which the project deliverable is evaluated at the end of the development life cycle to determine whether it meets the stakeholder expectations and requirements or not.

Software testing

In software testing both Validation and Verification are the parts of V model in which the development & testing activity is started based on the identified and confirmed requirement specifications. The verification process always comes before the validation process.

Verification versus validation (V&V) – actions and deliverables explained

Verification Validation
Evaluation of whether a specific project deliverable complies with the identified requirement specifications Assurance that the project deliverable meets the needs of the identified stakeholders
Evaluates the project deliverables of a particular phase to check whether it meets the identified requirement specifications Evaluates the final project deliverable to check whether it meets the stakeholder’s expectations
Verification is concerned with whether the project deliverable meet the developer’s expectation (Confirmation of the requirement specifications) Validation is concerned with checking that the project deliverable will meet the customer expectation and actual needs (Fit for use)
Verification may or may not involve the project deliverable of a particular phase itself Validation always involves the actual project deliverable
Examples of typical verification activities are the review of documents that are produced during the development phase (e.g., requirements specification, traceability matrix, flow charts, code review, test plan review. Examples of typical validation activities are the execution and completion of functional testing, system testing, and user acceptance testing.
Verification is the process of checking that the software meets the specification Validation is the process of checking whether the
specification captures the stakeholder’s needs.
Verification does not involve code execution Validation involves code execution
Examples of verification methods are review, audits, inspections, and/or desk research Examples of validation methods are functional testing,
script-testing, non-functional testing, and/or regression testing
Verification checks whether the deliverable confirms a specification Validation checks whether the deliverable meets the requirements and expectations
Verification process targets on architecture, design, and requirements Validation process targets on the actual project deliverable (final software product)

* Barry Boehm, Software Engineering Economics, 1981

GAMP5 – The Good Automated Manufacturing Practices (GAMP) Guide for Validation of Automated Systems in Pharmaceutical Manufacture.

Introduction GAMP5

In this article we will share examples of GAMP software categories that are related to The Good Automated Manufacturing Practices (GAMP5) Guide for Validation of Automated Systems in Pharmaceutical Manufacture. But before we start, what is GAMP5 all about?

GAMP5 Key Objectives:

  • Patient Safety
  • Product Quality
  • Data Integrity
  • Regulatory Compliance Requirements

The GAMP5 software categories – An introduction

Based on the GMP impact (Product Quality, Patient Safety, Data Integrity and Regulatory Compliance Requirements) the level of verification through software testing will be determined. Because the GAMP5 software categories can be open to interpretation implementing a risk-based approach is a key element to ensure the correct level of validation effort is executed. A risk-based approach is used to identify, evaluate, control and review the identified risks. The outcome of the risk assessment (level of risk and potential system impact) will be used to determine if validation is required and what elements must be validated.

GAMP5 key principles of product and process understanding, quality risk management, and leveraging suppliers’ activities is fully aligned with the Computer System Assurance (CSA) risk-based approach.

The GAMP5 categories

  • Category 1: Infrastructure software
  • Category 2: This category is not live in GAMP 5. (Its present in old version GAMP 4)
  • Category 3: Non-configured products
  • Category 4: Configured products
  • Category 5: Custom applications

GAMP Software Category 1 – Infrastructure Software

This includes “established or commercially available layered software” and “infrastructure software tools” that are themselves validated from within rather than from the infrastructure. Infrastructure software in its most simple form is the operating system on which the application software resides. Infrastructure is qualified but not validated. The validation is performed on the hosted application not on the infrastructure. Infrastructure should be built, configured and deployed in accordance with the applicable procedures including configuration management.

GAMP5 Software Category 1 – Infrastructure software examples:

  • Standard operating systems (e.g., Windows XP, Linux)
  • Anti-virus software
  • Operating Systems
  • Anti-virus Software
  • Active Directory / Domain Controller
  • Database Software (SQL / Oracle)
  • Server and Network Hardware
  • Virtual Environments
  • Firewalls, including configuration
  • Server and Network Monitoring Tools
  • Backup Systems

Validation requirements GAMP5 Software Category 1 – Infrastructure software

  • Record ID and version

GAMP Software Category 3 – Non Configurable products

This includes software packages where existing code can be selected and set points can be used either unconfigured or with the standard defaults provided by the software supplier. Non Configurable Software is also referred to as COTS (Commercial-Off-The-Shelf-Software) or just OTS (Off-The-Shelf-Software). The software is capable of operating and automating the business process without any modification. There is no fixed rule as to the validation approach for GAMP Category 3 systems.

GAMP5 Software Category 3 – Non Configurable products examples:

  • Programmable Logic Controllers (PLC’s)
  • Electronic chart recorder
  • Commercial off the shelf software (COTS),
  • Laboratory Instruments
  • Laboratory Software
  • Remote Terminal Units
  • Vernier calipers
  • Analytical balance
  • Autoclave
  • Building Management System (BMS)
  • Environmental Monitoring System (EMS)
  • Stand-alone HPLC system
  • Quality Control (QC) equipment

Validation requirements GAMP Software Category 3 – Non Configurable products

  • Audit supplier – validate any code – validate functionality

GAMP Software Category 4 – Configured products

GAMP5 software category 4 includes configured software where the user has the means and knowledge to change the functionality of the software application. GAMP Category 4 software is software applications that are configured to meet user specific business needs. This is possibly the biggest and most complex category. The functionality of the software can be configured to return different outputs depending on the configuration.

GAMP5 Software Category 4 – Configured products examples:

  • Laboratory Information Management System (LIMS)
  • Supervisory Control And Data Acquisition (SCADA)
  • Distributed Control System (DCS)
  • Clinical Decision Support System (CDS)
  • DCS / SCADA Mimics
  • Alarm handling functionalities

Validation requirements GAMP Software Category 4 – Configured products

  • Audit supplier and code
  • Validate any bespoke configurations
  • Apply full life cycle requirements

GAMP Software Category 5 – Custom applications

This includes any “application, module, user-defined program, or macro” that has been written in-house or by a third party that “needs to be specified, version controlled, built, and tested (including integration testing with the commercial application, as applicable) as a minimum to ensure the quality of the software.”

Custom Software is software that is generally written from scratch to fulfil the business need. This software could be written in-house and is possibly the highest risk of the software categories as it is customized and there is a higher level risk of errors within the application code.

GAMP5 Software Category 5 – Custom applications examples:

  • PLC logic (Ladder, Sequence Flow Chart, C++)
  • Custom scripts within SCADA / DCS system
  • Electronic batch record

Validation requirements GAMP Software Category 5 – Custom applications

  • Audit supplier
  • Validate all code
  • Apply full life cycle requirements

Agile | Scrum engineering and Quality Assurance best practices

Scrum engineering and quality assurance best practices

Never seperate testing and development. According to the Agile principles devolopers and QA specialist must work together on features.

Address all bugs found during a current sprint in the following sprint to maintain commitment consistency and stable work rhythm.

Always integrate continuous integration to execute automated tests for every finalized feature. This will help to identify potential bugs in a early stage and take the appropriate actions to resolve them.

Continuous improvement: Always strive for better results by evaluating the teams track record and learn.

Connect with us and join the conversation

Thank you for reading this article. Please feel free to share your feedback on the practices we suggsted.

With your help we can continue to improve our Best Practices Library and share our knowledge with the world.

Scrum tracking and predicting best practices | AGILE

Scrum tracking and predicting best practices

  1. Use the burndown charts to visualize the progress of the sprint to determine that the sprint progresses according to schedule.
  2. Use the release burndown chart to estimate how many sprints are needed to complete the project on schedule and whether the timeframes must be adjusted.
  3. Use velocity as measurement to visualize how many stories are completed during each sprint compated to initital estimates.
  4. Take time to choose the right collabration software that fits the teams needs.

Agile – Scrum managing the sprint and product backlogs best practices

Scrum managing the sprint and product backlogs best practices

  1. The product backlog and sprint backlog seperate must be managed seperately to ensure you can plan, efficiently estimate ,and forecase your sprints.
  2. Stimulate teammembers and stakeholders to verify all outcomes and results and provide feedback as applicable by making all scrum related documentation available and accessible.
  3. Use task prioritization techniques in the product backlog to identify priorities and key features.
  4. Always use unique IDs to identify each unique user story or items in the backlog. This will make it easy to keep track of tasks and stories no matter how you call them.
  5. Use Excel as tool to create your backlog and prioritize and reprioritize the tasks quickly and easily.
  6. Use a Scrum board to create visibility of each sprint.
  7. Reprioritize your Product Backlog each time you add to it. It will save you trouble in the long-run
  8. A sprint backlog must be centered around one specific goal.
  9. Be as specific as possible with each part of your Sprint and product backlogs stories.

Agile – Scrum planning and estimates best practices

Scrum planning and estimates best practices

  1. Work together with stakeholders to estimate the product backlog to establish transparency and help reduce the change of potential conflicts around estimates.
  2. Ensure sufficient product backlog items are identified and confirmed before you start with the sprint planning. This will help to determine the correct scope of the project.
  3. Translate team and stakeholders objectives into clear sprint goals to ensure they are aligned and transparant.
  4. Use Planning Poker to make better estimates.
  5. Use 6-hours days leaving the remaining two hours each day for risk mitigation or unexpected events.
  6. Never stretch or cut sprint time because using specific timeframes is one of the key elements of the scrum approach.
  7. Reprioritize the work that needs to be done to meet the project goals.
  8. Use discussions during meetings as input for continuous improvement of the scrum project.
  9. The sprint planning meeting should take no longer and no shorter than four hours.

Agile – Scrum teamwork and meetings best practices

Scrum teamwork and meetings best practices

  1. Define clear objectives: Fill the product backlog together with all the involved stakeholders and the scrum team to algin their vision and understanding of the future product or service.
  2. Always begin and end on time with scrum meetings. You need to stick to the agreed schedule no matter who is there to attend the meeting.
  3. Don’t break existing teams to start a new project and use their collobration experiences from previous scrum projects.
  4. Practice stand-up meetings (stand-ups / daily meetings) to get the team comfortable with this approach.
  5. Ask stakeholders and product owners to participate on a regularly basis in the daily scrum meetings to improve communication and understanding.
  6. Create communication guidelines to ensure teammembers are trainend on effective communication skills and understand the importancy of communicatin.
  7. Empower team members to adapt their development approach as changes occur or problems arise.
  8. Use a central hub for User Stories, tasks lists, burndown charts ,notifications, comments and feedback.
  9. Promote peer-to-peer collabration to improve mutual respect and create transparency.
  10. Use a dedicated location to facilate the daily scrum meeting (same place and at the same time every workday).
  11. Keep things short: A daily scrum meeting should be fifteen minutes or less to keep focus and efficient.
  12. During the daily scrum meeting each member focus on: what has done yesterday, what will be done today, what issues may cacuse problems for progress.
  13. Always prepare a scrum meeting agenda to increase the effectiveness of the scrum meeting.
  14. Ask teammembers to come prepaired to meetings to ensure constructive participation and quick responses to questions or issues.
  15. Increase engagement during meetings by adding a small dose of humor.
  16. Create a distractions-free environment to host the meetings to improve focus and participation in the meeting.
  17. Use sepate meetings to discuss unsolved problems with only those members who are directly realted to the problem.
  18. Create an environment where team members proactively look for help.
  19. Ask teammembers to give each other timely constructive feedback.

Business management Best Practices examples

Business management Best Practices examples

  1. Engage workers;
  2. Set team expectations and be performanace driven;
  3. It is all about people;
  4. Initiatives must inspire shared insights;
  5. Reward effort;
  6. Having a clear process;
  7. Create a foolproof foundation;
  8. Be vulnerable;
  9. Stay committed;
  10. Implement continuou improvement thinking;
  11. Be transparent;
  12. Document the processes;
  13. Building for the future;
  14. Use data-driven decision making;
  15. Invest in the right people;
  16. Focus on excellent customer service;
  17. Seek clarity;
  18. Identify shared business values;
  19. Focus team effort;
  20. Schedule time for periodic evaluation;
  21. Managing without ego;
  22. Own the operation from start to finish.

Business management KPIs

  1. Sales Revenue
  2. Net Profit Margin
  3. Gross Margin
  4. MRR (Monthly Recurring Revenue)
  5. Net Promoter Score

Innovation management: 23 Best Practices examples

Innovation management: 23 Best Practices examples

  1. Allocate resources properly to support your strategy;
  2. Stimulate ideas by asking staff for input;
  3. Involve everyone in the organisation;
  4. Identify goals and stay focused;
  5. Embrace the opportunity mindset;
  6. Walk the talk;
  7. Plan for execution;
  8. Implement with speed;
  9. Cultivate a risk-taking culture;
  10. Develop organizational culture to support innovation;
  11. Create an innovative company culture;
  12. Collaborate with customers and strategic partners;
  13. Periodic evaluation of effectiveness of innovation programs;
  14. Apply continuou improvement thinking;
  15. Create an innovation strategy;
  16. Create value for your customers & stakeholders;
  17. Identify the best opportunities;
  18. Identify and remove barriers;
  19. Set up dedicated and cross functional innovation teams;
  20. Set-up research and technology networks;
  21. Incent and reward;
  22. Communicate results and milestones.

7 Innovation management KPIs

  1. Innovation rate = revenue share of innovations / total turnover * 100
  2. Innovation rate = number of innovations / number of products * 100
  3. The number of innovation projects started
  4. Number of new products launched in X amount of time
  5. Actual vs. targeted breakeven time for new products
  6. Revenue/profit growth from new products
  7. ROI of innovation activities