This power point presentation that was delivered during the July 16, 2014 OC user group meeting gives an overview of a few things you need in order to get FDA approval for off the shelf software in medical devices. It also addresses tools that NI has available to assist in this process.
One of the requirements for FDA submission is validation of the software test tools in addition to the software.
I'm referring to the document FDA document, 'General Principles of Software Validation' (2002). Section 6 outlines 'VALIDATION OF AUTOMATED PROCESS EQUIPMENT AND QUALITY SYSTEM SOFTWARE'.
Does anyone have experience validating Unit Test Framework for an application? Based on FDA documentation, it appears that the validation approach changes depending on the application and risk assessment, but in high-risk applications, all functions must be validated.
Would anyone suggest an example of what a validation strategy might look like for Unit Test Framework?
I do FDA Validation in the OC area. You have a complex question. (contact me if you wish PM:email@example.com)
Yes: you do need to 'validate' the tools you use in performing a validation.
To what degree: depends on your companies quality SOP's. In many FDA regulated companies, Commercial Off the Shelf (COTS) software requires very little or no validation.It DOES need to be controlled software - meaning your companies Doc Control should dispense the software tools. So that you are using a known version. However, complex software's may require validation on specific functions.
Validate Every Function: Is a common misconception and only proposed by those who don't have to do it or pay for it... .
So do you need to validate the Unit Test Software?: Possibly, within the scope of your use, which is derived from your risk assessement.
I'm happy to discuss this with you...it's easy to get started down the wrong path, and create a mountain of work, that obscures the actual goal of product validation.
Thanks for your comment. I had not yet considered vaulting installers for COTS software tools.
Regarding question complexity, my intent was to aim as high as possible first. Actually, our 'Level of Concern' is Moderate, so validation of every feature is certainly not a requirement, but if National Instruments or partners have a test script (or validation strategy ideas) they are willing to share with the community for 'Major Level of Concern' developments, it would be helpful for the community to have that for reference here or somewhere else.
The question for the community in that case is, what's an example of a validation strategy for NI Unit Test Framework for a Major Level of Concern?
I don't think NI is in a position to provide a 'test script'. Even if they did, it does not necessarily alleviate your burden to validate your use of the software\tool.
Also, I'm a big advocate to keeping 'Tool' validations seperate from process validations. It's makes the tool validation simpler, and decouple changes in the tool or process from causing a revalidation of both again. Also, If I was an auditor, I could make your life hell asking which requirement is testing the tool or the process?.
Further, (I used to lecture on FDA validations...sorry for preaching here) Understand what a 'Requirement' is and what 'Specifications' are....This is the most mis-understood, overlooked part of validations. Many think they mean the same thing...they do not. FDA validations deal with verifiying that the requirements are met. Verifying something is running at 25psi +/- 2 psi is not a requirement....that is a specification!. Yes, it's easy to measure, but someone just made up the +/- part cause they need a pass/fail criteria.
A validation is not a SAT test, where you tick the "PASS" boxes...
Many processes are not repeatable and can't pass a limit criteria, such as blood pressure, saline dilution etc. It's a convoluted path when people make up specifications for requirements - loosing track of what is being verified. Yes, you can generate great stacks of paper with mind-numbing details, but it does not always result in a effective validation.
I think the Unit Test Framework can be scoped in about 10-15 requirements. And a very clean validation done.
"Can the Unit Test Framework, faithfully, connect to a given LabVIEW function, pass variables to the function, execute the function in a identical manner to the Run-time engine, acquire the function outputs and display and outputs?"
I'd start with this fundamental requirement.
You bring up a lot of interesting subject matter. I'd like to organize this and make distinctions for the sake of other readers.
Validation of the Unit Test Framework
If we look at what the FDA says in the document referenced in my first post, on page 26, they tell us, "documentation providing evidence of the validation of these software tools for their intended use should be maintained."
So as to what the auditor will ask for... They would receive said documentation from us with validation criteria & data for any test tools we use.
Tool vs. process validation
We have a process that's like a development flowchart that includes unit testing at key points. Validation of the tool (i.e. Unit Test Framework) is done prior to using the tool in that process.
Requirements vs. specifications
Perhaps my statement, "validation of every feature is certainly not a requirement," should read, "validation of every feature will not be necessary for FDA submission."
The use of unit testing derives from what the FDA needs, not from our SRS. Unit testing does not go into our SRS because it's essentially a component of our risk mitigation strategy, not our system design.
So naturally I'm curious how you would scope the Unit Test Framework in 10-15 'requirements.'
The value of a test script
A test script could automate low-level testing of the Unit Test Framework, much like the Unit Test Framework automates low-level code analysis of LabVIEW code. I'm calling it a 'test script' - but it could be a LabVIEW project that makes it easier to manually exercise all of the low-level functionality of the tool (e.g. project menus, automated features, etc.).
Page 32 suggests documenting test cases that 'exercise the system', and it's common for utilities and support software to be validated with automated scripts.
It's worth asking if it saves us time or gives us ideas for how to better 'exercise the system'.
Ultimately, the FDA is looking for safety and efficacy. It certainly feels 'convoluted' until you see it from the system perspective.
Agree, It's good to clarify the points and I like your outline.
Validation of the Unit Test Framework
I worked for a client who looked at this tool, and walked away because of the burden of validating the Unit Test Framework application itself. Mainly IMO because they did not have a tool validation method inplace.They only did proces and software validations.
Auditor submisssion; data. Agree! and as I stated "P" for pass is not data...I see so many companies just litter their forms with checkboxes. It's not a shopping list.
Unit Test Framework Requirements in 10-15 Statements
The best way to think of this is that I could scope all the basic requirements I would have for a car in 10 to 15 requirements. Now from that the Risk assessment would add mitigation requirements, which would add more.
BAD: Software shall run on a PC with i7 core or better, Win7 64-bit, 16gigs of RAM, 60Gig hard drive, CD\DVD Rom drive.
Why bad?: mixes specifications with requirements, specs hardware avaliable now, in one year we might have a marshmallow size cube quantam computer that runs Win7 or any OS environment. And all the specs' are irrevelant now and hinder doing a replacement.
Better: The host computer shall provide for running the Win7 OS, and provide for permanent data storage.
Requirements should stand the test of time and be stated that way. Calling out a J-type thermocouple is a bad idea. Stating you need to measure temperature withn a certain range and be NIST tracable allows you to upgrade the system a few years later with new technology and have the requirements still stand. Now you still have to change the test protocol and re-run the validation. But the requirements don't need revision.
I know I seem to dwell on this quite a bit, but it takes years to get this hammered into people's heads. Getting the requiremens right is critically important to getting off in the right direction. It's so easy to get lost in specs and max/min limits...I see this as the act of the desperate who don't have clear vision and direction.
So how do I scope the Requirements of the Unit Test Framework?.
Start by forgetting such a tool exists!. Imagine you are going to contract a software company to create it for you!. You have to give them general requirements, you don't state you want a Red stop button...(That's a design specification).
(This is a very rough, off the top of my head go at this....)
The UTF application shall provide for running a LabVIEW VI including all is sub-VI hierarchy routines as a single function.
The UTF shall provide for passing variables into the target function via its normal variable input terminals and executing the function in a manner normal to the native LabVIEW runtime environment for which it was written.
The UTF shall provide for receiving the executed outputs of the function, including variables output, and or outputs of sub-routine calls nested within the top-level VI.
The UTF shall provide a framework to call and execute the target VI sequencially with a set of inputs and or conditions that are user defined.
The UTF shall provide for loading a table or list of input values and or conditions from which the VI under test is passed sequencially.
The UTF shall provide for acquiring, displaying and storing the output variable or sub-routine output calls of the VI under test.
The UTF shall provide for comparing the actual function outputs and or/ action to an expected output\action from a prescribed user defined expected output\action.
Etc, etc, etc.....As you see the test cases are where you as the engineer specify how that requirement is met.
In closing, it would be good to have a Target Test VI and script that provides a known baseline to excercise the Test tool. This is what the tool validation would do and limit the scope to that alone.