Validation of calibration software ? as required by ISO 17025, for instance ? is Suspicious that people don?t like to talk about. Often there is uncertainty concerning the following: Which software actually should be validated? If so, who should look after it? Which requirements must be satisfied by validation? How does one take action efficiently and how is it documented? The following post explains the background and provides a recommendation for implementation in five steps.
In a calibration laboratory, software can be used, among other activities, from supporting the evaluation process, up to fully automated calibration. Whatever the degree of automation of the program, validation always identifies the complete processes into that your program is integrated. Behind validation, therefore, is the fundamental question of if the process of calibration fulfills its purpose and whether it achieves all its intended goals, that is to say, does it provide the required functionality with sufficient accuracy?
If you need to do validation tests now, you should be aware of two basic principles of software testing:
Full testing is not possible.
Testing is always dependent on the environment.
The former states that the test of all possible inputs and configurations of an application cannot be performed as a result of large number of possible combinations. Depending on the application, the user must always decide which functionality, which configurations and quality features should be prioritised and that are not relevant for him.
Which decision is made, often depends on the second point ? the operating environment of the program. With respect to the application, practically, there are always different requirements and priorities of software use. There are also customer-specific adjustments to the program, such as concerning the contents of the certificate. But additionally the individual conditions in the laboratory environment, with an array of instruments, generate variance. The wide variety of requirement perspectives and the sheer, endless complexity of the software configurations within the customer-specific application areas therefore ensure it is impossible for a manufacturer to test for all the needs of a specific customer.
Correspondingly, taking into account the above points, the validation falls onto an individual themself. In order to make this technique as efficient as possible, a procedure fitting the following five points is preferred:
The info for typical calibration configurations should be defined as ?test sets?.
At regular intervals, typically once a year, but at the very least after any software update, these test sets should be entered into the software.
The resulting certificates could be compared with those from the prior version.
Regarding a first validation, a cross-check, e.g. via MS Excel, can take place.
The validation evidence ought to be documented and archived.
WIKA provides a PDF documentation of the calculations carried out in the software.
Note
For more info on our calibration software and calibration laboratories, visit the WIKA website.

Leave a Reply