Basic
Definition and
requirements for selection,
verification and validation of methods are given in section 3.8,
3.9. and 7.2 of ISO/IEC 17025:2017.
Verification:
Standard methods
need verification to ensure that the laboratory is capable of performing the
stated activities. Verification
is the demonstration that the laboratory is capable of replicating, with an
acceptable level of performance, a standard method. Verification under
conditions of use is demonstrated by meeting system suitability specifications
established for the method, as well as a demonstration of accuracy and
precision or other method parameters for the type of method.
Validation:
ISO/IEC 17025:
2017 – Clause 7.2.2.1: “The laboratory shall validate non-standard methods, laboratory-developed
methods and standard methods used outside their intended scope or otherwise
modified. … ”
Factors to consider
Selection:
Customer could specify
the method to be used, otherwise, the laboratory can select an appropriate one
and inform the customer. Customer
acceptance is usually given in written form; agreement can be part of the
contract.
When a deviation from
method occurs, the deviation shall be notified to the customer unless a
specific statement has been already included as part of the contract.
Deviation from a
standard method requires validation of method.
Verification should be
documented in such a way to provide evidence that the laboratory is capable of
achieving the required performance characteristics of the method; this can
include:
- Estimation of
repeatability and/or reproducibility
- Instruments
characteristics
- Operator qualification
(training, experience, competences, …)
- Environmental
conditions
- Materials or reagents
- Any other
characteristics that could influence the result
General cases are listed
below:
- Methods in national
or international standards should be regarded as validated. Nevertheless,
it must be verified that all conditions are fulfilled in the laboratory´s
application. This includes the stated uncertainty. If the uncertainty of the
result is not mentioned or stated in the national or international standard,
some reflection about this should be made by the laboratory using it.
- Seldom used
methods. When a method is used only occasionally, the maintenance of
personal competence or the fitness of the equipment may be questioned. Here a
reasoning should be made, considering e.g. the experience and education of the
personnel in areas close to the method in question or the straightforwardness
of the method.
Example: When testing
the strength and deformations of 24
feet containers once every second year,
the verification should consider whether the personnel has extensive training
in mechanics or solid mechanics, and whether other large scale mechanical tests
are regularly performed in the laboratory.
Validation:
When planning a validation
much work can be saved by having technical competence available and by use of a
systematic approach. One aim is to judge which factors are of most importance
and deserve most attention. Three main stages could be used:
- Distinguish between
method of test, and of producing and processing the specimen, including
sampling
- Consider the test or
measurement factors (equipment and calibration, handling of specimen, testing
or measurement procedure, analysis and form of results)
- Consider supplementary
changing factors (environment, education and experience of operator, frequency
of use of the method)
The documentation
should clearly describe which factors are of significance and why, and how they
are treated in the validation. Conditions and limitations should be described.
Note: One important
distinction is that a method may be valid, but not necessarily relevant, e.g.
the result is what is stated, but does not tell what is really needed. Many
examples may be found in old but still used standards for product testing.
The two main principles
for validation
Validation may be
obtained using the following principles, often in combination.
- Use scientific
knowledge and acknowledged experience to describe and demonstrate the validity
of factors involved.
Example: Time to obtain
thermodynamic equilibrium in a climate chamber may be assessed either by
dimensional analysis of the laws of heat flow, or by experience from
measurements in similar situations.
- Use, if possible, interlaboratory
comparison, proficiency tests or reference materials to show that the complete
chain of testing or analysis gives the stated result, including uncertainty,
and in the range of interest.
Example: Chemical
analyses by “black box” equipment may be validated by reference materials and
proficiency tests.
Different types of
methods
Validation
procedure should be chosen in accordance with the actual type of method.
Method extensions or
variation of methodologies are very important for services to innovative
branches of industry. For efficient accreditation of flexible scope,
such validation is important. It is recommended to use scientific knowledge or
experience. Good competence of laboratory staff is essential.
Example: EMC
investigations in increasing ranges of frequencies require both a scientific
basis and experience from the actual anechoic chamber in order to judge the
necessary number of geometries and antenna configurations to achieve the
resulting uncertainty.
In-house methods have
to be validated by the laboratory, but with consideration of a cost-benefit
perspective and in agreement with the customers. Often the method is an
extension or a simple combination of known methods.
Example: The torque
required to open the lid of a can may be tested in a simple way with an
uncertainty of, say 3 per cent, but it may be very difficult to achieve an
uncertainty of 1 per cent. If the variation in torque between cans is typically
10 per cent and the intention is to check the possibility for elderly people to
open the cans, the 3 per cent is obviously sufficient.
Validation is a
relative concept and the extent should always be chosen with consideration of
the intended use of the results. This is implicit in the paragraph 7.2.2 cited
above.
New Method
According to the above,
each new method has to be validated or verified prior to the implementation.
Both validation and/or verification has to be documented and approved.
Uncertainty fit for
purpose as part of the validation
procedure
Uncertainty assessment
may seem complicated and is not always possible. There are most often simple
ways to obtain robust assessments of uncertainty. A continuously updated list
of useful documents is available on the EUROLAB web-site (www.eurolab.org) (for
reference use the GUM).
If possible the
definition of instrumental uncertainty and target uncertainty could be included
for the assessment (concepts described in the VIM).
Some rules of thumb may
be the following.
- One may distinguish
between dispersion in the tested objects (the representativity of a sample),
and the dispersion (uncertainty) of the test method.
- Selection of Type A
and Type B should be made according to the quality of the contribution.
- If Type B estimates
have to be used and combined, it is important to find the ones contributing
most. The others (smaller than 5% of the biggest one) can normally be
discarded.
In e.g. chemical
analysis, a local uncertainty measure, repeatability, is used for
controlling stability of production processes etc, which may contain bias,
systematic error contributing to the global uncertainty. In other areas, as
products intended for safety-critical applications, it is necessary to use the
global uncertainty, relating results to the true value.
A concept related to
this is the reproducibility describing for, typically, a number of
laboratories and operators the capability to produce similar results over time
applying the method.
Note: The ISO/IEC 17025
standard mentions a number of measures of properties of a test method, as
robustness, sensitivity, detection limit etc where the terms are sector
specific and should be considered if need be by finding their definition in the
VIM.
See also:
JCGM 100 (GUM)
JCGM
200 (VIM)