287
CHAPTER
6
Data Verification and Validation
Data verification and validation are the first steps in assessing project data quality
and usefulness. They represent a standardized review process for determining the
analytical quality of discrete sets of chemical data. The primary purpose is to
summarize the technical defensibility of the analytical data for the end users and
decision makers.
Direction is provided by the EPA under the Contract Laboratory Program (CLP)
in the form of the National Functional Guidelines for Organic Data Review (EPA-
540/R-94/012, February 1994) and the National Functional Guidelines for Inorganic
Data Review (EPA-540/R-94/013, February 1994). Interpretation of this guidance
and its application to individual programs and projects needs to be made at the
operational level and incorporated into the Sampling and Analysis Plan for a given
investigation. Verification and validation must be consistent with the project data
quality objectives, laboratory scope of work, and designated analytical methods.
Data verification/validation represent the decision process by which established
quality control criteria are applied to the data. Individual sample results are accepted,
rejected, or qualified as estimated. Data that meet all criteria are acceptable and can
be used as needed by the project. Data not meeting critical criteria are “rejected” (R)
and should not be used for any project objectives. Some data meet most criteria and
meet all critical measures; however, they have quality control deficiencies that ques-
tion their degree of accuracy, precision, or sensitivity. These data are qualified as
“estimated” (J) to indicate that certain validation criteria were not met. Estimated data
may or may not be usable depending on the intended data use, and all estimated data
should be followed by a rationale for their estimation, so that the data user is capable
of making an intelligent decision regarding their employment in the decision process.
EPA Region I has provided the environmental community with a useful “Tiered
Approach” to validation that allows a program or project to establish the level of
intensity and depth of review applicable to its needs. Guidance to this approach appears
in “Region I, EPA-New England Data Validation Functional Guidelines for Evaluating
Environmental Analyses,” July 1996, revised December 1996. This document and its
appendices may prove useful during project data validation development.
© 2001 by CRC Press LLC
288 SAMPLING AND SURVEYING RADIOLOGICAL ENVIRONMENTS
Data verification is the first step in a tiered or graded approach (Tier I). It is
performed for the purpose of “verifying” that the laboratory provided the reporting
information required by the Sampling and Analysis Plan and met a few minimum
requirements that ensure the data are usable. Data verification involves evaluating
laboratory analytical data packages to confirm that:
• The data packages are complete and contain all of the information specified in the
Sampling and Analysis Plan, e.g., All samples and analyses requested, case narra-
tive, summary data report, completed chain-of-custody form, analytical quality
control data (blanks, matrix spikes, matrix spike duplicates, etc.), Date and time
when each analysis was performed;
• The laboratory ran the correct analytical methods specified in the Sampling and
Analysis Plan;
• Samples did not exceed the maximum analytical holding times specified in the
Sampling and Analysis Plan;
• Sample chain-of-custody was not broken from the time the sample was collected,
analyzed, and the data reported; and
• The laboratory reported analytical results for each analytical method and each
analyte required by the laboratory statement of work and the project Sampling and
Analysis Plan.
Data verification does not require an extensive effort and all analytical data
packages should undergo this level of evaluation prior to use. Minimal qualification
of the data would result from this level of verification and the applicability to
objectives such as risk assessment would be questionable. However, for site char-
acterization studies and definition of contamination nature and extent for engineering
planning purposes Tier I may be all that is needed, since the consequences of decision
error are relatively minor.
Basic data validation is a more extensive evaluation than data verification and
represents the next tier in the review process (Tier II). Basic data validation imple-
ments an evaluation of laboratory quality control data and analytical procedures.
This ensures the analytical process and instrumentation used to perform the analyses
met all of the data quality requirements specified in the DQOs and Sampling and
Analysis Plan. Focus is given to laboratory/instrument performance criteria, sample
preparation and matrix effects evaluation, and field quality control measures. In
addition to Tier I verification, basic data validation involves evaluating laboratory
analytical data packages to confirm that:
Laboratory/instrument performance criteria
• Laboratory case narrative documentation is clear and accurate.
• Analytical preparation procedures are acceptable and documented.
• Instrument operational and method calibration criteria have been achieved.
• Laboratory calibration blank contamination is under control.
• Laboratory control standard criteria are being met.
© 2001 by CRC Press LLC
DATA VERIFICATION AND VALIDATION 289
Sample preparation and matrix effects criteria
• Laboratory method blank contamination is under control.
• Sample surrogate compound recovery, tracer recovery, and internal standard
criteria have been achieved.
• Sample matrix spike recoveries meet minimum accuracy requirements specified
in the DQOs and Sampling and Analysis Plan.
• Sample matrix spike duplicate or duplicate comparisons meet minimum preci-
sion requirements specified in the DQOs and Sampling and Analysis Plan.
• Sample dilution review and reanalyses are performed.
Field quality control measures
• Field source water blank, equipment rinsate blank, and sample trip blank con-
tents have not impacted the project data results.
• Field duplicate comparisons meet minimum precision requirements specified
in the DQOs and Sampling and Analysis Plan.
Tier II validation does require more effort; however, this level enables meaningful
qualification of the data and is considered acceptable for documenting data uses,
such as risk assessment, transport modeling, cleanup confirmation, and site closing.
This degree of review conforms with that identified by the EPA in “Guidance for
Data Usability in Risk Assessment” EPA540/G90/009, October 1990. Tier II vali-
dation should also be acceptable documentation to confirm legal defensibility.
Complete data validation encompasses both Tier I and Tier II information and
adds a detailed examination of the analytical raw data. This level of review requires
all information generated by the laboratory to be presented as part of the data
deliverable. This would include copies of all chromatograms, spectral printouts,
quantification details, preparation logbooks, standard logbooks, calculation pro-
grams, etc., produced by the laboratory. In addition to the previously reviewed
material, comprehensive data validation will include:
• A detailed examination of the raw data analyte identification;
• A check of calculations used to quantify analyte results, normally a minimum of
10% of the reported concentrations are checked by recalculation from original raw
data information; and
• Raw data results are verified against final reported concentrations to preclude
transcription errors.
Complete data validation does require an extensive effort, and unless there are
special project goals requiring this level of review, it is probably not necessary to
implement such intensive inspection of the data. This level of validation is considered
acceptable for all data uses including legal actions. When Tier I and Tier II reviews
show persistent laboratory errors or indicate a possibility of laboratory impropriety,
complete validation is warranted to justify possible contractual recourse relative to
laboratory performance.
Regardless of which level is implemented, validation must be documented for-
mally for project records. The principal mechanisms utilized to establish this doc-
© 2001 by CRC Press LLC
290 SAMPLING AND SURVEYING RADIOLOGICAL ENVIRONMENTS
umentation are qualification flags (U, J, UJ, R) attached to the data values and
relevant checklists targeted at verification and method-specific validation. The qual-
ification flags U, J, UJ, and R represent below detection, estimated value, below
detection–estimated value, and rejected, respectively. Validation flags should be
placed on both hardcopy “Form 1” results and in electronic databases. More sophis-
ticated and useful documentation is implemented by several organizations in the
form of reason codes applied to the traditional qualification flags. These codes
express in detail why the primary qualification was made. This information can be
employed by data users to ascertain potential concentration bias and concentration
utility to the project. Additional documentation may be written in validation sum-
mary reports (per sample delivery group or per project), Quality Control Summary
Reports, or Data Quality Assessments.
Various combinations of data review, verification, and validation can be imposed
on the project data depending on the intended use of the information. However, the
end result of the verification/validation process must provide a consistent and defen-
sible data set.
REFERENCES
EPA (Environmental Protection Agency), Data Useability in Risk Assessment, EPA-
540/G90/009, October, 1990.
EPA (Environmental Protection Agency), National Functional Guidelines for Organic Data
Review, EPA-540/R-94/012, February, 1994.
EPA (Environmental Protection Agency), National Functional Guidelines for Inorganic Data
Review, EPA-540/R-94/013, February, 1994.
EPA (Environmental Protection Agency), Region I, New England Data Validation Functional
Guidelines for Evaluating Environmental Analyses, July 1996, (revised December 1996).
© 2001 by CRC Press LLC