• facebook
  • twitter
  • YouTube
  • RSS feed
  • 651-201-6000
  • 800-967-2474
  • 711 TTY
  • PARKING

NodeFire Save Document
Home > Ag Chemicals & Fertilizers > Spills & Safety > Incident Response Unit > Guidance Documents > Laboratory Data Review Guidance (GD29)

Laboratory Data Review Guidance | Guidance Document 29


Printable Version (PDF: 50 KB / 3 pages)
GD29 Attachment: Laboratory Data Review Checklist (PDF: 391 KB / 4 pages)

The following is the Minnesota Department of Agriculture’s (MDA) informal guidance that should be used with the Attachment to review analytical data. The Attachment should be completed by the consultant and included with all laboratory data submitted to the MDA. This form is based on the Minnesota Pollution Control Agency’s Laboratory Data Review Checklist Guidance and follows the general format of the National Functional Guidelines which is the primary data review tool used in the U.S. Environmental Protection Agency’s Contract Laboratory Program for Superfund analytical work.

1. Chain of Custody, Preservation and Holding

Times The validity of analytical results is partially based on the condition of the sample when received and the length of the holding time from sample collection to sample analysis. The requirements for sample preservation and holding times are found in the specific MDA approved method (for MDA parameters in ground water these criteria are included in Attachment 3 to Incident Response Unit Guidance Document 12), or in specific U.S. Environmental Protection Agency approved methods. The MDA approves the freezing of soil samples for up to six (6) months to extend holding times for pesticides and nutrient analysis.

How to check: Review the Chain of Custody Form, the Sample Condition on Receipt Form and the report narrative to determine if the samples were preserved and arrived at the laboratory in the proper condition. In addition, look at the date of sample collection on the Chain of Custody Form and compare this to the date of sample preparation and/or to the date of analysis. The number of days must be less than or equal to the required technical or program holding times.

If the samples were not analyzed within the technical holding time, the results may be impacted. The results for these samples should be qualified or flagged as estimated (“J”) and non-detects as either estimated (“J”) or unusable (“R”). If problems are noted, the integrity of the samples has been compromised and professional judgment must be used to evaluate the impact on the sample results.

Any problems with the condition of the sample, preservation of the data or not meeting holding times must be properly flagged on the results for the specific sample and described in the report narrative and on the Sample Condition on Receipt Form.

2. Calibration

The calibration process consists of an initial calibration and continuing calibration verifications. Requirements for initial calibrations are specified in the MDA approved method reference or by the MPCA Quality Control (QC) Policy.

How to check: Calibration information may not be available for review by the data user. Look at the report narrative or any attached data discussing calibration. Look at the data report and check any flagged data that indicates there was a calibration issue. If you find flagged data, consider the impact on the affected compounds in making decisions. Any problems with calibration of the instruments must be described in the report narrative by the laboratory.

3. Blanks

There are numerous types of blanks that are analyzed by the laboratory. These include field blanks, trip blanks and method blanks. These blanks are analyzed to determine the existence and magnitude of contamination resulting from field, trip or laboratory activities. The method blank is used to determine the levels of contamination associated with the processing and analysis of samples. There must be one method blank reported for each batch of samples prepared by the laboratory for most analyses. The concentration of each target analyte in the method blank must be less than the associated reporting level. If the method blank is contaminated, measures must be taken by the laboratory to eliminate the problem.

How to check: Look at the blank analysis results and the narrative. If any target analyte is detected above the reporting level in the blanks, sample results may need to be qualified to indicate the problem. The presence of unexpected compounds in the field or trip blanks likely indicates cross-contamination of sampling equipment or contamination from an unexpected source.

All concentration levels for the affected target analyte in the sample which are less than 10 times the concentration of the analyte in the method blank should be qualified with a “B” to indicate that the sample results may contain a bias related to the blank contamination. Concentrations of the affected analyte in the sample which are above 10 times the blank contamination will not need to be qualified. If a compound of concern on a site is flagged due to blank contamination, the sample result may be a false positive and care must be used in making site decisions.

4. Surrogates – Organic Analysis

Surrogates are synthesized compounds that are not normally found in the environment. Surrogates are added to every sample and all batch QC samples by the laboratory to monitor laboratory performance in analyses of organic compounds. Laboratories develop surrogate recovery limits based on recoveries from submitted samples. Acceptable recovery limits are defined in each method.

How to check: Review the recoveries of the surrogates for each method where surrogates are added. Acceptable ranges for surrogate recoveries must be listed in each report. If any recovery is outside of the acceptable range, the associated data should have been flagged as follows:

  1. If any recovery is greater than the upper acceptance limit, any associated analytes detected above reporting limits should be flagged as estimated (“J”).
  2. If any recovery is below the lower acceptance limit, any associated non-detects should be flagged as either estimated (“J”) or as unusable (“R”).

5. Laboratory Control Sample/Laboratory Control Sample Duplicates

Data for Laboratory Control Sample (LCS) and Laboratory Control Sample Duplicates (LCSD) are generated to monitor the accuracy and precision of the analytical process on a non-contaminated material such as homogeneous sand or purified water.

How to check: Review the LCS and LCSD recoveries and the Relative Percent Differences (RPD) between the LCS and LCSD for each compound for each method. Acceptable ranges for LCS and LCSD recoveries must be listed in the report. If any recovery or RPD is outside of the acceptable range, the samples and associated QC samples will need to be re-analyzed. If the second set of LCS/LCSD results still fails, contact MDA Incident Response Unit (IRU) staff.

  1. If any recovery or RPD is greater than the upper acceptance limit, any associated analytes detected above the reporting limits should be flagged as estimated (“J”).
  2. If any recovery is below the lower acceptance limit, any associated non-detects should be flagged as estimated (“J”) or as unusable (“R”).

6. Matrix Spike/Matrix Spike Duplicates/Sample Duplicates

Data for matrix spike (MS) and matrix spike duplicate (MSD) and sample duplicates (DUPs) are generated to monitor accuracy and precision of the analytical process on the sample matrix, i.e., one of the samples submitted to the laboratory. For organic analyses, MS/MSDs are prepared and analyzed at a 5 percent frequency. For inorganic analyses, matrix spikes and duplicates are prepared and analyzed at a 10 percent frequency. Note: Labs do not always choose the samples submitted for your MDA project to run as MS/MSD samples. You may request that the lab use your MDA project samples for QC analyses instead of samples submitted from another site.

How to check: Review the recoveries and RPDs for each method. Acceptable ranges for MS/MSD recoveries must be listed in the report. If any recovery or RPD is outside of the acceptable range, the samples must be re-analyzed. If the second set of MS/MSD results still fail, the data must be flagged with the following qualifiers and a narrative must be included which describes the issues.

  1. If any recovery or RPD is greater than the upper acceptance limit, any associated analytes detected above the reporting limits should be flagged as estimated (“J”).
  2. If any recovery is below the lower acceptance limit, any associated non- detects should be flagged as estimated (“J”) or as unusable (“R”).

7. Method Detection Limits/Reporting Limits

Method Detection Limits (MDLs) and Reporting Limits (RLs) are determined initially by the laboratory for each method and repeated at a minimum of every 2 years, or after a major change to the instrument conditions. The MDLs are to be determined per the procedure defined in U.S. 40 Code of Federal Regulations 136, Appendix B. The RLs should be at least three (3) times the MDLs. Reporting limits depend on program needs. They can change as new information becomes available. Reporting limits are verified after each calibration and at least monthly. Consult MDA IRU Guidance Document 24 for reporting limits for target analytes. Reporting limits and method detection limits can vary between laboratories performing the same test method and within the same laboratory from one year to another as new MDL studies are performed. Reporting limits must be met for each analysis. If the reporting limits have been raised, the laboratory must provide an explanation in the report narrative. In the case of dilutions, the laboratory must report concentrations of analytes from multi-compound lists that were detected in the sample prior to dilution of the sample.

8. Sample Information

The laboratory must ensure that all sample numbers are cross-referenced correctly and that this information is clear to the consultant and MDA IRU staff.

Soil Reporting: Results for soil/solid samples must be reported by dry weight. The percent moisture results must also be reported.

9. Report Narrative

The laboratory must provide a case narrative with each analytical report. The narrative must explain all problems and issues with sample analysis and fully discuss any data that is flagged or footnoted. The narrative must also indicate why a result which was flagged or otherwise identified to have quality assurance/quality control concerns should be considered usable data.


GD29 Attachment: Laboratory Data Review Checklist (PDF: 391 KB / 4 pages)