Triangle NoTES January 2017

 Triangle NoTES

 

January 2017

 

On the anniversary of our return to independent status I would like to revisit a couple of things which are very important in looking at analytical services. Recently I have been asked to review reports to give advice on how to proceed concerning the results, which has been interesting to say the least.

 

Over the years I have seen reports that ranged from less than a page of reported concentrations to dozens of pages of raw data. To be honest, a single page of results may represent the highest quality of work, but without any supporting evidence there is no way to really know the quality represented in the handling or analysis of the samples. This is especially true when the single page covers multiple analyses following multiple methods and using multiple detectors. There is a level of uncertainty with every analysis, which must be acknowledged in some fashion to allow a reviewer to determine the quality of those results. Are the results due to a single injection or the average of multiple injections? The QA for the average of multiple injections is, of course, higher that the single injection. For example, the Method 3-C analysis calls for duplicate injection, which for most cases is sufficient to show agreement. It is that rare case where there is a difference between the injections that the third is so useful. The problem is that for many, a single injection is considered effective with a QA duplicate every few samples. They may have one of the unusual variations but never see it because the odds are against them doing so.

 

On the other side of the coin, it is possible to overwhelm people with too much data. For example, we provide a sample chromatogram for one of the individual analytical runs and a calibration run in addition to the data for every injection of every sample and calibration to be included in the report. This is because to add all of the chromatograms it would add six to twelve pages of raw data for each individual sample and up to twenty four additional pages for the calibrations to the report depending on the type of sample. The size of the report grows quickly into the unmanageable levels as this raw data is included. For example, a set of three inlet and three outlet samples could have eighty-four pages or more of raw data that merely inflate the volume of the report. In our case the raw area counts are electronically transferred into the program used to generate the report so there is no manual transfer of that data to be checked and no reason to unnecessarily bulk up the report.

 

Sometimes the modifications some laboratories make to the methods are specifically acknowledged and other times there are just general claims. If a laboratory says the standard methods are listed for reference only and the actual procedure may be different, but will meet the requirements as a performance base, I become very concerned. The ASTM methods, for example, are fairly generous in the performance based applications, but the EPA methods are not flexible unless there is an explicit notation within the method making such an allowance. This is because the EPA reference methods legally define the parameters for which they are used. Thus, any change, however minor, may result in a changed legal definition for those results and not one supported by the regulations. For example, when Method 25-C analysis was being performed with the SCAQMD 25.1 analytical systems, the USEPA contracted RTI to do a comparison after our round robin with laboratories using the modified systems. In my talks with Dr. Jayanty on the subject, it appeared that some of the column sets used were not compared due to the claim of proprietary information, but others were tested in comparison with the set specified in the Methods 25 and 25-C. The best correlation they found was a 20% negative bias for the new columns. The worst any of us saw was a negative 100% bias where a non-detect was reported compared to significant amounts detected using the correct columns and by a GC/MS scan of the sample using an internal standard for quantification of the total peaks.

 

Another instance of a comparison was totally internal. We had been told of a “better” column set by Varian Instruments, but it was based on the SCAQMD Method so we needed to make some comparisons prior to making a request to be able to use it. We performed all of the required tests in the method and the set passed with flying colors. The peaks were much more pronounced, sharper, and cleaner which made the data review even better. Technically this showed the column set to meet the performance criteria for the method, but I needed to know if the results were similar to those of the method in the application of this set. We analyzed a series of samples using this new column set on one analyzer after it was analyzed by the standard set of columns on another analyzer. The resulting comparison of the reported concentrations was not even close. The new set showed a very significant negative bias when compared to the standard columns. We then repeated the initial tests required under the method, but this time they failed in a spectacular manner. In the space of a couple of days the column set had completely failed the comparison and the second set of initial performance tests, however the daily tests and calibrations, which use different compounds, were still perfect. There was no indication of the impending failure of the column set based on the first set initial performance testing required by the method. Had we been looking to use a performance review based on the fairly numerous initial performance tests required by the method, we would have thought we had a successful replacement and been very wrong. This is why I am always concerned with changes to the methods, however slight, without a lot of review and comparison to the original method.

 

Another caveat which I find a little disturbing is if the laboratory denies any liability for anything relating to the use of anything, including results, contained in their report. Thus, if a laboratory provides a report that someone needs to use to determine compliance or even engineering, the use of those results are solely at the liability of someone else. If there was a modification made by the laboratory, which lost a significant portion of the reported sample and was later discovered, the laboratory bears no liability for any damages, such as fines or improper engineering. I imagine that none of the other groups involved in such a project would want to take that liability either, but someone will eventually have to pay the price for the difference when it is discovered.

 

I believe every report should have all of the data required to allow anyone to go from the raw area counts to the final reported concentrations. This transparency helps everyone who may be involved understand the report and the level of accuracy. It is also why I never want normalize a sample in a report. If the totals do not add to exactly 100% that shows the uncertainty, but if everything looks perfect there is no way to grasp that uncertainty. I also believe everything of potential impact should be in the comments of the report. There may need to be an expanded explanation later, but if it is listed there is no concern if there is a later question involving that situation.

 

I had hoped the accreditation of laboratories would have made this concern over modifications and limited information on the analysis moot. The increased costs, limited scope in many areas, and the vast knowledge required of an auditor to know what is and is not a potential problem has caused the accreditation to have less of a positive impact than I would have liked. That leaves the regulators, testers, their clients, and anyone else who needs to use the reports of the various laboratories performing the work in a position where they may be asked to trust those reports based solely on faith rather than supporting science. We must have faith in the people doing the sampling and the analysis, but confirmation of that faith is also necessary for the long term.

 

Wayne Stollings

Triangle Environmental Services, Inc.

 

Wstollings@aol.com

 

P.O. Box 13294 122 US Hwy 70 E

Research Triangle Park, NC 27709 Hillsborough, NC 27278

 

(919) 361-2890 (800) 367-4862 Fax: (919) 361-3474