May 2004, Page 16

Forensic Crime Labs: Scrutinizing Results, Audits & Accreditation -- Part 2
By Frederic Whitehurst

13. Raw data for the complete measurement sequence (opening and closing quality control included) that includes the subject samples. For GC-MS analysis, this would include: areas and retention times, injection volumes, dilution factors, chromatograms and mass spectra. As prepared and as determined values for all quality control samples.

Modern analytical chemistry instrumentation is generally run by computers and data is collected in those computers. There is no reason that one must accept the fact that forensic crime labs do not have data storage capabilities. Paper print outs of mass spec and other instrumental data represent the data which may very well have been manipulated with computer algorithms/programs to take out “noise”, make the graph appear to be smoother, remove information, etc. Counsel should demand the raw data which has not been manipulated in order that review may address the actual information collected rather than the forensic lab’s version of it.

14. A description of the library used for spectral matches for the purpose of qualitative identification of controlled substances, including source(s) and number of reference spectra.


Following collection of data, the data is used to identify materials through comparison with previously collected data for many materials. These comparisons are performed by computers which objectively give best choices among many possibilities found in spectral libraries. James W. Shellow in “The End of A Confidence Game, A Possible Defense to the Impossible Drug Prosecution”26 provides us with a description of these libraries:


Forensic laboratories now possess infrared and mass spectrometers which contain computers with spectral libraries. These computers are programmed so that with the push of a button they will automatically search spectral data bases and compare the spectrum of the suspect drug with thousands of reference spectra.

Comprehensive research on the computerized searching of spectra libraries was undertaken by Professor F.W. McLafferty and his colleagues at Cornell University. These or similar automated comparison algorithms are incorporated in both infrared and mass spectrometers. If such procedures are used, the computer will generate a list of compounds whose spectra are most similar to that of the suspected drug. The computer will rank order these spectra and attach to each an arithmetic assessment of closeness of fit; this allows an objective estimate of the probability that the examined specimen is the suspected illegal drug.

A favorite argument of forensic scientists is that even if the computer does not match the data with the actual material the analyst believes is present, computers cannot be trusted as well as human beings. The computer algorithm that matches spectra may give a list of five materials or ten materials that are best fits. A reviewer may very well find that despite the fact that the computer matches the unknown to substance A, the forensic scientist picks another choice from the list. That disregard for the output of the computer algorithm provides fruitful ground for cross-examination. One would ask what specific reasons the examiner had for ignoring the “best match.” There will generally be no sound answer other than “based on my experience.”

A way around having a large library of spectra result in the designation of a “wrong” material as the best fit for the data is to create a small library populated by very different types of materials therefore forcing a fit with the desired match. Counsel should ask about the population of the spectral library. An example would be a library populated only by drugs. The analyst who has been told by the law enforcement officer who collected the evidence that it must be cocaine can use that small library of spectra and get a match. However, if one uses the total spectral library available to instrument manufacturers and usually purchased by forensic labs, data from other materials may match the experimental data better, raising reasonable doubt.

15. Copy of records documenting computation of illicit drug laboratory’s theoretical production yield, including the basis for the computation, and the algorithm used, as appropriate.

Suspected illicit drug labs are found with precursor materials. From the amounts of those precursors, forensic lab examiners determine theoretical yields of final product. Counsel reviewing these opinions must have copies of the computer programs that calculate those yields as well as the process of determining the yields totally spelled out and based upon available scientific literature including information on proposed synthetic routes. Fruit from this tree may very well be that the yields suggested may be dependent upon conditions of synthesis which the alleged illicit lab did not apply.

16. Procedure(s) for operation and calibration checks of analytical balances used to weigh controlled substances.

As simple as the concept of weighing evidence may seem, an analytical balance can be improperly operated and data from the balance misinterpreted. Where the amount of illicit material seized can be a huge factor in sentencing, one needs to know whether the analyst truly knows how to utilize an analytical balance.

17. Results of calibration checks and documentation of mass traceability for gravimetric determinations.

Does the balance which is used to measure weight actually function correctly or is it poorly calibrated or even broken? How does the analyst know? When an analytical balance is not working correctly, the fallout can be significant to many analyses. Not only will evidence be misweighed but standards prepared wrongly and interpretations of data without proper foundation.

18. Results of contamination control surveys for trace level analytes relevant to test methods at the time of opening and closing qualit


13. Raw data for the complete measurement sequence (opening and closing quality control included) that includes the subject samples. For GC-MS analysis, this would include: areas and retention times, injection volumes, dilution factors, chromatograms and mass spectra. As prepared and as determined values for all quality control samples.


Modern analytical chemistry instrumentation is generally run by computers and data is collected in those computers. There is no reason that one must accept the fact that forensic crime labs do not have data storage capabilities. Paper print outs of mass spec and other instrumental data represent the data which may very well have been manipulated with computer algorithms/programs to take out “noise”, make the graph appear to be smoother, remove information, etc. Counsel should demand the raw data which has not been manipulated in order that review may address the actual information collected rather than the forensic lab’s version of it.

14. A description of the library used for spectral matches for the purpose of qualitative identification of controlled substances, including source(s) and number of reference spectra.

Following collection of data, the data is used to identify materials through comparison with previously collected data for many materials. These comparisons are performed by computers which objectively give best choices among many possibilities found in spectral libraries. James W. Shellow in “The End of A Confidence Game, A Possible Defense to the Impossible Drug Prosecution”26 provides us with a description of these libraries:

Forensic laboratories now possess infrared and mass spectrometers which contain computers with spectral libraries. These computers are programmed so that with the push of a button they will automatically search spectral data bases and compare the spectrum of the suspect drug with thousands of reference spectra.


Comprehensive research on the computerized searching of spectra libraries was undertaken by Professor F.W. McLafferty and his colleagues at Cornell University. These or similar automated comparison algorithms are incorporated in both infrared and mass spectrometers. If such procedures are used, the computer will generate a list of compounds whose spectra are most similar to that of the suspected drug. The computer will rank order these spectra and attach to each an arithmetic assessment of closeness of fit; this allows an objective estimate of the probability that the examined specimen is the suspected illegal drug.

A favorite argument of forensic scientists is that even if the computer does not match the data with the actual material the analyst believes is present, computers cannot be trusted as well as human beings. The computer algorithm that matches spectra may give a list of five materials or ten materials that are best fits. A reviewer may very well find that despite the fact that the computer matches the unknown to substance A, the forensic scientist picks another choice from the list. That disregard for the output of the computer algorithm provides fruitful ground for cross-examination. One would ask what specific reasons the examiner had for ignoring the “best match.” There will generally be no sound answer other than “based on my experience.”

A way around having a large library of spectra result in the designation of a “wrong” material as the best fit for the data is to create a small library populated by very different types of materials therefore forcing a fit with the desired match. Counsel should ask about the population of the spectral library. An example would be a library populated only by drugs. The analyst who has been told by the law enforcement officer who collected the evidence that it must be cocaine can use that small library of spectra and get a match. However, if one uses the total spectral library available to instrument manufacturers and usually purchased by forensic labs, data from other materials may match the experimental data better, raising reasonable doubt.

15. Copy of records documenting computation of illicit drug laboratory’s theoretical production yield, including the basis for the computation, and the algorithm used, as appropriate.

Suspected illicit drug labs are found with precursor materials. From the amounts of those precursors, forensic lab examiners determine theoretical yields of final product. Counsel reviewing these opinions must have copies of the computer programs that calculate those yields as well as the process of determining the yields totally spelled out and based upon available scientific literature including information on proposed synthetic routes. Fruit from this tree may very well be that the yields suggested may be dependent upon conditions of synthesis which the alleged illicit lab did not apply.

16. Procedure(s) for operation and calibration checks of analytical balances used to weigh controlled substances.

As simple as the concept of weighing evidence may seem, an analytical balance can be improperly operated and data from the balance misinterpreted. Where the amount of illicit material seized can be a huge factor in sentencing, one needs to know whether the analyst truly knows how to utilize an analytical balance.

17. Results of calibration checks and documentation of mass traceability for gravimetric determinations.

Does the balance which is used to measure weight actually function correctly or is it poorly calibrated or even broken? How does the analyst know? When an analytical balance is not working correctly, the fallout can be significant to many analyses. Not only will evidence be misweighed but standards prepared wrongly and interpretations of data without proper foundation.

18. Results of contamination control surveys for trace level analytes relevant to test methods at the time of analysis, including sampling design and analytical procedures.

If one is searching for nitroglycerine in a testing laboratory which tests for the presence of nitroglycerine on a regular basis, what may be found is not nitroglycerine which was on evidence before it reached the lab but nitroglycerine in the lab which contaminated the evidence when it was processed in the lab.


If one is searching for the presence of cocaine residue on an item, that cocaine residue found may have been put on the evidence by contamination from the lab conducting the analysis.


Forensic crime labs have ignored concerns about contamination in the past. Materials which cannot be seen with the naked eye can still be detected and indicate culpability. If not seen or noticed these materials can spread like a common cold, undetected before sensitive instrumentation is used.


Imagine an individual who investigates a methamphetamine lab, finds a lot of white powder, gets some of that powder on his hands, shoes, clothes. He returns to the crime lab, gets out of a vehicle which he has contaminated, opens a door to the lab contaminating the door handle, uses a phone which he contaminates, sits in a chair which he contaminates, shakes hands with friends who he contaminates, uses equipment which he contaminates, all without realizing the contamination vectors which he is responsible for. Other lab employees open the door, use the phone, shake hands, sit in that chair, use equipment and soon the lab has a background level of methamphetamine on many surfaces. Evidence from another case is submitted and processed on a contaminated table and methamphetamine is detected when that evidence is processed. The methamphetamine originated not from the evidence itself but from the lab. Unless labs conduct regular audits for contamination, examiners cannot say whether residue detected originated from the lab or from the evidence.

19. Records and results of internal reviews of subject data.

Reports are peer reviewed in forensic labs. The results of those reviews should themselves be reviewed. Very possibly wording used by the examiner is changed during the review. These changes may be very fruitful areas to explore during cross-examination.

20. Method validation records documenting the laboratory’s performance characteristics for qualitative identification and quantitative determinations of the controlled substance, to include data documenting specificity, accuracy, precision, linearity, and method detection limits.

Just because I give you an answer, that does not mean the answer is correct. Of course you recognize that. The fact that a scientist utilizes equipment with large names and complicated electronics to find an answer to a problem does not mean that the answer is correct. The process by which a scientist determines if the method he is using is giving the correct answers to questions is protocol validation. Protocols/methods must be validated, tested. Otherwise the scientist is simply collecting data. Validation studies for many materials which are analyzed by forensic labs have never been conducted. Dr. Maureen Bradley, an FBI Lab paint analyst testified in a civil deposition27:

Q. Okay. Now let’s assume you’ve run all six tests, all the tests you’ve described, and every one come...is consistent with, that you’ve seen consistent. At that point, how many false negatives would show up.
A. I’m not certain I understand what you mean by false negatives.

Q. Okay. In other words, excuse me, how many false positives would show up? In other words, you run through your analysis and it’s come up comparable, comparable, comparable. You’re done with it right. But what’s the percentage that the two paints were not a match? In other words, there’s some other explanation to why the two paints came up consistent with each other, such as two people had the same, went to the store and purchased the same paint and the same manufacturer at roughly the same time. Do you know what the percentage of false positives?
A. No, I don’t know.

Q. Do you know if the laboratory has undergone an analysis of the percentage of false positives?
A. No I don’t.

Q. Do you know what the percent of reliability, without knowing the percent of false positives, do you know how reliable your testimony is. . . .”


What one may infer here is that if Dr. Bradley, a Ph.D. chemist qualified by the FBI crime lab in the forensic analysis of paints, does not know the reliability of her opinions concerning paints then very probably no research has been conducted by the FBI concerning the validity of their forensic paint analysis program. One must wonder, if the examiner does not know how many times she is wrong, when she is wrong, if she is wrong, how is forensic paint analysis evidence relevant or probative to finding of guilt? If under Daubert the trial judge must determine whether the method has a known error and even the analyst does not know that error rate then any jury viewing this evidence will not be in any position to decide how much weight to assign to that testimony.
 
21. Copy of the laboratory’s Quality Manual in effect at the time the subject samples were tested as well as the laboratory’s most recent Quality Manual (however named; the document that describes the laboratory’s quality objects and policies).

The Quality Assurance manual outlines the methods one must have in place to determine the quality of the work product. Counsel required to review crime laboratory work product must have a copy of the quality manual which was in place at the time the evidence was analyzed
and at the time of court. These manuals will be a road map to ferreting out issues in the lab work product. A comparison of the two manuals, both that used when the evidence was analyzed and that when testimony is presented by the government, may also show differences which reflect discovery by the crime lab managers of problems within the lab in the past...problems which might affect a case at court.

22. Copy of the laboratory’s ASCLD-LAB application for accreditation, and most recent Annual Accreditation Review Report, as appropriate.

U.S. crime labs can be accredited by the American Society of Crime Lab Directors-Lab accreditation group. As the ASCLD name implies, crime laboratory directors review each other’s labs. At present the accreditation process is viewed as proof that the crime lab is producing valid work product. However that is not necessarily true. The ASCLD-LAB process simply declares that a crime lab has in place that documentation which is required in order for an audit to be conducted successfully. Virtually no crime labs have been fully audited to date. Crime labs have simply refused to allow themselves to be subjected to audit. The normal excuse used is that the crime labs do what no one else does or can do and therefore they must audit themselves. That excuse flies in the face of the scientific method and renders labs using it useless. Science is open. Science is continually reviewing the “truths” of yesterday. Science that exists in secret degrades. The 2003 US DOJ IG inspection of the FBI crime lab’s DNA analysis28 program shows us that the ASCLD-Lab accreditation does not insure that a crime lab is producing valid work product. The FBI was last accredited in 1997 by ASCLD-Lab. Jacquelyn Blake, the FBI DNA technician who was found in 2002 to have failed to follow required scientific procedures while analyzing 103 DNA samples, was hired by the FBI after the 1997 ASCLD inspection and accreditation, produced her flawed work product for a number of months undetected and resigned while under investigation as a result of that poor work product. It becomes very obvious that the ASCLD accreditation process does not insure quality work product.
Only a thorough audit will insure that work product. The ASCLD accreditation process, however, is very valuable in that it means that the laboratory is finally saying publicly that it is prepared for an audit. Now counsel just needs to conduct that audit. Remembering the North Carolina crime lab’s Dr. Waggoner not even knowing if the instruments that were used to analyze for the presence of cocaine were functioning correctly leads one to realize that audit is necessary in every case involving forensic science.

23. Statement of qualifications of each analyst and/or technician responsible for processing case samples to include all names, locations and jurisdictions of cases in which these personnel testified concerning the same substances found in the present case.

Professor James Starrs, in his article
Mountbanks Among Forensic Scientists29 recounts numerous incidents of forensic scientists who overstated their credentials and lied about tests performed during testimony. Among those individuals Starrs describes are the following:
David Bruce Tredwell passed himself off as having obtained bachelor of science and doctorate degrees in geology as well as having been a staff geologist and laboratory manager for a National Aeronautics and Space Administration Laboratory, all of which testimony was false. Tredwell found himself testifying in the Love Canal environmental pollution litigation before the government moved to strike his testimony.


Supervisory Special Agent Thomas N. Curran, FBI Laboratory, was described in a 1979 Maine Supreme Judicial Court decision as lying under oath regarding tests he had conducted for the FBI Laboratory and reporting results of lab tests that he did not in fact conduct. In a rape and murder trial in 1974, in the District of Columbia, Curran testified to having a bachelor’s and master’s degree in science when he never had acquired a graduate degree.


And in a FBI report of Jay Cochran dated February 5, 1975 to White, the FBI Laboratory Director, Cochran described Curran as “he chose to ignore the virtue of integrity and to lie when asked if specific tests were conducted.”

Richard Zielinski of the Toledo Ohio Police Department Crime Laboratory misstated his academic credentials as having a bachelor’s degree in pharmacy which he did not have. He apparently never attended the FBI Academy school on handwriting as he claimed, was not familiar with infrared spectrometry, and never sent bullets to the FBI laboratory as he said that he did. Zielinski also represented himself as having performed laboratory tests that he did not perform.
As we can see from the examples given above by Professor Starrs, counsel should never simply accept the word of experts concerning their credentials. One should demand that the agency that presents the expert vouch for the credentials of that expert. The expert who is found to have lied about his credentials immediately disqualifies any further opinions he might want to present to the trier of fact.


The US DOJ IG investigation of the FBI crime lab found that “experts” from that crime lab were testifying outside their areas of expertise. Determining that an “expert” who is testifying about chemical analysis but has no scientific education, experience or training in chemical analysis can lend powerful weight to the argument that the testifier is not an expert at all. This may seem ridiculously obvious but numerous counsel in the past have simply accepted credentials or not even asked for them.

24. Copy of the laboratory’s ASCLD-LAB on-site inspection report, as appropriate, as well as any reports of on-site inspections by any other testing laboratory audit organization.

Forensic crime labs that have been accredited by the American Society of Crime Lab Directors-Lab group refuse to give up the results of the inspections. ASCLD-Lab refuses now to even describe the inspection process and will never provide any information concerning the failures of crime labs. The accreditation process is voluntary and ASCLD would not be allowed to inspect the labs if the labs knew that the results of the inspections would become common knowledge. That the labs refuse to provide any indication of their flaws and failure rates is indicative of the fact that they have something to hide and are not indeed, even scientific laboratories. Triers–of-fact will not be blind to that fact. The labs, by refusing to discuss failure rates raise the level of reasonable doubt.

25. Copy of internal audit reports generated during the period subject samples were tested.

As time goes on and defense counsel learn to closely question forensic work product, these internal audit reports will come under closer and closer scrutiny. What now passes for an internal audit will tomorrow be something to laugh at. It is obvious that the internal audit process that led to the discovery of the FBI’s Jacquelyn Blake’s flawed work product was not functioning if it allowed Blake to produce flawed product for many months. It is obvious that any internal audits in the crime labs described above have not functioned as intended. Internal audits should not be the period at the end of the sentence but simply a comma, in preparation for external independent testing laboratory audits.
 
26. List of capital instrumentation in the laboratory at the time subject testing was performed, including manufacturer, model number, and major accessories.

Questions concerning the extent of testing of materials before opinions can be rendered may require a knowledge of what kinds of equipment crime labs actually have. One lab may test for the sameness of two paint samples with three tests and the next with seven tests. The two labs cannot generally render the same opinion about sameness. Also the technician who decides not to conduct analyses for whatever reason, who engages in “protocol drift” may be discovered once a reviewer is aware of the total kinds of instrumentation available.


While I was employed at the FBI crime lab a great deal of pressure was applied to examiners to write one-liner reports so as not to wave flags to possible problems which defense experts might ferret out. Attempts to list in the laboratory report the types of instruments utilized in an analytical protocol were strongly resisted both by examiners who eventually were found to be testifying outside their areas of expertise and by managers who knew that more complete reports would open the doors to discovery of problems within the lab.

27. Production throughput data for the drug testing section: numbers of tests performed per month or per year, and the number of Full Time Equivalent personnel in the drug testing section of the laboratory.

As has been noted earlier in this series, forensic crime labs are generally horribly understaffed and over-tasked. Following protocols takes a certain amount of time which generally cannot be shortened without taking short cuts and not following protocols carefully. Determining if staff are overworked and therefore more prone to error is an important piece of data to have to guide counsel to understand the weakness of the forensic lab work he is facing.


Just as we as attorneys ask in-depth questions concerning legal aspects of all facets of a criminal case, so we should be asking indepth questions concerning forensic laboratory work product. We have many examples of failures of forensic crime labs gone undetected by our justice system for years due possibly to the misunderstanding that these are unaddressable issues. It is both possible and absolutely necessary for counsel defending and even prosecuting to know if the work product from crime labs is valid. It is obvious that because crime labs have refused to submit to outside independent review in the past, there is a concern within those labs of having their failures discovered. The crime lab that refuses to provide clear and complete answers to the questions provided above should be viewed as hiding something and therefore creating reasonable doubt concerning its own work product.
Any successful and valid scientific inquiry requires openness and in-depth peer review. That is particularly important when the work product of science is being used in our justice system.

This series has been presented as a primer hopefully leading to more successful review of the work product presented by crime laboratories.


Notes
26. James W. Shellow,
The End of a Confidence Game, A Possible Defense to the Impossible Drug Prosecution, The Champion, August/September 2000.
27. Villaneuva v. FBI, Deposition of Maureen J. Bradley, 11/26/01
28.Associated Press,
Justice Department Broadens Probe of FBI’s Dna Lab Practices, The Daily Reflection, April 28, 2003.
29. James E. Starrs,
Mountbanks Among Forensic Scientists, 2 Forensic Science Handbook, 1 (1995).



National Association of Criminal Defense Lawyers (NACDL)
1150 18th St., NW, Suite 950, Washington, DC 20036
(202) 872-8600 • Fax (202) 872-8690 • assist@nacdl.org