Вы находитесь на странице: 1из 31

1 NISTIR 8225 DRAFT

2
3

4 NIST Scientific Foundation Reviews


5
6
7 John M. Butler
8 Hari Iyer
9 Rich Press
10 Melissa K. Taylor
11 Peter M. Vallone
12 Sheila Willis*
13 *International Associate under contract; former director of Forensic Science Ireland
14
15 This publication is available free of charge from:
16 https://doi.org/10.6028/NIST.IR.8225-draft
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37

38
39 NISTIR 8225 DRAFT
40

41 NIST Scientific Foundation Reviews


42
43 John M. Butler
44 Melissa K. Taylor
45 Sheila Willis*
46 Special Programs Office
47 Associate Director of Laboratory Programs
48
49 Hari Iyer
50 Statistical Engineering Division
51 Information Technology Laboratory
52
53 Peter M. Vallone
54 Biomolecular Measurement Division
55 Material Measurement Laboratory
56
57 Rich Press
58 Public Affairs
59 Director’s Office
60
61 *International Associate under contract; former director of Forensic Science Ireland
62
63 This publication is available free of charge from:
64 https://doi.org/10.6028/NIST.IR.8225-draft
65
66
67 September 2018
68
69

70
71
72 U.S. Department of Commerce
73 Wilbur L. Ross, Jr., Secretary
74
75 National Institute of Standards and Technology
76 Walter Copan, NIST Director and Undersecretary of Commerce for Standards and Technology
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

77 National Institute of Standards and Technology Internal Report 8225 DRAFT


78 (September 2018)
79
80 Acknowledgments: Input and suggestions on this document and project were made by
81 Richard Cavanagh, Barbara Guttman, Bill MacCrehan, Kathryn Miller, Kathy Sharpless,
82 Jack Ballantyne, Todd Bille, Jennifer Breaux, Robin Cotton, Roger Frappier, Bruce
83 Heidebrecht, Keith Inman, Eugene Lien, Tamyra Moretti, Lisa Schiermeier-Wood, Joel
84 Sutton, Ray Wickenheiser, and Charlotte Word.
85
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

86 Points of view are those of the authors and do not necessarily represent the official position
87 or policies of the National Institute of Standards and Technology. Certain commercial
88 entities are identified in order to specify experimental procedures as completely as possible.
89 In no case does such identification imply a recommendation or endorsement by the National
90 Institute of Standards and Technology, nor does it imply that any of the entities identified are
91 necessarily the best available for the purpose.
92
93 Public comment period: September 24, 2018 through November 19, 2018
94
95 National Institute of Standards and Technology
96 Attn: Special Programs Office – Scientific Foundation Review
97 100 Bureau Drive, MS 4701
98 Gaithersburg, MD 20899-4701
99
100 Email: scientificfoundationreviews@nist.gov
101
102 All comments, including commenter name and affiliation, will be published at
103 https://www.nist.gov/topics/forensic-science/draft-nistir-8225-nist-scientific-foundation-
104 reviews.

i
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

105 Abstract
106
107 The National Institute of Standards and Technology (NIST) is a scientific research agency
108 that works to advance measurement science, standards, and technology and that has been
109 working to strengthen forensic science methods for almost a century. In recent years, several
110 scientific advisory bodies [2-4] have expressed the need for scientific foundation reviews of
111 forensic disciplines and identified NIST as an appropriate agency for conducting them. The
112 purpose of a scientific foundation review is to document and consolidate information
113 supporting the methods used in forensic analysis and identify knowledge gaps where they
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

114 exist. In fiscal year 2018, Congress appropriated funds for NIST to conduct scientific
115 foundation reviews [5], p. 22. NIST has begun reviews of DNA mixture interpretation and
116 bitemark analysis. In addition to providing insights into these specific disciplines, the initial
117 reviews serve as pilot studies which will guide future efforts of this type. This document
118 outlines NIST’s approach to conducting scientific foundation reviews, including data
119 sources, evaluation criteria, and expected outputs.
120
121 Keywords
122
123 forensic science, scientific foundation review, technical merit evaluation
124

i
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

125 Table of Contents


126
127 1. What is a Scientific Foundation Review? ...................................................................... 1
128 1.1. What Data Sources Will We Use? .............................................................................. 1
129 1.2. How Will We Evaluate the Data? ............................................................................... 2
130 1.3. What Information Will We Report? ............................................................................ 2
131 2. Why NIST?....................................................................................................................... 4
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

132 2.1. Calls for NIST to Conduct Scientific Foundation Reviews ........................................ 5
133 3. Previous Efforts: A Historical Overview ....................................................................... 5
134 3.1. Analytical Chemistry Bi-annual Application Reviews (1983 to 2011) ....................... 6
135 3.2. INTERPOL Literature Review .................................................................................... 6
136 3.3. Measurement Science Workshops............................................................................... 7
137 3.4. National Research Council 2009 Report ..................................................................... 7
138 3.5. White House Subcommittee on Forensic Science (SoFS) .......................................... 8
139 3.6. NSF/NIJ-Funded Workshop ........................................................................................ 9
140 3.7. Insights from the National Commission on Forensic Science .................................... 9
141 3.7.1. Appropriate Scientific Literature......................................................................... 10
142 3.7.2. Identifying and Evaluating Literature ................................................................. 11
143 3.7.3. A Proposal for NIST to Perform Scientific Foundation Reviews ....................... 13
144 3.7.4. NIST Announcement at the September 2016 NCFS Meeting ............................ 13
145 3.7.5. NCFS Technical Merit Review Documents ........................................................ 14
146 3.8. PCAST 2016 Report .................................................................................................. 16
147 3.9. AAAS Studies ........................................................................................................... 18
148 4. Other Similar International Activities ......................................................................... 19
149 4.1. Australian NIFS Forensic Fundamentals .................................................................. 19
150 4.2. UK Forensic Science Regulator ................................................................................ 20
151 5. Terminology and Concepts ........................................................................................... 21
152 6. References....................................................................................................................... 23
153

ii
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

154 1. What is a Scientific Foundation Review?


155
156 A scientific foundation review is a study that seeks to document and evaluate the foundations
157 of a scientific discipline, that is, the trusted and established knowledge that supports and
158 underpins the discipline’s methods. NIST is conducting scientific foundation reviews in
159 forensic science. These reviews seek to answer the question: “What empirical data exist to
160 support the methods that forensic science practitioners use to analyze evidence?”
161
162 The central activity of forensic science is to make associations between pieces of evidence or
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

163 between evidence and known items in order to shed light on past events and actions. Forensic
164 practitioners do this by comparing and classifying items based on selected features such as
165 the minutiae of a fingerprint, the alleles in a DNA sample, or the toolmarks on a fired bullet.
166
167 For each forensic method studied, we will evaluate whether the selected features are
168 characterized and measurable; to what extent the discriminating power of those features is
169 known; and whether the factors that affect the transferability and persistence of those features
170 are understood.
171
172 Each foundation review will be different depending on the specifics of the discipline, but all
173 will be based on the following generalized approach.
174
175 1.1. What Data Sources Will We Use?
176
177 Because peer-reviewed publications are essential building blocks of a respected edifice of
178 scientific knowledge, studies that address the reliability of forensic methods would ideally be
179 present in a discipline’s published, peer-reviewed, and well-cited scientific literature.
180 However, a focus on peer-reviewed literature alone may not provide a complete picture of a
181 discipline’s available body of knowledge. For instance, data from laboratory validation
182 studies may not be publicly available or published. Therefore, NIST scientific foundation
183 reviews are designed to seek input by:
184
185 • collecting and evaluating the peer-reviewed literature
186 • assessing available data from interlaboratory studies, proficiency tests, and laboratory
187 validation studies
188 • exploring other available information including position statements and non-peer
189 reviewed literature
190 • obtaining input from members of the relevant community through interviews,
191 workshops, working groups and other formats for the open exchange of ideas and
192 information.
193
194 Obtaining input from experts outside of NIST is an integral component of a NIST scientific
195 foundation review. This will help ensure that these reviews capture the full breadth of
196 knowledge that forensic practitioners and researchers consider foundational to their
197 discipline.
198

1
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

199 1.2. How Will We Evaluate the Data?


200
201 After gathering information, we will evaluate it against the following criteria:
202
203 1) Retrievable: Does the information appear in a peer-reviewed journal or book that is
204 indexed? Is the reference citation accessible by search engines? If not, is it published
205 online or otherwise reasonably available for review by others?
206 2) Reliable: Can the information be verified against other sources? Can the reported
207 methods do what they claim to do? Are capabilities and limitations of the methods
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

208 understood? Are the methods clearly explained so that they can be reproduced? Are
209 experimental materials with known values used to show that accurate conclusions can
210 be made? Are statistically significant sample sizes used?
211 3) Respected: Has the information been cited as being useful by other researchers or
212 practitioners in the scientific literature? Has it been scrutinized or reviewed by others?
213
214 Retrievability is among the criteria because transparency and openness are hallmarks of good
215 science [6]. Therefore, we believe that for something to be considered foundational, it must
216 be reasonably accessible to anyone who wishes to review it.
217
218 Where peer-reviewed publications are not available, transparency and accessibility can help
219 fill the gap. For instance, publishing validation data from forensic laboratories online would
220 allow for “open peer review” [7].
221
222 1.3. What Information Will We Report?
223
224 The outcome of each NIST scientific foundation review will be a publicly-available report
225 that may be accompanied by additional online resources. We expect that these reports will
226 include:
227
228 1) an introduction to the issues involved
229 2) historical perspectives of the field and current methods in use
230 3) a discussion of the NIST review team’s efforts to collect and evaluate data sources,
231 literature, and input received from experts in the field
232 4) a complete list of literature and other sources used
233 5) a discussion of our findings with regard to scientific foundations
234 6) key takeaways and considerations for the field
235
236 We anticipate that scientific foundations reviews will be useful in a number of ways. First,
237 identifying those methods that are built on a solid scientific foundation will increase trust in
238 those methods. Second, by identifying those parts of the foundation that would benefit from
239 strengthening, a foundation review can provide strategic direction for future research efforts.
240 Third, in an interdisciplinary environment in which legal, academic, and forensic
241 professionals need to understand one another’s perspectives, consolidating key points and
242 principles can promote a shared understanding of critical concepts and lead to more effective
243 communication. Fourth, in many disciplines, hundreds of forensic science research articles
244 are published every year, yet time to absorb and discuss those articles is limited. Identifying a

2
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

245 discipline’s foundational literature can help a community develop a shared understanding of
246 core principles. In addition, establishing a comprehensive and curated canon can promote a
247 better appreciation for the capabilities and limitations of methods, increase competency, and
248 reduce variability across the field.
249
What is Science?

When conducting a scientific foundation review, it is important to define the word “science” and what
attributes we consider to be “scientific.” A succinct statement usually attributed to German philosopher
Immanuel Kant is that “science is organized knowledge.”1
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

The UK Science Council defines science as “the pursuit and application of knowledge and
understanding of the natural and social world following a systematic methodology based on
evidence.”2 Thus, science involves data collected, evaluated, and understood in a systematic and logical
fashion.3

The UK Science Council notes some key attributes of scientific study that include:

• Repetition (a phenomenon can be demonstrated repeatedly)


• Measurements and data
• Experiments
• (Falsifiable) hypotheses
• Critical analyses that consider more than one possibility
• Verification and testing
• Exposure to scrutiny, peer-review, and assessment

Several decades ago the National Academy of Sciences (NAS) prepared a report entitled On Being a
Scientist that describes “the ethical foundations of scientific practices” and how these foundations
“safe-guard the integrity of the scientific enterprise” [1]. This NAS report emphasizes that “science is
not done in isolation” and “takes place within a broad social and historical context, which gives
substance, direction, and, ultimately, meaning to the work of individual scientists” [1].

On Being a Scientist notes that

“[scientists] submit their work to be examined by others with the hope that it will be accepted.
This process of public, systematic skepticism is critical in science. It minimizes the influence
of individual subjectivity by requiring that research results be accepted by other scientists. It
also is a powerful inducement for researchers to be critical of their own conclusions, because
they know that their objective must be to convince their ablest colleagues, including those
with contrasting views… Publication in a scientific journal includes important aspects of
quality control – particularly, critical review by peers who can detect mistakes, omissions, and
alternative explanations” [1].
1 https://quoteinvestigator.com/2015/05/18/science/
2 https://sciencecouncil.org/about-science/our-definition-of-science/
3 We recognize that methods used in various forensic disciplines may have differing levels of supporting
background information available. As noted in the NRC 2009 report (p. 39): “…the term ‘forensic science’ is
used with regard to a broad array of activities, with the recognition that some of these activities might not have a
well-developed research base, are not informed by scientific knowledge, or are not developed within the culture
250
251

3
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

252 2. Why NIST?


253
254 NIST was founded in 1901 and is one of the nation's oldest physical science laboratories.
255 Over its long history, NIST has cultivated deep scientific expertise that cuts across a wide
256 range of disciplines. Having that expertise without being a regulatory agency allows NIST to
257 work closely with a broad spectrum of partners.
258
259 NIST provides industry, academia, and other government agencies with:
260
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

261 • Expertise in measurement science and best practices in many disciplines, including
262 physics, chemistry, materials science, information technology, and engineering
263 • World class, unique, cutting-edge research facilities
264 • Leadership in the development of consensus-based standards, test methods, and
265 specifications that define technical and performance requirements
266
267 Drawing on these capabilities, its national networks, international partnerships, and
268 relationship with industry, NIST works to address complex measurement challenges, ranging
269 from the physical (renewable energy sources) to the virtual (cybersecurity and cloud
270 computing), and from fundamental (quantum measurements) to the applied (fire spread
271 rates).
272
273 NIST has been involved in forensic science since the 1920s, when physicist Wilmer Souder
274 conducted precision measurements to assist hundreds of investigations involving
275 handwriting, typewriting, and ballistic examinations [8]. NIST’s direct involvement in
276 criminal investigations ended in the 1950s, but NIST has been working since then to
277 strengthen the measurements and technologies underpinning methods for analyzing DNA,
278 fingerprints, firearms and toolmarks, and digital evidence, among others. In addition, NIST
279 provides standard reference materials including human DNA, standard bullets, and mass
280 spectral data to U.S. forensic laboratories to help ensure accurate and reliable measurements.
281
282 Because NIST is not directly involved in the criminal justice system, its scientists are able to
283 offer an independent perspective on scientific matters bearing on forensic science. NIST
284 furthered its involvement in forensic science following a 2013 Memorandum of
285 Understanding (MOU) between NIST and the Department of Justice (DOJ). This MOU
286 stated that, “Scientifically valid and accurate forensic science strengthens all aspects of our
287 justice system”[10]. Under this MOU, which established the National Commission on
288 Forensic Science (NCFS) and the Organization of Scientific Area Committees for Forensic
289 Science (OSAC), NIST had the following four responsibilities:
290
291 1) appoint a senior NIST official to serve as the Co-Chair of the Commission
292 2) administer and coordinate all necessary support for OSAC
293 3) conduct research supporting the development and dissemination of methods,
294 standards, and technical guidance for forensic science measurements
295 4) test and validate select forensic science practices and standards as appropriate
296

4
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

297 NIST’s scientific foundation reviews fulfill the responsibilities outlined in the fourth element
298 of that MOU.
299
300 In February 2014, NIST launched OSAC to support the development of documentary
301 standards. Through OSAC, NIST convenes stakeholders and provides technical and scientific
302 guidance and expertise to help stakeholder groups reach a consensus.
303
304 2.1. Calls for NIST to Conduct Scientific Foundation Reviews
305
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

306 Several entities have specified the need for scientific foundation reviews. In 2009, the
307 National Research Council published a report entitled Strengthening Forensic Science in the
308 United States: A Path Forward, which requested “studies establishing the scientific bases
309 demonstrating the validity of forensic methods” [3], p. 22. More recently, the President’s
310 Council of Advisors on Science and Technology (PCAST) [4], the National Commission on
311 Forensic Science (NCFS) [11], and the American Association for the Advancement of
312 Science (AAAS) [12, 13] have published recommendations encouraging further research and
313 studies assessing the scientific foundations of forensic disciplines.
314
315 In September 2016, both PCAST and NCFS requested that NIST examine the scientific
316 literature and conduct technical merit evaluations and validation studies of forensic science
317 methods and practices. The NCFS recommended that the results of these technical merit
318 evaluations “be issued by NIST as publicly available resource documents” and that “NIST’s
319 evaluation may include but is not limited to: a) research performed by other agencies and
320 laboratories, b) its own intramural research program, or c) research studies documented in
321 already published scientific literature” [14]. NCFS also requested that these evaluation
322 documents “be broadly disseminated in the scientific and criminal justice communities and
323 accompanied by judicial trainings” [14].
324
325 During the September 12, 2016 NCFS meeting, NIST leadership announced that the agency
326 would respond to the NCFS requests by conducting a “pilot” scientific foundation review of
327 DNA mixture interpretation, to be followed by reviews of bitemarks and firearms
328 identification [15]. At the final NCFS meeting held on April 10, 2017, then Acting NIST
329 Director Kent Rochford reiterated these plans [16].
330
331 In fiscal year 2018, Congress appropriated funding for NIST to conduct “technical merit
332 evaluations.” NIST scientific foundation reviews are intended to fulfill this mandate. The
333 first NIST scientific foundation review, a study on DNA mixture interpretation, began in
334 September 2017. A review of bitemark analysis began later that year.
335
336 3. Previous Efforts: A Historical Overview
337
338 Our approach to the review of forensic science literature builds upon previous efforts and
339 experiences, which are summarized below. These activities, which have often been
340 conducted independent of other on-going or previous efforts, include literature reviews, input
341 from advisory groups, and workshops. Many of these previous efforts have been prospective
342 (i.e., looking to where the field needs to go) rather than introspective (i.e., reflecting on the

5
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

343 foundations and support that exist for specific methods). An important goal of our NIST
344 scientific foundation reviews is to consider, compile, and integrate information from previous
345 efforts.
346
347 3.1. Analytical Chemistry Bi-annual Application Reviews (1983 to 2011)
348
349 A number of literature summaries have been gathered over the years to reflect various topics
350 that were published in forensic science. For three decades, the journal Analytical Chemistry
351 published a brief review of activities focused primarily in three areas: drugs and poisons;
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

352 forensic biochemistry; and trace evidence. In the 15 review articles published on alternate
353 years between 1983 and 2011, there were a total of 9263 publications reviewed, of which
354 1565 were articles related to DNA methods [17].
355
356 These Analytical Chemistry application reviews surveyed articles published in the Journal of
357 Forensic Sciences, Science and Justice, Forensic Science International, Forensic Science
358 International: Genetics, Journal of the Canadian Society of Forensic Science, Journal of
359 Forensic Identification, Forensic Science Review, Analytical Toxicology, The Microscope,
360 and Chemical Abstracts. While each of these reviews provided a nice summary of the
361 breadth of information published in the previous two years, there was no attempt to assess the
362 quality or prioritize the publications in any way. Moreover, as noted previously [17], these
363 reviews were methods-focused to enable readers to find information that might aid forensic
364 laboratory work.
365
366 3.2. INTERPOL Literature Review
367
368 The International Forensic Science Managers Symposium provides another approach to
369 gathering and discussing forensic science literature. Experts from around the world speak at
370 this symposium, which is held every three years at INTERPOL headquarters in Lyon, France.
371 As part of this gathering, a summary of the published literature from the previous three years
372 is organized into a review article. The approach taken for each discipline varies and the
373 number of publications examined, summarized, and reported on can range from a few dozen
374 to over a thousand.
375
376 The 2010-2013 literature summary contains 4832 references from the following disciplines
377 (with number of listed references in parentheses): firearms (159), gunshot residue (49),
378 toolmarks (189), paint (201), fibers and textiles (68), forensic geology (102), arson and fire
379 debris analysis (140), explosives and explosive residues (1341), drug evidence (668),
380 toxicology (324), forensic audio analysis (133), forensic video analysis (31), imaging (256),
381 digital evidence (190), fingermarks and other impressions (472), body fluid identification and
382 DNA typing in forensic biology (114), questioned documents (275), and forensic science
383 management (120). These compiled literature summaries can be accessed on the INTERPOL
384 website at
385 https://www.interpol.int/content/download/21910/206602/version/1/file/IFSMSReviewPaper
386 s2013.pdf. Authors of these forensic discipline summaries come from Australia, Belgium,
387 Canada, Finland, France, Hong Kong, Israel, Japan, The Netherlands, Switzerland, the
388 United Kingdom, and the United States (see [17]).

6
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

389
390 The 2013-2016 literature summary contains 4891 references from the following disciplines:
391 firearms (179), gunshot residue (77), toolmarks (104), paint and glass (102), fibers and
392 textiles (92), forensic geosciences (245), fire investigation and debris analysis (194),
393 explosives (646), drugs (1434), toxicology (600), audio analysis (88), video and imaging
394 (108), digital evidence (100), fingermarks and other impressions (536), DNA and biological
395 evidence (75), questioned documents (255), and forensic science management (56). These
396 literature summaries are at
397 https://www.interpol.int/content/download/33314/426506/version/1/file/INTERPOL%2018th
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

398 %20IFSMS%20Review%20Papers.pdf.
399
400 The reviews on DNA involve only 114 articles from 2010 to 2013 and 75 articles from 2013
401 to 2016. These reviews are minimal in nature, typically involve just a summary listing of the
402 material, and are focused on topics of interest to the authors rather than attempting to be
403 comprehensive. For example, in the 75 articles discussed in the 2013 to 2016 review,
404 selected topics include rapid DNA analysis (11 references), analysis of complex DNA
405 profiles including mixtures and low-template DNA (4 references), and the development of
406 next-generation sequencing and its application to DNA phenotyping (60 references). There
407 are many, many more references on DNA mixture interpretation during this time period that
408 were not covered, which points to the fact that it can be a challenge with any review of the
409 literature to be both effective and thorough.
410
411 3.3. Measurement Science Workshops
412
413 Since 2012, NIST has conducted or sponsored a number of measurement science workshops
414 to assist the transition of research in specific forensic fields into more effective practice.
415 These workshops are typically webcast from the NIST campus in Gaithersburg, Maryland.
416 Topics have included firearms analysis, DNA mixture interpretation, emerging trends in
417 synthetic drugs, handwriting analysis, cloud computing, mobile forensics, probabilistic
418 genotyping, validation, and forensic science error management (e.g., see Table 7 in [17])1.
419 Forensics@NIST conferences have been held bi-annually since 2012 to share research
420 conducted at NIST with the forensic community.
421
422 3.4. National Research Council 2009 Report
423
424 In November 2005, the United States Congress authorized the National Academy of Sciences
425 (NAS) to conduct a study on forensic science [3]. From January 2007 to November 2008 a
426 17-member committee met eight times, heard from 70 presenters, and discussed the
427 information received. In February 2009 the National Research Council (NRC) arm of the
428 NAS issued a 352-page report entitled Strengthening Forensic Science in the United States:
429 A Path Forward.
430
431 This 2009 NRC report, which is often referred to in forensic circles as “the NAS report,”
432 proposed 13 recommendations to improve forensic science in the United States.
433 Recommendation #3 emphasized the need for research “to address issues of accuracy,

1
An updated list of past NIST events is available at https://www.nist.gov/topics/forensic-science/conferences-and-events.

7
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

434 reliability, and validity in the forensic science disciplines” and encourages funding of peer-
435 reviewed research involving “(a) studies establishing the scientific bases demonstrating the
436 validity of forensic methods, (b) the development and establishment of quantifiable measure
437 of the reliability and accuracy of forensic analyses [that can be expected as forensic evidence
438 conditions vary]…, (c) the development of quantifiable measures of uncertainty in the
439 conclusions of forensic analyses, and (d) automated techniques capable of enhancing forensic
440 technologies” ([3], pp. 22-23). This recommendation also emphasized that research results
441 should be published in respected scientific journals.
442
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

443 In this report, nuclear DNA testing from single-source, high-quality samples are given high
444 marks with statements like: “Among existing forensic methods, only nuclear DNA analysis
445 has been rigorously shown to have the capacity to consistently, and with a high degree of
446 certainty, demonstrate a connection between an evidentiary sample and a specific individual
447 or source” ([3], p. 100; see also p. 7). The 2009 NRC report does not discuss DNA mixture
448 interpretation beyond a brief mention on page 100: “There may be problems in a particular
449 case with how the DNA was collected, examined in the laboratory, or interpreted, such as
450 when there are mixed samples, limited amounts of DNA, or biases due to the statistical
451 interpretation of data from partial profiles.”
452
453 The 2009 NRC assessment of bitemark analysis notes on page 176: “Despite the inherent
454 weaknesses involved in bitemark comparison, it is reasonable to assume that the process can
455 sometimes reliably exclude suspects.” However, “[t]he committee received no evidence of an
456 existing scientific basis for identifying an individual to the exclusion of all others.” They
457 emphasize “[s]ome research is warranted in order to identify the circumstances within which
458 the methods of forensic odontology can provide probative value.”
459
460 The report opines that “the interpretation of forensic evidence is not always based on
461 scientific studies to determine its validity” and that “a body of research is required to
462 establish the limits and measures of performance and to address the impact of sources of
463 variability and potential bias” (p. 8).
464
465 3.5. White House Subcommittee on Forensic Science (SoFS)
466
467 From July 2009 to December 2012, the White House Office of Science and Technology
468 Policy (OSTP) established a federal government effort – a Subcommittee on Forensic
469 Science (SoFS) under the National Science and Technology Council (NSTC) – to work
470 towards potential solutions that could help address the 2009 NRC report recommendations.
471 For a brief timeline of recent U.S. efforts to strengthen forensic science, see Ref. [17].
472
473 Five interagency working groups (IWGs) met on almost a monthly basis during the time
474 period in which the SoFS was in existence. The IWG activities involved nearly 200 subject
475 matter experts from 23 Federal departments and agencies as well as 49 participants
476 representing state and local forensic laboratories. One of these IWGs covered Research,
477 Development, Testing, and Evaluation (RDT&E) and as part of their work wrote to the then-
478 existing Scientific Working Groups (SWGs) to request information on literature supporting
479 the scientific foundations of their disciplines.

8
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

480
481 In 2011 and 2012, annotated bibliographies were provided to the SoFS RDT&E IWG for 10
482 forensic disciplines in responses to questions raised (see Ref. [17]). The forensic disciplines
483 represented include: (1) firearms and toolmarks, (2) bloodstain pattern analysis, (3) bitemark
484 (odontology) analysis, (4) fiber analysis, (5) shoeprint and tire tread, (6) latent print analysis,
485 (7) arson investigation and burn pattern analysis, (8) digital evidence, (9) hair analysis, and
486 (10) paints and other coatings. Links to these bibliographies can be found at
487 https://www.nist.gov/topics/forensic-science/working-groups/legacy-scientific-working-
488 groups.
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

489
490 For example, with firearms and toolmarks, SWGGUN and AFTE (Association of Firearms
491 and Toolmark Examiners) prepared a 94-page response to 25 questions on the foundations of
492 their field [18]. These questions ranged from “What literature documents the scientific
493 domains used to inform the foundations of firearm/toolmark analysis?” to “What statistical
494 research has been conducted and applied to firearm and toolmark examinations? What
495 statistical models for firearms and toolmarks have been published?”
496
497 Some of the provided bibliographies contain only meeting presentation abstracts to address
498 questions raised on foundational issues. The limited responses obtained by the SoFS RDT&E
499 IWG on some of the foundational questions ultimately led to the National Commission on
500 Forensic Science (NCFS) position statements on scientific literature, the request for NIST to
501 perform technical merit reviews, and the American Association for the Advancement of
502 Science (AAAS) studies described below.
503
504 3.6. NSF/NIJ-Funded Workshop
505
506 In May 2015, a workshop was held at the American Association for the Advancement of
507 Science (AAAS) in Washington, D.C. that was co-funded by the National Science
508 Foundation (NSF) and the National Institute of Justice (NIJ) [19]. This workshop, entitled
509 “Forensic Science Research and Evaluation Workshop: A Discussion on the Fundamentals of
510 Research Design and an Evaluation of Available Literature,” brought together 17 experts to
511 cover topics of experimental design and statistics, interpretation and assessment, and policy
512 implications regarding scientific foundations of forensic disciplines. Each participant
513 submitted a short essay on their presented topic at the workshop. An important output from
514 this workshop was a 122-page report, which is available from NIJ [20].
515
516 A purpose of this workshop was to inform AAAS regarding approaches to examining the
517 literature for foundational studies on selected forensic disciplines (see below). Concurrent
518 with the AAAS forensic science assessments that began in 2015, the Department of Justice
519 and NIST had begun discussing many of these issues and had begun working on other efforts
520 to strengthen forensic science via the National Commission on Forensic Science.
521
522 3.7. Insights from the National Commission on Forensic Science
523
524 The National Commission on Forensic Science (NCFS), which served as a Federal Advisory
525 Committee to the U.S. Department of Justice from 2013 to 2017, held 13 meetings and

9
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

526 approved 43 work products. The adopted work products were either Recommendations to the
527 Attorney General [14] or Views of the Commission [21]. There were seven subcommittees
528 that prepared and presented the work products to the full NCFS.
529
530 The Scientific Inquiry and Research Subcommittee drafted and championed five documents
531 that were approved by NCFS:
532
533 1) Views on Scientific Literature in Support of Forensic Science and Practice (January
534 2015) [22]
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

535 2) Recommendation to Fund Post-Doctoral Projects to Facilitate Translation of Research


536 into Forensic Science Practice (March 2016) [23]
537 3) Views on Identifying and Evaluating Literature that Supports the Basic Principles of a
538 Forensic Science Method or Forensic Science Discipline (March 2016) [24]
539 4) Views on Technical Merit Evaluation of Forensic Science Methods and Practices
540 (June 2016) [25]
541 5) Recommendation on Technical Merit Evaluation of Forensic Science Methods and
542 Practice (September 2016) [26]
543
544 Four of these documents (the only exception being the one on funding post-doctoral projects)
545 apply directly to scientific foundation reviews. Each of these four will be discussed further.
546
547 3.7.1. Appropriate Scientific Literature
548
549 Some members of the NCFS Scientific Inquiry and Research Subcommittee had been part of
550 the SoFS RDT&E IWG and were familiar with the submissions made a few years before in
551 response to inquiries about foundational literature. As stated in the January 2015 Views
552 document: “A cursory review of the literature citations raised concerns within the NCFS that
553 extend beyond these specific bibliographies: (1) In some cases, it was unclear which
554 literature citations are crucial to support the foundation of a particular forensic science
555 discipline. (2) Some of the cited literature had not undergone a rigorous peer-review
556 process.”
557
558 These observations fueled a desire to describe what is appropriate scientific literature to
559 provide support for methods used in forensic practice. The NCFS states: “The goal of this
560 [January 2015] Views document is to provide the framework necessary to address these and
561 broader concerns regarding the status of the scientific foundation of forensic science across
562 its many disciplines and practices.”
563
564 In January 2015, the NCFS unanimously approved Views of the Commission on Scientific
565 Literature in Support of Forensic Science and Practice: “The NCFS believes that a
566 comprehensive evaluation of the scientific literature is critical for the advancement of
567 forensic science policy and practice in the United States. While other forms of dissemination
568 of research and practice (e.g., oral and poster presentations at meetings, workshops, personal
569 communications, editorials, dissertations, theses, and letters to editors) play an important role
570 in science, the open, peer-reviewed literature is what endures and forms a foundation for
571 further advancements” [27].

10
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

572
573 The NCFS Views document states that “foundational, scientific literature supportive of
574 forensic practice should meet criteria such as the following:
575
576 • Peer-reviewed in the form of original research, substantive reviews of the original
577 research, clinical trial reports, or reports of consensus development conferences.
578 • Published in a journal or book that has an International Standard Number…and
579 recognized expert(s) as authors (for books) or on its Editorial Board (for journals).
580 • Published in a journal that maintains a clear and publicly available statement of
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

581 purpose that encourages ethical conduct such as disclosure of potential conflicts
582 of interest integral to the peer review process.
583 • Published in a journal that utilizes rigorous peer review with independent external
584 reviewers to validate the accuracy in its publications and their overall consistency
585 with scientific norms of practice.
586 • Published in a journal that is searchable using free, publicly available search
587 engines (e.g., PubMed, Google Scholar, National Criminal Justice Reference
588 Service) that search major databases of scientific literature (e.g., Medline, …).
589 • Published in a journal that is indexed in databases that are available through
590 academic libraries and other services (e.g., JSTOR, Web of Science, …).”
591
592 This Views of the Commission document points out that “the term ‘foundation’ was used no
593 less than thirty times [in the 2009 NRC report [3]] to emphasize that each forensic discipline
594 must have a scientifically robust and validated basis to its methods, its technologies, and its
595 process of interpreting data.” It also notes: “…each forensic discipline must have an
596 underlying foundation that is the result of a rigorous vetting process and that is ultimately
597 captured in the peer-reviewed scientific literature.”
598
599 It continues: “Scientific literature comprises manuscripts that report empirical data and have
600 been independently peer-reviewed for quality, originality, and relevance to the discipline. To
601 strengthen confidence in results obtained in forensic examinations, each forensic discipline
602 must identify resources that are scientifically credible, valid and with a clear scientific
603 foundation. Such foundational literature in forensic practice should conform to norms across
604 all scientific disciplines. Accordingly, the [NCFS] proposes criteria [those listed above] by
605 which scientific literature can be assessed for its consistency with principles of scientific
606 validity.”
607
608 3.7.2. Identifying and Evaluating Literature
609
610 In March 2016, the NCFS approved “Views of the Commission Regarding Identifying and
611 Evaluating Literature that Supports the Basic Principles of a Forensic Science Method or
612 Forensic Science Discipline” [2].
613
614 This Views document states: “In any scientific discipline, an on-going process to evaluate the
615 weight and merit of published materials must be established. The NCFS is aware of past and
616 on-going efforts to establish the scientific foundation of forensic discipline[s] through
617 literature reviews and generation of bibliographies. As part of these efforts, it is the view of

11
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

618 the NCFS that scientific literature must be evaluated and be vetted through an objective and
619 critical review process using tenets based on general scientific principles and practices. These
620 tenets must be satisfied before any form of scientific literature is included in, and considered
621 part of, a forensic discipline’s scientific foundation. Herein, foundational literature is
622 intended to refer to that upon which a discipline has derived, developed, or defined practices
623 and procedures examined and validated by a given discipline and applied within a legal,
624 medicolegal, or judicial setting.”
625
626 The Commission provides some specific guidance by asking 15 questions to provide a
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

627 framework for an objective and critical review. The document states: “The following tenets
628 of literature review should be considered in a critical review process that evaluates the merit
629 of an individual article:
630
631 • Does the publication adhere to the guidelines stated in the Views Document
632 “Scientific Literature in Support of Forensic Science and Practice”?
633 • Is the problem or hypothesis clearly stated?
634 • Is the scope of the article clearly stated as appropriate (article, case study, review,
635 technical note, etc.)?
636 • Is the literature review current, thorough, and relevant to the problem being studied?
637 • Does this work fill a clear gap in the literature or is it confirmatory and/or
638 incremental?
639 • Are the experimental procedures clear and complete such that the work could be
640 easily reproduced?
641 • Are the experimental methods appropriate to the problem?
642 • Are the methods fully validated to the necessary level of rigor (fit for purpose)?
643 • Are the data analysis and statistical methodology appropriate for the problem, and
644 explained clearly so it can be reproduced?
645 • Are the experimental results clearly and completely presented and discussed?
646 • Are omissions and limitations to the study discussed and explained?
647 • Are the results and conclusions reasonable and defensible based on the work and the
648 supporting literature?
649 • Are the citations and references complete and accurate?
650 • Are the references original (primary) and not secondary?
651 • Are funding sources and other potential sources of conflict of interest clearly stated?”
652
653 The document also points out: “Evaluations of the literature using a universal systematic
654 process will provide a means to determine which studies are truly foundational. As an on-
655 going effort, these reviews will document the evolution of a given discipline with respect to
656 the expectations outlined in the National Research Council Report on Forensic Science in
657 2009. Such an approach could allow for strengths and weaknesses of a given discipline to be
658 discovered which could result in systematic exploration of these weaknesses through future
659 research.”
660
661 The Commission document further notes: “Compilations of accepted foundational literature
662 serves additional purposes. First, compilations generated under stringent review criteria
663 define general scientific acceptance and should be used to assist in admissibility decisions

12
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

664 and gatekeeping functions. Second, priorities can be established for translational studies
665 designed to bring the most promising developments into mainstream forensic practice. Third,
666 research needs can be identified and used to develop initiatives and calls for proposals to fill
667 these needs and to spur investigator-initiated research. Success in these endeavors depends
668 on current and complete understanding of the foundational literature.”
669
670 In this same document, the NCFS opined: “Documentation of the literature that supports the
671 underlying scientific foundation for each forensic discipline is a critical component in
672 determining if methods, technologies, interpretation guidelines and conclusions are supported
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

673 by science.”
674
675 3.7.3. A Proposal for NIST to Perform Scientific Foundation Reviews
676
677 Following the NSF/NIJ-funded workshop described earlier, the NCFS Scientific Inquiry and
678 Research Subcommittee reached out to NIST leadership with a request for NIST to perform
679 what was referred to as “technical merit” reviews of forensic disciplines. As described
680 previously, the MOU between DOJ and NIST that established NCFS and OSAC had agreed
681 that NIST would “test and validate select forensic science practices and standards as
682 appropriate.” NCFS felt that their request fell within NIST’s agreed upon responsibilities
683 under the MOU.
684
685 During the September 2016 NCFS meeting, Dr. Richard Cavanagh, as Director of the NIST
686 Special Programs Office, responded to the NCFS request by reviewing how NIST might
687 approach the issue of examining the scientific foundations of forensic disciplines.
688
689 3.7.4. NIST Announcement at the September 2016 NCFS Meeting
690
691 A “Technical Merit” panel was held on September 12, 2016 as part of the 11th meeting of the
692 NCFS. The proceedings can be viewed at https://www.nist.gov/topics/forensic-science/ncfs-
693 meeting-11-webcast (Meeting 11, Part 2 [1:20:37]; the NIST plan is described from 4:40 to
694 17:45 and the Q&A portion begins at 1:08:30). Slides for the NIST plan are available on the
695 archived NCFS website: https://www.justice.gov/archives/ncfs/page/file/893966/download.
696
697 The proposed NIST plan presented at that time called for performing three pilot studies
698 (dependent on available funding) involving DNA, bitemarks, and firearms and toolmark
699 identification. These three diverse examples were selected in order to learn if the
700 approach(es) taken could be effective. The stated goals involved examining the scientific
701 maturity and technical merit of selected methods and practices through considering research
702 performed by other agencies and laboratories, NIST research, and studies documented in the
703 literature.
704
705 For each area studied, the NIST proposal involved (1) assembling a NIST review team with a
706 range of expertise in order to view issues from multiple perspectives, (2) seeking input on
707 issues to consider from a variety of outside experts, (3) examining the scientific literature to
708 evaluate available support for claims made, (4) conducting interlaboratory studies where
709 appropriate and possible, (5) publishing a written report of findings and recommendations,

13
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

710 and (6) sharing findings with the scientific and criminal justice communities to convey the
711 capabilities and limitations of studied forensic disciplines to practitioners, judges, lawyers,
712 jurors, and other stakeholders.
713
714 The NIST scientific foundation reviews are the outcome of what began with the NIST plan
715 presented at the September 2016 NCFS meeting.
716
717 During the question and answer portion of this September 2016 panel discussion, several
718 members of the Commission discussed issues involved in pursuing technical merit (scientific
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

719 foundation) reviews. There was a desire stated to avoid duplication of effort by being aware
720 of and learning from other ongoing efforts, such as the AAAS forensic science assessments,
721 which will be discussed below. Concern was expressed regarding the amount of time
722 required to perform studies as well as the idea articulated by some that a field should not and
723 could not move forward until a foundation review was completed.
724
725 The challenge of being “independent” in assessments performed was shared given that a
726 certain level of expertise and connection to the community is needed to evaluate scientific
727 details of any method. One Commissioner stated that there were different perceptions
728 regarding what “methodology” can mean and the extent to which a forensic method or entire
729 discipline might be reviewed. Finally, there was a desire expressed for open access to
730 published reports of findings so that the information could be freely and widely available.
731
732 3.7.5. NCFS Technical Merit Review Documents
733
734 The NCFS approved two documents expressing its desires regarding technical merit reviews:
735 (1) “Views of the Commission: Technical Merit Evaluation of Forensic Science Methods and
736 Practices,” which was published in June 2016 and (2) “Recommendation to the Attorney
737 General: Technical Merit Evaluation of Forensic Science Methods and Practices,” which was
738 approved in September 2016 following the technical merit panel discussion mentioned
739 above.
740
741 The Views document begins: “Forensic data, results, interpretations, and conclusions have
742 life-changing consequences for individuals and society. It is vital that the analytical data be
743 generated through reliable methods and practices build upon valid core scientific principles
744 and methodology.” Three views of the Commission are stated in the document:
745
746 “(1) All forensic science methodologies should be evaluated by an independent scientific
747 body to characterize their capabilities and limitations in order to accurately and
748 reliably answer a specific and clearly-defined forensic question. The independent
749 scientific body should evaluate how forensic science test methods and practices meet
750 the standards of technical merit as defined in the OSAC Technical Merit Worksheet 2.
751 (2) The National Institute of Standards and Technology (NIST) should assume the role of
752 independent scientific evaluator within the justice system for this purpose.

2
The OSAC Technical Merit Worksheet has evolved over time. Version 4 was the one available at the time the NCFS voted:
https://www.nist.gov/sites/default/files/documents/forensics/osac/4-OSAC-QIC-Form-01-Technical-Merit-Worksheet-Form-V4.pdf. For a
more recent version, see
https://www.nist.gov/sites/default/files/documents/2018/01/05/technical_merit_guide_and_worksheet_january_3_2018.pdf.

14
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

753 (3) Additional resources should be made available to support this new capacity.”
754
755 The word “independent” as defined in this document “refers to a body that is fair, impartial,
756 and without conflict of interest in the results of the evaluation.” It is also noted that “an
757 entity’s independence does not imply that this work will be conducted without the
758 contribution of individuals who are knowledgeable of a specific discipline. It is expected that
759 an independent scientific body will be able to retain the relevant experts to advise the
760 independent body as to the real life forensic application of the science.” A DNA Mixture
761 Resource Group, which provides expert input to the current DNA mixture interpretation
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

762 review at regular intervals during the study, helps fulfill this vital role. Other NIST scientific
763 foundation reviews may seek input from relevant experts using this approach or perhaps by
764 gathering a larger group of perspectives in a single workshop near the start of the study, such
765 as is anticipated for the bitemark effort.
766
767 The Views document defines “technical merit” as “the process that ensures the accuracy,
768 capabilities, and limitations of forensic science tests” and states “the data and research that
769 need to be gathered to support technical merit include, but are not limited to, clearly defined
770 terminology, quality control, uncertainty, limitations, validation, fitness-for-purpose, and
771 general acceptance in both the forensic and the general scientific communities.” It continues:
772 “While NIST may have a centralized evaluative role, the Commission envisions that the data
773 and research NIST will evaluate will be generated by the robust and diverse scientific
774 research community as well as by NIST. The resulting resource documents will be
775 continually updated as the state of the science develops. Centralizing the evaluative role will
776 facilitate the development of a knowledge base at NIST that will build over time.”
777
778 The Views document concludes: “It is the view of the NCFS that an institutional entity
779 assigned a permanent independent scientific evaluation function would facilitate the
780 gathering of scientific research, knowledge, and expertise over time, creating a service
781 resource for forensic science, technology research, and user communities. Development of a
782 trusted and impartial process of evaluating technical merit of forensic practices and the
783 presentation of data will ensure that all decisions rendered by the justice system are based on
784 sound and current science.”
785
786 The second document approved by the NCFS on technical merit evaluation proposed “that
787 the Attorney General endorse and refer to the Director of NIST the following [three]
788 recommendations:”
789
790 “Recommendation #1: NIST should establish an in-house entity with the capacity to
791 conduct independent scientific evaluations of the technical merit of test methods and
792 practices used in forensic science disciplines.
793
794 “Recommendation #2: The results of the evaluations will be issued by NIST as publicly
795 available resource documents. NIST’s evaluation may include but is not limited to: a)
796 research performed by other agencies and laboratories, b) its own intramural research
797 program, or c) research studies documented in already published scientific literature.
798 NIST should begin its work by piloting three resource documents to establish their design

15
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

799 and requirements. The release of these documents should be broadly disseminated in the
800 scientific and criminal justice communities and accompanied by judicial trainings.
801
802 “Recommendation #3: The Organization of Scientific Area Committees for Forensic
803 Science (OSAC) leadership, the Forensic Science Standards Board (FSSB), should
804 commit to placing consensus documentary standards on the OSAC Registry of Approved
805 Standards for only those forensic science test methods and practices where technical
806 merit has been established by NIST, or in the interim, established by an independent
807 scientific body. An example of an interim independent scientific body could be an OSAC
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

808 created Technical Merit Resource Committee composed of measurement scientists and
809 statisticians appointed by NIST and tasked with the evaluation of technical merit.”
810
811 In providing these recommendations, NCFS recognized “that NIST is a non-regulatory
812 agency and is not recommending that NIST’s function here will be regulatory in nature.” The
813 document concludes with “the vision and hope of the NCFS is that NIST will develop
814 resource documents for all forensic science disciplines, but [recognizes] that process will
815 take time.”
816
817 The NCFS concluded its appointed role as a Federal Advisory Committee to DOJ in April
818 2017. However, the deliberations held, insights provided, and documents approved serve as
819 important background material and as a roadmap for NIST scientific foundation reviews.
820
821 3.8. PCAST 2016 Report
822
823 In September 2016 a report entitled “Forensic Science in Criminal Courts: Ensuring
824 Scientific Validity of Feature-Comparison Methods” [4] was provided to President Barack
825 Obama from the President’s Council of Advisors on Science and Technology (PCAST). This
826 PCAST group was led by co-chairs John P. Holdren (Assistant to the President for Science
827 and Technology and Director of the White House Office of Science and Technology) and
828 Eric S. Lander (President of the Broad Institute of Harvard and Massachusetts Institute of
829 Technology and one of the leaders of the Human Genome Project).
830
831 The Executive Summary notes that “in the course of its study, PCAST compiled and
832 reviewed a set of more than 2,000 papers from various sources – including bibliographies
833 prepared by the Subcommittee on Forensic Science of the National Science and Technology
834 Council and the relevant Working Groups organized by the National Institute of Standards
835 and Technology (NIST); submissions in response to PCAST’s request for information from
836 the forensic science stakeholder community; and PCAST’s own literature searches” (p. 2).
837 See Ref. [28] for full reference list.
838
839 During their study, “PCAST concluded that there are two important gaps: (1) the need for
840 clarity about the scientific standards for the validity and reliability of forensic methods and
841 (2) the need to evaluate specific forensic methods to determine whether they have been
842 scientifically established to be valid and reliable” (p. 1). The PCAST report examines and
843 comments on “foundational validity” and “validity as applied” for six forensic feature-
844 comparison methods: (1) DNA analysis of single-source and simple-mixture samples, (2)

16
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

845 DNA analysis of complex-mixture samples, (3) bitemarks, (4) latent fingerprints, (5) firearms
846 identification, and (6) footwear analysis. Expressing the desire for peer-reviewed research
847 publications with data to support claims, PCAST notes “the publication and critical review of
848 methods and data is an essential component in establishing scientific validity” (p. 68).
849
850 Commenting on bitemark analysis: “In its own review of the literature [involving 407
851 entries] PCAST found few empirical studies that attempted to study the validity and
852 reliability of the methods to identify the source of a bitemark” (p. 85). They conclude:
853 “Among those studies that have been undertaken, the observed false positive rates were so
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

854 high that the method is clearly scientifically unreliable at present… [A]vailable scientific
855 evidence strongly suggests that examiners cannot consistently agree on whether an injury is a
856 human bitemark and cannot identify the source of bitemark with reasonable accuracy.” (p.
857 87).
858
859 PCAST found “that DNA analysis of single-source samples or simple mixtures of two
860 individuals, such as from many rape kits, is an objective method that has been established to
861 be foundationally valid,” (p. 75) but expressed some concerns with complex mixtures (pp.
862 75-83). PCAST concludes that, “NIST should play a leadership role in this process [of
863 conducting scientific studies], by ensuring the creation and dissemination of materials and
864 stimulating studies by independent groups through grants, contracts, and prizes; and by
865 evaluating the results of these studies” (p. 83).
866
867 Regarding the need for assessments of foundational validity, PCAST recommended:
868
869 “It is important that scientific evaluations of the foundational validity be conducted, on an
870 ongoing basis, to assess the foundational validity of current and newly developed forensic
871 feature-comparison technologies. To ensure the scientific judgments are unbiased and
872 independent, such evaluations must be conducted by a science agency which has no stake
873 in the outcome. (A) The National Institute of Standards and Technology (NIST) should
874 perform such evaluations and should issue an annual public report evaluating the
875 foundational validity of key forensic feature-comparison methods. (B) The President
876 should request and Congress should provide increased appropriations to NIST of (a) $4
877 million to support the evaluation activities described above and (b) $10 million to support
878 increased research activities in forensic science, including on complex DNA mixtures,
879 latent fingerprints, voice/speaker recognition, and face/iris biometrics” (pp. 128-129).
880
881 It is important to keep in mind that funding levels are determined by Congress regardless of
882 recommendations made by PCAST or any other group. In fiscal year 2018, Congress
883 provided funding to NIST to perform “technical merit evaluations,” which we have termed
884 “scientific foundation reviews.”
885
886 There were numerous reactions to the PCAST report, with some applauding its findings,
887 some ignoring its findings, and some criticizing them. Critics raised at least six distinct points
888 as noted by one legal scholar [29]: (1) The PCAST committee was biased against forensic
889 science, (2) PCAST offered an overly narrow and idiosyncratic definition of scientific
890 validity, (3) PCAST ignored strong evidence that proves the scientific validity of various

17
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

891 forensic sciences, (4) PCAST usurped the role of judges and juries by inserting its own
892 opinions about forensic science, (5) forensic science evidence should not be held to scientific
893 standards of validity because the evidence includes technical or specialized knowledge, and
894 (6) practitioners’ personal experiences and observations should be given weight when
895 assessing the scientific validity of forensic science.
896
897 An Addendum to the PCAST Report on Forensic Science in Criminal Courts released on
898 January 6, 2017 [30] emphasized that “an empirical claim cannot be considered scientifically
899 valid until it has been empirically tested” (p. 1) and continues that “while scientists may
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

900 debate the precise design of a study, there is no room for debate about the absolute
901 requirement for empirical testing” (p. 2). This addendum further notes that “the test problems
902 used in the empirical study define the specific bounds within which the validity and
903 reliability of the method has been established (e.g., is a DNA analysis method reliable for
904 identifying a sample that comprises only 1% of a complex mixture?)” (p. 2).
905
906 3.9. AAAS Studies
907
908 The American Association for the Advancement of Science (AAAS) announced a
909 partnership with the Laura and John Arnold Foundation in 2015, with plans to explore the
910 “underlying scientific bases for the forensic tools and methods currently used in the criminal
911 justice system.” AAAS planned to begin with ten forensic disciplines: (1) bloodstain pattern
912 analysis, (2) digital evidence, (3) fire investigations, (4) firearms and toolmarks/ballistics, (5)
913 footwear and tire tracks, (6) forensic odontology – bitemark analysis, (7) latent fingerprints,
914 (8) trace evidence – fibers, (9) trace evidence – hair, and (10) trace evidence – paint and
915 other coatings. Their website notes that the project goals were to evaluate “the scientific
916 underpinnings the forensic community relies on to support their practices and, where these
917 fall short, recommend areas requiring further study” [31].
918
919 Reports were released for fire investigations (in July 2017, [12]) and latent fingerprint
920 examination (in September 2017, [13]). The fire investigation report offers 25
921 recommendations that provide a roadmap for future research efforts [12] while the latent
922 fingerprint examination report provides 14 recommendations to assist future research [13].
923 These reports were authored by a small group (e.g., four authors – William Thompson, John
924 Black, Anil Jain, and Joseph Kadane – for the latent fingerprint examination report; five
925 authors – Jose Almirall, Hal Arkes, John Lentini, Frederick Mowrer, and Janusz Pauliszyn –
926 for the fire investigation report) with three contributing AAAS staff and a seven-member
927 advisory committee.
928
929 Apparently, any future work by AAAS with their forensic science assessments is subject to
930 availability of funding (personal communication from Deborah Runkle, AAAS).
931

18
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

932 4. Other Similar International Activities


933
934 4.1. Australian NIFS Forensic Fundamentals
935
936 In 2016, the Australia New Zealand Policing Advisory Agency (ANZPAA) National Institute
937 of Forensic Science (NIFS) [32] released “A Guideline to Forensic Fundamentals” that
938 describes their plan for evaluating the scientific foundations for human-based forensic
939 disciplines where comparisons of features by an expert inform the final result obtained. This
940 guideline document is available at
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

941 http://www.anzpaa.org.au/ArticleDocuments/220/A%20Guideline%20to%20Forensic%20Fu
942 ndamentals.pdf.aspx. The goal of this effort is to help forensic managers, researchers, and
943 practitioners “to assess the validity of current methods and opinions and to consider the
944 suitability of new techniques being considered for implementation in forensic casework” (p.
945 3).
946
947 The ANZPAA NIFS effort notes that “the application of human-based forensic disciplines is
948 based on underlying feature set assumptions which should be quantified and assessed as they
949 form the basis of all methods and opinions that are derived. These assumptions relate not
950 only to the nature and frequency of the feature set, but also to whether they can be used as a
951 means to distinguish between groups or individuals” (p. 5). Therefore, their effort is focusing
952 on eight areas: “(1) how the features originate and whether they are random or ordered, (2)
953 the persistence of the features, (3) the transference of the features, (4) the potential for
954 something foreign/unrelated to be mistaken as a feature, (5) the dependence or independence
955 of the subcomponents of the feature set, (6) whether unrelated items have the potential to
956 resemble one another, (7) population studies to determine the level of variation and
957 frequency of variants, and (8) whether there are established databases to determine the
958 frequency of concurring features” (p. 5).
959
960 ANZPAA NIFS encourages a review of published empirical studies available in the literature
961 and states: “a good published scientific validation study would include the following: [(1)]
962 explanation of the methodology and the opinions that can be derived, [(2)] publication in a
963 recognized, peer reviewed scientific journal, [(3)] use of ground truth known experimental
964 materials, and [(4)] use of a statistically significant sample size” (p. 6).
965
966 The Forensic Fundamentals guidelines point out that “acceptance in court does not provide
967 confirmation that a method is scientifically valid” (p. 7). This document emphasizes that
968 “appropriate experimental design is important to ensure that the correct processes are
969 validated” and provides examples of types of factors that need to be tested including:
970 “accuracy, precision, specificity, sensitivity, reliability, and reproducibility” (p. 7). These
971 guidelines stress: “The test materials should be prepared based on studies of how closely
972 unrelated items may resemble one another. Experimental design should include an equal
973 mixture of randomly presented test materials that include: items that are related [and] items
974 that are unrelated with the highest degree of similarity” (p. 7). This section of their guidelines
975 concludes: “The ground truth of test items should be known” (p. 7).
976

19
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

977 A section on limitations encourages acknowledgment of the limitations of a method “to


978 ensure that the evidence provided can be appropriately assessed by the Court” (p. 7). Some
979 examples are provided including: “element or general discipline-specific limitations, case-
980 specific limitations (where appropriate), and applicable error rates that may exist” (p. 7). A
981 section on assumptions stresses the importance of “acknowledge[ing] any assumptions that
982 have been made” and the need “to disclose [any] assumptions in any scientific report that is
983 prepared” (p. 8). Examples provided include: “underlying principles of the feature set on
984 which the basis for the analysis is being performed and case-specific assumptions required to
985 perform the analysis, where appropriate” (p. 8).
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

986
987 These guidelines also cover implementation considerations for proficiency testing,
988 accreditation, presenting opinions, reporting scales, propositions, peer-review, and human
989 bias. The document concludes: “Forensic science evidence has served the Courts well for
990 many years and its continued success will be dependent on ensuring that there is empirical
991 support for the validity and reliability of the underlying science. It is anticipated that if each
992 of the considerations presented in this document can be satisfied, for each of the elements
993 identified within a given forensic science discipline, a sound scientific basis will be available
994 for the Court to assess the strengthen of the forensic evidence appropriately” (p. 10).
995
996 4.2. UK Forensic Science Regulator
997
998 Since 2008 in the United Kingdom, a Forensic Science Regulator has been appointed to
999 oversee and coordinate quality efforts in forensic science in serving the entire criminal justice
1000 system. Codes of Practice and Conduct have been developed over the years with the fourth
1001 version issued in October 2017 [33]. This document notes: “This Code of Conduct provides a
1002 clear statement to customers and the public of what they have a right to expect” (p. 12). For
1003 example, the tenth requirement for a practitioner is to “conduct casework using methods of
1004 demonstrable validity and comply with the quality standards set by the Regulator relevant to
1005 the area in which you work” (p. 12).
1006
1007 The UK Forensic Science Regulator has published over 140 documents with guidance on a
1008 variety of topics including interpreting DNA evidence (December 2012), DNA
1009 contamination detection (September 2014), validation (November 2014), cognitive bias
1010 effects relevant to forensic science examinations (October 2015), laboratory DNA: anti-
1011 contamination (December 2015), crime scene DNA: anti-contamination (July 2016), expert
1012 report content (October 2017), DNA mixture interpretation (July 2018), and software
1013 validation for DNA mixture interpretation (July 2018). These publications are available at
1014 https://www.gov.uk/government/publications?departments%5B%5D=forensic-science-
1015 regulator.
1016

20
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

1017 5. Terminology and Concepts


1018
1019 Information is conveyed in how terms are defined. Having a common vocabulary is
1020 important for communicating ideas and developing a shared understanding of concepts.
1021 Therefore, the terms and concepts below are defined as we are using them in NIST scientific
1022 foundation reviews.
1023
1024 Repeatability: a measure of the variability that exists even when measurement conditions are
1025 kept as constant as possible—same laboratory, same protocol, same technician, same
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

1026 method, same batch of materials, same temperature, and so on. Repeatability represents the
1027 smallest amount of variability that can be achieved when all influential factors are kept as
1028 constant as possible.
1029
1030 Reproducibility: a measure of the variability that can exist under field application
1031 conditions—different operators, different labs, different equipment, different software
1032 programs, etc. Reproducibility represents a more relevant quantity than repeatability when
1033 assessing variability that may be present in practice.
1034
1035 Measurement error: the difference between a reported value and the true value when the
1036 true value is known and can be calculated. Otherwise this is a conceptual quantity).
1037
1038 Degree of reliability: a quantity that summarizes the average magnitude of the measurement
1039 errors. The degree of reliability is often reported as a root mean square error or mean
1040 absolute error for continuous measurements using known test cases of known value. For
1041 binary decisions (present/absent, positive/negative, etc.), reliability may be judged using
1042 error rates. The test cases with known values used to assess the degree of reliability must be
1043 representative of cases that may be encountered in practice.
1044
1045 False positive error rate: the proportion of times a known negative sample is classified as
1046 positive by a binary decision rule over a large number of independent tests that are
1047 representative of casework.
1048 .
1049 False negative error rate: the proportion of times a known positive sample is classified as
1050 negative by a binary decision rule over a large number of independent tests that are
1051 representative of casework.
1052
1053 Error rates: false positive and false negative rates are often reported as global averages, i.e.,
1054 average error rates across all labs, all examiners, samples of various complexities, etc. For
1055 such error rates to be useful in casework, it is important to assess error rates in cases similar
1056 to the current case samples being considered, which may be called case-specific error rates.
1057 Attempts to use case-specific error rates still involves some subjectivity in the sense that
1058 someone has to make the decision of what it means to be similar to the current case, but it is
1059 important to note that global error rates may not be relevant in a particular application.
1060
1061 Validation: the process of empirically demonstrating the suitability or fitness for purpose of
1062 a method of analysis. A validation exercise should explicitly state the criteria that are

21
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

1063 required to be met to demonstrate fitness for purpose. In the absence of clearly stated criteria
1064 that need to be met for a method to be regarded as validated, it is possible to calculate and
1065 share clearly defined metrics such as repeatability, reproducibility, degree of reliability, error
1066 rates, etc. When such metrics are available, the user can determine whether the method is fit
1067 for purpose. This is the preferred output from a validation exercise. Since it is impractical to
1068 carry out test runs to exhaustively cover all use cases, sufficient information needs to be
1069 made available for the user to determine the reasonable limits of extrapolation from actually
1070 conducted test scenarios to new scenarios not explicitly considered as part of a validation
1071 study.
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

1072
1073 Note that validation is neither a universal nor binary concept (i.e., validated versus not
1074 validated) because the same method may be considered to be valid for one application and
1075 not for another application. Even for two similar applications, a method may be considered
1076 valid or not valid depending on the seriousness of the consequences of errors. Therefore, it is
1077 good practice to report metrics that allow assessment of fitness for purpose rather than to
1078 report suitability for each specific application.
1079

22
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

1080
1081 6. References
1082 [1] National Academy of Sciences (NAS) , Committee on the Conduct of Science (1989)
1083 On being a scientist. Proceedings of the National Academy of Sciences of the United
1084 States of America 86(23):9053–9074.

1085 [2] National Commission on Forensic Science (2016) Views of the Commission
1086 Regarding Identifying and Evaluating Literature that Supports the Basic Principles of
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

1087 a Forensic Science Method or Forensic Science Discipline. Available at


1088 https://www.justice.gov/archives/ncfs/file/839716/download. Accessed September
1089 18, 2018.

1090 [3] National Research Council (2009) Strengthening Forensic Science in the United
1091 States: A Path Forward. National Academies Press, Washington, D.C.

1092 [4] President’s Council of Advisors on Science and Technology (PCAST) (September
1093 20, 2016) Report to the President: Forensic Science in Criminal Courts: Ensuring
1094 Scientific Validity of Feature-Comparison Methods. Available at
1095 https://obamawhitehouse.archives.gov/sites/default/files/microsites/ostp/PCAST/pcast
1096 _forensic_science_report_final.pdf Accessed June 28, 2018.

1097 [5] Committee on Appropriations (2017) Departments of Commerce and Justice,


1098 Science, and Related Agencies Appropriations Bill, 2018. Report 115-139. Available
1099 at https://www.congress.gov/115/crpt/srpt139/CRPT-115srpt139.pdf. Accessed
1100 September 18, 2018.

1101 [6] Nosek BA, Alter G, Banks GC, Borsboom D, Bowman SD, Breckler SJ, Buck S,
1102 Chambers CD, Chin G, Christensen G, Contestabile M, Dafoe A, Eich E, Freese J,
1103 Glennerster R, Goroff D, Green DP, Hesse B, Humphreys M, Ishiyama J, Karlan D,
1104 Kraut A, Lupia A, Mabry P, Madon T, Malhotra N, Mayo-Wilson E, McNutt M,
1105 Miguel E, Paluck EL, Simonsohn U, Soderberg C, Spellman BA, Turitto J,
1106 VandenBos G, Vazire S, Wagenmakers EJ, Wilson R, Yarkoni T (2015) Promoting
1107 an open research culture. Science 348(6242):1422-1425.
1108 https://doi.org/10.1126/science.aab2374

1109 [7] Ross-Hellauer T (2017) What is open peer review? A systematic review.
1110 F1000Research 6:588. https://doi.org/10.12688/f1000research.11369.2

1111 [8] Press R (2017) Who was Detective X? Available at https://www.nist.gov/featured-


1112 stories/who-was-detective-x. Accessed August 30, 2018.

1113 [9] Federal Bureau of Investigation (2011) Quality Assurance Standards for Forensic
1114 DNA Testing Laboratories. Available at https://www.fbi.gov/file-repository/quality-

23
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

1115 assurance-standards-for-forensic-dna-testing-laboratories.pdf/view. Accessed August


1116 30, 2018.

1117 [10] Memorandum of Understanding (MOU) between NIST and the Department of Justice
1118 (DOJ) (2013) Available at
1119 https://www.justice.gov/archives/ncfs/file/761051/download. Accessed September
1120 19, 2018.

1121 [11] National Commission on Forensic Science (NCFS). Available at


This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

1122 https://www.justice.gov/archives/ncfs. Accessed June 28, 2018.

1123 [12] American Association for the Advancement of Science (AAAS) (July 11, 2017)
1124 Forensic Science Assessments: A Quality and Gap Analysis – Fire Investigation.
1125 Available at https://www.aaas.org/page/forensic-science-assessments-quality-and-
1126 gap-analysis and https://www.aaas.org/report/fire-investigation. Accessed June 28,
1127 2018.

1128 [13] American Association for the Advancement of Science (AAAS) (September 15,
1129 2017) Forensic Science Assessments: A Quality and Gap Analysis – Latent
1130 Fingerprint Examination. Available at https://www.aaas.org/page/forensic-science-
1131 assessments-quality-and-gap-analysis and https://www.aaas.org/report/latent-
1132 fingerprint-examination. Accessed June 28, 2018.

1133 [14] National Commission on Forensic Science , Recommendation to the Attorney


1134 General (September 12, 2016) Technical Merit Evaluation of Forensic Science
1135 Method and Practices. Available at
1136 https://www.justice.gov/archives/ncfs/page/file/905541/download. Accessed June 28,
1137 2018.

1138 [15] 2016) NCFS Meeting #11 webcast. Available at https://www.nist.gov/topics/forensic-


1139 science/ncfs-meeting-11-webcast (see part 2). Accessed July 9, 2018.

1140 [16] 2017) NCFS Meeting #13 webcast. Available at https://www.nist.gov/topics/forensic-


1141 science/ncfs-meeting-13-webcast; transcript available at
1142 https://www.nist.gov/sites/default/files/documents/2017/04/27/1_april_10_doj_part_1
1143 .pdf (see page 8). Accessed July 9, 2018.

1144 [17] Butler JM (2015) U.S. initiatives to strengthen forensic science & international
1145 standards in forensic DNA. Forensic Science International: Genetics 18:4-20.
1146 https://doi.org/10.1016/j.fsigen.2015.06.008

1147 [18] Scientific Working Group for Firearms and Toolmarks , Association of Firearms and
1148 Toolmark Examiners (June 14, 2011) Response to 25 foundational firearm and
1149 toolmark examination questions received from the Subcommittee on Forensic Science

24
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

1150 (SoFS), Research, Development, Testing, & Evaluation Interagency Working Group
1151 (RDT&E IWG). Available at
1152 https://www.nist.gov/sites/default/files/documents/forensics/Annotated-Bibliography-
1153 Firearms-Toolmarks.pdf. Accessed September 18, 2018.

1154 [19] Bartick EG & Floyd MA eds (2016) Forensic Science Research and Evaluation
1155 Workshop: A Discussion on the Fundamentals of Research Design and an Evaluation
1156 of Available Literature. May 26-27, 2015. Washington, D.C. Available at
1157 https://www.ncjrs.gov/pdffiles1/nij/250088.pdf. Accessed June 28, 2018.
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

1158 [20] National Institute of Justice (May 26–27, 2015) Forensic Science Research and
1159 Evaluation Workshop: A Discussion on the Fundamentals of Research Design and an
1160 Evaluation of Available Literature. Available at
1161 https://www.ncjrs.gov/pdffiles1/nij/250088.pdf. Accessed September 18, 2018.

1162 [21] National Commission on Forensic Science , Views of the Commission (May 1, 2015)
1163 Defining Forensic Science and Related Terms. Available at
1164 https://www.justice.gov/archives/ncfs/file/786571/download. Accessed July 11, 2018.

1165 [22] National Commission on Forensic Science (NCFS) , Scientific Inquiry and Research
1166 Subcommittee (January 2015) Views on Scientific Literature in Support of Forensic
1167 Science and Practice. Available at
1168 https://www.justice.gov/archives/ncfs/file/786591/download. Accessed September
1169 18, 2018.

1170 [23] National Commission on Forensic Science (NCFS) , Scientific Inquiry and Research
1171 Subcommittee (March 2016) Recommendation to Fund Post-Doctoral Projects to
1172 Facilitate Translation of Research into Forensic Science Practice. Available at
1173 https://www.justice.gov/archives/ncfs/page/file/839721/download. Accessed
1174 September 18, 2018.

1175 [24] National Commission on Forensic Science (NCFS) , Scientific Inquiry and Research
1176 Subcommittee (March 2016) Views on Identifying and Evaluating Literature that
1177 Supports the Basic Principles of a Forensic Science Method or Forensic Science
1178 Discipline Available at https://www.justice.gov/archives/ncfs/file/839716/download.
1179 Accessed September 18, 2018.

1180 [25] National Commission on Forensic Science (NCFS) , Scientific Inquiry and Research
1181 Subcommittee (June 2016) Views on Technical Merit Evaluation of Forensic Science
1182 Methods and Practices. Available at
1183 https://www.justice.gov/archives/ncfs/file/881796/download. Accessed September
1184 18, 2018.

25
NISTIR 8225 DRAFT NIST SCIENTIFIC FOUNDATION REVIEWS

1185 [26] National Commission on Forensic Science (NCFS) , Scientific Inquiry and Research
1186 Subcommittee (September 2016) Recommendation on Technical Merit Evaluation of
1187 Forensic Science Methods and Practice Available at
1188 https://www.justice.gov/archives/ncfs/page/file/905541/download. Accessed
1189 September 18, 2018.

1190 [27] National Commission on Forensic Science (2015) Views of the Commission Scientific
1191 Literature in Support of Forensic Science and Practice. Available at
1192 https://www.justice.gov/archives/ncfs/file/786591/download. Accessed September
This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8225-draft

1193 18, 2018.

1194 [28] PCAST (2016) References in Report to the President: Forensic Science in Criminal
1195 Courts: Ensuring Scientific Validity of Feature-Comparison Methods. Available at
1196 https://obamawhitehouse.archives.gov/sites/default/files/microsites/ostp/PCAST/pcast
1197 _forensics_references.pdf. Accessed June 28, 2018.

1198 [29] Koehler JJ (2018) How Trial Judges Should Think About Forensic Science Evidence.
1199 Judicature. 102(1):28-38

1200 [30] PCAST (January 2017) An Addendum to the PCAST Report on Forensic Science in
1201 Criminal Courts. Available at
1202 https://obamawhitehouse.archives.gov/sites/default/files/microsites/ostp/PCAST/pcast
1203 _forensics_addendum_finalv2.pdf. Accessed June 28, 2018.

1204 [31] American Association for the Advancement of Science (AAAS) (2015) Forensic
1205 Science Assessments: A Quality and Gap Analysis. Available at
1206 https://www.aaas.org/page/forensic-science-assessments-quality-and-gap-analysis.
1207 Accessed September 18, 2018.

1208 [32] ANZPAA NIFS publications. Available at http://www.anzpaa.org.au/forensic-


1209 science/our-work/products/publications. Accessed June 28, 2018.

1210 [33] UK Forensic Science Regulator publications. Available at


1211 https://www.gov.uk/government/publications?departments%5B%5D=forensic-
1212 science-regulator. Accessed June 28, 2018.
1213

26