Spelling suggestions: "subject:"bperformance alidation"" "subject:"bperformance balidation""
1 |
Validating Machine and Human Decision-Making in Forensic Fire Debris AnalysisWhitehead, Frances A 01 January 2024 (has links) (PDF)
This work presents a background on the chemical complexity of fire debris analysis, including an ever-present matrix of pyrolysis products as the catalyst that led to the creation of the National Center for Forensic Science's Fire Debris Database. A selection of these 1,000+ casework-relevant ground truth samples was used to create two newly proposed analyst workflows to connect the current method of categorical reporting with evaluative reporting practices reflective of the strength of the evidence. Both workflows use linear sequential unmasking to help mitigate bias, a discrete scoring system for quantification of the analysis, and receiver operating characteristic (ROC) curves to bridge together categorical and probabilistic reporting by indicating the optimum decision threshold the analysts are operating from when they make a decision. Both workflows also allow a machine-learning component to be included in evaluating the evidence and are practical methods for obtaining validated performances for human and machine decisions. The second workflow includes subjective logic, which provides a means of determining the uncertainty inherent to the opinion made by the analyst and the machine learning computational model. ‘Fuzzy categories' and an opinion triangle connect the opinion offered by the analyst given their perceived uncertainty to the ROC curve so a categorical decision can be made. For each workflow, three analysts independently assessed 20 randomly chosen samples from the Fire Debris Database and followed the ASTM E1618-19 standard fire debris analysis method. The resultant area under the ROC curve for each analyst for each workflow was 0.90 or higher, indicating that all were in the very good to excellent range for diagnostic classifiers, as was the machine learning model tested in the second workflow. Recommendations for implementing a performance validation workflow, how repetitive engagement can help the individual analyst and insights on using these for performance validation and training purposes are also included.
|
2 |
Optimization Techniques for Performance and Power Dissipation in Test and ValidationJayaraman, Dheepakkumaran 01 May 2012 (has links)
The high cost of chip testing makes testability an important aspect of any chip design. Two important testability considerations are addressed namely, the power consumption and test quality. The power consumption during shift is reduced by efficiently adding control logic to the design. Test quality is studied by determining the sensitization characteristics of a path to be tested. The path delay fault models have been used for the purpose of studying this problem. Another important aspect in chip design is performance validation, which is increasingly perceived as the major bottleneck in integrated circuit design. Given the synthesizable HDL code, the proposed technique will efficiently identify infeasible paths, subsequently, it determines the worst case execution time (WCET) in the HDL code.
|
Page generated in 0.1163 seconds