Spelling suggestions: "subject:" forensics"" "subject:" forensic's""
141 |
Evaluation of Environmental Concentratorsfor Trace Actinide MeasurementsLavelle, Kevin B. January 2016 (has links)
No description available.
|
142 |
Risk Prediction in Forensic Psychiatry: A Path ForwardWatts, Devon January 2020 (has links)
Background: Actuarial risk estimates are considered the gold-standard way to assess whether forensic psychiatry patients are likely to commit prospective criminal offences. However, these risk estimates cannot individually predict the type of criminal offence a patient will subsequently commit, and often simply assess the general likelihood of crime occurring in a group sample. In order to advance the predictive utility of risk assessments, better statistical strategies are required.
Aim: To develop a machine learning model to predict the type of criminal offense committed in forensic psychiatry patients, at an individual level.
Method: Machine learning algorithms (Random Forest, Elastic Net, SVM), were applied to a representative and diverse sample of 1240 patients in the forensic mental health system. Clinical, historical, and sociodemographic variables were considered as potential predictors and assessed in a data-driven way. Separate models were created for each type of criminal offence, and feature selection methods were used to improve the interpretability and generalizability of our findings.
Results: Sexual and violent crimes can be predicted at an individual level with 83.26% sensitivity and 77.42% specificity using only 20 clinical variables. Likewise, nonviolent, and sexual crimes can be individually predicted with 74.60% sensitivity and 80.65% specificity using 30 clinical variables.
Conclusion: The current results suggest that machine learning models have accuracy comparable to existing risk assessment tools (AUCs .70-.80). However, unlike existing risk tools, this approach allows for the prediction of cases at an individual level, which is more clinically useful. The accuracy of prospective models is expected to only improve with further refinement. / Thesis / Master of Science (MSc) / Individuals end up in the forensic mental health system when they commit crimes and are found to be not criminality responsible because of a mental disorder. They are released back into the community when deemed to be low risk. However, it is important to consider the accuracy of the method we use to determine risk at the level of an individual person. Currently, we use group average to assess individual risk, which does not work very well. The range of our predictions become so large, that they are virtually meaningless. In other words, the average of a group is meaningless with respect to you.
Instead, statistical models can be developed that can make predictions accurately, and at an individual level. Therefore, the current work sought to predict the types of criminal offences committed, among 1240 forensic patients. Making accurate predictions of the crimes people may commit in the future is urgently needed to identify better strategies to prevent these crimes from occurring in the first place.
Here, we show that it is possible to predict the type of criminal offense an individual will later commit, using data that is readily available by clinicians. These models perform similarly to the best risk assessment tools available, but unlike these risk assessment tools, can make predictions at an individual level. It is suggested that similar approaches to the ones outlined in this paper could be used to improve risk prediction models, and aid crime prevention strategies.
|
143 |
Skeletal manifestations of child abuse and associated sociological risk factorsThomas, Lindsey M. 01 January 2008 (has links)
Children are at a greater risk for abuse due to their small size and powerlessness. As pregnancies and births can be easily hidden, a child's death can be equally as unnoticeable. Often, these deaths are unknown until skeletal evidence is discovered. At this point, any incriminating evidence that may have been soft tissue in nature is gone or of no use, and all that remains is the skeleton. This is especially important in areas of the United States that are characterized by hot and humid climates, as in the Southeast, or in situations that mimic such conditions. These circumstances favor a faster rate of decomposition and thus quicker and earlier loss of soft tissue along with any of the important information it could provide about identification and time and manner of death. It is important for law enforcement agents and forensic anthropologists to be familiar with what the patterns of child abuse look like by being able to differentiate between intentional trauma from non-intentional trauma; this requires a basic knowledge of bone biology and healing rates in order to sequence injuries to aid in the determination of cause and manner of death. It also necessary to understand what other events can mimic child abuse such as disease. In addition to the skeletal evidence, the sociological risk factors that can increase the risk of child abuse must also be taken into consideration.
|
144 |
High throughput quantitative analysis of four commonly encountered drug metabolites in synthetic urine via biocompatible-solid phase microextraction and direct analysis in real time-mass spectrometryKnake-Wheelock, Kelsey 04 November 2024 (has links)
Since 2011, toxicology labs across the United States have faced an ever-increasing caseload, with backlogs and analytical turn-around times that continue to grow despite efforts to increase sample throughput. Sample preparation is often the more time-consuming, labor-intensive, and error-prone portion of the analytical process. Biocompatible Solid Phase Microextraction (Bio-SPME) is a single-step sample preparation technique that is rapid, simple, and amenable to automation. Direct Analysis in Real Time (DART) is an ambient ionization technique that can be paired with mass spectrometry (MS) to rapidly detect, identify, and quantitate drugs of abuse in urine samples. When these methods are combined, the entire analytical process, from start to finish, can be completed manually in just over 2 minutes per sample. By increasing sample throughput and decreasing human labor, toxicology labs would benefit greatly by adopting such a rapid analytical technique.
This project focused on optimization of the Bio-SPME sample preparation process to maximize signal intensity and minimize preparation time per sample. 11-nor-9-carboxy-9-tetrahydrocannabinol (THC-COOH), benzoylecgonine, norfentanyl, and methamphetamine were selected as target analytes for detection and quantitation in specialty matrix urine (SMx urine). To prepare samples, octadecyl (C18) Bio-SPME fibers were subjected to a conditioning and extracting process prior to analysis via DART-MS. It was determined that a 15 minute conditioning period was sufficient to prepare the fibers for extraction. The extraction period was analyte-dependent with ideal analytes adsorbing to the fibers in as little as 15 minutes. Under ideal conditions, the entire sample preparation process was found to take as little as 30 minutes.
Multiple Reaction Monitoring (MRM) methods were built for each target analyte and their deuterated analog. For qualitative purposes, two to three transitions were included for each analyte, whereas the quantitative methods included only one transition per analyte. The limit of detection (LOD) for benzoylecgonine and methamphetamine was found to be 50ng/mL in urine. The LOD for norfentanyl was 75ng/mL in urine, and the LOD for THC-COOH was 250ng/mL in urine. Three of the four analytes were successfully quantitated using DART-MS when only a single analyte is present in the sample: benzoylecgonine and methamphetamine can be reliably quantitated between 50ng/mL and 1000ng/mL, while norfentanyl can be quantitated between 75ng/mL and 1000ng/mL. Data collected from the optimized sample preparation and targeted analytical process demonstrates that rapid detection, identification, and quantitation of metabolites from various drug classes are possible via DART-MS.
|
145 |
Completing the Picture : Fragments and Back AgainKarresand, Martin January 2008 (has links)
<p>Better methods and tools are needed in the fight against child pornography. This thesis presents a method for file type categorisation of unknown data fragments, a method for reassembly of JPEG fragments, and the requirements put on an artificial JPEG header for viewing reassembled images. To enable empirical evaluation of the methods a number of tools based on the methods have been implemented.</p><p>The file type categorisation method identifies JPEG fragments with a detection rate of 100% and a false positives rate of 0.1%. The method uses three algorithms, Byte Frequency Distribution (BFD), Rate of Change (RoC), and 2-grams. The algorithms are designed for different situations, depending on the requirements at hand.</p><p>The reconnection method correctly reconnects 97% of a Restart (RST) marker enabled JPEG image, fragmented into 4 KiB large pieces. When dealing with fragments from several images at once, the method is able to correctly connect 70% of the fragments at the first iteration.</p><p>Two parameters in a JPEG header are crucial to the quality of the image; the size of the image and the sampling factor (actually factors) of the image. The size can be found using brute force and the sampling factors only take on three different values. Hence it is possible to use an artificial JPEG header to view full of parts of an image. The only requirement is that the fragments contain RST markers.</p><p>The results of the evaluations of the methods show that it is possible to find, reassemble, and view JPEG image fragments with high certainty.</p>
|
146 |
Digital evidence : representation and assuranceSchatz, Bradley Lawrence January 2007 (has links)
The field of digital forensics is concerned with finding and presenting evidence sourced from digital devices, such as computers and mobile phones. The complexity of such digital evidence is constantly increasing, as is the volume of data which might contain evidence. Current approaches to interpreting and assuring digital evidence rely implicitly on the use of tools and representations made by experts in addressing the concerns of juries and courts. Current forensics tools are best characterised as not easily verifiable, lacking in ease of interoperability, and burdensome on human process. The tool-centric focus of current digital forensics practise impedes access to and transparency of the information represented within digital evidence as much as it assists, by nature of the tight binding between a particular tool and the information that it conveys. We hypothesise that a general and formal representational approach will benefit digital forensics by enabling higher degrees of machine interpretation, facilitating improvements in tool interoperability and validation. Additionally, such an approach will increase human readability. This dissertation summarises research which examines at a fundamental level the nature of digital evidence and digital investigation, in order that improved techniques which address investigation efficiency and assurance of evidence might be identified. The work follows three themes related to this: representation, analysis techniques, and information assurance. The first set of results describes the application of a general purpose representational formalism towards representing diverse information implicit in event based evidence, as well as domain knowledge, and investigator hypotheses. This representational approach is used as the foundation of a novel analysis technique which uses a knowledge based approach to correlate related events into higher level events, which correspond to situations of forensic interest. The second set of results explores how digital forensic acquisition tools scale and interoperate, while assuring evidence quality. An improved architecture is proposed for storing digital evidence, analysis results and investigation documentation in a manner that supports arbitrary composition into a larger corpus of evidence. The final set of results focus on assuring the reliability of evidence. In particular, these results focus on assuring that timestamps, which are pervasive in digital evidence, can be reliably interpreted to a real world time. Empirical results are presented which demonstrate how simple assumptions cannot be made about computer clock behaviour. A novel analysis technique for inferring the temporal behaviour of a computer clock is proposed and evaluated.
|
147 |
Developing a one-semester course in forensic chemical science for university undergraduatesSalem, Roberta Sue January 1900 (has links)
Doctor of Philosophy / Curriculum and Instruction Programs / Tweed R. Ross / John R. Staver / The purpose of this study was to research, develop and validate a one-semester course for the general education of university undergraduates in forensic chemical education. The course outline was developed using the research and development (R&D) methodology recommended by Gall, Borg, and Gall, (2003) and Dick and Carey, (2001) through a three step developmental cycle.
Information was gathered and analyzed through review of literature and proof of concept interviews, laying the foundation for the framework of the course outline.
A preliminary course outline was developed after a needs assessment showed need for such a course. Professors expert in the area of forensic science participated in the first field test of the course. Their feedback was recorded, and the course was revised for a main field test. Potential users of the guide served as readers for the main field test and offered more feedback to improve the course.
|
148 |
Comparative Analysis & Study of Android/iOS MobileForensics Tools / Komparativ Analys & Studie av Android/iOS Forensik Verktyg för MobiltelefonerShakir, Amer, Hammad, Muhammad, Kamran, Muhammad January 2021 (has links)
This report aims to draw a comparison between two commercial mobile forensics and recovery tools, Magnet AXIOM and MOBILedit. A thorough look at previously done studies was helpful to know what aspects of the data extractions must be compared and which areas are the most important ones to focus upon. This work focuses on how the data extracted from one tool compares with another and provides comprehensive extraction based on different scenarios, circumstances, and aspects. Performances of both tools are compared based on various benchmarks and criteria. This study has helped establish that MOBILedit has been able to outperform Magnet AXIOM on more data extraction and recovery aspects. It is a comparatively better tool to get your hands on.
|
149 |
Completing the Picture : Fragments and Back AgainKarresand, Martin January 2008 (has links)
Better methods and tools are needed in the fight against child pornography. This thesis presents a method for file type categorisation of unknown data fragments, a method for reassembly of JPEG fragments, and the requirements put on an artificial JPEG header for viewing reassembled images. To enable empirical evaluation of the methods a number of tools based on the methods have been implemented. The file type categorisation method identifies JPEG fragments with a detection rate of 100% and a false positives rate of 0.1%. The method uses three algorithms, Byte Frequency Distribution (BFD), Rate of Change (RoC), and 2-grams. The algorithms are designed for different situations, depending on the requirements at hand. The reconnection method correctly reconnects 97% of a Restart (RST) marker enabled JPEG image, fragmented into 4 KiB large pieces. When dealing with fragments from several images at once, the method is able to correctly connect 70% of the fragments at the first iteration. Two parameters in a JPEG header are crucial to the quality of the image; the size of the image and the sampling factor (actually factors) of the image. The size can be found using brute force and the sampling factors only take on three different values. Hence it is possible to use an artificial JPEG header to view full of parts of an image. The only requirement is that the fragments contain RST markers. The results of the evaluations of the methods show that it is possible to find, reassemble, and view JPEG image fragments with high certainty.
|
150 |
Enhancing the Admissibility of Live Box Data Capture in Digital Forensics: Creation of the Live Box Computer Preservation Response (LBCPR) and Comparative Study Against Dead Box Data AcquisitionEmilia Mancilla (14202911) 05 December 2022 (has links)
<p>There are several techniques and methods on how to capture data during a Live Box response in computer forensics, but the key towards these acquisitions is to keep the collected data admissible in a judicial court process. Different approaches during a Live Box examination will lead to data changes in the computer, due to the volatile nature of data stored in memory. The inevitable changes of volatile data are what cause the controversy when admitting digital evidence to court room proceedings.</p>
<p>The main goal of this dissertation was to create a process model, titled Live Box Computer Preservation Response(LBCPR), that would assist in ensuing validity, reliably and accuracy of evidence in a court of law. This approach maximizes the admissibly of digital data derived from a Live Box response. </p>
<p>The LBCPR was created to meet legal and technical requirements in acquiring data from a live computer. With captured Live Box computer data, investigators can further add value to their investigation when processing and analyzing the captured data set, that would have otherwise been permanently unrecoverable upon powering down the machine. By collecting the volatile data prior to conducting Dead Box forensics, there is an increased amount of information that that can be a utilized to understand the state of the machine upon collection when combined with the stored data contents. </p>
<p>This study created a comparative analysis on data collection with the LBCPR method versus traditional Dead Box forensics techniques, further proving the expected results of Live Box techniques capturing volatile data. However, due to the structure of the LBCPR, there were enhanced capabilities of obtaining value from the randomization of memory dumps, because of the assistance of the collected logs in the process model. In addition, with the legal admissibility focus, there was incorporation of techniques to keep data admissible in a court of law. </p>
|
Page generated in 0.0545 seconds