• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 151
  • 40
  • 37
  • 22
  • 7
  • 6
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 395
  • 134
  • 125
  • 84
  • 63
  • 56
  • 55
  • 53
  • 43
  • 42
  • 37
  • 32
  • 29
  • 27
  • 25
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Assisting digital forensic analysis via exploratory information visualisation

Hales, Gavin January 2016 (has links)
Background: Digital forensics is a rapidly expanding field, due to the continuing advances in computer technology and increases in data stage capabilities of devices. However, the tools supporting digital forensics investigations have not kept pace with this evolution, often leaving the investigator to analyse large volumes of textual data and rely heavily on their own intuition and experience. Aim: This research proposes that given the ability of information visualisation to provide an end user with an intuitive way to rapidly analyse large volumes of complex data, such approached could be applied to digital forensics datasets. Such methods will be investigated; supported by a review of literature regarding the use of such techniques in other fields. The hypothesis of this research body is that by utilising exploratory information visualisation techniques in the form of a tool to support digital forensic investigations, gains in investigative effectiveness can be realised. Method:To test the hypothesis, this research examines three different case studies which look at different forms of information visualisation and their implementation with a digital forensic dataset. Two of these case studies take the form of prototype tools developed by the researcher, and one case study utilises a tool created by a third party research group. A pilot study by the researcher is conducted on these cases, with the strengths and weaknesses of each being drawn into the next case study. The culmination of these case studies is a prototype tool which was developed to resemble a timeline visualisation of the user behaviour on a device. This tool was subjected to an experiment involving a class of university digital forensics students who were given a number of questions about a synthetic digital forensic dataset. Approximately half were given the prototype tool, named Insight, to use, and the others given a common open-source tool. The assessed metrics included: how long the participants took to complete all tasks, how accurate their answers to the tasks were, and how easy the participants found the tasks to complete. They were also asked for their feedback at multiple points throughout the task. Results:The results showed that there was a statistically significant increase in accuracy for one of the six tasks for the participants using the Insight prototype tool. Participants also found completing two of the six tasks significantly easier when using the prototype tool. There were no statistically significant different difference between the completion times of both participant groups. There were no statistically significant differences in the accuracy of participant answers for five of the six tasks. Conclusions: The results from this body of research show that there is evidence to suggest that there is the potential for gains in investigative effectiveness when information visualisation techniques are applied to a digital forensic dataset. Specifically, in some scenarios, the investigator can draw conclusions which are more accurate than those drawn when using primarily textual tools. There is also evidence so suggest that the investigators found these conclusions to be reached significantly more easily when using a tool with a visual format. None of the scenarios led to the investigators being at a significant disadvantage in terms of accuracy or usability when using the prototype visual tool over the textual tool. It is noted that this research did not show that the use of information visualisation techniques leads to any statistically significant difference in the time taken to complete a digital forensics investigation.
62

DFMF : a digital forensic management framework

Grobler, Cornelia Petronella 22 August 2012 (has links)
D.Phil.(Computer Science) / We are living in an increasingly complex world in which much of society is dependent on technology and its various offshoots and incarnations (Rogers & Siegfried, 2004). There is ample evidence of the influence of technology on our daily lives. We communicate via e-mail, use chat groups to interact and conduct business by using e-commerce. People relate each other’s existence to a presence on Facebook. The convergence of the products, systems and services of information technology is changing the way of living. The latest smart and cell phones have cameras, applications, and access to social networking sites. These phones contain sensitive information, for example photographs, e-mail, spread sheets, documents, and presentations. The loss of a cell phone therefore may pose a serious problem to an individual or an organisation, when considering privacy and intellectual property issues from an information security (Info Sec) perspective (Pieterse, 2006). Organisations have accepted the protection of information and information assets as a fundamental business requirement and managers are therefore implementing an increasing number of security counter measures, such as security policies, intrusion detection systems, access control mechanisms, and anti-virus products to protect the information and information assets from potential threats. However, incidents still occur, as no system is 100% secure. The incidents must be investigated to determine their root cause and potentially to prosecute the perpetrators (Louwrens, von Solms, Reeckie & Grobler, 2006b). Humankind has long been interested in the connection between cause and event, wishing to know what happened, what went wrong and why it happened. The need for computer forensics emerged when an increasing number of crimes were committed with the use of computers and the evidence required was stored on the computer. In 1984, a Federal Bureau of Investigation (FBI) laboratory began to examine computer evidence (Barayumureeba & Tushabe, 2004), and in 1991 the international association of computer investigation specialists (IACIS) in Portland, Oregon coined the term ‘computer forensics’ during a training session.
63

Docker forensics: Investigation and data recovery on containers / Dockerforensik: Undersökning och datautvinning av containers

Davidsson, Pontus, Englund, Niklas January 2020 (has links)
Container technology continuously grows in popularity, and the forensic area is less explored than other areas of research concerning containers. The aim of this thesis is, therefore, to explore Docker containers in a forensic investigation to test whether data can be recovered from deleted containers and how malicious processes can be detected in active containers. The results of the experiments show that, depending on which container is used, and how it is configured, data sometimes persists after the container is removed. Furthermore, file carving is tested and evaluated as a useful method of recovering lost files from deleted containers, should data not persist. Lastly, tests reveal that malicious processes running inside an active container can be detected by inspection from the host machine.
64

Sociolinguistic Differences in Deceptive Speech: Analyzing Speech Patterns Between Different Groups in Police Interrogations

Davidson, Isabel D. 23 May 2022 (has links)
No description available.
65

Prevalence of alcohol and drugs in New York City drivers

Kazaryan, Ani 24 October 2018 (has links)
PURPOSE: The purpose of this study was to investigate the potential relationship between alcohol and drug prevalence in drunk- and drug-impaired driving cases in New York City (NYC) between January 1, 2015 and December 31, 2017 and to determine how this prevalence has changed over time. The study also investigated the demographic characteristics of drivers to determine if there are certain groups who are consistently involved in alcohol and/or drug abuse while operating a motor vehicle. METHODS: This retrospective study determined the alcohol and drug prevalence in individual drivers represented as cases per year over three consecutive years. A total of 613 cases were included in the study for individuals, age 16 to 75 years old arrested for suspicion of driving while intoxicated (DWI) in NYC. Individual data collected included basic demographic information, time and day of incident, borough in which incident occurred, type of matrix used for toxicological analysis and the presence and absence of alcohol and/or drugs. Drug findings were combined into classes based on their likely effect and included the following categories: alcohol, antidepressants, cannabinoids, narcotic analgesics, sedatives, stimulants and other. RESULTS: Results from the study compared data over three consecutive years from DWI cases (2015 to 2017). In comparing prevalence of drug classes by year, the percent of cases tested positive for cannabinoids, narcotic analgesics and stimulants changed significantly from 2015 to 2017. Delta-9-tetrahydrocannabinol (THC), the active component of marijuana, was the most frequent individual drug identified using a screening method. The prevalence rate of cannabinoids increased significantly in 2017 to 43.0% from 32.5% the previous year and 29.3% in 2015. The narcotic analgesics prevalence rate increased significantly in 2016 to 28.5% from 13.4% in the previous year and slightly decreased to 26.9% in 2017. Comparison of stimulants by year showed a significant increase in 2017, 28.1% versus 19.0% (2016) versus 18.3% (2015). When comparing the 2017 results to the drugs tested for in 2015 and 2016, significantly higher daytime drug prevalence was found between the previous years and 2017. In evaluating race and drug use, white drivers were significantly more likely to test positive for sedatives and stimulants than other races. In Manhattan, there was a significantly higher alcohol detection rate compared to the other boroughs and in Staten Island there was a significantly higher narcotic analgesics detection rate. In comparing the top five individual drugs identified by borough, cannabinoids were the most common drug across all of the boroughs. Alprazolam and cocaine (identified by its metabolite, benzoylecgonine, 98% of the time) were the next most frequently encountered drugs alternating as the top two and three drugs identified in the following four boroughs: Manhattan, Queens, the Bronx, and Staten Island. Phencyclidine (PCP) (“angel dust”) was identified in the top five for Manhattan, the Bronx and Staten Island. A statistically significant negative association was found between cannabinoid-positive and alcohol-positive drivers. The percentage of drivers with a BAC greater than .08 g/dL was significantly lower among cannabinoid-positive drivers than those who tested negative for cannabinoids. Although there were no strong correlations between drug classes, sedatives were associated (according to significant correlations) most to other drugs (correlated to 6 out of 6 categories). CONCLUSIONS: This study summarizes the results of the first OCME FTL of NYC toxicological findings in DWI cases to estimate alcohol and drug-involved driving prevalence. It is important to note that this is a prevalence study and not a study that reports the risks associated with drugged-driving. Since many drugs may be detected long after its impairing effects are gone, the focus of this study was to merely convey the use of particular drugs in the driving population.
66

The use of microchip capillary electrophoresis/tandem mass spectrometry for the detection and quantification of opioids

Silver, Brianna Danielle 28 February 2021 (has links)
Forensic toxicology is a critical field in which scientific techniques are employed in order to establish the presence or absence of pharmacological substances and/or their metabolites within an individual. The results of such analyses can have legal implications, and toxicology has a number of important applications, including post-mortem investigations, workplace drug testing, therapeutic drug monitoring, and impaired driving studies. The focus of this specific body of work is on the use of toxicology in the detection and quantification of drugs of abuse –specifically opioids - in biological samples. In recent years, there has been a surge in opioid abuse and the need for forensic toxicology labs to process samples from such cases quickly and accurately continues to increase. As a result, it is imperative to research different techniques and technologies that can be applied in toxicology to improve efficiency of sample processing while still remaining sensitive and specific. Many toxicology laboratories today use immunoassay techniques for screening, and utilize a liquid chromatography-tandem mass spectrometry (LC-MS/MS) method for quantification. While these methods are established and reliable, the need to analyze an increasing number of samples in a more efficient time frame is essential, and with that, the need to develop and validate new analytical methods. This study sought to validate the use of Microchip Capillary Electrophoresis-Tandem Mass Spectrometry (CE-MS/MS) as a method for detecting and quantifying a panel of fourteen opioids. The experiments were run using a ZipChip (908 Devices, Boston, MA) as the separation scheme, which contains a small capillary where analytes are separated out by electrophoretic mobility - dictated largely by size and charge. These analytes were then ionized by electron spray ionization (ESI) at the end of the chip, and then detected, fragmented, and analyzed in a SCIEX 4500 Triple Quadrupole Mass Spectrometer (Framingham, MA). The analytical run time of the method evaluated was two and half minutes per sample. Calibration curves were run and the method was assessed for a number of validation parameters, including bias, precision, limit of detection, common analyte interferences, matrix interferences, and carryover, as recommended by the American Academy of Forensic Sciences Standards Board. The fourteen drugs and metabolites looked at in this study were 6-monoacetylmorphine, buprenorphine, codeine, dihydrocodeine, 2-ethylidene-1, 5-dimethyl-3, 3-diphenylpyrrolidine (EDDP), fentanyl, heroin, methadone, morphine, naloxone, norfentanyl, oxycodone, oxymorphone, and tramadol. All standards were ordered from Cerilliant (Round Rock, TX), as well as deuterated internal standards used for quantification purposes. This study showed that as the method currently stands, it can reliably detect this panel of opioids at limits of detection between 1 and 15 ng/mL, with the exception of buprenorphine and morphine, for which the method appeared less sensitive. While some applications desire higher sensitivity than this, this level of detection could be very useful as a screening technique that is quick and also far more specific than current immunoassay screening techniques, and provide the additional advantage of quantification for samples at slightly higher concentrations. Quality control samples at 100 ng/mL and 150 ng/mL generally showed consistent results and acceptable levels of bias and precision, indicating that the method can be used to reliably quantify this panel of opioids at those concentrations. In addition, interference signals detected during analysis of other common analytes often encountered with opioids were negligible, with the exception of heroin and norfentanyl. Analysis of ten lots of urine for blank matrix interferences also demonstrated low potential for interference, with the exception of heroin. Finally, there was no evidence of significant carryover between samples, or interference from the deuterated internal standards. While some potential instrumentation issues such as mass spectrometer calibration prompt further study, the method shows promise for future use as a high throughput analysis tool in forensic toxicology labs. CE-MS/MS has the added benefit of not only faster run times, but significantly less sample consumption per run, and additionally, less sample preparation. CE is a viable separation scheme for metabolites and forensic applications, and could make large impacts as an effective way to analyze toxicological samples.
67

Investigating stutter characteristics via isoalleles in massively parallel sequencing of a family pedigree

Wu, Ping Yi 01 March 2021 (has links)
Despite the prevalent utilization of capillary electrophoresis (CE) in the analysis of short tandem repeats (STRs) to generate deoxyribonucleic acid (DNA) profiles for forensic comparisons, the method is not without its inherent drawbacks. Massively parallel sequencing (MPS) is still a relatively novel technology in the forensics field, but contains the capacity to address current challenges faced by the traditional CE approach - such as degraded samples, low template DNA, and artifacts - while also providing additional information such as isoalleles, same-length alleles with sequence variation, and ancestry, mixture, and phenotyping-informative single nucleotide polymorphisms (SNPs). One of the principal ongoing challenges faced by both technologies is the presence of artifacts such as stutter, a byproduct of slipped strand mispairing during amplification of STRs, which can further complicate interpretation of DNA profiles. Understanding and predicting the behavior of stutter is important in establishing appropriate thresholds to distinguish these artifacts from true alleles. With complex MPS data, new approaches in characterizing stutter have been established such as the BLMM and simplified sequence. In this study, twenty-one oral samples from individuals belonging to the same family were constructed into libraries containing 58 STR regions and 98 identity SNPs using Verogen’s Forenseq™ DNA Signature Prep Kit and sequenced on the MiSeq FGx™ Forensics Genomics System. Isoallele and stutter sequences were extracted from the data and simplified using the longest uninterrupted stretch (LUS), block length of missing motif (BLMM) and simplified sequence approaches. It was found that the stutter ratio for the 11 isoallele pairs at the D13S317 locus did not follow the linear correlation with increasing LUS. Instead, the isoallele with the higher LUS demonstrated equal or lower stutter ratios. Additionally, three different stutter patterns were identified for the same locus. Based on the provided pedigree, ten different relations were defined and the amount of allele sharing between the individuals in the pedigree was analyzed with and in the absence of isoallelic information to determine its impact on predicting relatedness. It was found that there was an average of 1.3% difference across the ten defined categories when isoalleles were taken into consideration. However, the difference in the percentage of shared alleles was not found to be significant for each of the relations category between the results before and after the consideration of isoallelelic data.
68

A Platform Independent Investigative Process Model for Smartphones

Dancer, Frances Chevonne 15 December 2012 (has links)
A properly conducted forensic examination is one of the most fundamental aspects of a digital investigation. Examiners are obligated to obtain the skills necessary to use forensic tools and methodologies and rely on sound judgment when analyzing a digital device. Anytime during this process, the quality of the methods, skills, and expertise of the examiner may be challenged, thus, placing the forensic value of the evidence collected during the process in jeopardy. In order to combat the potential challenges posed as a result of the forensic examination process, the digital forensics community must ensure that suitable protocols are used throughout the analysis process. Currently, there is no standard methodology forensic examiners use to analyze a digital device. Examiners have made use of a model derived from the Digital Forensic Research Workshop in 2001 and the application of ad-hoc techniques has become routine. While these approaches may reveal potential data of evidentiary value when applying them to digital devices, their core purpose specifically involves the analysis of computers. It is not clear how effective these methods have been when examining other digital technologies, in particular Small Scale Digital Devices (SSDDs). Due to these mitigating factors, it is critical to develop standard scientifically sound methodologies in the area of digital forensics that allow us to evaluate various digital technologies while considering their distinctive characteristics. This research addresses these issues by introducing the concept of an extendable forensic process model applicable to smartphones regardless of platform. The model has been developed using the property of invariance to construct a core components list which serves as the foundation of the proposed methodology. This dissertation provides a description of the forensic process, the models currently used, the developed model, and experiments to show its usefulness.
69

Forensic Application of Chemometric Analysis to Visible Absorption Spectra Collected from Dyed Textile Fibers

Flores, Alejandra 01 January 2015 (has links)
Forensic analysis of evidence consists of the comparison of physical, spectroscopic, or chemical characteristics of a questioned sample to a set of knowns. Currently, decisions as to whether or not the questioned sample can be associated or grouped with the knowns are left up to the discretion of the forensic analyst. The implications of these outcomes are presented as evidence to a jury in a court of law to determine if a defendant is guilty of committing a crime or not. Leading up to, and since, the publication of the National Academy of Sciences (NAS) report entitled “Strengthening Forensic Science in the United States: A Path Forward,” the inadequacies of allowing potentially biased forensic opinion to carry such weight in the courtroom have been unmasked. This report exposed numerous shortcomings in many areas of forensic science, but also made recommendations on how to fortify the discipline. The main suggestions directed towards disciplines that analyze trace evidence include developing error rates for commonly employed practices and evaluating method reliability and validity. This research focuses on developing a statistical method of analysis for comparing visible absorption profiles collected from highly similarly colored textile fibers via microspectrophotometry (MSP). Several chemometric techniques were applied to spectral data and utilized to help discriminate fibers beyond the point where traditional methods of microscopical examination may fail. Because a dye's chemical structure dictates the shape of the absorption profile, two fibers dyed with chemically similar dyes can be very difficult to distinguish from one another using traditional fiber examination techniques. The application of chemometrics to multivariate spectral data may help elicit latent characteristics that may aid in fiber discrimination. The three sample sets analyzed include dyed fabric swatches (three pairs of fabrics were dyed with chemically similar dye pairs), commercially available blue yarns (100% acrylic), and denims fabrics (100% cotton). Custom dyed swatches were each dyed uniformly with a single dye whereas the dye formulation for both the yarns and denims is unknown. As a point for study, spectral comparisons were performed according to the guidelines published by the Standard Working Group for Materials Analysis (SWGMAT) Fiber Subgroup based on visual analysis only. In the next set of tests, principal components analysis (PCA) was utilized to reduce the dimensionality of the large multivariate data sets and to visualize the natural groupings of samples. Comparisons were performed using the resulting PCA scores where group membership of the questioned object was evaluated against the known objects using the score value as the distance metric. Score value is calculated using the score and orthogonal distances, the respective cutoff values based on a quantile percentage, and an optimization parameter, ?. Lastly, likelihood ratios (LR) were generated from density functions modelled from similarity values assessing comparisons between sample population data. R code was written in-house to execute all method of fiber comparisons described here. The SWGMAT method performed with 62.7% accuracy, the optimal accuracy rate for the score value method was 75.9%, and the accuracy rates for swatch-yarn and denim comparisons, respectively, are 97.7% and 67.1% when the LR method was applied.
70

Android Memory Capture and Applications for Security and Privacy

Sylve, Joseph T 17 December 2011 (has links)
The Android operating system is quickly becoming the most popular platform for mobiledevices. As Android’s use increases, so does the need for both forensic and privacy toolsdesigned for the platform. This thesis presents the first methodology and toolset for acquiringfull physical memory images from Android devices, a proposed methodology for forensicallysecuring both volatile and non-volatile storage, and details of a vulnerability discovered by theauthor that allows the bypass of the Android security model and enables applications to acquirearbitrary permissions.

Page generated in 0.0795 seconds