• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 161
  • 40
  • 37
  • 22
  • 7
  • 6
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 404
  • 142
  • 128
  • 87
  • 66
  • 61
  • 57
  • 53
  • 44
  • 42
  • 39
  • 37
  • 29
  • 28
  • 26
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
161

Leveraging Personal Internet-of-Things Technology To Facilitate User Identification in Digital Forensics Investigations

Shinelle Hutchinson (16642559) 07 August 2023 (has links)
<p>Despite the many security and privacy concerns associated with Internet-of-Things (IoT) devices, we continue to be barraged by new IoT devices every day. These devices have infiltrated almost every aspect of our lives, from government and corporations to our homes, and now, on and within our person, in the form of smartphones and wearables. These personal IoT devices can collect some of the most intimate pieces of data about their user. For instance, a smartwatch can record its wearer's heart rate, skin temperature, physical activity, and even GPS location data. At the same time, a smartphone has access to almost every piece of information related to its user, including text messages, social media activity, web browser history, and application-specific data. Due to the quantity and quality of data these personal IoT devices record, these devices have become critical sources of evidence during forensic investigations. However, there are instances in which digital forensic investigators need to make doubly sure that the data obtained from these smart devices, in fact, belong to the alleged owner of the smart device and not someone else. To that end, this dissertation provides the first look at using personal IoT device handling as a user identification technique with machine learning models to aid forensic investigations. The results indicated that this technique is capable of significantly differentiating device owners with performance metrics of .9621, .9618, and .9753, for accuracy, F1, and AUC, respectively, when using a smartwatch with statistical time-domain features. When considering the smartphone performance, the performance was only marginally acceptable with accuracy, F1, and AUC values of .8577, .8560, and .8891, respectively.  The results also indicate that female users handled their devices notably differently from male users. This study thus lays the foundation for performing user identification during a forensic investigation to determine whether the smart device owner did, in fact, use the device at the time of the incident.</p>
162

Dark Web Forensics : An Investigation of Tor and I2P Artifacts on Windows 11

Abolghsemi, Seyedhesam, Chukwuneta, Chukwudalu January 2024 (has links)
With the rising use of the Internet by businesses and individuals for their regular activities and transactions, there has been increased attention to user privacy and data security on the web. While the adoption of dark web networks has ensured that users' privacy and anonymity concerns are being addressed, there has also been a consequential increase in illicit activities on the internet. The dark web remains a critical area for law enforcement investigations, providing a platform for criminal activities to thrive unchecked. This study evaluates the digital traces deposited by dark web browsers on the client side of user devices, providing a deep insight into the security features of Tor and I2P and outlining the potential areas where digital artifacts can be retrieved on a Windows 11 computer. By detailing the forensic acquisition process and subsequent artifact analysis, this research aims to enhance the capabilities of digital forensic examiners in tracking and prosecuting cybercriminals, thereby contributing to the broader field of digital forensics and cybersecurity.
163

Digital Forensics Tool Interface Visualization

Altiero, Roberto A. 15 January 2015 (has links)
Recent trends show digital devices utilized with increasing frequency in most crimes committed. Investigating crime involving these devices is labor-intensive for the practitioner applying digital forensics tools that present possible evidence with results displayed in tabular lists for manual review. This research investigates how enhanced digital forensics tool interface visualization techniques can be shown to improve the investigator's cognitive capacities to discover criminal evidence more efficiently. This paper presents visualization graphs and contrasts their properties with the outputs of The Sleuth Kit (TSK) digital forensic program. Exhibited is the textual-based interface proving the effectiveness of enhanced data presentation. Further demonstrated is the potential of the computer interface to present to the digital forensic practitioner an abstract, graphic view of an entire dataset of computer files. Enhanced interface design of digital forensic tools means more rapidly linking suspicious evidence to a perpetrator. Introduced in this study is a mixed methodology of ethnography and cognitive load measures. Ethnographically defined tasks developed from the interviews of digital forensics subject matter experts (SME) shape the context for cognitive measures. Cognitive load testing of digital forensics first-responders utilizing both a textual-based and visualized-based application established a quantitative mean of the mental workload during operation of the applications under test. A t-test correlating the dependent samples' mean tested for the null hypothesis of less than a significant value between the applications' comparative workloads of the operators. Results of the study indicate a significant value, affirming the hypothesis that a visualized application would reduce the cognitive workload of the first-responder analyst. With the supported hypothesis, this work contributes to the body of knowledge by validating a method of measurement and by providing empirical evidence that the use of the visualized digital forensics interface will provide a more efficient performance by the analyst, saving labor costs and compressing time required for the discovery phase of a digital investigation.
164

Digital forensics : an integrated approach for the investigation of cyber/computer related crimes

Hewling, Moniphia Orlease January 2013 (has links)
Digital forensics has become a predominant field in recent times and courts have had to deal with an influx of related cases over the past decade. As computer/cyber related criminal attacks become more predominant in today’s technologically driven society the need for and use of, digital evidence in courts has increased. There is the urgent need to hold perpetrators of such crimes accountable and successfully prosecuting them. The process used to acquire this digital evidence (to be used in cases in courts) is digital forensics. The procedures currently used in the digital forensic process were developed focusing on particular areas of the digital evidence acquisition process. This has resulted in very little regard being made for the core components of the digital forensics field, for example the legal and ethical along with other integral aspects of investigations as a whole. These core facets are important for a number of reasons including the fact that other forensic sciences have included them, and to survive as a true forensics discipline digital forensics must ensure that they are accounted for. This is because, digital forensics like other forensics disciplines must ensure that the evidence (digital evidence) produced from the process is able to withstand the rigors of a courtroom. Digital forensics is a new and developing field still in its infancy when compared to traditional forensics fields such as botany or anthropology. Over the years development in the field has been tool centered, being driven by commercial developers of the tools used in the digital investigative process. This, along with having no set standards to guide digital forensics practitioners operating in the field has led to issues regarding the reliability, verifiability and consistency of digital evidence when presented in court cases. Additionally some developers have neglected the fact that the mere mention of the word forensics suggests courts of law, and thus legal practitioners will be intimately involved. Such omissions have resulted in the digital evidence being acquired for use in various investigations facing major challenges when presented in a number of cases. Mitigation of such issues is possible with the development of a standard set of methodologies flexible enough to accommodate the intricacies of all fields to be considered when dealing with digital evidence. This thesis addresses issues regarding digital forensics frameworks, methods, methodologies and standards for acquiring digital evidence using the grounded theory approach. Data was gathered using literature surveys, questionnaires and interviews electronically. Collecting data using electronic means proved useful when there is need to collect data from different jurisdictions worldwide. Initial surveys indicated that there were no existing standards in place and that the terms models/frameworks and methodologies were used interchangeably to refer to methodologies. A framework and methodology have been developed to address the identified issues and represent the major contribution of this research. The dissertation outlines solutions to the identified issues and presents the 2IR Framework of standards which governs the 2IR Methodology supported by a mobile application and a curriculum of studies. These designs were developed using an integrated approach incorporating all four core facets of the digital forensics field. This research lays the foundation for a single integrated approach to digital forensics and can be further developed to ensure the robustness of process and procedures used by digital forensics practitioners worldwide.
165

Applications of Raman spectroscopic techniques in forensic and security contexts : the detection of drugs of abuse and explosives in scenarios of forensic and security relevance using benchtop and portable Raman spectroscopic instrumentation

Ali, Esam Mohamed Abdalla January 2010 (has links)
Drug trafficking and smuggling is an ongoing challenge for law enforcement agencies. Cocaine smuggling is a high-value pursuit for smugglers and has been attempted using a variety of concealment methods including the use of bottled liquids, canned milk, wax and suspensions in cans of beer. In particular, traffickers have used clothing impregnated with cocaine for smuggling. Handling, transportation or re-packaging of drugs of abuse and explosives will inevitably leave residual material on the clothing and other possessions of the involved persons. The nails and skin of the person may also be contaminated through the handling of these substances. This research study describes the development of Raman spectroscopic techniques for the detection of drugs of abuse and explosives on biomaterials of forensic relevance including undyed natural and synthetic fibres and dyed textile specimens, nail and skin. Confocal Raman microscopy has been developed and evaluated for the detection and identification of particulates of several drugs of abuse and explosives on different substrates. The results show that excellent spectroscopic discrimination can be achieved between single particles and substrate materials, giving a ubiquitous non-destructive approach to the analysis of pico-gram quantities of the drugs and explosives in-situ. Isolating the particle in this way corresponds with an analytical sensitivity comparable with the most sensitive analytical techniques currently available e.g. the highly sensitive, yet destructive ionization desorption mass spectrometry. With the confocal Raman approach, this work demonstrates that definitive molecular-specific information can be achieved within seconds without significant interference from the substrate. The potential for the application of this technique as a rapid preliminary, forensic screening procedure is obvious and attractive to non-specialist operators as it does not involve prior chemical pretreatment ii or detachment of the analyte from the substrate. As a result, evidential materials can be analysed without compromising their integrity for future investigation. Also, the applications of benchtop and portable Raman spectroscopy for the in-situ detection of drugs of abuse in clothing impregnated with the drugs have been demonstrated. Raman spectra were obtained from a set of undyed natural and synthetic fibres and dyed textiles impregnated with these drugs. The spectra were collected using three Raman spectrometers; one benchtop dispersive spectrometer coupled to a fibre-optic probe and two portable spectrometers. High quality spectra of the drugs could be acquired in-situ within seconds and without any sample preparation or alteration of the evidential material. A field-portable Raman spectrometer is a reliable instrument that can be used by emergency response teams to rapidly identify unknown samples. This method lends itself well to further development for the in-situ examination by law enforcement officers of items associated with users, handlers and suppliers of drugs of abuse in the forensics arena. In the last section of this study, a portable prototype Raman spectrometer ( DeltaNu Advantage 1064) equipped with 1064 nm laser excitation has been evaluated for the analysis of drugs of abuse and explosives. The feasibility of the instrument for the analysis of the samples both as neat materials and whilst contained in plastic and glass containers has been investigated. The advantages, disadvantages and the analytical potential in the forensics arena of this instrument have been discussed.
166

Corroboration, consent and community : a 'meaning finitist' account of the forensic medical examination of rape and penetrative sexual assault complainers in Scotland

Rees, Gethin January 2009 (has links)
This thesis examines the construction of forensic medical evidence in penetrative sexual assault cases and the procedures that Forensic Medical Examiners (FMEs) employ in order to ensure the authority of that evidence. Drawing upon interviews and on the texts and artefacts that FMEs use in their work, the thesis employs the concept of “meaning finitism” to analyse how FMEs perform forensic examinations for evidential purposes. The thesis starts with an exploration of how medical practitioners are taught to identify and classify injuries of medico-legal significance, culminating in their being judged “safe” to provide expert testimony by other members of the clinical forensic medical community. The thesis next addresses the construction of what I call the “morphological account”: a set of judgements about the nature of a case based upon a combination of the observed injuries, the FME’s training and their previous experience of cases. While there is considerable agreement amongst practitioners about how to interpret injuries (a result of their training), because the morphological account involves personal judgement, there is also scope for differences of opinion. The thesis therefore explores the methods that FMEs employ to limit the risk of being seen to disagree with one another during trials. The thesis also examines the role that guidelines play in the forensic medical examination. The thesis argues that standardised medical kits and associated guidance documents were originally introduced in the early 1980s in response to sustained criticism of FMEs’ practices, and further developed in the late 1990s and early 2000s with the rise of Evidence-Based Medicine. Kits and guidance documents provide a means for FMEs to legitimate and explain their work to others, particularly during trials: they codify collective practice and provide FMEs with an aide memoire of the requisite procedures, without overly determining or constraining practice. Finally, I will argue that FMEs’ concern to ensure the authority of their evidence may sometimes limit the value of that evidence. Caution over drawing inferences that might be challenged in court, and a concern not to be seen as “prosecution-minded”, commonly leads FME to compose so-called “Neutral Reports” which neither confirm nor deny the complainer’s allegations. As Scottish Procedural Law makes provision for non-contentious evidence to be removed from trial, such neutral reports are likely to be dismissed from consideration.
167

Database Forensics in the Service of Information Accountability

Pavlou, Kyriacos Eleftheriou January 2012 (has links)
Regulations and societal expectations have recently emphasized the need to mediate access to valuable databases, even by insiders. At one end of a spectrum is the approach of restricting access to information; at the other is information accountability. The focus of this work is on effecting information accountability of data stored in relational databases. One way to ensure appropriate use and thus end-to-end accountability of such information is through continuous assurance technology, via tamper detection in databases built upon cryptographic hashing. We show how to achieve information accountability by developing and refining the necessary approaches and ideas to support accountability in high-performance databases. These concepts include the design of a reference architecture for information accountability and several of its variants, the development of a sequence of successively more sophisticated forensic analysis algorithms and their forensic cost model, and a systematic formulation of forensic analysis for determining when the tampering occurred and what data were tampered with. We derive a lower bound for the forensic cost and prove that some of the algorithms are optimal under certain circumstances. We introduce a comprehensive taxonomy of the types of possible corruption events, along with an associated forensic analysis protocol that consolidates all extant forensic algorithms and the corresponding type(s) of corruption events they detect. Finally, we show how our information accountability solution can be used for databases residing in the cloud. In order to evaluate our ideas we design and implement an integrated tamper detection and forensic analysis system named DRAGOON. This work shows that information accountability is a viable alternative to information restriction for ensuring the correct storage, use, and maintenance of high-performance relational databases.
168

An insider misuse threat detection and prediction language

Magklaras, Georgios Vasilios January 2012 (has links)
Numerous studies indicate that amongst the various types of security threats, the problem of insider misuse of IT systems can have serious consequences for the health of computing infrastructures. Although incidents of external origin are also dangerous, the insider IT misuse problem is difficult to address for a number of reasons. A fundamental reason that makes the problem mitigation difficult relates to the level of trust legitimate users possess inside the organization. The trust factor makes it difficult to detect threats originating from the actions and credentials of individual users. An equally important difficulty in the process of mitigating insider IT threats is based on the variability of the problem. The nature of Insider IT misuse varies amongst organizations. Hence, the problem of expressing what constitutes a threat, as well as the process of detecting and predicting it are non trivial tasks that add up to the multi- factorial nature of insider IT misuse. This thesis is concerned with the process of systematizing the specification of insider threats, focusing on their system-level detection and prediction. The design of suitable user audit mechanisms and semantics form a Domain Specific Language to detect and predict insider misuse incidents. As a result, the thesis proposes in detail ways to construct standardized descriptions (signatures) of insider threat incidents, as means of aiding researchers and IT system experts mitigate the problem of insider IT misuse. The produced audit engine (LUARM – Logging User Actions in Relational Mode) and the Insider Threat Prediction and Specification Language (ITPSL) are two utilities that can be added to the IT insider misuse mitigation arsenal. LUARM is a novel audit engine designed specifically to address the needs of monitoring insider actions. These needs cannot be met by traditional open source audit utilities. ITPSL is an XML based markup that can standardize the description of incidents and threats and thus make use of the LUARM audit data. Its novelty lies on the fact that it can be used to detect as well as predict instances of threats, a task that has not been achieved to this date by a domain specific language to address threats. The research project evaluated the produced language using a cyber-misuse experiment approach derived from real world misuse incident data. The results of the experiment showed that the ITPSL and its associated audit engine LUARM provide a good foundation for insider threat specification and prediction. Some language deficiencies relate to the fact that the insider threat specification process requires a good knowledge of the software applications used in a computer system. As the language is easily expandable, future developments to improve the language towards this direction are suggested.
169

Development of Peer Instruction Material for a Cybersecurity Curriculum

Johnson, William 19 May 2017 (has links)
Cybersecurity classes focus on building practical skills alongside the development of the open mindset that is essential to tackle the dynamic cybersecurity landscape. Unfortunately, traditional lecture-style teaching is insufficient for this task. Peer instruction is a non-traditional, active learning approach that has proven to be effective in computer science courses. The challenge in adopting peer instruction is the development of conceptual questions. This thesis presents a methodology for developing peer instruction questions for cybersecurity courses, consisting of four stages: concept identification, concept trigger, question presentation, and development. The thesis analyzes 279 questions developed over two years for three cybersecurity courses: introduction to computer security, network penetration testing, and introduction to computer forensics. Additionally, it discusses examples of peer instruction questions in terms of the methodology. Finally, it summarizes the usage of a workshop for testing a selection of peer instruction questions as well as gathering data outside of normal courses.
170

SQLite Carving och Analys : En jämförelse av metoder / SQLite Carving and analysis : A comparrison of methods

Johansson, Marcus January 2016 (has links)
SQLite filer används av ett flertal olika program för att spara viktig information.Information som kan vara viktig för forensiska utredningar, behovet att kunnaåterskapa SQLite filer är då ett växande bekymmer. Problemet med att återskapa SQLitefiler är att, till skillnad från andra filer så har SQLite filer inget definerat slut eller någonmarkör som visar var filen slutar. Detta arbete presenterar en ny metod att bestämmaslutet på SQLite filer och jämför denna metod mot den metod som används idag tillmesta del för att återskapa SQLite filer. För att jämföra dessa metoder utvecklades tvåprogram stpextr och blkextr. Blkextr är den metod som utvecklades under detta arbete.Stpextr visade sig vara snabbare och använda mindre arbetsminne än blkextr. Men ivissa sammanhang så kommer information gå förlorad när stpextr körs till skillnad från blkextr.

Page generated in 0.0714 seconds