• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 44
  • 8
  • 5
  • 2
  • 2
  • Tagged with
  • 119
  • 119
  • 56
  • 36
  • 33
  • 27
  • 27
  • 23
  • 18
  • 17
  • 16
  • 16
  • 13
  • 13
  • 13
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Android Memory Capture and Applications for Security and Privacy

Sylve, Joseph T 17 December 2011 (has links)
The Android operating system is quickly becoming the most popular platform for mobiledevices. As Android’s use increases, so does the need for both forensic and privacy toolsdesigned for the platform. This thesis presents the first methodology and toolset for acquiringfull physical memory images from Android devices, a proposed methodology for forensicallysecuring both volatile and non-volatile storage, and details of a vulnerability discovered by theauthor that allows the bypass of the Android security model and enables applications to acquirearbitrary permissions.
42

Data Exploration Interface for Digital Forensics

Dontula, Varun 17 December 2011 (has links)
The fast capacity growth of cheap storage devices presents an ever-growing problem of scale for digital forensic investigations. One aspect of scale problem in the forensic process is the need for new approaches to visually presenting and analyzing large amounts of data. Current generation of tools universally employ three basic GUI components—trees, tables, and viewers—to present all relevant information. This approach is not scalable as increasing the size of the input data leads to a proportional increase in the amount of data presented to the analyst. We present an alternative approach, which leverages data visualization techniques to provide a more intuitive interface to explore the forensic target. We use tree visualization techniques to give the analyst both a high-level view of the file system and an efficient means to drill down into the details. Further, we provide means to search for keywords and filter the data by time period.
43

Creating Volatility Support for FreeBSD

Bond, Elyse 11 August 2015 (has links)
Digital forensics is the investigation and recovery of data from digital hardware. The field has grown in recent years to include support for operating systems such as Windows, Linux and Mac OS X. However, little to no support has been provided for less well known systems such as the FreeBSD operating system. The project presented in this paper focuses on creating the foundational support for FreeBSD via Volatility, a leading forensic tool in the digital forensic community. The kernel and source code for FreeBSD were studied to understand how to recover various data from analysis of a given system’s memory image. This paper will focus on the base Volatility support that was implemented, as well as the additional plugins created to recover desired data, including but not limited to the retrieval of a system’s process list and mounted file systems.
44

A Digital Tool to Improve the Efficiency of IT Forensic Investigations / Translation not available

Hansen, Tone January 2019 (has links)
The IT forensic process causing bottlenecks in investigations is an identified issue, with multiple underlying causes – one of the main causes being the lack of expertise among those responsible for ordering IT forensic investigations. The focus of the study is to create and evaluate a potential solution for this problem, aiming to answer research questions related to a suitable architecture, structure and design of a digital tool that would assist individuals in creating IT forensic orders. This work evaluates concepts of such a digital tool. This is done using a grounded theory approach, where a series of test sessions together with the answers from a survey have been examined and analyzed in an iterative process. A low-fidelity prototype is used in the process. The resulting conclusion of the study is a set of concepts, ideas and principles for a digital tool that would aid in the IT forensic ordering process, as well improving the efficiency of the IT forensic process itself. Future work could involve developing the concept further to eventually become a finished product, or using it for improving already existing systems and tools, improving the efficiency and quality of the IT forensic process.
45

The Comprehensive Digital Forensic Investigation Process Model (CDFIPM) for digital forensic practice

Montasari, Reza January 2016 (has links)
No description available.
46

Semantic-aware Stealthy Control Logic Infection Attack

kalle, Sushma 06 August 2018 (has links)
In this thesis work we present CLIK, a new, automated, remote attack on the control logic of a programmable logic controller (PLC) in industrial control systems. The CLIK attack modifies the control logic running in a remote target PLC automatically to disrupt a physical process. We implement the CLIK attack on a real PLC. The attack is initiated by subverting the security measures that protect the control logic in a PLC. We found a critical (zero-day) vulnerability, which allows the attacker to overwrite password hash in the PLC during the authentication process. Next, CLIK retrieves and decompiles the original logic and injects a malicious logic into it and then, transfers the infected logic back to the PLC. To hide the infection, we propose a virtual PLC that engages the software the virtual PLC intercepts the request and then, responds with the original (uninfected) control logic to the software.
47

Risk, Privacy, and Security in Computer Networks

Årnes, Andre January 2006 (has links)
<p>With an increasingly digitally connected society comes complexity, uncertainty, and risk. Network monitoring, incident management, and digital forensics is of increasing importance with the escalation of cybercrime and other network supported serious crimes. New laws and regulations governing electronic communications, cybercrime, and data retention are being proposed, continuously requiring new methods and tools.</p><p>This thesis introduces a novel approach to real-time network risk assessment based on hidden Markov models to represent the likelihood of transitions between security states. The method measures risk as a composition of individual hosts, providing a precise, fine-grained model for assessing risk and providing decision support for incident response. The approach has been integrated with an existing framework for distributed, large-scale intrusion detection, and the results of the risk assessment are applied to prioritize the alerts produced by the intrusion detection sensors. Using this implementation, the approach is evaluated on both simulated and real-world data.</p><p>Network monitoring can encompass large networks and process enormous amounts of data, and the practice and its ubiquity can represent a great threat to the privacy and confidentiality of network users. Existing measures for anonymization and pseudonymization are analyzed with respect to the trade-off of performing meaningful data analysis while protecting the identities of the users. The results demonstrate that most existing solutions for pseudonymization are vulnerable to a range of attacks. As a solution, some remedies for strengthening the schemes are proposed, and a method for unlinkable transaction pseudonyms is considered.</p><p>Finally, a novel method for performing digital forensic reconstructions in a virtual security testbed is proposed. Based on a hypothesis of the security incident in question, the testbed is configured with the appropriate operating systems, services, and exploits. Attacks are formulated as event chains and replayed on the testbed. The effects of each event are analyzed in order to support or refute the hypothesis. The purpose of the approach is to facilitate reconstruction experiments in digital forensics. Two examples are given to demonstrate the approach; one overview example based on the Trojan defense and one detailed example of a multi-step attack. Although a reconstruction can neither prove a hypothesis with absolute certainty, nor exclude the correctness of other hypotheses, a standardized environment combined with event reconstruction and testing can lend credibility to an investigation and can be a valuable asset in court.</p>
48

Risk, Privacy, and Security in Computer Networks

Årnes, Andre January 2006 (has links)
With an increasingly digitally connected society comes complexity, uncertainty, and risk. Network monitoring, incident management, and digital forensics is of increasing importance with the escalation of cybercrime and other network supported serious crimes. New laws and regulations governing electronic communications, cybercrime, and data retention are being proposed, continuously requiring new methods and tools. This thesis introduces a novel approach to real-time network risk assessment based on hidden Markov models to represent the likelihood of transitions between security states. The method measures risk as a composition of individual hosts, providing a precise, fine-grained model for assessing risk and providing decision support for incident response. The approach has been integrated with an existing framework for distributed, large-scale intrusion detection, and the results of the risk assessment are applied to prioritize the alerts produced by the intrusion detection sensors. Using this implementation, the approach is evaluated on both simulated and real-world data. Network monitoring can encompass large networks and process enormous amounts of data, and the practice and its ubiquity can represent a great threat to the privacy and confidentiality of network users. Existing measures for anonymization and pseudonymization are analyzed with respect to the trade-off of performing meaningful data analysis while protecting the identities of the users. The results demonstrate that most existing solutions for pseudonymization are vulnerable to a range of attacks. As a solution, some remedies for strengthening the schemes are proposed, and a method for unlinkable transaction pseudonyms is considered. Finally, a novel method for performing digital forensic reconstructions in a virtual security testbed is proposed. Based on a hypothesis of the security incident in question, the testbed is configured with the appropriate operating systems, services, and exploits. Attacks are formulated as event chains and replayed on the testbed. The effects of each event are analyzed in order to support or refute the hypothesis. The purpose of the approach is to facilitate reconstruction experiments in digital forensics. Two examples are given to demonstrate the approach; one overview example based on the Trojan defense and one detailed example of a multi-step attack. Although a reconstruction can neither prove a hypothesis with absolute certainty, nor exclude the correctness of other hypotheses, a standardized environment combined with event reconstruction and testing can lend credibility to an investigation and can be a valuable asset in court.
49

Accelerating digital forensic searching through GPGPU parallel processing techniques

Bayne, Ethan January 2017 (has links)
Background: String searching within a large corpus of data is a critical component of digital forensic (DF) analysis techniques such as file carving. The continuing increase in capacity of consumer storage devices requires similar improvements to the performance of string searching techniques employed by DF tools used to analyse forensic data. As string searching is a trivially-parallelisable problem, general purpose graphic processing unit (GPGPU) approaches are a natural fit. Currently, only some of the research in employing GPGPU programming has been transferred to the field of DF, of which, a closed-source GPGPU framework was used— Complete Unified Device Architecture (CUDA). Findings from these earlier studies have found that local storage devices from which forensic data are read present an insurmountable performance bottleneck. Aim: This research hypothesises that modern storage devices no longer present a performance bottleneck to the currently used processing techniques of the field, and proposes that an open-standards GPGPU framework solution – Open Computing Language (OpenCL) – would be better suited to accelerate file carving with wider compatibility across an array of modern GPGPU hardware. This research further hypothesises that a modern multi-string searching algorithm may be better adapted to fulfil the requirements of DF investigation. Methods: This research presents a review of existing research and tools used to perform file carving and acknowledges related work within the field. To test the hypothesis, parallel file carving software was created using C# and OpenCL, employing both a traditional string searching algorithm and a modern multi-string searching algorithm to conduct an analysis of forensic data. A set of case studies that demonstrate and evaluate potential benefits of adopting various methods in conducting string searching on forensic data are given. This research concludes with a final case study which evaluates the performance to perform file carving with the best-proposed string searching solution and compares the result with an existing file carving tool— Foremost. Results: The results demonstrated from the research establish that utilising the parallelised OpenCL and Parallel Failureless Aho-Corasick (PFAC) algorithm solution demonstrates significantly greater processing improvements from the use of a single, and multiple, GPUs on modern hardware. In comparison to CPU approaches, GPGPU processing models were observed to minimised the amount of time required to search for greater amounts of patterns. Results also showed that employing PFAC also delivers significant performance increases over the BM algorithm. The method employed to read data from storage devices was also seen to have a significant effect on the time required to perform string searching and file carving. Conclusions: Empirical testing shows that the proposed string searching method is believed to be more efficient than the widely-adopted Boyer-Moore algorithms when applied to string searching and performing file carving. The developed OpenCL GPGPU processing framework was found to be more efficient than CPU counterparts when searching for greater amounts of patterns within data. This research also refutes claims that file carving is solely limited by the performance of the storage device, and presents compelling evidence that performance is bound by the combination of the performance of the storage device and processing technique employed.
50

Collaborative Digital Forensics: Architecture, Mechanisms, and Case Study

January 2011 (has links)
abstract: In order to catch the smartest criminals in the world, digital forensics examiners need a means of collaborating and sharing information with each other and outside experts that is not prohibitively difficult. However, standard operating procedures and the rules of evidence generally disallow the use of the collaboration software and techniques that are currently available because they do not fully adhere to the dictated procedures for the handling, analysis, and disclosure of items relating to cases. The aim of this work is to conceive and design a framework that provides a completely new architecture that 1) can perform fundamental functions that are common and necessary to forensic analyses, and 2) is structured such that it is possible to include collaboration-facilitating components without changing the way users interact with the system sans collaboration. This framework is called the Collaborative Forensic Framework (CUFF). CUFF is constructed from four main components: Cuff Link, Storage, Web Interface, and Analysis Block. With the Cuff Link acting as a mediator between components, CUFF is flexible in both the method of deployment and the technologies used in implementation. The details of a realization of CUFF are given, which uses a combination of Java, the Google Web Toolkit, Django with Apache for a RESTful web service, and an Ubuntu Enterprise Cloud using Eucalyptus. The functionality of CUFF's components is demonstrated by the integration of an acquisition script designed for Android OS-based mobile devices that use the YAFFS2 file system. While this work has obvious application to examination labs which work under the mandate of judicial or investigative bodies, security officers at any organization would benefit from the improved ability to cooperate in electronic discovery efforts and internal investigations. / Dissertation/Thesis / M.S. Computer Science 2011

Page generated in 0.0529 seconds