• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 154
  • 40
  • 37
  • 22
  • 7
  • 6
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 397
  • 136
  • 125
  • 85
  • 63
  • 57
  • 56
  • 53
  • 43
  • 42
  • 37
  • 34
  • 29
  • 28
  • 25
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Forensic Insights: Analyzing and Visualizing Fitbit Cloud Data

Poorvi Umesh Hegde (17635896) 15 December 2023 (has links)
<p dir="ltr">Wearable devices are ubiquitous. There are over 1.1 billion wearable devices in the<br>market today[1]. The market is projected to grow at a rate of 14.6% annually till 2030[2].<br>These devices collect and store a large amount of data[3]. A major amount of this collected<br>data is stored in the cloud. For many years now, law enforcement organizations have been<br>continuously encountering cases that involve a wearable device in some capacity. There have<br>also been examples of how these wearable devices have helped in crime investigations and<br>insurance fraud investigations [4],[5],[6],[7],[8]. The article [4] performs an analysis of 5 case<br>studies and 57 news articles and shows how the framing of wearables in the context of the<br>crimes helped those cases. However, there still isn’t enough awareness and understanding<br>among law enforcement agencies on leveraging the data collected by these devices to solve<br>crimes. Many of the fitness trackers and smartwatches in the market today have more or<br>less similar functionalities of tracking data on an individual’s fitness-related activities, heart<br>rate, sleep, temperature, and stress [9]. One of the major players in the smartwatch space is<br>Fitbit. Fitbit synchronizes the data that it collects, directly to Fitbit Cloud [10]. It provides<br>an Android app and a web dashboard for users to access some of these data, but not all.<br>Application developers on the other hand can make use of Fitbit APIs to use user’s data.<br>These APIs can also be leveraged by law enforcement agencies to aid in digital forensic<br>investigations. There have been previous studies where they have developed tools that make<br>use of Fitbit Web APIs [11],[12], [13] but for various other purposes, not for forensic research.<br>There are a few studies on the topic of using fitness tracker data for forensic investigations<br>[14],[15]. But very few have used the Fitbit developer APIs [16]. Thus this study aims to<br>propose a proof-of-concept platform that can be leveraged by law enforcement agencies to<br>access and view the data stored on the Fitbit cloud on a person of interest. The results<br>display data on 12 categories - activity, body, sleep, breathing, devices, friends, nutrition,<br>heart rate variability, ECG, temperature, oxygen level, and cardio data, in a tabular format<br>that is easily viewable and searchable. This data can be further utilized for various analyses.<br>The tool developed is Open Source and well documented, thus anyone can reproduce the<br>process.<br>12<br></p>
82

<b>Comparison of Persistence of Deleted Files on Different File Systems and Disk Types</b>

Chinmay Amul Chhajed (18403644) 19 April 2024 (has links)
<p dir="ltr">The presence of digital devices in various settings, from workplaces to personal spaces, necessitates reliable and secure data storage solutions. These devices store data on non-volatile media like Solid State Drives (SSDs) and Hard Disk Drives (HDDs), ensuring data preservation even after power loss. Files, fundamental units of data storage, are created, modified, and deleted through user activities like application installations or file management. File systems, acting as the backbone of the system, manage these files on storage devices.</p><p dir="ltr">This research explores how three key factors: (1) different operating systems running various file system types (ext4, NTFS, FAT, etc.), (2) different disk types (SSD and HDD), and (3) common user activities (system shutdowns, reboots, web browsing, downloads, etc.) influence the persistence of deleted files.</p><p dir="ltr">This research aims to fill a gap in the understanding by looking at how these factors influence how quickly new information overwrites deleted files. This is especially important for digital forensics, where investigators need to be sure they can find all the evidence on a device. The research will focus on how operating systems handle deleted files and how everyday activities affect the chances of getting them back. This can ultimately improve data security and make digital forensics more reliable.</p>
83

Digital incursion: Breaching the android lock screen and liberating data

Oskarsson, Tim January 2021 (has links)
Android is the most used operating system in the world, because of this the probability of an android device being acquired in an investigation is high. To begin to extract data from an android device you first need to gain access to it. Mechanisms like full system encryption can make this very difficult. In this paper, the advantages and disadvantages of different methods of gaining access and extracting data from an android device with an unlocked bootloader are discussed. Many users unlock the bootloader of their android device to gain a much greater level of control over it. Android forensics on a device without a unlocked bootloader is very limited. It is therefore interesting to study how you can extract data from an android device that doesn’t have this limitation to android forensics. A literature study is done on previous related research to gather methods for gaining access and extracting data. The methods collected are then tested by performing experiments on a Oneplus 3 android 9 and Oneplus 8 android 11. The research of this paper found that it is possible to perform a brute force attack within a reasonable time against a PIN of length 4-5 or pattern of length 4-6 on the lock screen of an android device. It found that you can optimise the attack by performing a dictionary attack by using public lists of the most used PIN codes. A list of all possible pattern combinations sorted and optimised for a dictionary attack is generated based on statistics of pattern starting location and length. A proof of concept is made by creating a copy of a fingerprint with common cheap materials to gain access through the fingerprint sensor. A device image were able to be extracted by using a root shell through Android Debug Bridge and common command-line tools. Memory forensics were performed by using Frida and was able to extract usernames, passwords, and emails from Google Chrome and Gmail. The custom recovery image TWRP was used to boot the device, gain root access, and was able to extract a full device image with common command-line tools. The results of the TWRP backup feature is also analysed. The results of the data extraction is then analysed manually and with Autopsy.
84

Towards Automation in Digital Investigations : Seeking Efficiency in Digital Forensics in Mobile and Cloud Environments

Homem, Irvin January 2016 (has links)
Cybercrime and related malicious activity in our increasingly digital world has become more prevalent and sophisticated, evading traditional security mechanisms. Digital forensics has been proposed to help investigate, understand and eventually mitigate such attacks. The practice of digital forensics, however, is still fraught with various challenges. Some of the most prominent of these challenges include the increasing amounts of data and the diversity of digital evidence sources appearing in digital investigations. Mobile devices and cloud infrastructures are an interesting specimen, as they inherently exhibit these challenging circumstances and are becoming more prevalent in digital investigations today. Additionally they embody further characteristics such as large volumes of data from multiple sources, dynamic sharing of resources, limited individual device capabilities and the presence of sensitive data. These combined set of circumstances make digital investigations in mobile and cloud environments particularly challenging. This is not aided by the fact that digital forensics today still involves manual, time consuming tasks within the processes of identifying evidence, performing evidence acquisition and correlating multiple diverse sources of evidence in the analysis phase. Furthermore, industry standard tools developed are largely evidence-oriented, have limited support for evidence integration and only automate certain precursory tasks, such as indexing and text searching. In this study, efficiency, in the form of reducing the time and human labour effort expended, is sought after in digital investigations in highly networked environments through the automation of certain activities in the digital forensic process. To this end requirements are outlined and an architecture designed for an automated system that performs digital forensics in highly networked mobile and cloud environments. Part of the remote evidence acquisition activity of this architecture is built and tested on several mobile devices in terms of speed and reliability. A method for integrating multiple diverse evidence sources in an automated manner, supporting correlation and automated reasoning is developed and tested. Finally the proposed architecture is reviewed and enhancements proposed in order to further automate the architecture by introducing decentralization particularly within the storage and processing functionality. This decentralization also improves machine to machine communication supporting several digital investigation processes enabled by the architecture through harnessing the properties of various peer-to-peer overlays. Remote evidence acquisition helps to improve the efficiency (time and effort involved) in digital investigations by removing the need for proximity to the evidence. Experiments show that a single TCP connection client-server paradigm does not offer the required scalability and reliability for remote evidence acquisition and that a multi-TCP connection paradigm is required. The automated integration, correlation and reasoning on multiple diverse evidence sources demonstrated in the experiments improves speed and reduces the human effort needed in the analysis phase by removing the need for time-consuming manual correlation. Finally, informed by published scientific literature, the proposed enhancements for further decentralizing the Live Evidence Information Aggregator (LEIA) architecture offer a platform for increased machine-to-machine communication thereby enabling automation and reducing the need for manual human intervention.
85

Exploring IoT Security Threats and Forensic Challenges: A LiteratureReview and Survey Study

Al Allaf, Abdulrahman, Totonji, Waseem January 2023 (has links)
Internet of Things (IoT) devices have increased rapidly in recent years, revolutionizing many industries, including healthcare, manufacturing, and transportation, and bringing benefits to both individuals and industries. However, this increase in IoT device usage has exposed IoT ecosystems to numerous security threats and digital forensic challenges. This thesis investigates the most common IoT security threats and attacks, students’ awareness of them and their mitigation strategies, and the key challenges associated with IoT forensic investigations. A mixed-method approach is adopted in this thesis combining a literature review and a survey study. The survey assesses students’ knowledge of IoT security threats, mitigation techniques, and perceptions of the most effective ways to enhance IoT security. The survey also emphasizes the importance of user training and awareness in mitigating IoT threats, highlighting the most effective strategies, such as stronger regulations and improved device security by manufacturers. The literature review provides a comprehensive overview of the most common IoT security threats and attacks, such as malware, malicious code injection, replay attacks, Man in the Middle (MITM), botnets, and Distributed Denial of Service Attacks (DDoS). The mitigation techniques to these threats are overviewed as well as real-world incidents and crimes, such as the Mirai botnet, St. Jude Medical implant cardiac devices hack, and the Verkada hack, are examined to understand the consequences of these attacks. Moreover, this work also highlights the definition and the process of digital and IoT forensics, the importance of IoT forensics, and different data sources in IoT ecosystems. The key challenges associated with IoT forensics and how they impact the effectiveness of digital investigations in the IoT ecosystem are examined in detail. Overall, the results of this work contribute to ongoing research to improve IoT device security, highlight the importance of increased awareness and user training, and address the challenges associated with IoT forensic investigations.
86

Providing Context to the Clues: Recovery and Reliability of Location Data from Android Devices

Bell, Connie 01 January 2015 (has links)
Mobile device data continues to increase in significance in both civil and criminal investigations. Location data is often of particular interest. To date, research has established that the devices are location aware, incorporate a variety of resources to obtain location information, and cache the information in various ways. However, a review of the existing research suggests varying degrees of reliability of any such recovered location data. In an effort to clarify the issue, this project offers case studies of multiple Android mobile devices utilized in controlled conditions with known settings and applications in documented locations. The study uses data recovered from test devices to corroborate previously identified accuracy trends noted in research involving live-tracked devices, and it further offers detailed analysis strategies for the recovery of location data from devices themselves. A methodology for reviewing device data for possible artifacts that may allow an examiner to evaluate location data reliability is also presented. This paper also addresses emerging trends in device security and cloud storage, which may have significant implications for future mobile device location data recovery and analysis. Discussion of recovered cloud data introduces a distinct and potentially significant resource for investigators, and the paper addresses the cloud resources' advantages and limitations.
87

Authorship Attribution of Source Code

Tennyson, Matthew Francis 01 January 2013 (has links)
Authorship attribution of source code is the task of deciding who wrote a program, given its source code. Applications include software forensics, plagiarism detection, and determining software ownership. A number of methods for the authorship attribution of source code have been presented in the past. A review of those existing methods is presented, while focusing on the two state-of-the-art methods: SCAP and Burrows. The primary goal was to develop a new method for authorship attribution of source code that is even more effective than the current state-of-the-art methods. Toward that end, a comparative study of the methods was performed in order to determine their relative effectiveness and establish a baseline. A suitable set of test data was also established in a manner intended to support the vision of a universal data set suitable for standard use in authorship attribution experiments. A data set was chosen consisting of 7,231 open-source and textbook programs written in C++ and Java by thirty unique authors. The baseline study showed both the Burrows and SCAP methods were indeed state-of-the-art. The Burrows method correctly attributed 89% of all documents, while the SCAP method correctly attributed 95%. The Burrows method inherently anonymizes the data by stripping all comments and string literals, while the SCAP method does not. So the methods were also compared using anonymized data. The SCAP method correctly attributed 91% of the anonymized documents, compared to 89% by Burrows. The Burrows method was improved in two ways: the set of features used to represent programs was updated and the similarity metric was updated. As a result, the improved method successfully attributed nearly 94% of all documents, compared to 89% attributed in the baseline. The SCAP method was also improved in two ways: the technique used to anonymize documents was changed and the amount of information retained in the source code author profiles was determined differently. As a result, the improved method successfully attributed 97% of anonymized documents and 98% of non-anonymized documents, compared to 91% and 95% that were attributed in the baseline, respectively. The two improved methods were used to create an ensemble method based on the Bayes optimal classifier. The ensemble method successfully attributed nearly 99% of all documents in the data set.
88

Database Forensics in the Service of Information Accountability

Pavlou, Kyriacos 04 November 2011 (has links)
Poster won first place in the graduate division of Physical Sciences, Mathematics, Computer Engineering and Computer Science at GPSC Student Showcase 2011. / Regulations and societal expectations have recently expressed the need to mediate access to valuable databases, even by insiders. At one end of the spectrum is the approach of restricting access to information and on the other that of information accountability. The focus of the proposed work is effecting information accountability of data stored in databases. One way to ensure appropriate use and thus end-to-end accountability of such information is tamper detection in databases via a continuous assurance technology based on cryptographic hashing. In our current research we are working to show how to develop the necessary approaches and ideas to support accountability in high performance databases. This will include the design of a reference architecture for information accountability and several of its variants, the development of forensic analysis algorithms and their cost model, and a systematic formulation of forensic analysis for determining when the tampering occurred and what data were tampered with. Finally, for privacy, we would like to create mechanisms for allowing as well as (temporarily) preventing the physical deletion of records in a monitored database. In order to evaluate our ideas we will design and implement an integrated tamper detection and forensic analysis system. This work will show that information accountability is a viable alternative to information restriction for ensuring the correct storage, use, and maintenance of databases.
89

Naming the Dead: Identification and Ambiguity Along the U.S.-Mexico Border

Reineke, Robin Christine, Reineke, Robin Christine January 2016 (has links)
Since the beginning of the 21st century, the deaths of migrants have become a regular occurrence in southern Arizona where an average of 170 bodies are recovered from the desert each year. This dissertation examines the causes and effects of death and disappearance along the U.S.-Mexico border, seeking to address the contradiction present in the fact that thousands of people have died or disappeared in one of the world’s most heavily surveilled landscapes. It interrogates the ways in which the dead, the missing, and their families are simultaneously erased and exposed in a biopolitical process that has powerful implications beyond the space of the borderlands. The observations for this dissertation are drawn from nearly a decade of both ethnographic research and applied humanitarian assistance in the field of forensic human identification, primarily at the Pima County Office of the Medical Examiner, in Tucson, Arizona. Although the majority of migrant fatalities have been determined by the medical examiner to be accidental, resulting from exposure to the elements or unknown causes, a historical analysis reveals the violent nature of these deaths and disappearances, which are a structured result of U.S. border and immigration policies. From their homes to their destinations, migrants in the Americas face a particular kind of structural violence and social invisibility that is revealed when they disappear at the border. This disappearance is then made more thorough by the structured lack of access for families of the missing to services to assist them in their search. Practices of care, whether occurring within families of the missing and dead, during the desert crossing itself, or in the forensic work to identify the dead, powerfully contest the invisibility and erasure experienced by migrants in the Americas today.
90

Detecting Objective-C Malware through Memory Forensics

Case, Andrew 13 May 2016 (has links)
Memory forensics is increasingly used to detect and analyze sophisticated malware. In the last decade, major advances in memory forensics have made analysis of kernel-level malware straightforward. Kernel-level malware has been favored by attackers because it essentially provides complete control over a machine. This has changed recently as operating systems vendors now routinely enforce driving signing and strategies for protecting kernel data, such as Patch Guard, have made userland attacks much more attractive to malware authors. In this thesis, new techniques for detecting userland malware written in Objective-C on Mac OS X are presented. As the thesis illustrates, Objective-C provides a rich set of APIs that malware uses to manipulate and steal data and to perform other malicious activities. The novel memory forensics techniques presented in this thesis deeply examine the state of the Objective-C runtime, identifying a number of suspicious activities, from keystroke logging to pointer swizzling.

Page generated in 0.0775 seconds