• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 11
  • 5
  • 4
  • 1
  • 1
  • Tagged with
  • 42
  • 42
  • 18
  • 11
  • 10
  • 9
  • 8
  • 8
  • 6
  • 6
  • 5
  • 5
  • 5
  • 5
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

A study of computer forensics from a cross-cultural perspective: Australia and Taiwan

Lin, Yi-Chi January 2008 (has links)
The rise in the number and variety of digital devices has changed all facets of society,from the way we work and communicate to how our lives are recorded. Computers, notebooks, networks, mobile phones, digital cameras and embedded devices are all readily available and it is difficult to imagine a person that does not own at least some of these. However, with this rise in the use of electronics, comes its potential misuse. Electronic devices have transformed how existing crimes may occur, and also have allowed for several new forms of illegal activity. However, given the global nature of the Internet, such crimes may take place over multiple jurisdictions and countries. Where the investigation into computer-based crime occurs between two or more countries, there is a need for the two teams to understand the legal and cultural differences between them. Whilst the legal differences are written, interpreted and can be made explicit, there is less knowledge on the cultural differences between two countries, working in an emerging scientific field. Literature not only shows that the applications of science are affected by culture, but also demonstrates that computer forensics is a subject within the field of science. This research seeks to answer the question: Does Culture have impacts on Applications of Computer Forensics? In order to seek the answer of this problem, at least two countries with different cultural backgrounds and customs will be examined. Specifically, this work will discuss Australia and Taiwan, as these countries are examples of distinctive cultural variations found in the Asia Pacific. Culture is difficult to directly measure, and this work utilises the Delphi survey technique and case study interviews of computer forensic professionals and experts in the field in both Australia and Taiwan as its primary source of data collection. Analysis of this data provides both a view of the field at large, provided by the Delphi survey results and more detailed knowledge from specific experts gathered from the interviews. Specifically, the Delphi survey has 40 questions in five dimensions (Current Situation, Policy and Organization, Education, Law, and Personal Preference and Skill), and the interview is comprised of 13 questions, each asking for more depth than the Delphi can provide. The outcomes of this research directly compare the Australian and Taiwanese cultures as they apply to the field of computer forensics. As such, the most tangible outcome is a framework for Australian and Taiwanese law enforcement, forensic science community, and the court. The suggestions for cross-cultural, cross-border and collaborative digital forensic investigations can be provided based on the discoveries of this research. This thesis essentially helps the mutual understanding between Australian and Taiwanese computer forensic investigators. The understanding is able to improve the chances of success of future cooperation between Australia and Taiwan.
2

An investigation into the identification, reconstruction, and evidential value of thumbnail cache file fragments in unallocated space

Morris, Sarah Louise Angela January 2013 (has links)
This thesis establishes the evidential value of thumbnail cache file fragments identified in unallocated space. A set of criteria to evaluate the evidential value of thumbnail cache artefacts were created by researching the evidential constraints present in Forensic Computing. The criteria were used to evaluate the evidential value of live system thumbnail caches and thumbnail cache file fragments identified in unallocated space. Thumbnail caches can contain visual thumbnails and associated metadata which may be useful to an analyst during an investigation; the information stored in the cache may provide information on the contents of files and any user or system behaviour which interacted with the file. There is a standard definition of the purpose of a thumbnail cache, but not the structure or implementation; this research has shown that this has led to some thumbnail caches storing a variety of other artefacts such as network place names. The growing interest in privacy and security has led to an increase in user’s attempting to remove evidence of their activities; information removed by the user may still be available in unallocated space. This research adapted popular methods for the identification of contiguous files to enable the identification of single cluster sized fragments in Windows 7, Ubuntu, and Kubuntu. Of the four methods tested, none were able to identify each of the classifications with no false positive results; this result led to the creation of a new approach which improved the identification of thumbnail cache file fragments. After the identification phase, further research was conducted into the reassembly of file fragments; this reassembly was based solely on the potential thumbnail cache file fragments and structural and syntactical information. In both the identification and reassembly phases of this research image only file fragments proved the most challenging resulting in a potential area of continued future research. Finally this research compared the evidential value of live system thumbnail caches with identified and reassembled fragments. It was determined that both types of thumbnail cache artefacts can provide unique information which may assist with a digital investigation. ii This research has produced a set of criteria for determining the evidential value of thumbnail cache artefacts; it has also identified the structure and related user and system behaviour of popular operating system thumbnail cache implementations. This research has also adapted contiguous file identification techniques to single fragment identification and has developed an improved method for thumbnail cache file fragment identification. Finally this research has produced a proof of concept software tool for the automated identification and reassembly of thumbnail cache file fragments.
3

An Investigation into the identification, reconstruction, and evidential value of thumbnail cache file fragments in unallocated space

Morris, S L A 08 October 2013 (has links)
©Cranfield University / This thesis establishes the evidential value of thumbnail cache file fragments identified in unallocated space. A set of criteria to evaluate the evidential value of thumbnail cache artefacts were created by researching the evidential constraints present in Forensic Computing. The criteria were used to evaluate the evidential value of live system thumbnail caches and thumbnail cache file fragments identified in unallocated space. Thumbnail caches can contain visual thumbnails and associated metadata which may be useful to an analyst during an investigation; the information stored in the cache may provide information on the contents of files and any user or system behaviour which interacted with the file. There is a standard definition of the purpose of a thumbnail cache, but not the structure or implementation; this research has shown that this has led to some thumbnail caches storing a variety of other artefacts such as network place names. The growing interest in privacy and security has led to an increase in user’s attempting to remove evidence of their activities; information removed by the user may still be available in unallocated space. This research adapted popular methods for the identification of contiguous files to enable the identification of single cluster sized fragments in Windows 7, Ubuntu, and Kubuntu. Of the four methods tested, none were able to identify each of the classifications with no false positive results; this result led to the creation of a new approach which improved the identification of thumbnail cache file fragments. After the identification phase, further research was conducted into the reassembly of file fragments; this reassembly was based solely on the potential thumbnail cache file fragments and structural and syntactical information. In both the identification and reassembly phases of this research image only file fragments proved the most challenging resulting in a potential area of continued future research. Finally this research compared the evidential value of live system thumbnail caches with identified and reassembled fragments. It was determined that both types of thumbnail cache artefacts can provide unique information which may assist with a digital investigation. ii This research has produced a set of criteria for determining the evidential value of thumbnail cache artefacts; it has also identified the structure and related user and system behaviour of popular operating system thumbnail cache implementations. This research has also adapted contiguous file identification techniques to single fragment identification and has developed an improved method for thumbnail cache file fragment identification. Finally this research has produced a proof of concept software tool for the automated identification and reassembly of thumbnail cache file fragments.
4

Judges' Awareness, Understanding, and Application of Digital Evidence

Kessler, Gary Craig 01 January 2010 (has links)
As digital evidence grows in both volume and importance in criminal and civil courts, judges need to fairly and justly evaluate the merits of the offered evidence. To do so, judges need a general understanding of the underlying technologies and applications from which digital evidence is derived. Due to the relative newness of the computer forensics field, there have been few studies on the use of digital forensic evidence and none about judges' relationship with digital evidence. This study addressed judges' awareness, knowledge, and perceptions of digital evidence, using grounded theory methods. The interaction of judges with digital evidence has a social aspect that makes a study of this relationship well suited to grounded theory. This study gathered data via a written survey distributed to judges in the American Bar Association and National Judicial College, followed by interviews with judges from Massachusetts and Vermont. The results indicated that judges generally recognize the importance of evidence derived from digital sources, although they are not necessarily aware of all such sources. They believe that digital evidence needs to be authenticated just like any type of evidence and that it is the role of attorneys rather than of judges to mount challenges to that evidence, as appropriate. Judges are appropriately wary of digital evidence, recognizing how easy it is to alter or misinterpret such evidence. Less technically aware judges appear even more wary of digital evidence than their more knowledgeable peers. Judges recognize that they need additional training in computer and Internet technology as the computer forensics process and digital evidence, citing a lack of availability of such training. This training would enable judges to better understand the arguments presented by lawyers, testimony offered by technical witnesses, and judicial opinions forming the basis of decisional law. A framework for such training is provided in this report. This study is the first in the U.S. to analyze judges and digital forensics, thus opening up a new avenue of research. It is the second time that grounded theory has been employed in a digital forensics study, demonstrating the applicability of that methodology to this discipline.
5

DFMF : a digital forensic management framework

Grobler, Cornelia Petronella 22 August 2012 (has links)
D.Phil.(Computer Science) / We are living in an increasingly complex world in which much of society is dependent on technology and its various offshoots and incarnations (Rogers & Siegfried, 2004). There is ample evidence of the influence of technology on our daily lives. We communicate via e-mail, use chat groups to interact and conduct business by using e-commerce. People relate each other’s existence to a presence on Facebook. The convergence of the products, systems and services of information technology is changing the way of living. The latest smart and cell phones have cameras, applications, and access to social networking sites. These phones contain sensitive information, for example photographs, e-mail, spread sheets, documents, and presentations. The loss of a cell phone therefore may pose a serious problem to an individual or an organisation, when considering privacy and intellectual property issues from an information security (Info Sec) perspective (Pieterse, 2006). Organisations have accepted the protection of information and information assets as a fundamental business requirement and managers are therefore implementing an increasing number of security counter measures, such as security policies, intrusion detection systems, access control mechanisms, and anti-virus products to protect the information and information assets from potential threats. However, incidents still occur, as no system is 100% secure. The incidents must be investigated to determine their root cause and potentially to prosecute the perpetrators (Louwrens, von Solms, Reeckie & Grobler, 2006b). Humankind has long been interested in the connection between cause and event, wishing to know what happened, what went wrong and why it happened. The need for computer forensics emerged when an increasing number of crimes were committed with the use of computers and the evidence required was stored on the computer. In 1984, a Federal Bureau of Investigation (FBI) laboratory began to examine computer evidence (Barayumureeba & Tushabe, 2004), and in 1991 the international association of computer investigation specialists (IACIS) in Portland, Oregon coined the term ‘computer forensics’ during a training session.
6

Certifying Computer Forensics Skills

Watson, Michael Charles 14 June 2021 (has links)
Computer forensics is an ever-growing technological field of complexity and depth. Individuals must strive to keep learning and growing their skills as they help combat cybercrime throughout the world. This study attempts to establish a method of evaluating conceptual expertise in computer forensics to help indicate whether or not an individual understands the five basic phases of computer forensics: preparation, seizure of evidence, acquisition of data, analysis of data, and reporting the findings of the analysis. A survey was presented to a university class of 30 students taking a computer forensics course and as well as posted online asking computer forensics professionals to participate in the survey. Results show that novices that were enrolled in a computer forensics course were able to identify the phases of computer forensics more readily than professionals
7

Selecting Keyword Search Terms in Computer Forensics Examinations Using Domain Analysis and Modeling

Bogen, Alfred Christopher 09 December 2006 (has links)
The motivation for computer forensics research includes the increase in crimes that involve the use of computers, the increasing capacity of digital storage media, a shortage of trained computer forensics technicians, and a lack of computer forensics standard practices. The hypothesis of this dissertation is that domain modeling of the computer forensics case environment can serve as a methodology for selecting keyword search terms and planning forensics examinations. This methodology can increase the quality of forensics examinations without significantly increasing the combined effort of planning and executing keyword searches. The contributions of this dissertation include: ? A computer forensics examination planning method that utilizes the analytical strengths and knowledge sharing abilities of domain modeling in artificial intelligence and software engineering, ? A computer forensics examination planning method that provides investigators and analysts with a tool for deriving keyword search terms from a case domain model, and ? The design and execution of experiments that illustrate the utility of the case domain modeling method. Three experiment trials were conducted to evaluate the effectiveness of case domain modeling, and each experiment trial used a distinct computer forensics case scenario: an identity theft case, a burglary and money laundering case, and a threatening email case. Analysis of the experiments supports the hypothesis that case domain modeling results in more evidence found during an examination with more effective keyword searching. Additionally, experimental data indicates that case domain modeling is most useful when the evidence disk has a relatively high occurrence of text-based documents and when vivid case background details are available. A pilot study and a case study were also performed to evaluate the utility of case domain modeling for typical law enforcement investigators. In these studies the subjects used case domain models in a computer forensics service solicitation activity. The results of these studies indicate that typical law enforcement officers have a moderate comprehension of the case domain modeling method and that they recognize a moderate amount of utility in the method. Case study subjects also indicated that the method would be more useful if supported by a semi-automated tool.
8

Automated Timeline Anomaly Detection

Barone, Joshua M 17 May 2013 (has links)
Digital forensics is the practice of trained investigators gathering and analyzing evidence from digital devices such as computers and smart phones. On these digital devices, it is possible to change the time on the device for a purpose other than what is intended. Currently there are no documented techniques to determine when this occurs. This research seeks to prove out a technique for determining when the time has been changed on forensic disk image by analyzing the log files found on the image. Out of this research a tool is created to perform this analysis in automated fashion. This tool is TADpole, a command line program that analyzes the log files on a disk image and determines if a timeline anomaly has occurred.
9

EFFECTIVE AND EFFICIENT COMPUTATION SYSTEM PROVENANCE TRACKING

Shiqing Ma (7036475) 02 August 2019 (has links)
<div><div><div><p>Provenance collection and analysis is one of the most important techniques used in analyzing computation system behaviors. For forensic analysis in enterprise environment, existing provenance systems are limited. On one hand, they tend to log many redundant and irrelevant events causing high runtime and space overhead as well as long investigation time. On the other hand, they lack the application specific provenance data, leading to ineffective investigation process. Moreover, emerging machine learning especially deep learning based artificial intelligence systems are hard to interpret and vulnerable to adversarial attacks. Using provenance information to analyze such systems and defend adversarial attacks is potentially very promising but not well-studied yet.</p><p><br></p><div><div><div><p>In this dissertation, I try to address the aforementioned challenges. I present an effective and efficient operating system level provenance data collector, ProTracer. It features the idea of alternating between logging and tainting to perform on-the-fly log filtering and reduction to achieve low runtime and storage overhead. Tainting is used to track the dependence relationships between system call events, and logging is performed only when useful dependencies are detected. I also develop MPI, an LLVM based analysis and instrumentation framework which automatically transfers existing applications to be provenance-aware. It requires the programmers to annotate the desired data structures used for partitioning, and then instruments the program to actively emit application specific semantics to provenance collectors which can be used for multiple perspective attack investigation. In the end, I propose a new technique named NIC, a provenance collection and analysis technique for deep learning systems. It analyzes deep learning system internal variables to generate system invariants as provenance for such systems, which can be then used to as a general way to detect adversarial attacks.</p></div></div></div></div></div></div>
10

Hash Comparison Module for OCFA

Axelsson, Therese, Melani, Daniel January 2010 (has links)
<p>Child abuse content on the Internet is today an increasing problem and difficult to dealwith. The techniques used by paedophiles are getting more sophisticated which means ittakes more effort of the law enforcement to locate this content.</p><p>To help solving this issue, a EU-funded project named FIVES is developing a set oftools to help investigations involving large amounts of image and video material. One ofthese tools aims to help identifying potentially illegal files by hash signatures derived fromusing classification information from another project.</p> / FIVES

Page generated in 0.0953 seconds