• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 24
  • 1
  • Tagged with
  • 31
  • 31
  • 31
  • 21
  • 17
  • 11
  • 11
  • 11
  • 9
  • 8
  • 8
  • 8
  • 7
  • 6
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

An architecture for the forensic analysis of Windows system generated artefacts

Hashim, Noor Hayati January 2011 (has links)
Computer forensic tools have been developed to enable forensic investigators to analyse software artefacts to help reconstruct possible scenarios for activity on a particular computer system. A number of these tools allow the examination and analysis of system generated artefacts such as the Windows registry. Examination and analysis of these artefacts is focussed on recovering the data extracting information relevant to a digital investigation. This information is currently underused in most digital investigations. With this in mind, this thesis considers system generated artefacts that contain information concerning the activities that occur on a Windows system and will often contain evidence relevant to a digital investigation. The objective of this research is to develop an architecture that simplifies and automates the collection of forensic evidence from system generated files where the data structures may be either known or in a structured but poorly understood (unknown) format. The hypothesis is that it should be feasible to develop an architecture that will be to integrate forensic data extracted from a range of system generated files and to implement a proof of concept prototype tool, capable of visualising the Event logs and Swap files. This thesis presents an architecture to enable the forensic investigator to analyse and visualise a range of system generated artefacts for which the internal arrangement of data is either well structured and understood or those for which the internal arrangement of the data is unclear or less publicised (known and not known data structures). The architecture reveals methods to access, view and analyse system generated artefacts. The architecture is intended to facilitate the extraction and analysis of operating system generated artefacts while being extensible, flexible and reusable. The architectural concepts are tested using a prototype implementation focussed the Windows Event Logs and the Swap Files. Event logs reveal evidence regarding logons, authentication, account and privilege use and can address questions relating to which user accounts were being used and which machines were accessed. Swap file contains fragments of data, remnants or entire documents, e-mail messages or results of internet browsing which reveal past user activities. Issues relating to understanding and visualising artefacts data structure are discussed and possible solutions are explored. The architecture is developed by examining the requirements and methods with respect to the needs of computer forensic investigations and forensic process models with the intention to develop a new multiplatform tool to visualise the content of Event logs and Swap files. This tool is aimed at displaying data contained in event logs and swap files in a graphical manner. This should enable the detection of information which may support the investigation. Visualisation techniques can also aid the forensic investigators in identifying suspicious events and files, making such techniques more feasible for consideration in a wider range of cases and, in turn, improve standard procedures. The tool is developed to fill a gap between capabilities of certain other open source tools which visualise the Event logs and Swap files data in a text based format only.
12

Forensic framework for honeypot analysis

Fairbanks, Kevin D. 05 April 2010 (has links)
The objective of this research is to evaluate and develop new forensic techniques for use in honeynet environments, in an effort to address areas where anti-forensic techniques defeat current forensic methods. The fields of Computer and Network Security have expanded with time to become inclusive of many complex ideas and algorithms. With ease, a student of these fields can fall into the thought pattern of preventive measures as the only major thrust of the topics. It is equally important to be able to determine the cause of a security breach. Thus, the field of Computer Forensics has grown. In this field, there exist toolkits and methods that are used to forensically analyze production and honeypot systems. To counter the toolkits, anti-forensic techniques have been developed. Honeypots and production systems have several intrinsic differences. These differences can be exploited to produce honeypot data sources that are not currently available from production systems. This research seeks to examine possible honeypot data sources and cultivate novel methods to combat anti-forensic techniques. In this document, three parts of a forensic framework are presented which were developed specifically for honeypot and honeynet environments. The first, TimeKeeper, is an inode preservation methodology which utilizes the Ext3 journal. This is followed with an examination of dentry logging which is primarily used to map inode numbers to filenames in Ext3. The final component presented is the initial research behind a toolkit for the examination of the recently deployed Ext4 file system. Each respective chapter includes the necessary background information and an examination of related work as well as the architecture, design, conceptual prototyping, and results from testing each major framework component.
13

New cryptographic schemes with application in network security and computer forensics

Jiang, Lin, 蒋琳 January 2010 (has links)
published_or_final_version / Computer Science / Doctoral / Doctor of Philosophy
14

What is the impact of the Cyber Crime Act on the business community in Mauritius.

Jamalkhan, Nasserkhan. January 2004 (has links)
At this early age of the internet, the e-business environment is almost like a lawless territory. Fast movers are making fortunes whereas rebels can act with impunity and move on before the legal process can catch up. The fast expansion of cyber crimes in the world has been the motivation to perform this research on its impact on the business community in Mauritius after the devastating effects in developing countries. Organisations that are not keeping pace with these realities are becoming vulnerable to cyber criminals or hackers. An analysis of the situation in the world from the literature review has provided a better understanding of the most common crimes that are causing trouble to the businesses and obstacles to the advancement of e-commerce. Compared to earlier technological changes, the internet has shown a rapid proliferation. Organisations have to be ready to face this challenge or they may face the dangers of being attacked or even prosecuted for not having secured their system properly. While securing the internet remains a major challenge for every country, businesses have to cope with limited protection until an international law become in force to control this wild territory. The reports available on the Crime trend show that there has been a steady increase in Computer related crimes in the world. The research is conducted on a sample of IT literate participants. Interviews and focus group discussion have also contributed in the accuracy of the findings. The results and findings demonstrate that there is room for improvement but there is a lack of awareness on the Cyber crime act. Hopefully, this research will help to shed light on the major concerns of the business community. VI / Thesis (MBA)-University of KwaZulu-Natal, Durban, 2004.
15

A formalised ontology for network attack classification

Van Heerden, Renier Pelser January 2014 (has links)
One of the most popular attack vectors against computers are their network connections. Attacks on computers through their networks are commonplace and have various levels of complexity. This research formally describes network-based computer attacks in the form of a story, formally and within an ontology. The ontology categorises network attacks where attack scenarios are the focal class. This class consists of: Denial-of- Service, Industrial Espionage, Web Defacement, Unauthorised Data Access, Financial Theft, Industrial Sabotage, Cyber-Warfare, Resource Theft, System Compromise, and Runaway Malware. This ontology was developed by building a taxonomy and a temporal network attack model. Network attack instances (also know as individuals) are classified according to their respective attack scenarios, with the use of an automated reasoner within the ontology. The automated reasoner deductions are verified formally; and via the automated reasoner, a relaxed set of scenarios is determined, which is relevant in a near real-time environment. A prototype system (called Aeneas) was developed to classify network-based attacks. Aeneas integrates the sensors into a detection system that can classify network attacks in a near real-time environment. To verify the ontology and the prototype Aeneas, a virtual test bed was developed in which network-based attacks were generated to verify the detection system. Aeneas was able to detect incoming attacks and classify them according to their scenario. The novel part of this research is the attack scenarios that are described in the form of a story, as well as formally and in an ontology. The ontology is used in a novel way to determine to which class attack instances belong and how the network attack ontology is affected in a near real-time environment.
16

The use of electronic evidence in forensic investigation

Ngomane, Amanda Refiloe 06 1900 (has links)
For millions of people worldwide the use of computers has become a central part of life. Criminals are exploiting these technological advances for illegal activities. This growth of technology has therefore produced a completely new source of evidence referred to as ‘electronic evidence’. In light of this the researcher focused on the collection of electronic evidence and its admissibility at trial. The study intends to assist and give guidance to investigators to collect electronic evidence properly and legally and ensure that it is admitted as evidence in court. Electronic evidence is fragile and volatile by nature and therefore requires the investigator always to exercise reasonable care during its collection, preservation and analysis to protect its identity and integrity. The legal requirements that the collected electronic evidence must satisfy for it to be admissible in court are relevance, reliability, and authenticity. When presenting the evidence in court the investigator should always keep in mind that the judges are not specialists in the computing environment and that therefore the investigator must be able to explain how the chain of custody was maintained during the collection, preservation and analysis of electronic evidence. The complex technology behind electronic evidence must be clearly explained so that the court is able to understand the evidence in a way that an ordinary person or those who have never used a computer before can. This is because the court always relies on the expertise of the investigator to understand electronic evidence and make a ruling on matters related to it. / Police Practice / M. Tech. (Forensic Investigation)
17

The Response Of American Police Agencies To Digital Evidence

Yesilyurt, Hamdi 01 January 2011 (has links)
Little is known about the variation in digital forensics practice in the United States as adopted by large local police agencies. This study investigated how environmental constraints, contextual factors, organizational complexity, and organizational control relate to the adoption of digital forensics practice. This study integrated 3 theoretical perspectives in organizational studies to guide the analysis of the relations: institutional theory, contingency theory, and adoption-of-innovation theory. Institutional theory was used to analyze the impact of environmental constraints on the adoption of innovation, and contingency theory was used to examine the impacts of organizational control on the adoption of innovation. Adoption of innovation theory was employed to describe the degree to which digital forensics practice has been adopted by large municipal police agencies having 100 or more sworn police officers. The data set was assembled primarily by using Law Enforcement Management and Administrative Statistics (LEMAS) 2003 and 1999. Dr. Edward Maguire`s survey was used to obtain 1 variable. The joining up of the data set to construct the sample resulted in 345 large local police agencies. The descriptive results on the degree of adoption of digital forensics practice indicate that 37.7% of large local police agencies have dedicated personnel to address digital evidence, 32.8% of police agencies address digital evidence but do not have dedicated personnel, and only 24.3% of police agencies have a specialized unit with full-time personnel to address digital evidence. About 5% of local police agencies do nothing to address digital evidence in any circumstance. These descriptive statistics indicate that digital evidence is a matter of concern for most large local police agencies and that they respond to varying degrees to digital evidence at iv the organizational level. Agencies that have not adopted digital forensics practice are in the minority. The structural equation model was used to test the hypothesized relations, easing the rigorous analysis of relations between latent constructs and several indicator variables. Environmental constraints have the largest impact on the adoption of innovation, exerting a positive influence. No statistically significant relation was found between organizational control and adoption of digital forensic practice. Contextual factors (task scope and personnel size) positively influence the adoption of digital forensics. Structural control factors, including administrative weight and formalization, have no significant influence on the adoption of innovation. The conclusions of the study are as follows. Police agencies adopt digital forensics practice primarily by relying on environmental constraints. Police agencies exposed to higher environmental constraints are more frequently expected to adopt digital forensics practice. Because organizational control of police agencies is not significantly related to digital forensics practice adoption, police agencies do not take their organizational control extensively into consideration when they consider adopting digital forensics practice. The positive influence of task scope and size on digital forensics practice adoption was expected. The extent of task scope and the number of personnel indicate a higher capacity for police agencies to adopt digital forensics practice. Administrative weight and formalization do not influence the adoption of digital forensics practice. Therefore, structural control and coordination are not important for large local police agencies to adopt digital forensics practice. v The results of the study indicate that the adoption of digital forensics practice is based primarily on environmental constraints. Therefore, more drastic impacts on digital forensics practice should be expected from local police agencies’ environments than from internal organizational factors. Researchers investigating the influence of various factors on the adoption of digital forensics practice should further examine environmental variables. The unexpected results concerning the impact of administrative weight and formalization should be researched with broader considerations.
18

The Implications Of Virtual Environments In Digital Forensic Investigations

Patterson, Farrah M 01 January 2011 (has links)
This research paper discusses the role of virtual environments in digital forensic investigations. With virtual environments becoming more prevalent as an analysis tool in digital forensic investigations, it’s becoming more important for digital forensic investigators to understand the limitation and strengths of virtual machines. The study aims to expose limitations within commercial closed source virtual machines and open source virtual machines. The study provides a brief overview of history digital forensic investigations and virtual environments, and concludes with an experiment with four common open and closed source virtual machines; the effects of the virtual machines on the host machine as well as the performance of the virtual machine itself. My findings discovered that while the open source tools provided more control and freedom to the operator, the closed source tools were more stable and consistent in their operation. The significance of these findings can be further researched by applying them in the context of exemplifying reliability of forensic techniques when presented as analysis tool used in litigation.
19

Liforac - A Model For Life Forensic Acquisition

11 October 2010 (has links)
D.Phil.
20

Um sistema para análise e detecção de ataques ao navegador Web / A system for analysis and detection of browser attacks

Afonso, Vitor Monte, 1987- 10 June 2011 (has links)
Orientador: Paulo Lício de Geus / Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Computação / Made available in DSpace on 2018-10-08T19:16:40Z (GMT). No. of bitstreams: 1 Afonso_VitorMonte_M.pdf: 1149302 bytes, checksum: 7db349dd13a346a3b3e7175d2f7eeaae (MD5) Previous issue date: 2011 / Resumo: Páginas Web com conteúdo malicioso são uma das grandes ameaças à segurança de sistemas atualmente. Elas são a principal forma utilizada por atacantes para instalar programas maliciosos (malware) no sistema operacional dos usuários. Para desenvolver mecanismos de proteção contra essas páginas, elas precisam ser estudadas e entendidas profundamente. Existem diversos sistemas de análise que são capazes de analisar páginas Web, prover informações sobre elas e classificá-las como maliciosas ou benignas. Entretanto, estes sistemas possuem diversas limitações em relação ao tipo de código que pode ser analisado e aos tipos de ataque que podem ser detectados. Para suprir tal deficiência nos sistemas de análise de páginas Web maliciosas foi desenvolvido um sistema, chamado de BroAD (Browser Attacks Detection), que faz a análise destas páginas de forma dinâmica, monitorando tanto as chamadas de sistemas realizadas pelo navegador enquanto as processa, quanto as ações realizadas pelo código JavaScript contido na página. A detecção dos comportamentos maliciosos é feita em quatro etapas, utilizando técnicas de aprendizado de máquina e assinaturas. Estas etapas incluem a detecção de shellcodes, a detecção de anomalias no comportamento do JavaScript e a análise de chamadas de sistema e assinaturas de código JavaScript. Foram realizados testes que demonstram que o sistema desenvolvido possui taxas de detecção superiores aos sistemas do estado-da-arte de análise de páginas Web maliciosas e ainda provê mais informações a respeito delas, levando a um entendimento melhor das amostras. Além disso, são apresentados códigos que podem detectar e evadir facilmente a análise de parte desses sistemas usados na comparação, demonstrando a fragilidade deles / Abstract: Malicious Web applications are a significant threat to computer security today. They are the main way through which attackers manage to install malware on end-user systems. In order to develop protection mechanisms to these threats, the attacks themselves need to be deeply studied and understood. Several analysis systems exist to analyze Web pages, provide information about them and classify them as malicious or benign. However, these systems are limited regarding the type of attacks that can be detected and the programming languages that can be analyzed. In order to fill this gap, a system, called BroAD (Browser Attacks Detection), that is capable of analyzing malicious Web pages, was developed. It monitors both system calls and JavaScript actions and the detection of the malicious behavior is performed in four steps, by the use of machine learning techniques and signatures. These steps include the detection of shellcodes, the anomaly detection of the JavaScript behavior and the analysis of system calls and JavaScript signatures. The results of the performed tests show that the developed system has better detection rates than the state-of-the-art systems in malicious Web pages analysis and also provides more information about these pages, giving a better understanding about their behavior. Besides, codes that can be used to easily detect and evade the analysis of part of those systems are presented, showing their fragility / Mestrado / Ciência da Computação / Mestre em Ciência da Computação

Page generated in 0.1025 seconds