• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 161
  • 40
  • 37
  • 22
  • 7
  • 6
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 404
  • 142
  • 128
  • 87
  • 66
  • 61
  • 57
  • 53
  • 44
  • 42
  • 39
  • 37
  • 29
  • 28
  • 26
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
191

Risk, Privacy, and Security in Computer Networks

Årnes, Andre January 2006 (has links)
With an increasingly digitally connected society comes complexity, uncertainty, and risk. Network monitoring, incident management, and digital forensics is of increasing importance with the escalation of cybercrime and other network supported serious crimes. New laws and regulations governing electronic communications, cybercrime, and data retention are being proposed, continuously requiring new methods and tools. This thesis introduces a novel approach to real-time network risk assessment based on hidden Markov models to represent the likelihood of transitions between security states. The method measures risk as a composition of individual hosts, providing a precise, fine-grained model for assessing risk and providing decision support for incident response. The approach has been integrated with an existing framework for distributed, large-scale intrusion detection, and the results of the risk assessment are applied to prioritize the alerts produced by the intrusion detection sensors. Using this implementation, the approach is evaluated on both simulated and real-world data. Network monitoring can encompass large networks and process enormous amounts of data, and the practice and its ubiquity can represent a great threat to the privacy and confidentiality of network users. Existing measures for anonymization and pseudonymization are analyzed with respect to the trade-off of performing meaningful data analysis while protecting the identities of the users. The results demonstrate that most existing solutions for pseudonymization are vulnerable to a range of attacks. As a solution, some remedies for strengthening the schemes are proposed, and a method for unlinkable transaction pseudonyms is considered. Finally, a novel method for performing digital forensic reconstructions in a virtual security testbed is proposed. Based on a hypothesis of the security incident in question, the testbed is configured with the appropriate operating systems, services, and exploits. Attacks are formulated as event chains and replayed on the testbed. The effects of each event are analyzed in order to support or refute the hypothesis. The purpose of the approach is to facilitate reconstruction experiments in digital forensics. Two examples are given to demonstrate the approach; one overview example based on the Trojan defense and one detailed example of a multi-step attack. Although a reconstruction can neither prove a hypothesis with absolute certainty, nor exclude the correctness of other hypotheses, a standardized environment combined with event reconstruction and testing can lend credibility to an investigation and can be a valuable asset in court.
192

Forensic analysis of the ESE database in Internet Explorer 10

Malmström, Bonnie, Teveldal, Philip January 2013 (has links)
With Internet Explorer 10, Microsoft changed the way of storing web related information. Instead of the old index.dat files, Internet Explorer 10 uses an ESE database called WebCacheV01.dat to maintain its web cache, history and cookies. This database contains a wealth of information that can be of great interest to a forensic investigator. This thesis explores the structure of the new database, what information it contains, how it behaves in different situations, and also shows that it is possible to recover deleted database records – even when the InPrivate browsing mode has been used.
193

Forward model calculations for determining isotopic compositions of materials used in a radiological dispersal device

Burk, David Edward 29 August 2005 (has links)
In the event that a radiological dispersal device (RDD) is detonated in the U.S. or near U.S. interests overseas, it will be crucial that the actors involved in the event can be identified quickly. If irradiated nuclear fuel is used as the dispersion material for the RDD, it will be beneficial for law enforcement officials to quickly identify where the irradiated nuclear fuel originated. One signature which may lead to the identification of the spent fuel origin is the isotopic composition of the RDD debris. The objective of this research was to benchmark a forward model methodology for predicting isotopic composition of spent nuclear fuel used in an RDD while at the same time optimizing the fidelity of the model to reduce computational time. The code used in this study was Monteburns-2.0. Monteburns is a Monte Carlo based neutronic code utilizing both MCNP and ORIGEN. The size of the burnup step used in Monteburns was tested and found to converge at a value of 3,000 MWd/MTU per step. To ensure a conservative answer, 2,500 MWd/MTU per step was used for the benchmarking process. The model fidelity ranged from the following: 2-dimensional pin cell, multiple radial-region pin cell, modified pin cell, 2D assembly, and 3D assembly. The results showed that while the multi-region pin cell gave the highest level of accuracy, the difference in uncertainty between it and the 2D pin cell (0.07% for 235U) did not warrant the additional computational time required. The computational time for the multiple radial-region pin cell was 7 times that of the 2D pin cell. For this reason, the 2D pin cell was used to benchmark the isotopics with data from other reactors. The reactors from which the methodology was benchmarked were Calvert Cliffs Unit #1, Takahama Unit #3, and Trino Vercelles. Calvert Cliffs is a pressurized water reactor (PWR) using Combustion Engineering 14??14 assemblies. Takahama is a PWR using Mitsubishi Heavy Industries 17??17 assemblies. Trino Vercelles is a PWR using non-standard lattice assemblies. The measured isotopic concentrations from all three of the reactors showed good agreement with the calculated values.
194

Forensic framework for honeypot analysis

Fairbanks, Kevin D. 05 April 2010 (has links)
The objective of this research is to evaluate and develop new forensic techniques for use in honeynet environments, in an effort to address areas where anti-forensic techniques defeat current forensic methods. The fields of Computer and Network Security have expanded with time to become inclusive of many complex ideas and algorithms. With ease, a student of these fields can fall into the thought pattern of preventive measures as the only major thrust of the topics. It is equally important to be able to determine the cause of a security breach. Thus, the field of Computer Forensics has grown. In this field, there exist toolkits and methods that are used to forensically analyze production and honeypot systems. To counter the toolkits, anti-forensic techniques have been developed. Honeypots and production systems have several intrinsic differences. These differences can be exploited to produce honeypot data sources that are not currently available from production systems. This research seeks to examine possible honeypot data sources and cultivate novel methods to combat anti-forensic techniques. In this document, three parts of a forensic framework are presented which were developed specifically for honeypot and honeynet environments. The first, TimeKeeper, is an inode preservation methodology which utilizes the Ext3 journal. This is followed with an examination of dentry logging which is primarily used to map inode numbers to filenames in Ext3. The final component presented is the initial research behind a toolkit for the examination of the recently deployed Ext4 file system. Each respective chapter includes the necessary background information and an examination of related work as well as the architecture, design, conceptual prototyping, and results from testing each major framework component.
195

Analysis of domestic dog mitochondrial DNA sequence variation for forensic investigations

Angleby, Helen January 2005 (has links)
<p>The first method for DNA analysis in forensics was presented in 1985. Since then, the introduction of the polymerase chain reaction (PCR) has rendered possible the analysis of small amounts of DNA and automated sequencing and fragment analysis techniques have facilitated the analyses. In most cases short tandemly repeated regions (STRs) of nuclear DNA are analysed in forensic investigations, but all samples cannot be successfully analysed using this method. For samples containing minute amounts of DNA or degraded DNA, such as shed hairs, analysis of mitochondrial DNA (mtDNA) is generally more successful due to the presence of thousands of copies of mtDNA molecules per cell.</p><p>In Sweden, ~40 % of all households have cats or dogs. With ~9 million humans shedding ~100 scalp hairs per day, and ~1.6 million cats and ~1 million dogs shedding hairs it is not surprising that shed hairs are one of the most common biological evidence found at crime scenes. However, the match probability for domestic dog mtDNA analysis has only been investigated in a few minor studies. Furthermore, although breed –sequence correlations of the noncoding mtDNA control region (CR) have been analysed in a few studies, showing limited correlations, no largescale studies have been performed previously. Thus, there have not been any comprehensive studies of forensic informativity of dog mtDNA. In the two papers presented in this thesis we have tried to lay a foundation for forensic use of analysis of domestic dog mtDNA. In the first paper, CR sequences were analysed and the exclusion capacity was investigated for a number of different populations. This is also the first comprehensive study of the correlation between mtDNA CR type and breed, type, and geographic origin of domestic dogs. Since the exclusion capacity for analysis of domestic dog CR sequences is relatively low, it was investigated in the second paper to what extent the discrimination power is improved by analysis of coding sequence. The exclusion capacity improved considerably when 3,000 base pairs of coding sequences where analysed in addition to CR sequences. This study will hopefully work as a basis for future development of analysis of dog mtDNA for forensic purposes.</p>
196

A comparison of nuclide production and depletion using MCNPX and ORIGEN-ARP reactor models and a sensitivity study of reactor design parameters using MCNPX for nuclear forensics purposes

Chambers, Angela Sue 22 February 2011 (has links)
The Oak Ridge Isotope Generation and Depletion – Automatic Rapid Proccessing (ORIGEN-ARP) deterministic code has been extensively utilized for determining nuclide concentrations at various specific burnup values for a variety of nuclear reactor designs. Given nuclide concentrations or ratios, such calculations can be used in nuclear forensics and nuclear non-proliferation applications to reverse-calculate the type of reactor and specific burnup of the fuel from which the nuclides originated. Recently, Los Alamos National Laboratory has released a version of its probabilistic radiation transport code, MCNPX 2.6.0, which incorporates a fuel burnup feature which can also determine, via the probabilistic Monte Carlo method, nuclide concentrations as a function of fuel burnup. This dissertation compares the concentrations of 46 nuclides significant to nuclear forensics analyses for different reactor types using results from the ORIGEN-ARP and the MCNPX 2.6.0 codes. Three reactor types were chosen: the Westinghouse 17x17 Pressurized Water Reactor (PWR), the GE 8x8-4 Boiling Water Reactor (BWR), and the Canadian Deuterium Uranium, CANDU-37, reactor. Additionally, a sensitivity study of the different reactor parameters within the MCNPX Westinghouse 17x17 PWR model was performed. This study analyzed the different nuclide concentrations resulting from minor perturbations of the following parameters: assembly rod pitch, initial moderator boron concentration, fuel pin cladding thickness, moderator density, and fuel temperature. / text
197

Veiksmų kaip įkalčių skaičiavimų debesies saugyklose atkūrimo metodika / Methodology of user activities reconstruction for forensic purposes in cloud storage

Saikauskas, Nerijus 26 August 2013 (has links)
Skaičiavimų debesies (angl. cloud computing) technologijos sukūrimas suteikė galimybę padidinti kompanijų veiklos efektyvumą, tačiau sukėlė ir naujų problemų, viena kurių – skaitmeninės teismo ekspertizės (angl. digital forensics) atlikimas nutolusioje aplinkoje. Apskritai teigiama, kad jeigu skaičiavimų debesies paslauga nefiksuoja tinkamų audito įrašų, nustatyti įkalčius tampa sunku arba tiesiog neįmanoma. Deja, paprastai šiam tikslui siūlomas funkcionalumas yra gana ribotas arba iš viso neegzistuoja. Šiame magistriniame darbe yra siūloma nauja metodika-įrankis, Žurnalizavimo Paramos Sistema (ŽPS), programavimo kalba „Python” apjungianti skaitmeninės teismo ekspertizės atlikimui skirtas atvirojo kodo programines priemones „The Sleuth Kit“ ir „The Volatility Framework“, kuri padeda užfiksuoti ir atkurti vartotojų veiksmus kaip įkalčius skaičiavimų debesies saugyklose. ŽPS įgyvendina kitų autorių pasiūlytą unifikuotą audito įrašų formatą tokio pobūdžio aplinkoms ir sukuria save aprašančių duomenų efektą, kuris, manoma, yra svarbus žingsnis siekiant efektyviai tirti nusikaltimus skaičiavimų debesies saugyklose. Eksperimentinio tyrimo metu Žurnalizavimo Paramos Sistema pademonstravo aukštus efektyvumo rodiklius: jos pagalba pavyko atkurti daugiau kaip 65 % veiksmų priklausomai nuo vartotojų aktyvumo su sąlyga, kad virtualių mašinų (angl. virtual machine) kopijos buvo kuriamas ir analizuojamas ne rečiau kaip kas 5 min. / Even though creation of cloud computing technology has provided opportunities to increase effectiveness of the companies, it has also generated new problems where one of them is digital forensics in the remote environments. It is generally agreed that if the service of a cloud doesn't record appropriate logs, identification of evidence becomes hard if not possible. Unfortunately, the existing functionality for this purpose is limited or absent all together. In this Master's thesis a new method-tool, Žurnalizavimo Paramos Sistema (ŽPS), has been proposed which combines open source digital forensic software The Sleuth Kit and The Volatility Framework with the help of Python programming language and helps to record and restore user activities in cloud storage environments. ŽPS implements unified logging format for such types of settings proposed by other authors and creates a data-centric effect which is thought to be an important step towards proper crime investigations in cloud storage environments. During experimental evaluation the method proved to be highly effective managing to reconstruct more than 65 % of user actions depending on their activeness when the copies of virtual machines have been created and analized not rarer than 5 minutes period.
198

Reconstruction of major male and female lineages of the Strand Muslim community

Tasneem Geduld January 2010 (has links)
<p>Initially, a pilot study was carried out in order to reconstruct the major paternal and maternal lineages of the Muslim population living in the Cape metropolitan area. The Study has shown the ability of molecular genetic tools to give insight into the origins and history of local communities. The study was also used as a point of reference for the Strand Muslim Community project. Genetic variations of the Y-chromosome and mitochondrial DNA for the pilot study were analyzed using the RFLP technique. The SNaPshot mini-sequencing technique was used to genotype single nucleotide polymorphisms (SNP) on the Y-chromosome and mitochondrial DNA in 115 males from the Strand Muslim community.</p>
199

Network Forensics and Log Files Analysis : A Novel Approach to Building a Digital Evidence Bag and Its Own Processing Tool

Qaisi, Ahmed Abdulrheem Jerribi January 2011 (has links)
Intrusion Detection Systems (IDS) tools are deployed within networks to monitor data that is transmitted to particular destinations such as MySQL,Oracle databases or log files. The data is normally dumped to these destinations without a forensic standard structure. When digital evidence is needed, forensic specialists are required to analyse a very large volume of data. Even though forensic tools can be utilised, most of this process has to be done manually, consuming time and resources. In this research, we aim to address this issue by combining several existing tools to archive the original IDS data into a new container (Digital Evidence Bag) that has a structure based upon standard forensic processes. The aim is to develop a method to improve the current IDS database function in a forensic manner. This database will be optimised for future, forensic, analysis. Since evidence validity is always an issue, a secondary aim of this research is to develop a new monitoring scheme. This is to provide the necessary evidence to prove that an attacker had surveyed the network prior to the attack. To achieve this, we will set up a network that will be monitored by multiple IDSs. Open source tools will be used to carry input validation attacks into the network including SQL injection. We will design a new tool to obtain the original data in order to store it within the proposed DEB. This tool will collect the data from several databases of the different IDSs. We will assume that the IDS will not have been compromised.
200

Detection and analysis of connection chains in network forensics

Almulhem, Ahmad 06 April 2010 (has links)
Network forensics is a young member of the bigger family of digital forensics discipline. In particular, it refers to digital forensics in networked environments. It represents an important extension to the model of network security where emphasis is traditionally put on prevention and to a lesser extent on detection. It focuses on the collection, and analysis of network packets and events caused by an intruder for investigative purposes. A key challenge in network forensics is to ensure that the network itself is forensically-ready, by providing an infrastructure to collect and analyze data in real-time. In this thesis, we propose an agent-based network forensics system, which is intended to add real-time network forensics capabilities into a controlled network. We also evaluate the proposed system by deploying and studying it in a real-life environment. Another challenge in network forensics arises because of attackers ability to move around in the network, which results in creating a chain of connections; commonly known as connection chains. In this thesis, we provide an extensive review and taxonomy of connection chains. Then, we propose a novel framework to detect them. The framework adopts a black-box approach by passively monitoring inbound and outbound packets at a host, and analyzing the observed packets using association rule mining. We assess the proposed framework using public network traces, and demonstrate both its efficiency and detection capabilities. We, finally, propose a profiling-based framework to investigate connection chains that are distributed over several ip addresses. The framework utilizes a simple yet extensible hacker model that integrates information about a hacker's linguistic, operating system and time of activity. We establish the effectiveness of the proposed approach through several simulations and an evaluation with real attack data.

Page generated in 0.0569 seconds