Spelling suggestions: "subject:"computer crimes"" "subject:"coomputer crimes""
131 |
Métodos formais algébricos para geração de invariantes / Algebraic formal methods for invariant generationRebiha, Rachid, 1977- 08 December 2011 (has links)
Orientador: Arnaldo Vieira Moura / Tese (doutorado) - Universidade Estadual de Campinas, Instituto de Computação / Made available in DSpace on 2018-08-19T00:11:05Z (GMT). No. of bitstreams: 1
Rebiha_Rachid_D.pdf: 1451665 bytes, checksum: abe6fc4e72cf43113c7c93064ab11ed8 (MD5)
Previous issue date: 2011 / Resumo: É bem sabido que a automação e a eficácia de métodos de verificação formal de softwares, sistemas embarcados ou sistemas híbridos, depende da facilidade com que invariantes precisas possam ser geradas automaticamente a partir do código fonte. Uma invariante é uma propriedade, especificada sobre um local específico do código fonte, e que sempre se verifica a cada execução de um sistema. Apesar dos progressos enormes ao longo dos anos, o problema da geração de invariantes ainda está em aberto para tanto programas não-lineares discretos, como para sistemas não-lineares híbridos. Nesta tese, primeiramente, apresentamos novos métodos computacionais que podem automatizar a descoberta e o fortalecimento de relações não-lineares entre as variáveis de um programa que contém laços não-lineares, ou seja, programas que exibem relações polinomiais multivariadas e manipulações fracionarias. Além disso, a maioria dos sistemas de segurança críticos, tais como aviões, automóveis, produtos químicos, usinas de energia e sistemas biológicos, operam semanticamente como sistemas híbridos não-lineares. Nesse trabalho, apresentamos poderosos métodos computacionais que são capazes de gerar bases de ideais polinomiais de invariantes não-lineares para sistemas híbridos não-lineares. Em segundo lugar, apresentamos métodos pioneiros de verificação que automaticamente gerem bases de invariantes expressas por séries de potências multi-variáveis e por funções transcendentais. Discutimos, também, a sua convergência em sistemas híbridos que exibem modelos não lineares. Verificamos que as séries de potência geradas para invariantes são, muitas vezes, compostas pela expansão de algumas funções transcendentais bem conhecidas, tais como "log" e "exp". Assim, apresentam uma forma analisável fechada que facilita o uso de invariantes na verificação de propriedades de segurança. Para cada problema de geração de invariantes estabelecemos condições suficientes, muito gerais, que garantem a existência e permitem o cálculo dos ideais polinomiais para situações que não podem ser tratadas pelas abordagens de geração invariantes hoje conhecidas. Finalmente, estendemos o domínio de aplicações, acessíveis através de métodos de geração de invariantes, para a área de segurança. Mais precisamente, fornecemos uma plataforma extensível baseada em invariantes pré-computadas que seriam usadas como assinaturas semânticas para análise de intrusos ("malwares") e deteção dos ataques de intrusões mais virulentos. Seguindo a concepção de tais plataformas, propomos sistemas de detecção de intrusão, usando modelos gerados automaticamente, onde as chamadas de sistema e de funções são vigiados pela avaliação de invariantes, pré-calculadas para denunciar qualquer desvio observado durante a execução da aplicação. De modo abrangente, nesta tese, propomos a redução de problemas de geração de invariantes para problemas algébricos lineares. Ao reduzir os problemas de geração de invariante não-triviais de sistemas híbridos não-lineares para problemas algébricos lineares relacionados, somos capazes de ultrapassar as deficiências dos mais modernos métodos de geração de invariante hoje conhecidos permitindo, assim, a geração automática e eficiente de invariantes para programas e sistemas híbridos não lineares complexos. Tais métodos algébricos lineares apresentam complexidades computacionais significativamente inferiores àquelas exigidas pelos os fundamentos matemáticos das abordagens usadas hoje, tais como a computação de bases de Gröbner, a eliminação de quantificadores e decomposições cilíndricas algébricas / Abstract: It is well-known that the automation and effectiveness of formal software verification of embedded or hybrid systems depends to the ease with which precise invariants can be automatically generated from source specifications. An invariant is a property that holds true at a specific location in the specification code, whenever an execution reaches that location. Despite tremendous progress over the years, the problem of invariant generation remains very challenging for both non-linear discrete programs, as well as for non-linear hybrid systems. In this thesis, we first present new computational methods that can automate the discovery and can strengthen interrelationships among the variables of a program that contains non-linear loops, that is, programs that display multivariate polynomial and fractional manipulations. Moreover, most of safety-critical systems such as aircraft, cars, chemicals, power plants and biological systems operate semantically as non-linear hybrid systems. In this work, we demonstrate powerful computational methods that can generate basis for non-linear invariant ideals of non-linear hybrid systems. Secondly, we present the first verification methods that automatically generate basis for invariants expressed by multivariate formal power series and transcendental functions. We also discuss their convergence over hybrid systems that exhibit non linear models. The formal power series invariants generated are often composed by the expansion of some well-known transcendental functions e.g. log and exp. They also have an analysable closed-form which facilitates the use of the invariants when verifying safety properties. For each invariant generation problem, we establish very general sufficient conditions that guarantee the existence and allow for the computation of invariant ideals for situations that can not be treated in the presently known invariant generation approaches. Finally, we extend the domain of applications for invariant generation methods to encompass security problems. More precisely, we provide an extensible invariant-based platform for malware analysis and show how we can detect the most virulent intrusions attacks using these invariants. We propose to automatically generate invariants directly from the specified malware code in order to use them as semantic aware signatures, i.e. malware invariant, that would remain unchanged by most obfuscated techniques. Folix lowing the design of such platforms, we propose host-based intrusion detection systems, using automatically generated models where system calls are guarded by pre-computed invariants in order to report any deviation observed during the execution of the application. In a broad sense, in this thesis, we propose to reduce the verification problem of invariant generation to algebraic problems. By reducing the problems of non-trivial nonlinear invariant generation for programs and hybrid systems to related linear algebraic problems we are able to address various deficiencies of other state-of-the-art invariant generation methods, including the efficient treatment of complicated non-linear loop programs and non-linear hybrid systems. Such linear algebraic methods have much lower computational complexities than the mathematical foundations of previous approaches know today, which use techniques such as as Gröbner basis computation, quantifier elimination and cylindrical algebraic decomposition / Doutorado / Ciência da Computação / Doutor em Ciência da Computação
|
132 |
Sequential detection and isolation of cyber-physical attacks on SCADA systems / Détection et localisation séquentielle d’attaques cyber-physiques aux systèmes SCADADo, Van Long 17 November 2015 (has links)
Cette thèse s’inscrit dans le cadre du projet « SCALA » financé par l’ANR à travers le programme ANR-11-SECU-0005. Son objectif consiste à surveiller des systèmes de contrôle et d’acquisition de données (SCADA) contre des attaques cyber-physiques. Il s'agit de résoudre un problème de détection-localisation séquentielle de signaux transitoires dans des systèmes stochastiques et dynamiques en présence d'états inconnus et de bruits aléatoires. La solution proposée s'appuie sur une approche par redondance analytique composée de deux étapes : la génération de résidus, puis leur évaluation. Les résidus sont générés de deux façons distinctes, avec le filtre de Kalman ou par projection sur l’espace de parité. Ils sont ensuite évalués par des méthodes d’analyse séquentielle de rupture selon de nouveaux critères d’optimalité adaptés à la surveillance des systèmes à sécurité critique. Il s'agit donc de minimiser la pire probabilité de détection manquée sous la contrainte de niveaux acceptables pour la pire probabilité de fausse alarme et la pire probabilité de fausse localisation. Pour la tâche de détection, le problème d’optimisation est résolu dans deux cas : les paramètres du signal transitoire sont complètement connus ou seulement partiellement connus. Les propriétés statistiques des tests sous-optimaux obtenus sont analysées. Des résultats préliminaires pour la tâche de localisation sont également proposés. Les algorithmes développés sont appliqués à la détection et à la localisation d'actes malveillants dans un réseau d’eau potable / This PhD thesis is registered in the framework of the project “SCALA” which received financial support through the program ANR-11-SECU-0005. Its ultimate objective involves the on-line monitoring of Supervisory Control And Data Acquisition (SCADA) systems against cyber-physical attacks. The problem is formulated as the sequential detection and isolation of transient signals in stochastic-dynamical systems in the presence of unknown system states and random noises. It is solved by using the analytical redundancy approach consisting of two steps: residual generation and residual evaluation. The residuals are firstly generated by both Kalman filter and parity space approaches. They are then evaluated by using sequential analysis techniques taking into account certain criteria of optimality. However, these classical criteria are not adequate for the surveillance of safety-critical infrastructures. For such applications, it is suggested to minimize the worst-case probability of missed detection subject to acceptable levels on the worst-case probability of false alarm and false isolation. For the detection task, the optimization problem is formulated and solved in both scenarios: exactly and partially known parameters. The sub-optimal tests are obtained and their statistical properties are investigated. Preliminary results for the isolation task are also obtained. The proposed algorithms are applied to the detection and isolation of malicious attacks on a simple SCADA water network
|
133 |
A framework for high speed lexical classification of malicious URLsEgan, Shaun Peter January 2014 (has links)
Phishing attacks employ social engineering to target end-users, with the goal of stealing identifying or sensitive information. This information is used in activities such as identity theft or financial fraud. During a phishing campaign, attackers distribute URLs which; along with false information, point to fraudulent resources in an attempt to deceive users into requesting the resource. These URLs are made obscure through the use of several techniques which make automated detection difficult. Current methods used to detect malicious URLs face multiple problems which attackers use to their advantage. These problems include: the time required to react to new attacks; shifts in trends in URL obfuscation and usability problems caused by the latency incurred by the lookups required by these approaches. A new method of identifying malicious URLs using Artificial Neural Networks (ANNs) has been shown to be effective by several authors. The simple method of classification performed by ANNs result in very high classification speeds with little impact on usability. Samples used for the training, validation and testing of these ANNs are gathered from Phishtank and Open Directory. Words selected from the different sections of the samples are used to create a `Bag-of-Words (BOW)' which is used as a binary input vector indicating the presence of a word for a given sample. Twenty additional features which measure lexical attributes of the sample are used to increase classification accuracy. A framework that is capable of generating these classifiers in an automated fashion is implemented. These classifiers are automatically stored on a remote update distribution service which has been built to supply updates to classifier implementations. An example browser plugin is created and uses ANNs provided by this service. It is both capable of classifying URLs requested by a user in real time and is able to block these requests. The framework is tested in terms of training time and classification accuracy. Classification speed and the effectiveness of compression algorithms on the data required to distribute updates is tested. It is concluded that it is possible to generate these ANNs in a frequent fashion, and in a method that is small enough to distribute easily. It is also shown that classifications are made at high-speed with high-accuracy, resulting in little impact on usability.
|
134 |
Visualisation of PF firewall logs using open sourceCoetzee, Dirk January 2015 (has links)
If you cannot measure, you cannot manage. This is an age old saying, but still very true, especially within the current South African cybercrime scene and the ever-growing Internet footprint. Due to the significant increase in cybercrime across the globe, information security specialists are starting to see the intrinsic value of logs that can ‘tell a story’. Logs do not only tell a story, but also provide a tool to measure a normally dark force within an organisation. The collection of current logs from installed systems, operating systems and devices is imperative in the event of a hacking attempt, data leak or even data theft, whether the attempt is successful or unsuccessful. No logs mean no evidence, and in many cases not even the opportunity to find the mistake or fault in the organisation’s defence systems. Historically, it remains difficult to choose what logs are required by your organization. A number of questions should be considered: should a centralised or decentralised approach for collecting these logs be followed or a combination of both? How many events will be collected, how much additional bandwidth will be required and will the log collection be near real time? How long must the logs be saved and what if any hashing and encryption (integrity of data) should be used? Lastly, what system must be used to correlate, analyse, and make alerts and reports available? This thesis will address these myriad questions, examining the current lack of log analysis, practical implementations in modern organisation, and also how a need for the latter can be fulfilled by means of a basic approach. South African organizations must use technology that is at hand in order to know what electronic data are sent in and out of their organizations network. Concentrating only on FreeBSD PF firewall logs, it is demonstrated within this thesis the excellent results are possible when logs are collected to obtain a visual display of what data is traversing the corporate network and which parts of this data are posing a threat to the corporate network. This threat is easily determined via a visual interpretation of statistical outliers. This thesis aims to show that in the field of corporate data protection, if you can measure, you can manage.
|
135 |
A context-aware collaborative decision making framework for combating terrorism in AfricaOdhiambo, Nancy Achieng 19 June 2020 (has links)
PhD (Business Information Systems) / Department of Business Information Systems / Collaborative Decision Making (CDM) is a never-ending challenge in complex-problem situations where multiple actors are involved. Complex-problem situations involve problems that are ill-defined, ill-structured and wicked such as terrorism. Problems of this nature usually warrant a collaborative effort between actors (organizations) with multiple skill-sets and expertise that at times might be at variance with each other. In order to address this gap, three sub-objectives were postulated from the main research objective, “To determine how optimal/effective CDM can be realized amongst counter-terrorism organizations through context-aware technologies.”
Using the theory of synergetics and following deductive thematic analysis, the socio-technical nature of the terrorism problem was depicted by postulating a Digital Terrorism Ecology that consists of Open Digital Infrastructure (ODI), Digital Information Ecosystem (DIE), Digital Terrorism Enactment (DTE), Digital Capability and Digital Enslavement. Based on institutional theory and using PLS-SEM technique, Group/departmental relationships, Organizational co-operation, Organizational form, Technical infrastructure and interoperability, Information and knowledge sharing were identified as the factors influencing attainment of Optimal/effective CDM amongst counter-terrorism organizations. In order to explicate the role of context-aware technologies in enhancing CDM amongst counter-terrorism organizations, a context-aware CDM framework was developed following Design Science Research (DSR) methodology.
In this study it was evident from the findings that attainment of OCDM in counter-terrorism contexts is challenging even though it is essential. Among the factors considered as possible influencers of attainment of OCDM, Organizational form (OF) was found to influence Organizational cooperation (OC) and Technical infrastructure and inter-operability (TI). Group/departmental relationships (GDR) were found to influence OF and OC. TI was found to influence OC and GDR and further, Information and knowledge sharing (IKS) was found to influence Optimal/effective CDM (OCDM). Of the three pillars of institutional theory, the regulative pillar offered more insights on issues related to rules, discourse and practice and hence the challenges of OCDM attainment. Practically, this study aims to re-orient the thinking of counter-terrorism organizations by presenting the socio-technical nature of the terrorism problem as well as explicating the role of digital technologies in terrorism. / NRF
|
136 |
IMPACT OF ANTI-FORENSICS TECHNIQUES ON DIGITAL FORENSICS INVESTIGATIONEtow, Tambue Ramine January 2020 (has links)
Computer crimes have become very complex in terms of investigation and prosecution. This is mainly because forensic investigations are based on artifacts left oncomputers and other digital devices. In recent times, perpetrators of computer crimesare getting abreast of the digital forensics dynamics hence, capacitated to use someanti-forensics measures and techniques to obfuscate the investigation processes.Incases where such techniques are employed, it becomes extremely difficult, expensive and time consuming to carry out an effective investigation. This might causea digital forensics expert to abandon the investigation in a pessimistic manner.ThisProject work serves to practically demonstrate how numerous anti-forensics can bedeployed by the criminals to derail the smooth processes of digital forensic investigation with main focus on data hiding and encryption techniques, later a comparativestudy of the effectiveness of some selected digital forensics tools in analyzing andreporting shreds of evidence will be conducted.
|
137 |
A Multi-Modal Insider Threat Detection and Prevention based on Users' BehaviorsHashem, Yassir 08 1900 (has links)
Insider threat is one of the greatest concerns for information security that could cause more significant financial losses and damages than any other attack. However, implementing an efficient detection system is a very challenging task. It has long been recognized that solutions to insider threats are mainly user-centric and several psychological and psychosocial models have been proposed. A user's psychophysiological behavior measures can provide an excellent source of information for detecting user's malicious behaviors and mitigating insider threats. In this dissertation, we propose a multi-modal framework based on the user's psychophysiological measures and computer-based behaviors to distinguish between a user's behaviors during regular activities versus malicious activities. We utilize several psychophysiological measures such as electroencephalogram (EEG), electrocardiogram (ECG), and eye movement and pupil behaviors along with the computer-based behaviors such as the mouse movement dynamics, and keystrokes dynamics to build our framework for detecting malicious insiders. We conduct human subject experiments to capture the psychophysiological measures and the computer-based behaviors for a group of participants while performing several computer-based activities in different scenarios. We analyze the behavioral measures, extract useful features, and evaluate their capability in detecting insider threats. We investigate each measure separately, then we use data fusion techniques to build two modules and a comprehensive multi-modal framework. The first module combines the synchronized EEG and ECG psychophysiological measures, and the second module combines the eye movement and pupil behaviors with the computer-based behaviors to detect the malicious insiders. The multi-modal framework utilizes all the measures and behaviors in one model to achieve better detection accuracy. Our findings demonstrate that psychophysiological measures can reveal valuable knowledge about a user's malicious intent and can be used as an effective indicator in designing insider threat monitoring and detection frameworks. Our work lays out the necessary foundation to establish a new generation of insider threat detection and mitigation mechanisms that are based on a user's involuntary behaviors, such as psychophysiological measures, and learn from the real-time data to determine whether a user is malicious.
|
138 |
Moral Judgment and Digital Piracy: Predicting Attitudes, Intention, and Behavior Regarding Digital Piracy Using a Modified Version of the Defining Issues TestWang, Jie (Financial professional) 12 1900 (has links)
Digital piracy, the illegal copying or downloading of copyrighted digital products without approval from the copyright holders, has brought great economic loss to the software and digital media industries. Previous studies using moral developmental theory have not found consistent relationships between moral judgment and attitudes towards digital piracy. While some researchers have developed individual test items to assess relationships between moral judgment and attitudes toward digital piracy, others have relied on the Defining Issues Test (DIT). However, in that the DIT represents a general measure of moral judgment based on broad social issues, it, too, may not adequately assess an individual’s reasoning specific to issues regarding digital piracy. The purpose of this study was to create a reliable instrument (i.e., DP-DIT) modeled after the DIT designed to assess moral judgment regarding digital piracy as well as to examine and compare the ability of both DP-DIT and DIT2-short to predict attitudes, intentions and behaviors regarding digital piracy of college students. Results indicated the reliability of both the DIT2-short and the DP-DIT were discounted, quite likely due to the small number of stories contained in each. DP-DIT appeared to have greater predictive ability due to its advantage in predicting attitudes toward digital piracy, especially using DP-DIT MNS. However, even though here DP-DIT MNS was the strongest predictor of attitudes toward digital piracy, it explained a limited amount of variance. Further research to improve reliability and validity of DP-DIT is warranted.
|
139 |
Computer seizure as technique in forensic investigationNdara, Vuyani 19 March 2014 (has links)
The problem encountered by the researcher was that the South African Police Service Cyber-Crimes Unit is experiencing problems in seizing computer evidence. The following problems were identified by the researcher in practice: evidence is destroyed or lost because of mishandling by investigators; computer evidence is often not obtained or recognised, due to a lack of knowledge and skills on the part of investigators to properly seize computer evidence; difficulties to establish authenticity and initiate a chain of custody for the seized evidence; current training that is offered is unable to cover critical steps in the performance of seizing computer evidence; computer seizure as a technique requires specialised knowledge and continuous training, because the information technology industry is an ever-changing area.
An empirical research design, followed by a qualitative research approach, allowed the researcher to also obtain information from practice. A thorough literature study, complemented by interviews, was done to collect the required data for the research. Members of the South African Police Cyber-crime Unit and prosecutors dealing with cyber-crime cases were interviewed to obtain their input into, and experiences on, the topic.
The aim of the study was to explore the role of computers in the forensic investigation process, and to determine how computers can be seized without compromising evidence. The study therefore also aimed at creating an understanding and awareness about the slippery nature of computer evidence, and how it can find its way to the court of law without being compromised. The research has revealed that computer crime is different from common law or traditional crimes. It is complicated, and therefore only skilled and qualified forensic experts should be used to seize computer evidence, to ensure that the evidence is not compromised. Training of cyber-crime technicians has to be priority, in order to be successful in seizing computers. / Department of Criminology / M.Tech. (Forensic Investigation)
|
140 |
Developing a multidisciplinary digital forensic readiness model for evidentiary data handlingPooe, El Antonio 05 1900 (has links)
There is a growing global recognition as to the importance of outlawing malicious computer related acts in a timely manner, yet few organisations have the legal and technical resources necessary to address the complexities of adapting criminal statutes to cyberspace. Literature reviewed in this study suggests that a coordinated, public-private partnership to produce a model approach can help reduce potential dangers arising from the inadvertent creation of cybercrime havens. It is against this backdrop that the study seeks to develop a digital forensic readiness model (DFRM) using a coordinated, multidisciplinary approach, involving both the public and private sectors, thus enabling organisations to reduce potential dangers arising from the inadvertent destruction and negating of evidentiary data which, in turn, results in the non-prosecution of digital crimes.
The thesis makes use of 10 hypotheses to address the five research objectives, which are aimed at investigating the problem statement. This study constitutes qualitative research and adopts the post-modernist approach. The study begins by investigating each of the 10 hypotheses, utilising a systematic literature review and interviews, followed by a triangulation of findings in order to identify and explore common themes and strengthen grounded theory results. The output from the latter process is used as a theoretical foundation towards the development of a DFRM model which is then
validated and verified against actual case law. Findings show that a multidisciplinary approach to digital forensic readiness can aid in preserving the integrity of evidentiary data within an organisation. The study identifies three key domains and their critical components. The research then demonstrates how the interdependencies between the domains and their respective components can enable organisations to identify and manage vulnerabilities which may contribute to the inadvertent destruction and negating of evidentiary data. The Multidisciplinary Digital Forensic Readiness Model (M-DiFoRe) provides a proactive approach to creating and improving organizational digital forensic readiness. This study contributes to the greater body of knowledge in digital forensics in that it reduces complexities associated with achieving digital forensic readiness and streamlines the handling of digital evidence within an organisation. / Information Science / Ph.D. (Information Systems)
|
Page generated in 0.0683 seconds