Spelling suggestions: "subject:"1775"" "subject:"775""
1 |
Protokollwechsel zur Realisierung von Covert Channels und Header-Strukturveränderungen zur Vermeidung von Covert ChannelsWendzel, Steffen 13 May 2009 (has links) (PDF)
Diese Diplomarbeit befasst sich mit mehreren Unterthemen der verdeckten Kommunikationskanäle (Covert Channels) und möchte vor allen Dingen neue Themen vorstellen und diskutieren: Erstmalige und detaillierte Behandlung von Protocol Hopping Covert Channels: Protocol Hopping Covert Channels sind Storage Channels die, während sie existieren, das Netzwerkprotokoll, in dem die versteckten Informationen untergebracht werden, wechseln. Vorstellung der Idee der Protocol Channels: Im Gegensatz zu Protocol Hopping Covert Channels sind Protocol Channels schwerer zu detektieren, da sie ausschließlich durch den Wechsel eines Protokolls (ohne zusätzliche Informationen zu verstecken), versteckte Daten übertragen. Sowohl für Protocol Hopping Covert Channels als auch für Protocol Channels beschreibt diese Arbeit deren jeweilige Technik und untersucht deren Detektionsmöglichkeiten. Vorstellung der Idee der Header-Strukturveränderung: Ziel der Header-Strukturveränderung ist es, die Möglichkeiten, die Angreifer bei der Erstellung von Storage Channels innerhalb von Paket-Headern haben, einzugrenzen. Bei der Header-Strukturveränderung wird der Aufbau von Paket-Headern für jedes neu verschickte Paket verändert. Eine entsprechende Strukturinformation, die den Headeraufbau bestimmt, ist nur vertrauenswürdigen Komponenten beim Empfänger bzw. Sender zugänglich. Diese Arbeit stellt sowohl ein theoretisches Modell der Header-Strukturveränderung als auch eine praktische Umsetzung vor.
|
2 |
A design-by-contract based approach for architectural modelling and analysisOzkaya, M. January 2014 (has links)
Research on software architectures has been active since the early nineties, leading to a number of different architecture description languages (ADL). Given their importance in facilitating the communication of crucial system properties to different stakeholders and their analysis early on in the development of a system this is understandable. However, practitioners rarely use ADLs, and, instead, they insist on using the Unified Modelling Language (UML) for specifying software architectures. I attribute this to three main issues that have not been addressed altogether by the existing ADLs. Firstly, in their attempt to support formal analysis, current ADLs employ formal notations (i.e., mostly process algebras) that are rarely used among practitioners. Secondly, many ADLs focus on components in specifying software architectures, neglecting the first-class specification of complex interaction protocols as connectors. They view connectors as simple interaction links that merely identify the communicating components and their basic communication style (e.g., procedure call). So, complex interaction protocols are specified as part of components, which however reduce the re-usability of both. Lastly, there are also some ADLs that do support complex connectors. However, these include a centralised glue element in their connector structure that imposes a global ordering of actions on the interacting components. Such global constraints are not always realisable in a decentralised manner by the components that participate in these protocols. In this PhD thesis, I introduce a new architecture description language called XCD that supports the formal specification of software architectures without employing a complex formal notation and offers first-class connectors for maximising the re-use of components and protocols. Furthermore, by omitting any units for specifying global constraints (i.e., glue), the architecture specifications in XCD are guaranteed to be realisable in a decentralised manner. I show in the thesis how XCD extends Design-by-Contract (DbC) for specifying (i) protocol-independent components and (ii) complex connectors, which can impose only local constraints to guarantee their realisability. Use of DbC will hopefully make it easier for practitioners to use the language, compared to languages using process algebras. I also show the precise translation of XCD into SPIN’s formal ProMeLa language for formally verifying software architectures that (i) services offered by components are always used correctly, (ii) the component behaviours are always complete, (iii)there are no race-conditions, (iv) there is no deadlock, and (v) for components having event communications, there is no overflow of event buffers. Finally, I evaluate XCD via five well-known case studies and illustrate XCD’s enhanced modularity, expressive DbC-based notation, and guaranteed realisability for architecture specifications.
|
3 |
A secure and scalable communication framework for inter-cloud servicesSajjad, Ali January 2015 (has links)
A lot of contemporary cloud computing platforms offer Infrastructure-as-a-Service provisioning model, which offers to deliver basic virtualized computing resources like storage, hardware, and networking as on-demand and dynamic services. However, a single cloud service provider does not have limitless resources to offer to its users, and increasingly users are demanding the features of extensibility and inter-operability with other cloud service providers. This has increased the complexity of the cloud ecosystem and resulted in the emergence of the concept of an Inter-Cloud environment where a cloud computing platform can use the infrastructure resources of other cloud computing platforms to offer a greater value and flexibility to its users. However, there are no common models or standards in existence that allows the users of the cloud service providers to provision even some basic services across multiple cloud service providers seamlessly, although admittedly it is not due to any inherent incompatibility or proprietary nature of the foundation technologies on which these cloud computing platforms are built. Therefore, there is a justified need of investigating models and frameworks which allow the users of the cloud computing technologies to benefit from the added values of the emerging Inter-Cloud environment. In this dissertation, we present a novel security model and protocols that aims to cover one of the most important gaps in a subsection of this field, that is, the problem domain of provisioning secure communication within the context of a multi-provider Inter-Cloud environment. Our model offers a secure communication framework that enables a user of multiple cloud service providers to provision a dynamic application-level secure virtual private network on top of the participating cloud service providers. We accomplish this by taking leverage of the scalability, robustness, and flexibility of peer-to-peer overlays and distributed hash tables, in addition to novel usage of applied cryptography techniques to design secure and efficient admission control and resource discovery protocols. The peer-to-peer approach helps us in eliminating the problems of manual configurations, key management, and peer churn that are encountered when setting up the secure communication channels dynamically, whereas the secure admission control and secure resource discovery protocols plug the security gaps that are commonly found in the peer-to-peer overlays. In addition to the design and architecture of our research contributions, we also present the details of a prototype implementation containing all of the elements of our research, as well as showcase our experimental results detailing the performance, scalability, and overheads of our approach, that have been carried out on actual (as opposed to simulated) multiple commercial and non-commercial cloud computing platforms. These results demonstrate that our architecture incurs minimal latency and throughput overheads for the Inter-Cloud VPN connections among the virtual machines of a service deployed on multiple cloud platforms, which are 5% and 10% respectively. Our results also show that our admission control scheme is approximately 82% more efficient and our secure resource discovery scheme is about 72% more efficient than a standard PKI-based (Public Key Infrastructure) scheme.
|
4 |
Visualisation for household energy analysis : techniques for exploring multiple variables across scale and geographyGoodwin, Sarah M. January 2015 (has links)
The visualisation of large volumes of data can provide rich and meaningful representations that enable users to gain insights quickly and efficiently. Household energy consumer characteristics are explored in this thesis using innovative interactive visualisation techniques. Initial research with energy analysts, from a major UK utility company, investigates visual possibilities and opportunities for future (smart home) energy analytics and explicitly uses creativity techniques for information visualisation requirements gathering. The results, along with exploratory visual analysis combining geodemographic groups and energy consumption, identifes a need for profiling consumers by typical traits. While energy consumption has been a popular topic of research in recent years, there is still limited understanding of the relationship between energy consumption and measurable characteristics of the general population. An investigation of the process of creating an energy-based geodemographic classification led to the proposal and design of a new theoretical framework for visually comparing multivariate data across scale and geography; a necessary step when selecting reliable variables for running clustering algorithms, such as during the geodemographic classification creation process. The framework for including geography and scale in multivariate comparison forms the major contribution of this thesis. This framework is demonstrated and justified through the building of an interactive visualisation prototype, using input variables deemed relevant for consideration for energy-based geodemographic classification. Important transitions in the framework are highlighted in the proposed design, which uses both statistical and spatial representations. The utility of the framework is validated in the context of energy-based geodemographic variable selection where the multivariate geography of the UK is explored. The sensitivities of varying scale and geography { through varying resolution, extent and the calculation of locally weighted summary statistics { are investigated in context and are shown to be important elements to consider during the variable selection process. The broader applicability of the framework is demonstrated through two further scenarios where multivariate visualisation across scale and geography is shown to be important. The research provides a framework and viable solutions through which geographical visual parameter space analysis (gvPSA) can be undertaken. It uses a design science approach that results in a series of artifacts that open up new visualisation possibilities. This project covers a wide topic where the breadth of research options is extensive and many possibilities for continued research are identified.
|
5 |
Creativity support in games for motivated learningSisarica, Anja January 2015 (has links)
A natural extension of play for creative thinking can innovatively drive technology-led changes to the facilitation of creative problem solving, and generate a new genre in serious gaming. Whilst the use of serious games has grown considerably in recent years, support for players to think creatively is often implicit in the game, and does not exploit the wide range of creativity techniques and software tools available. The work reported in this thesis is the first to explicitly integrate creativity support into serious games. The results show that creative serious games can systematically support acquisition of creativity skills, generation of creative learning outcomes, and induction of motivational and learning benefits amongst the players. Therefore, this thesis introduces the concept of explicit creativity support in serious games, with a focus on games for motivated learning in adult professional setting, and reports formative and summative evaluations of new prototype games for this setting, in order to instantiate, refine and validate the concept. The creative learning objective of the prototype games was to train carers in creativity techniques to deliver more person-centred care to people with dementia. The findings are delivered in the form of a new framework, which proposes recommendations for the design and understanding of creative serious games. Four formative evaluations of three prototypes of creative serious games with carers provided results that led to refinements of the framework and the design of more usable and effective games. A subsequent summative evaluation partially validated the framework, delivering both a framework and prototype creative serious game that demonstrated the potential to improve person-centred dementia care training. The thesis provides a proof-of-concept of the value of creative serious games, and shows the potential for the framework to be applied and have impact on other application domains.
|
6 |
Mining intrusion detection alert logs to minimise false positives & gain attack insightShittu, Riyanat O. January 2016 (has links)
Utilising Intrusion Detection System (IDS) logs in security event analysis is crucial in the process of assessing, measuring and understanding the security state of a computer network, often defined by its current exposure and resilience to network attacks. Thus, the study of understanding network attacks through event analysis is a fast growing emerging area. In comparison to its first appearance a decade ago, the complexities involved in achieving effective security event analysis have significantly increased. With such increased complexities, advances in security event analytical techniques are required in order to maintain timely mitigation and prediction of network attacks. This thesis focusses on improving the quality of analysing network event logs, particularly intrusion detection logs by exploring alternative analytical methods which overcome some of the complexities involved in security event analysis. This thesis provides four key contributions. Firstly, we explore how the quality of intrusion alert logs can be improved by eliminating the large volume of false positive alerts contained in intrusion detection logs. We investigate probabilistic alert correlation, an alternative to traditional rule based correlation approaches. We hypothesise that probabilistic alert correlation aids in discovering and learning the evolving dependencies between alerts, further revealing attack structures and information which can be vital in eliminating false positives. Our findings showed that the results support our defined hypothesis, aligning consistently with existing literature. In addition, evaluating the model using recent attack datasets (in comparison to outdated datasets used in many research studies) allowed the discovery of a new set of issues relevant to modern security event log analysis which have only been introduced and addressed in few research studies. Secondly, we propose a set of novel prioritisation metrics for the filtering of false positive intrusion alerts using knowledge gained during alert correlation. A combination of heuristic, temporal and anomaly detection measures are used to define metrics which capture characteristics identifiable in common attacks including denial-of-service attacks and worm propagations. The most relevant of the novel metrics, Outmet is based on the well known Local Outlier Factor algorithm. Our findings showed that with a slight trade-off of sensitivity (i.e. true positives performance), outmet reduces false positives significantly. In comparison to prior state-of-the-art, our findings show that it performs more efficiently given a variation of attack scenarios. Thirdly, we extend a well known real-time clustering algorithm, CluStream in order to support the categorisation of attack patterns represented as graph like structures. Our motive behind attack pattern categorisation is to provide automated methods for capturing consistent behavioural patterns across a given class of attacks. To our knowledge, this is a novel approach to intrusion alert analysis. The extension of CluStream resulted is a novel light weight real-time clustering algorithm for graph structures. Our findings are new and complement existing literature. We discovered that in certain case studies, repetitive attack behaviour could be mined. Such a discovery could facilitate the prediction of future attacks. Finally, we acknowledge that due to the intelligence and stealth involved in modern network attacks, automated analytical approaches alone may not suffice in making sense of intrusion detection logs. Thus, we explore visualisation and interactive methods for effective visual analysis which if combined with the automated approaches proposed, would improve the overall results of the analysis. The result of this is a visual analytic framework, integrated and tested in a commercial Cyber Security Event Analysis Software System distributed by British Telecom.
|
7 |
Representation decomposition for knowledge extraction and sharing using restricted Boltzmann machinesTran, Son January 2016 (has links)
Restricted Boltzmann machines (RBMs), with many variations and extensions, are an efficient neural network model that has been applied very successfully recently as a building block for deep networks in diverse areas ranging from language generation to video analysis and speech recognition. Despite their success and the creation of increasingly complex network models and learning algorithms based on RBMs, the question of how knowledge is represented, and could be shared by such networks, has received comparatively little attention. Neural networks are notorious for being difficult to interpret. The area of knowledge extraction addresses this problem by translating network models into symbolic knowledge. Knowledge extraction has been normally applied to feed-forward neural networks trained in supervised fashion using the back-propagation learning algorithm. More recently, research has shown that the use of unsupervised models may improve the performance of network models at learning structures from complex data. In this thesis, we study and evaluate the decomposition of the knowledge encoded by training stacks of RBMs into symbolic knowledge that can offer: (i) a compact representation for recognition tasks; (ii) an intermediate language between hierarchical symbolic knowledge and complex deep networks; (iii) an adaptive transfer learning method for knowledge reuse. These capabilities are the fundamentals of a Learning, Extraction and Sharing (LES) system, which we have developed. In this system learning can automate the process of encoding knowledge from data into an RBM, extraction then translates the knowledge into symbolic form, and sharing allows parts of the knowledge-base to be reused to improve learning in other domains. To this end, in this thesis we introduce confidence rules, which are used to allow the combination of symbolic knowledge and quantitative reasoning. Inspired by Penalty Logic - introduced for Hopfield networks confidence rules establish a relationship between logical rules and RBMs. However, instead of representing propositional well-formed formulas, confidence rules are designed to account for the reasoning of a stack of RBMs, to support modular learning and hierarchical inference. This approach shares common objectives with the work on neural-symbolic cognitive agents. We show in both theory and through empirical evaluations that a hierarchical logic program in the form of a set of confidence rules can be constructed by decomposing representations in an RBM or a deep belief network (DBN). This decomposition is at the core of a new knowledge extraction algorithm which is computationally efficient. The extraction algorithm seeks to benefit from the symbolic knowledge representation that it produces in order to improve network initialisation in the case of transfer learning. To this end, confidence rules o_er a language for encoding symbolic knowledge into a deep network, resulting, as shown empirically in this thesis, in an improvement in modular learning and reasoning. As far as we know this is the first attempt to extract, encode, and transfer symbolic knowledge among DBNs. In a confidence rule, a real value, named confidence value, is associated with a logical implication rule. We show that the logical rules with the highest confidence values can perform similarly to the original networks. We also show that by transferring and encoding representations learned from a domain onto another related or analogous domain, one may improve the performance of representations learned in this other domain. To this end, we introduce a novel algorithm for transfer learning called “Adaptive Profile Transferred Likelihood”, which adapts transferred representations to target domain data. This algorithm is shown to be more effective than the simple combination of transferred representations with the representations learned in the target domain. It is also less sensitive to noise and therefore more robust to deal with the problem of negative transfer.
|
8 |
The diagnostic efficacy of JPEG still image compression in three radiological imaging modalitiesPatefield, Steven January 2002 (has links)
No description available.
|
9 |
A story model of report and work in neuroradiologyRooksby, J. January 2002 (has links)
No description available.
|
10 |
An object layer for conventional file-systemsWheatman, Martin J. January 1999 (has links)
No description available.
|
Page generated in 0.0391 seconds