• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 1
  • Tagged with
  • 720
  • 715
  • 707
  • 398
  • 385
  • 382
  • 164
  • 97
  • 86
  • 82
  • 44
  • 42
  • 39
  • 30
  • 29
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Taxonomies for software security

Corcalciuc, Horia V. January 2014 (has links)
A reoccurring problem with software security is that programmers are encouraged to reason about correctness either at code-level or at the design level, while attacks often tend to take places on intermediary layers of abstraction. It may happen that the code itself may seem correct and secure as long as its functionality has been demonstrated - for example, by showing that some invariant has been maintained. However, from a high-level perspective, one can observe that parallel executing processes can be seen as one single large program consisting of smaller components that work together in order to accomplish a task and that, for the duration of that interaction, several smaller invariants have to be maintained. It is frequently the case that an attacker manages to subvert the behavior of a program in case the invariants for intermediary steps can be invalidated. Such invariants become difficult to track, especially when the programmer does not explicitly have security in mind. This thesis explores the mechanisms of concurrent interaction between concurrent processes and tries to bring some order to synchronization by studying attack patterns, not only at code level, but also from the perspective of abstract programming concepts.
42

A framework for the analysis and comparison of process mining algorithms

Weber, Philip January 2014 (has links)
Process mining algorithms use event logs to learn and reason about business processes. Although process mining is essentially a machine learning task, little work has been done on systematically analysing algorithms to understand their fundamental properties, such as how much data is needed for confidence in mining. Nor does any rigorous basis exist on which to choose between algorithms and representations, or compare results. We propose a framework for analysing process mining algorithms. Processes are viewed as distributions over traces of activities and mining algorithms as learning these distributions. We use probabilistic automata as a unifying representation to which other representation languages can be converted. To validate the theory we present analyses of the Alpha and Heuristics Miner algorithms under the framework, and two practical applications. We propose a model of noise in process mining and extend the framework to mining from ‘noisy’ event logs. From the probabilities and sub-structures in a model, bounds can be given for the amount of data needed for mining. We also consider mining in non-stationary environments, and a method for recovery of the sequence of changed models over time. We conclude by critically evaluating this framework and suggesting directions for future research.
43

Learning deep representations for robotics applications

Aktaş, Ümit Ruşen January 2018 (has links)
In this thesis, two hierarchical learning representations are explored in computer vision tasks. First, a novel graph theoretic method for statistical shape analysis, called Compositional Hierarchy of Parts (CHOP), was proposed. The method utilises line-based features as its building blocks for the representation of shapes. A deep, multi-layer vocabulary is learned by recursively compressing this initial representation. The key contribution of this work is to formulate layerwise learning as a frequent sub-graph discovery problem, solved using the Minimum Description Length (MDL) principle. The experiments show that CHOP employs part shareability and data compression features, and yields state-of- the-art shape retrieval performance on 3 benchmark datasets. In the second part of the thesis, a hybrid generative-evaluative method was used to solve the dexterous grasping problem. This approach combines a learned dexterous grasp generation model with two novel evaluative models based on Convolutional Neural Networks (CNNs). The data- efficient generative method learns from a human demonstrator. The evaluative models are trained in simulation, using the grasps proposed by the generative approach and the depth images of the objects from a single view. On a real grasp dataset of 49 scenes with previously unseen objects, the proposed hybrid architecture outperforms the purely generative method, with a grasp success rate of 77.7% to 57.1%. The thesis concludes by comparing the two families of deep architectures, compositional hierarchies and DNNs, providing insights on their strengths and weaknesses.
44

A value and debt aware framework for evaluating compliance in software systems

Ojameruaye, Bendra January 2016 (has links)
Today's software systems need to be aligned with relevant laws and other prevailing regulations to control compliance. Compliance refers to the ability of a system to satisfy its functional and quality goals to levels that are acceptable to predefined standards, guidelines, principles, legislation or other norms within the application domain. Addressing compliance requirements at an early stage of software development is vital for successful development as it saves time, cost, resources and the effort of repairing software defects. We argue that the management of compliance and compliance requirements is ultimately an investment activity that requires value-driven decision-making. The work presented in this thesis revolves around improving decision support for compliance by making them value, risk and risk aware. Specifically, this thesis presents an economics-driven approach, which leverages on goal-oriented requirements engineering with portfolio-based thinking and technical debt analysis to enhance compliance related decisions at design-time. The approach is value driven and systematic; it leverages on influential work of portfolio thinking and technical to make the link between compliance requirements, risks, value and debt explicit to software engineers. The approach is evaluated with two case studies to illustrate its applicability and effectiveness.
45

Architecture-centric testing for security

Al-Azzani, Sarah January 2014 (has links)
This thesis presents a novel architecture-centric approach, which uses Implied Scenarios (IS) to detect design-vulnerabilities in the software architecture. It reviews security testing approaches, and draws on their limitations in addressing unpredictable behaviour in the face of evolution. The thesis introduces the concept of Security ISs as unanticipated (possibly malicious) behaviours that indicate potential insecurities in the architecture. The IS approach uses the architecture as the appropriate level of abstraction to tackle the complexity of testing. It provides potential for scalability to test large scale complex applications. It proposes a three-phased method for security testing: (1) Detecting design-level vulnerabilities in the architecture in an incremental manner by composing functionalities as they evolve. (2) Classifying the impact of detected ISs on the security of the architecture. (3) Using the detected ISs and their impact to guide the refinement of the architecture. The refinement is test-driven and incremental, where refinements are tested before they are committed. The thesis also presents SecArch, an extension to the IS approach to enhance its search-space to detect hidden race conditions. The thesis reports on the applications of the proposed approach and its extension to three case studies for testing the security of distributed and cloud architectures in the presence of uncertainty in the operating environment, unpredictability of interaction and possible security IS.
46

Trusted execution : applications and verification

Batten, Ian Gilbert January 2016 (has links)
Useful security properties arise from sealing data to specific units of code. Modern processors featuring Intel’s TXT and AMD’s SVM achieve this by a process of measured and trusted execution. Only code which has the correct measurement can access the data, and this code runs in an environment trusted from observation and interference. We discuss the history of attempts to provide security for hardware platforms, and review the literature in the field. We propose some applications which would benefit from use of trusted execution, and discuss functionality enabled by trusted execution. We present in more detail a novel variation on Diffie-Hellman key exchange which removes some reliance on random number generation. We present a modelling language with primitives for trusted execution, along with its semantics. We characterise an attacker who has access to all the capabilities of the hardware. In order to achieve automatic analysis of systems using trusted execution without attempting to search a potentially infinite state space, we define transformations that reduce the number of times the attacker needs to use trusted execution to a pre-determined bound. Given reasonable assumptions we prove the soundness of the transformation: no secrecy attacks are lost by applying it. We then describe using the StatVerif extensions to ProVerif to model the bounded invocations of trusted execution. We show the analysis of realistic systems, for which we provide case studies.
47

Prototyping parallel functional intermediate languages

Ben-Dyke, Andrew David January 1999 (has links)
Non-strict higher-order functional programming languages are elegant, concise, mathematically sound and contain few environment-specific features, making them obvious candidates for harnessing high-performance architectures. The validity of this approach has been established by a number of experimental compilers. However, while there have been a number of important theoretical developments in the field of parallel functional programming, implementations have been slow to materialise. The myriad design choices and demands of specific architectures lead to protracted development times. Furthermore, the resulting systems tend to be monolithic entities, and are difficult to extend and test, ultimatly discouraging experimentation. The traditional solution to this problem is the use of a rapid prototyping framework. However, as each existing systems tends to prefer one specific platform and a particular way of expressing parallelism (including implicit specification) it is difficult to envisage a general purpose framework. Fortunately, most of these systems have at least one point of commonality: the use of an intermediate form. Typically, these abstract representations explicitly identify all parallel components but without the background noise of syntactic and (potentially arbitrary) implementation details. To this end, this thesis outlines a framework for rapidly prototyping such intermediate languages. Based on the traditional three-phase compiler model, the design process is driven by the development of various semantic descriptions of the language. Executable versions of the specifications help to both debug and informally validate these models. A number of case studies, covering the spectrum of modern implementations, demonstrate the utility of the framework.
48

Metric learning for incorporating privileged information in prototype-based models

Fouad, Shereen January 2013 (has links)
Prototype-based classification models, and particularly Learning Vector Quantization (LVQ) frameworks with adaptive metrics, are powerful supervised classification techniques with good generalization behaviour. This thesis proposes three advanced learning methodologies, in the context of LVQ, aiming at better classification performance under various classification settings. The first contribution presents a direct and novel methodology for incorporating valuable privileged knowledge in the LVQ training phase, but not in testing. This is done by manipulating the global metric in the input space, based on distance relations revealed by the privileged information. Several experiments have been conducted that serve as illustration, and demonstrate the benefit of incorporating privileged information on the classification accuracy. Subsequently, the thesis presents a relevant extension of LVQ models, with metric learning, to the case of ordinal classification problems. Unlike in existing nominal LVQ, in ordinal LVQ the class order information is explicitly utilized during training. Competitive results have been obtained on several benchmarks, which improve upon standard LVQ as well as benchmark ordinal classifiers. Finally, a novel ordinal-based metric learning methodology is presented that is principally intended to incorporate privileged information in ordinal classification tasks. The model has been verified experimentally through a number of benchmark and real-world data sets.
49

The discursive constitution of software development

Cornut, Francis January 2009 (has links)
The successful development of software continues to be of central interest, both as an academic topic and in professional practice. Consequently, several software development approaches and methodologies have been developed and promoted over the past decades. However, despite the attention given to the subject and the methodical support available, software development and how it should be practiced continue to be controversial. This thesis examines how beliefs about software development come to be socially established as legitimate, and how they come to constitute software development practices in an organization. It is argued that the emergence of a dominant way of conceiving of and practicing software development is the outcome of power relations that permeate the discursive practices of organizational actors. The theoretical framework of this study is guided by Pierre Bourdieu’s theory of symbolic violence and organizational discourse theory. As a research method, ethnographic research techniques are utilized as part of a case study to gain deep insights into the standardization of software development practices. The research site is the IT division of a large financial services organization and is composed of ten units distributed across eight countries. The tumultuous development of a knowledge management programme intended to institutionalize a standard software development process across the organization’s units provides the case for this research. This thesis answers the call for studies providing detailed accounts of the sociopolitical process by which technically oriented practices are transferred and standardized within organizations. It is submitted that a discourse theoretical approach informed by Bourdieu’s thinking enables us to conceptualize this process in a more meaningful, and theoretically rigorous, manner. In providing this theoretical approach, the thesis seeks to contribute to current research on technology and innovation management, and to offer guidance on some issues concerning the management of the software development process.
50

ACTAS : Adaptive Composition and Trading with Agents for Services

Kloos, Reinhold January 2013 (has links)
Mainly in business domains, the vision of gaining flexible, adaptive service environments is based on the standardization and practical proliferation of (Semantic) Web Services, ontologies, and agents. The standards of Web Services and their Service-oriented Architectures (SOA) became the standard paradigm for software component integration. Dynamic changes and the permanently increasing amount of available e-services of different domains are a challenge of Service Discovery and Composition. Mediation between different approaches and expert knowledge is often necessary for the composition of services of different domains. Semantic enhancements, Autonomic Service Discovery, and the research for more holistic concepts for the classification of e-services are current attempts of overcoming this challenge, in order to reach the ultimate goal of Autonomic SOC.

Page generated in 0.0303 seconds