• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 77
  • 28
  • 10
  • 9
  • Tagged with
  • 422
  • 80
  • 74
  • 44
  • 40
  • 40
  • 40
  • 39
  • 39
  • 29
  • 28
  • 27
  • 26
  • 24
  • 21
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
211

Design and evaluate a fair exchange protocol based on online trusted third party (TTP)

Alotaibi, Abdullah S. January 2012 (has links)
One of the most crucial factors that e-commerce protocols should address is a fair exchange. In this research, an advanced method of cryptography coupled with the pay per use technique is used. A new electronic commerce protocol for the exchange of commodities is introduced. The proposed new protocol guarantees both features while addressing the main drawbacks associated with other related protocols. The new suggested e-commerce protocol is composed of two stages: pre-exchange and exchange stages. When the suggested new protocol is analysed with scrupulous protocol analysis, it attains fair exchange and a secure method of payment. The suggested new e-commerce protocol is more efficient than other related existing protocols. In this research “protocol prototype” and “model checking” is used for the purpose of authentication. The protocol prototype verifies that the suggested new protocol is executable when it's used in a real context. By experimental designs, this research shows the length of asymmetric keys as the biggest element that affects the efficiency of the protocol. When model-checking is applied in this protocol, the outcome indicates that the suggested protocol achieves the required features of fairness. Protocol extensions give those involved in the protocol the capacity to be resilient to failure. By using three methods of authentication, this research confirms that the new proposed protocol is well formulated. The work reported in this thesis first study the existing fair exchange protocols that solve the fairness problem. Then, propose more efficient protocol to solve the fairness problem. The original idea in this thesis is to reduce the communication overheads, risks and solve the bottleneck problems in the protocols that involve an online TTP.
212

Efficient enforcement of security policies in distributed systems

Alzahrani, Ali Mousa G. January 2013 (has links)
Policy-based management (PBM) is an adaptable security policy mechanism in information systems (IS) that confirm only authorised users can access resources. A few decades ago, the traditional PBM has focused on closed systems, where enforcement mechanisms are trusted by system administrators who define access control policies. Most of current work on the PBM systems focuses on designing a centralised policy decision point (PDP), the component that evaluates an access request against a policy and reports the decision back, which can have performance and resilience drawbacks. Performance and resilience are a major concern for applications in military, health and national security domains where the performance is desirable to increase situational awareness through collaboration and to decrease the length of the decision making cycle. The centralised PDP also represents a single point of failure. In case of the failure of the centralised PDP, all resources in the system may cease to function. The efficient distribution of enforcement mechanisms is therefore key in building large scale policy managed distributed systems. Moving from the traditional PBM systems to dynamic PBM systems supports dynamic adaptability of behaviour by changing policy without recoding or stopping the system. The SANTA history-based dynamic PBM system has a formal underpinning in Interval Temporal Logic (ITL) allowing for formal analysis and verification to take place. The main aim of the research to automatically distribute enforcement mechanisms in the distributed system in order to provide resilience against network failure whilst preserving efficiency of policy decision making. The policy formalisation is based on SANTA policy model to provide a high level of assurance. The contribution of this work addresses the challenge of performance, manageability and security, by designing a Decentralised PBM framework and a corresponding Distributed Enforcements Architecture (DENAR). The ability of enforcing static and dynamic security policies in DENAR is the prime research issue, which balances the desire to distribute systems for flexibility whilst maintaining sufficient security over operations. Our research developed mechanisms to improve the efficiency of the enforcement of security policy mechanisms and their resilience against network failures in distributed information systems.
213

Framework for the adoption of online banking

Alsulimani, Tagreed January 2013 (has links)
Information technology represents the most important tool for any business to grow and increase pro_ts in this century. Online banking represents one type of business change due to revolutionary changes in technology. There are limited studies for adoption of online banking in Saudi Arabia which is one of the largest economies in the world. For that reason my study focused on the adoption of online banking by countries in general and particularly in Saudi society. In many situations there is a gap between business and information technology. In particular there is a crimson between online bank users and technology. It is necessary to bridge this gap In order to achieve online banking targets. My study investigated the different reasons for the gap its formation (between online banking and information technology) and how to bridge it. This research is focused on the diferent factors that enhance the adoption of online banking services through general users. This framework was established by drawing from several theoretical studies. The proposed research framework contains the most important factors for online banking. These include the following hypotheses; (H1) personal information, (H2) personal experience, (H3) disposition to trust, (H4) reputation, (H5) trusting Belief, (H6)structural assur- ance and (H7) perceived site quality. These hypotheses were tested experimentally through a questionnaire which was analyzed using SPSS Version 14 program. The results showed that (H1) personal information, (H2) personal experience, (H3) dis- position to trust, (H4) reputation, (H5) trusting belief, (H6) structural assurance and (H7) perceived site quality are positive factors affecting customer adoption of online banking. There was a significant correlation between the different online banking adoption factors or hypotheses and the personal information (age, gender and education) with P values of <0.005 in most of cases.
214

Analysis and synthesis of inductive families

Ko, Hsiang-Shang January 2014 (has links)
Based on a natural unification of logic and computation, Martin-Löf’s intuitionistic type theory can be regarded simultaneously as a computationally meaningful higher-order logic system and an expressively typed functional programming language, in which proofs and programs are treated as the same entities. Two modes of programming can then be distinguished: in externalism, we construct a program separately from its correctness proof with respect to a given specification, whereas in internalism, we encode the specification in a sophisticated type such that any program inhabiting the type also encodes a correctness proof, and we can use type information as a guidance on program construction. Internalism is particularly effective in the presence of inductive families, whose design can have a strong influence on program structure. Techniques and mechanisms for facilitating internalist programming are still lacking, however. This dissertation proposes that internalist programming can be facilitated by exploiting an interconnection between internalism and externalism, expressed as isomorphisms between inductive families into which data structure invariants are encoded and their simpler variants paired with predicates expressing those invariants. The interconnection has two directions: one analysing inductive families into simpler variants and predicates, and the other synthesising inductive families from simpler variants and specific predicates. They respectively give rise to two applications, one achieving a modular structure of internalist libraries, and the other bridging internalist programming with relational specifications and program derivation. The datatype-generic mechanisms supporting the applications are based on McBride’s ornaments. Theoretically, the key ornamental constructs — parallel composition of ornaments and relational algebraic ornamentation — are further characterised in terms of lightweight category theory. Most of the results are completely formalised in the Agda programming language.
215

Theoretical and practical aspects of typestate

McGinniss, Iain January 2014 (has links)
The modelling and enforcement of typestate constraints in object oriented languages has the potential to eliminate a variety of common and difficult to diagnose errors. While the theoretical foundations of typestate are well established in the literature, less attention has been paid to the practical aspects: is the additional complexity justifiable? Can typestate be reasoned about effectively by "real" programmers? To what extent can typestate constraints be inferred, to reduce the burden of large type annotations? This thesis aims to answer these questions and provide a holistic treatment of the subject, with original contributions to both the theorical and practical aspects of typestate.
216

Context-aware aided parking solutions based on VANET

Alhammad, Abdulmalik January 2014 (has links)
Vehicular Ad-hoc Network (VANET) is a special application of the Mobile Ad-hoc Network (MANET) for managing road traffic and substantially contributes to the development of Intelligent Transportation Systems (ITS). VANET was introduced as a standard for data communication between moving vehicles with and without fixed infrastructure. It aims to support drivers by improving safety and driving comfort as a step towards constructing a safer, cleaner and a more intelligent environment. Nowadays, vehicles are manufactured equipped with a number of sensors and devices called On Board Units (OBU) assisting the vehicle to sense the surrounding environment and then process the context information to effectively manage communication with the surrounding vehicles and the associated infrastructure. A number of challenges have emerged in VANET that have encouraged researchers to investigate this concept further. Many of the recent studies have applied different technologies for intelligent parking management. However, despite all the technological advances, researchers are no closer to developing a system that enables drivers to easily locate and reserve a parking space. Limited resources such as energy, storage space, availability and reliability are factors which could have contributed to the lack success and progress in this area. The task then is to close these gaps and present a novel solution for parking. This research intends to address this need by developing a novel architecture for locating and reserving a parking space that best matches the driver's preferences and vehicle profile without distracting the driver. The simple and easy-to-use mechanism focuses on the domain of an intelligent parking system that exploits the concept of InfoStation (IS) and context-aware system creating a single framework to locate and reserve a parking space. A three tier network topology comprising of vehicles, IS and the InfoStation Centre (ISC) has been proposed as the foundation of the on-street parking system architecture. The thesis attempts to develop the architecture of a parking management solution as a comfort-enhancing application that offers to reduce congestion related stress and improve the driver experience by reducing the time it takes to identify and utilise a parking space that is available.
217

Security management system for 4G heterogeneous networks

Alquhayz, Hani January 2015 (has links)
There is constant demand for the development of mobile networks to meet the service requirements of users, and their development is a significant topic of research. The current fourth generation (4G) of mobile networks are expected to provide high speed connections anywhere at any time. Various existing 4G architectures such as LTE and WiMax support only wireless technologies, while an alternative architecture, Y-Comm, has been proposed to combine both existing wired and wireless networks. Y-Comm seeks to meet the main service requirements of 4G by converging the existing networks, so that the user can get better service anywhere and at any time. One of the major characteristics of Y-Comm is heterogeneity, which means that networks with different topologies work together to provide seamless communication to the end user. However, this heterogeneity leads to technical issues which may compromise quality of service, vertical handover and security. Due to the convergence characteristic of Y-Comm, security is considered more significant than in the existing LTE and WiMax networks. These security concerns have motivated this research study to propose a novel security management system. The research aims to meet the security requirements of 4G mobile networks, e.g. preventing end user devices from being used as attack tools. This requirement has not been met clearly in previous studies of Y-Comm, but this study proposes a security management system which does this. This research follows the ITU-T recommendation M.3400 dealing with security violations within Y-Comm networks. It proposes a policy-based security management system to deal with events that trigger actions in the system and uses Ponder2 to implement it. The proposed system, located in the top layer of the Y-Comm architecture, interacts with components of Y-Comm to enforce the appropriate policies. Its four main components are the Intelligent Agent, the Security Engine, the Security Policies Database and the Security Administrator. These are represented in this research as managed objects to meet design considerations such as extensibility and modifiability. This research demonstrates that the proposed system meets the security requirements of the Y-Comm environment. Its deployment is possible with managed objects built with Ponder2 for all of the components of Y-Comm, which means that the security management system is able to prevent end user devices from being used as attack tools. It can also achieve other security goals of Y-Comm networks.
218

A novel process model-driven approach to comparing educational courses using ontology alignment

Chernikova, Elena January 2014 (has links)
Nowadays, the comparison of educational courses and modules is performed manually by experts in the field of education. The main objective of this research work is to create an approach for the automation of this process. The main contribution of this work is a novel, ontology alignment-based methodology for the automated comparison of academic courses and modules, belonging to the cognitive learning domain. The results of this work are appropriate for such tasks as prior learning and degree recognition, the introduction of joint educational programmes and quality assurance in higher education institutions. The set-theoretical models of an educational course, its modules, learning outcomes and keywords were created and converted to the ontology. The choice of the information to be presented in the ontology was based on the careful analysis of programme specifications, module templates and Bologna recommendations for the comparison of educational courses. Ontology was chosen as the data model due to its ability to formally specify semantics, to represent taxonomies and to make inferences regarding data. The formal grammars of a keyword and a learning outcome were created to enable the semi-automated population of the ontology from the module templates. The corresponding annotators were designed in the General Architecture for Text Engineering 6.1. The algorithm for the comparison of educational courses and modules was based on the alignment of ontologies of their keywords and learning outcomes. A novel measure for calculating the similarity between the action verbs in the learning outcomes was introduced and was utilised. Both the measure and the algorithm were implemented in Java. For evaluation purposes, we utilised the module templates from the De Montfort and the Bauman Moscow State Technical Universities. The automatically produced annotations of the keywords and the learning outcomes were evaluated against a manually created gold standard. The high values of the precision, recall and f-measure proved their quality and their suitability for the task. The results produced by the alignment algorithm were compared with those produced by human judgement. The results returned by the experts and the algorithm were comparable, thus showing that the proposed approach is applicable for the partial automation of the comparison of educational modules and courses.
219

Security-driven software evolution using a model driven approach

Guan, Hui January 2014 (has links)
High security level must be guaranteed in applications in order to mitigate risks during the deployment of information systems in open network environments. However, a significant number of legacy systems remain in use which poses security risks to the enterprise' assets due to the poor technologies used and lack of security concerns when they were in design. Software reengineering is a way out to improve their security levels in a systematic way. Model driven is an approach in which model as defined by its type directs the execution of the process. The aim of this research is to explore how model driven approach can facilitate the software reengineering driven by security demand. The research in this thesis involves the following three phases. Firstly, legacy system understanding is performed using reverse engineering techniques. Task of this phase is to reverse engineer legacy system into UML models, partition the legacy system into subsystems with the help of model slicing technique and detect existing security mechanisms to determine whether or not the provided security in the legacy system satisfies the user's security objectives. Secondly, security requirements are elicited using risk analysis method. It is the process of analysing key aspects of the legacy systems in terms of security. A new risk assessment method, taking consideration of asset, threat and vulnerability, is proposed and used to elicit the security requirements which will generate the detailed security requirements in the specific format to direct the subsequent security enhancement. Finally, security enhancement for the system is performed using the proposed ontology based security pattern approach. It is the stage that security patterns derived from security expertise and fulfilling the elicited security requirements are selected and integrated in the legacy system models with the help of the proposed security ontology. The proposed approach is evaluated by the selected case study. Based on the analysis, conclusions are drawn and future research is discussed at the end of this thesis. The results show this thesis contributes an effective, reusable and suitable evolution approach for software security.
220

A computer-based holistic approach to managing progress of distributed agile teams

Alyahya, Sultan January 2013 (has links)
One of the co-ordination difficulties of remote agile teamwork is managing the progress of development. Several technical factors affect agile development progress; hence, their impact on progress needs to be explicitly identified and co-ordinated. These factors include source code versioning, unit testing (UT), acceptance testing (AT), continuous integration (CI), and releasing. These factors play a role in determining whether software produced for a user story (i.e. feature or use case) is ‘working software’ (i.e. the user story is complete) or not. One of the principles introduced by the Agile Manifesto is that working software is the primary measure of progress. In distributed agile teams, informal methods, such as video-conference meetings, can be used to raise the awareness of how the technical factors affect development progress. However, with infrequent communications, it is difficult to understand how the work of one team member at one site influences the work progress of another team member at a different site. Furthermore, formal methods, such as agile project management tools are widely used to support managing progress of distributed agile projects. However, these tools rely on team members’ perceptions in understanding change in progress. Identifying and co-ordinating the impact of technical factors on development progress are not considered. This thesis supports the effective management of progress by providing a computer-based holistic approach to managing development progress that aims to explicitly identify and co-ordinate the effects of the various technical factors on progress. The holistic approach requires analysis of how the technical factors cause change in progress. With each progress change event, the co-ordination support necessary to manage the event has been explicitly identified. The holistic approach also requires designing computer-based mechanisms that take into consideration the impact of technical factors on progress. A progress tracking system has been designed that keeps track of the impact of the technical factors by placing them under control of the tracking system. This has been achieved by integrating the versioning functionality into the progress tracking system and linking the UT tool, AT tool and CI tool with the progress tracking system. The approach has been evaluated through practical scenarios and has validated these through a research prototype. The result shows that the holistic approach is achievable and helps raise awareness of distributed agile teams regarding the change in the progress, as soon as it occurs. It overcomes the limitations of the informal and formal methods. Team members will no longer need to spend time determining how their change will impact the work of the other team members so that they can notify the affected members regarding the change. They will be provided with a system that helps them achieve this as they carry out their technical activities. In addition, they will not rely on static information about progress registered in a progress tracking system, but will be updated continuously with relevant information about progress changes occurring to their work.

Page generated in 0.0183 seconds