• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 10
  • 2
  • 1
  • Tagged with
  • 1325
  • 1313
  • 1312
  • 1312
  • 1312
  • 192
  • 164
  • 156
  • 129
  • 99
  • 93
  • 79
  • 52
  • 51
  • 51
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Algorithmic skeletons for exact combinatorial search at scale

Archibald, Blair January 2018 (has links)
Exact combinatorial search is essential to a wide range of application areas including constraint optimisation, graph matching, and computer algebra. Solutions to combinatorial problems are found by systematically exploring a search space, either to enumerate solutions, determine if a specific solution exists, or to find an optimal solution. Combinatorial searches are computationally hard both in theory and practice, and efficiently exploring the huge number of combinations is a real challenge, often addressed using approximate search algorithms. Alternatively, exact search can be parallelised to reduce execution time. However, parallel search is challenging due to both highly irregular search trees and sensitivity to search order, leading to anomalies that can cause unexpected speedups and slowdowns. As core counts continue to grow, parallel search becomes increasingly useful for improving the performance of existing searches, and allowing larger instances to be solved. A high-level approach to parallel search allows non-expert users to benefit from increasing core counts. Algorithmic Skeletons provide reusable implementations of common parallelism patterns that are parameterised with user code which determines the specific computation, e.g. a particular search. We define a set of skeletons for exact search, requiring the user to provide in the minimal case a single class that specifies how the search tree is generated and a parameter that specifies the type of search required. The five are: Sequential search; three general-purpose parallel search methods: Depth-Bounded, Stack-Stealing, and Budget; and a specific parallel search method, Ordered, that guarantees replicable performance. We implement and evaluate the skeletons in a new C++ parallel search framework, YewPar. YewPar provides both high-level skeletons and low-level search specific schedulers and utilities to deal with the irregularity of search and knowledge exchange between workers. YewPar is based on the HPX library for distributed task-parallelism potentially allowing search to execute on multi-cores, clusters, cloud, and high performance computing systems. Underpinning the skeleton design is a novel formal model, MT^3 , a parallel operational semantics that describes multi-threaded tree traversals, allowing reasoning about parallel search, e.g. describing common parallel search phenomena such as performance anomalies. YewPar is evaluated using seven different search applications (and over 25 specific instances): Maximum Clique, k-Clique, Subgraph Isomorphism, Travelling Salesperson, Binary Knapsack, Enumerating Numerical Semigroups, and the Unbalanced Tree Search Benchmark. The search instances are evaluated at multiple scales from 1 to 255 workers, on a 17 host, 272 core Beowulf cluster. The overheads of the skeletons are low, with a mean 6.1% slowdown compared to hand-coded sequential implementation. Crucially, for all search applications YewPar reduces search times by an order of magnitude, i.e hours/minutes to minutes/seconds, and we commonly see greater than 60% (average) parallel efficiency speedups for up to 255 workers. Comparing skeleton performance reveals that no one skeleton is best for all searches, highlighting a benefit of a skeleton approach that allows multiple parallelisations to be explored with minimal refactoring. The Ordered skeleton avoids slowdown anomalies where, due to search knowledge being order dependent, a parallel search takes longer than a sequential search. Analysis of Ordered shows that, while being 41% slower on average (73% worse-case) than Depth-Bounded, in nearly all cases it maintains the following replicable performance properties: 1) parallel executions are no slower than one worker sequential executions 2) runtimes do not increase as workers are added, and 3) variance between repeated runs is low. In particular, where Ordered maintains a relative standard deviation (RSD) of less than 15%, Depth-Bounded suffers from an RSD greater than 50%, showing the importance of carefully controlling search orders for repeatability.
32

From components to compositions : (de-)construction of computer-controlled behaviour with the robot operating system

Lyyra, Antti Kalervo January 2018 (has links)
Robots and autonomous systems play an increasingly important role in modern societies. This role is expected to increase as the computational methods and capabilities advance. Robots and autonomous systems produce goal-directed and context-dependent behaviour with an aim to loosen the coupling between the machines and their operators. These systems are a domain of complex digital innovation that intertwines the physical and digital worlds with computer-controlled behaviour as robots and autonomous systems render their behaviour from the interaction with the surrounding environment. Complex product and system innovation literature maintains that designers are expected to have detailed knowledge of different components and their interactions. To the contrary, digital innovation literature holds that end-product agnostic components can be generatively combined from heterogeneous sources utilising standardised interfaces. An in-depth case study into the Robot Operating System (ROS) was conducted to explore the conceptual tension between the specificity of designs and distributedness of knowledge and control in the context of complex digital innovation. The thematic analysis of documentary evidence, field notes and interviews produced three contributions. First, the case description presents how ROS has evolved over the past ten years to a global open-source community that is widely used in the development of robots and autonomous systems. Second, a model that conceptualises robots and autonomous as contextually bound and embodied chains of transformation is proposed to describe the structural and functional dynamics of complex digital innovation. Third, the generative-integrative mode of development is proposed to characterise the process of innovation that begins from a generative combination of components and subsequently proceeds to the integration phase during which the system behaviour is experimented, observed and adjusted. As the initial combination builds upon underspecification and constructive ambiguity, the generative combination is gradually crafted into a more dependable composition through the iterative removal of semantic incongruences.
33

Detection of suspicious URLs in online social networks using supervised machine learning algorithms

Al-Janabi, Mohammed Fadhil Zamil January 2018 (has links)
This thesis proposes the use of several supervised machine learning classification models that were built to detect the distribution of malicious content in OSNs. The main focus was on ensemble learning algorithms such as Random Forest, gradient boosting trees, extra trees, and XGBoost. Features were used to identify social network posts that contain malicious URLs derived from several sources, such as domain WHOIS record, web page content, URL lexical and redirection data, and Twitter metadata. The thesis describes a systematic analysis of the hyper-parameters of tree-based models. The impact of key parameters, such as the number of trees, depth of trees and minimum size of leaf nodes on classification performance, was assessed. The results show that controlling the complexity of Random Forest classifiers applied to social media spam is essential to avoid overfitting and optimise performance. The model complexity could be reduced by removing uninformative features, as the complexity they add to the model is greater than the advantages they give to the model to make decisions. Moreover, model-combining methods were tested, which are the voting and stacking methods. Both show advantages and disadvantages; however, in general, they appear to provide a statistically significant improvement in comparison to the highest singular model. The critical benefit of applying the stacking method to automate the model selection process is that it is effective in giving more weight to more topperforming models and less affected by weak ones. Finally, 'SuspectRate', an online malicious URL detection system, was built to offer a service to give a suspicious probability of tweets with attached URLs. A key feature of this system is that it can dynamically retrain and expand current models.
34

An evaluation model for information security strategies in healthcare data systems

Almutiq, Mutiq Mohammed January 2018 (has links)
This thesis presents a newly developed evaluation model, EMISHD (An "Evaluation Model for Information Security Strategies in Healthcare Data Systems") which can address the specific requirements of information security in healthcare sector. Based on a systematic literature review and case study, the information security requirements and the existing evaluation models used to examine the information security strategies of healthcare data systems have been analysed. The requirements of information security in any sector generally vary in line with changes in laws and regulations, and the emergence of new technologies and threats, which require existing information security strategies to be strengthened to deal with new challenges. The systemic review of the existing evaluation models identified from the previous research resulted in the development of a new evaluation model (EMISHD) specifically designed to examine the information security strategies in healthcare data systems according to the specific requirements. A case study of a healthcare organisation in Saudi Arabia is conducted in order to apply the newly developed evaluation model (EMISHD) in a real life case and to validate the evaluation results through observation.
35

A methodological framework for policy design & analysis focusing on problem identification & investigation

Teehan, Catherine January 2018 (has links)
Traditional public policy decision making has been supported with a cyclical framework based on the rational model, first introduced in the 1950s by Harold Lasswell. However, public policy problems are intrinsically complex and are usually inherently multi-disciplinary and critics of the cyclical model call for more holistic approaches to public policy decision making to address this complexity. This means methodologies, tools and techniques that support multiple perspectives, involve multiple stakeholders and require multiple sources of information are essential for the investigation, analysis and support of public policy decision making. The proposed framework presented in this thesis has been developed to address the issues arising when investigating public policy problems. It addresses the complexity and multiplicity that is public policy decision making, concentrating on problem identification and definition. There is a presentation of the existing frameworks for policy decision making and their limitations. It discusses issues with problem recognition and definition and proposes a methodological framework that provides a thorough investigation into the problem domain to identify areas for policy actions, critical information needs and enables simulation and experimentation to identify unintended consequences. Traditional approaches to policy decision making have been criticised for failing to take into account the wealth of information generated and used by the policy process. This has led to the emergence of Policy Informatics as a new field of research. This thesis shows that a methodological framework for policy design and analysis can be created, based on the core concepts of Policy Informatics and Systems Thinking, that more thoroughly investigates the problem space than previous approaches and addresses common issues with problem recognition and definition that exist in more traditional policy decision making frameworks.
36

An investigation on question answering for an online feedable Chatbot

Abdul-Kader, Sameera A'amer January 2018 (has links)
This thesis presents the design and implementation of a Chatbot that is able to answer questions about an entity it is learning about. This Chatbot is capable of automatically generating multiple genres using a unique technique to populate its SQL database from the Web. Our Online Feedable Chatbot can hold a conversation with the user regarding the information it has extracted from the Web. Our Online Feedable Chatbot attempts to create Question Answer pairs (QAPs) and acquire imperative sentences specially targeted at the entity it gives information about. A method to select the best response for a Chatbot query among a set of sentences using hybrid terms, syntactic, and semantic extracted features is developed as a response search system of our Online Feedable Chatbot. This tutor Chatbot can expand its training knowledge base by automatically extracting more QAPs and imperative sentences from the Web whenever the user needs to learn about a new entity and without any instructor's supervision, amendments, or control.
37

Coupling matrix based integration of the active components with microwave filters

Gao, Yang January 2018 (has links)
This thesis introduces novel integrated millimetre wave components for amplification and filtering. The conventional coupling matrix theory for passive filters is extended to the design of ‘filter-amplifiers’, which have both filtering and amplification functionalities. The design is based on the coupling matrix theory, and for this approach extra elements are added to the standard coupling matrix to represent the transistor. Based on the specification of the filter and small-signal parameters of the transistor, the active coupling matrices for the ‘filter-amplifier’ can be synthesised. Adopting the active coupling matrices, the resonators of the filter adjacent to the transistor and the coupling between them are modified mathematically to provide a Chebyshev filter response with amplification. Although the transistor has complex input and output impedances, it can be matched to the filters by choice of coupling structure and resonance frequency. This is particularly useful as the filter resonators can be of a different construction (e.g. waveguide) to the amplifier (e.g. microstrip).
38

XML to facilitate management of multi-vendor networks

Halse, Guy Antony, Wells, George, Terzoli, Alfredo 09 1900 (has links) (PDF)
Many standards aimed at managing networks currently exist, and yet networks remain notoriously difficult to maintain. Template-based management systems go a long way towards solving this problem. By developing an XML based language to describe network elements, as well as the topology of a network, we can create tools that are free from vendor specific idiosyncrasies, and are capable of managing both today’s networks and those of the future.
39

An ecologically valid evaluation of an observation-resilient graphical authentication mechanism

Maguire, Joseph Noel January 2013 (has links)
Alphanumeric authentication, by means of a secret, is not only a powerful mechanism, in theory, but prevails over all its competitors in reality. Passwords, as they are more commonly known, have the potential to act as a fairly strong gateway. In practice, though, password usage is problematic. They are (1) easily shared, (2) trivial to observe and (3) maddeningly elusive when forgotten. Moreover, modern consumer devices only exacerbate the problems of passwords as users enter them in shared spaces, in plain view, on television screens, on smartphones and on tablets. Asterisks may obfuscate alphanumeric characters on entry but popular systems, e.g. Apple iPhone and Nintendo Wii, require the use of an on-screen keyboard for character input. ! ! A number of alternatives to passwords have been proposed but none, as yet, have been adopted widely. There seems to be a reluctance to switch from tried and tested passwords to novel alternatives, even if the most glaring flaws of passwords can be mitigated. One argument is that there has not been sufficient investigation into the feasibility of the password alternatives and thus no convincing evidence that they can indeed act as a viable alternative. ! ! Graphical authentication mechanisms, solutions that rely on images rather than characters, are a case in point. Pictures are more memorable than the words that name them, meaning that graphical authentication mitigates one of the major problems with passwords. This dissertation sets out to investigate the feasibility of one particular observation-resilient graphical authentication mechanism called Tetrad. The authentication mechanism attempted to address two of the core problems with passwords: improved memorability and resistance to observability (with on-screen entry).! ! Tetrad was tested in a controlled lab study, that delivered promising results and was well received by the evaluators. It was then deployed in a realistic context and its viability tested in three separate field tests. The unfortunate conclusion was that Tetrad, while novel and viable in a lab setting, failed to deliver a usable and acceptable experience to the end users. This thorough testing of an alternative authentication mechanism is unusual in this research field and the outcome is disappointing. Nevertheless, it acts to inform inventors of other authentication mechanisms of the problems that can manifest when a seemingly viable authentication mechanism is tested in the wild.
40

An assessment model for Enterprise Clouds adoption

Nasir, Usman January 2017 (has links)
Context: Enterprise Cloud Computing (or Enterprise Clouds) is using the Cloud Computing services by a large-scale organisation to migrate its existing IT services or use new Cloud based services. There are many issues and challenges that are barrier to the adoption of Enterprise Clouds. The adoption challenges have to be addressed for better assimilation of Cloud based services within the organisation. Objective: The aim of this research was to develop an assessment model for adoption of Enterprise Clouds. Method: Key challenges reported as barrier in adoption of Cloud Computing were identified from literature using the Systematic Literature Review methodology. A survey research was carried out to elicit industrial approaches and practices from Cloud Computing experts that help in overcoming the key challenges. Both key challenges and practices were used in formulating the assessment model. Results: The results have highlighted that key challenges in the adoption of Enterprise Clouds are security & reliability concerns, resistance to change, vendor lock-in issues, data privacy and difficulties in application and service migration. The industrial practices to overcome these challenges are: planning and executing pilot project, assessment of IT needs, use of open source APIs, involvement of legal team in vendor selection, identification of the processes to change, involvement of senior executive as change champion, using vendor partners to support application/service migration to Cloud Computing and creating employee awareness about Cloud Computing services. Conclusion: Using the key challenges and practices, the assessment model was developed that assesses an organisation’s readiness to adopt Enterprise Clouds. The model measures the readiness in four dimensions: technical, legal & compliance, IT capabilities and end user readiness for the adoption of Enterprise Clouds. The model’s result can help the organisation in overcoming the adoption challenges for successful assimilation of newly deployed or migrated IT services on Enterprise Clouds.

Page generated in 0.0716 seconds