• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 93
  • 24
  • 15
  • 13
  • 1
  • Tagged with
  • 2490
  • 1189
  • 1135
  • 1070
  • 895
  • 183
  • 135
  • 114
  • 91
  • 89
  • 88
  • 86
  • 78
  • 75
  • 69
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

The generation of parsers for two-level grammars

Fisher, Anthony James January 1982 (has links)
The body of this thesis falls into two parts. Chapter II discusses the problems associated with van Wijngaarden grammars, in particular the problem of translating a van Wijngaarden grammar into an equivalent, more manageable one-level form. A strategy is presented for generating, from a van Wijngaarden grammar, a data-base suitable for a one-track interpretive parser, and a computer program is described which performs this transformation. Appendix A contains van Wijngaarden grammars for various languages, and shows how the grammars are transformed by the parser-gener~tor. Chapter III presents a new two-level meta-language, called TAG (for Two-level Attribute Grammar). A parsergenerator program is described which generates, from a TAG grammar, a data-base suitable for a one-track interpretive parser. TAG has been used to define the syntax of a dialect of the language Comal, and an interpreter for this dialect has been built whose parser-table is generated automatically from a TAG grammar. Appendix B lists the TAG grammar.
12

Testing boundaries : a theory of adaption and framing effects in ongoing tasks

Harrison, Timothy Samuel January 2012 (has links)
This thesis investigates how information presentation affects decisions in ongoing task scenarios. For this purpose it reapplies the principles of bounded rationality and specifically framing effects into this domain. Over a number of studies. unique properties concerning both frame effectiveness and additional measures such as confidence are observed to occur. A theory of cognitive adaptation to novel scenarios, and a redefining of the concept of framing effects are proposed as a result.
13

Fast and efficient algorithms on strings and sequences

Rahman, Mohammad Sohel January 2007 (has links)
The last few decades have witnessed the fascinating outgrowth of the field of String Algorithms. And, with the ever increasing mass of information and rapid pace of dissemination and sharing thereof, the importance and relevance of these algorithms can be expected to grow even further in near future. In this thesis, we have studied a number of interesting problems on strings and sequences and taken an effort to devise efficient algorithms to solve them. The problems we have studied here falls into two main categories of problems, namely the String Matching Problems and the Longest Common Subsequence Problems. The first String matching problem handled in this thesis is the pattern matching problems under the don't care paradigm. Don't care pattern matching problems have been studied since 1974 and the quest for improved algorithms is still on. We present efficient algorithms for different versions of this problem. We also present efficient algorithms for a more general problem, namely the pattern matching problem with degenerate strings. Next, we consider another well-studied problem namely the Swap Matching problem. We present a new graph-theoretic approach to model the problem, which opens a new and hitherto unexplored avenue to solve it. Then, using the model, we devise an efficient algorithm, which works particularly well for short patterns. Computer assisted music analysis and music information retrieval has a number of tasks that can be related to stringology. To this extent, we consider the problem of identifying rhythms in a musical text, which can be used in automatic song classification. Since there doesn't exist a complete agreement on the definitions of related features, we take an effort to present a new paradigm to model this problem and devise an efficient algorithm to solve it. Next, we consider a number ofrelatively new interesting pattern matching problems from indexing point of view. We present efficient indexing schemes for the problem of pattern matching in given intervals, the property matching problem and the circular pattern matching problem. We conclude our study of string matching problems by focusing on devising an efficient data structure to index gapped factors. In the second part of this thesis, we study the longest common subsequence (LCS) problems and variants thereof. We first introduce a number of new LCS variants by applying the notion of gap-constraints. The new variants are motivated by practical applications from computational molecular biology and we present efficient algorithms to solve them. Then, using the techniques developed while solving these variants, we present efficient algorithms for the classic LCS problem and for another relatively new variant, namely, the Constrained LCS (CLCS) problem. Finally, we efficiently solve the LCS and CLCS problems for degenerate strings.
14

Sequential Monte Carlo methods for extended and group object tracking

Petrov, Nikolay January 2013 (has links)
This dissertation deals with the challenging tasks of real-time extended and group object tracking. The problems are formulated as joint parameter and state estimation of dynamic systems. The solutions proposed are formulated within a general nonlinear framework and are based on the Sequential Monte Carlo (SMC) method, also known as Particle Filtering (PF) method. Eour different solutions are proposed for the extended object tracking problem. The first two are based on border parametrisation of the visible surface of the extended object. The likelihood functions are derived for two different scenarios - one without clutter in the measurements and another one in the presence of clutter. In the third approach the kernel density estimation technique is utilised to approximate the joint posterior density of the target dynamic state and static size parameters. The forth proposed approach solves the extended object tracking problem based on the recently emerged SMC method combined with interval analysis , called Box Particle Filter (Box P F). Simulation results for all of the developed algorithms show accurate online tracking, with very good estimates both for the target kinematic states and for the parameters of the target extent. In addition, the performance of the Box PF and the border parametrised PF is validated utilising real measurements from laser range scanners obtained within a prototype security system replicating an airport corridor.
15

Modelling and assessing the environmental impacts of software

Williams, Daniel R. January 2013 (has links)
Software is used on billions of devices every day for numerous activities and services that the modern world relies upon. It also has the potential to control energy consumption and maximise device efficiency, thus minimising device and service environmental impact. The academic background to the environmental impact of software, at the beginning of this research, was minor and little utilised by information and communications technology (ICT) organisations. Consequently, this research analysed and modelled three main software types enabling the environmental analysis of software use to take place with real data and information. Firstly, Operating Systems (OS) and their power management features were analysed. as form the foundation of any software based activity and control device componentry, and thus overall device energy consumption. The overall impact of an as, not just a devices idle or maximum power, was investigated with a new set of measurements, methods, and models constructed in this research. Secondly, new methodologies to measure the energy consumption of electronic software distribution (ESD) and cloud computing were created, which spanned the entire life cycle of the data route. Both technologies were modelled and measured across a number of real scenarios. ESD and cloud computing can be utilised to dematerialise and increase the efficiency of some high energy impact activities. Additionally, it was found that cloud computing can be utilised to reduce the impact of data and process intensive computing, but can also be wasteful compared to traditional computing when utilised for certain types of low resource software activities. Finally, a novel framework was created to enable the analysis and comparison of the energy consumption and greenhouse gas (GHG) emissions of the same software activity across a range of devices and as, according to the technical functionality and price of a set of devices. This framework enables the analysis of devices according to their software GHG emission efficiencies across different platforms and may allow manufacturers and consumers to maximise and drive forward environmental awareness. In conclusion, this research has set the foundations for quantifying, and has demonstrated, the large potential that software has to reduce energy consumption and overall environmental impact.
16

The design and construction of deadlock-free concurrent systems

Martin, Jeremy Malcolm Randolph January 1996 (has links)
Throughout our lives we take for granted the safety of complex structures that surround us. We live and work in buildings with scant regard for the lethal currents of electricity and flammable gas coarsing through their veins. We cross high bridges with little fear of them crumbling into the depths below. We are secure in the knowledge that these objects have been constructed using sound engineering principles. Now, increasingly, we are putting OUT lives into the hands of complex computer programs. One could cite aircraft control systems, railway signalling systems, and medical databases as examples. But whereas the disciplines of electrical and mechanical engineering have long been well understood, software engineering is in its infancy. Unlike other fields, there is no generally accepted certification of competence for its practitioners. Formal scientific methods for reliable software production have been developed, but these tend to require a level of mathematical knowledge beyond that of most programmers. Engineers, in general, are usually more concerned with practical issues than with the underlying scientific theory of their particular discipline. They want to get on with the business of building powerful systems. They rely on scientists to provide them with safety rules which they can incorporate into their designs. For instance, a bridge designer needs to know certain formulae to calculate how wide to set the span of an arch - he does not need to know why the formulae work. Software engineering is in need of a battery of similar rules to provide a bridge between theory and practice. The demand for increasing amounts of computing power makes parallel programming very appealing. However additional dangers lurk in this exciting field. In this thesis we explore ways to circumvent one particularly dramatic problem - deadlock. This is a state where none of the constituent processes of a system can agree on how to proceed, so nothing ever happens. Clearly we would desire that any sensible system we construct could never arrive at such a state, but what can we do to ensure that this is indeed the case? We might think to use a computer to check every posssible state of the system. But, given that the number of states of a parallel system usually grows exponentially with the number of processes, we would most likely find the task too great.
17

Automatic summarization of opinions in service reviews

Di Fabrizzio, Giuseppe January 2012 (has links)
No description available.
18

Improving the Process of Model Checking through State Space Reductions

Turner, Edward Nanakorn January 2007 (has links)
Model checking is a technique for finding errors in systems and algorithms. The technique requires a formal definition of the system with a set of correctness conditions, and the use of a tool, the model checker, that searches for model behaviours violating these correctness conditions. The value of existing model checkers depends 'largely on the complexity of the system being checked. Systems involving complex data structures quickly encounter the problem of state explosion, and checking becomes intractable. Furthermore, auxiliary feedback originally designed to aid the practitioner (e.g., process automata) becomes less useful. This thesis develops of a set of techniques to address these 'problems. The main contributions of this thesis are methods that improve model checking in the formal language of B, by reductions in the size of a system's state space. Methods are described that enable a user to view various succinct properties about a system's behaviour through automatic analysis of reached state spaces, and a technique is developed to improve the efficiency of generating state spaces during model checking using algorithms for identifying symmetries via graph isomorphism. Soundness proofs arc shown using refinement in B. Each technique has been implemented into the B model checker, called PRoB, and is shown to be effective through experimentation and evaluation. This research has stimulated three complementary approaches for improving the generation of state spaces, which are also presented and evaluated. Although this work concerns the context of B and PRoB, the techniques could be generalised to verification tools of other languages.
19

Action in context - context in action : towards a grounded theory of software design

Webb, Brian Robert January 2001 (has links)
This thesis develops a model and a theory of software design. Thirty-two transcripts of interviews with software designers were analysed using the Grounded Theory method. The first set of sixteen interviews drawn from the field of Digital Interactive Multimedia (Data-set A) was used to develop the model and theory, the second set of sixteen interviews drawn from one source of technical literature (Data-set B) was used to test and enhance the initial outcomes. Final outcomes are then grounded in the general literature on problem solving and design. The model is concerned to capture a rich, holistic picture of software design. It is descriptive rather than prescriptive, concerned to capture how software design is done rather than advocate how it ought to be done. The theory is a development of the model and is presented initially as a theoretical framework and then as a series of propositions. The theoretical framework is a function of the juxtaposition of specific properties or attributes of the "core category", which uniquely explains the phenomenon. Its outcome is four design scenarios. Each scenario is of interest as an explanation of software design practice but two scenarios wherein such practice does not "fit" the design context are of most interest. It is argued that these scenarios can be used to identify and explain design breakdowns. Finally, the thesis purports to explicate the "Meta-process" - the process through which the inductive model and theory was developed. This is an unusual objective for a piece of IS research but valid nonetheless and significant, given the complexity of the research method used and the dearth of good process accounts in the IS literature and elsewhere.
20

Practical strategies for agent-based negotiation in complex environments

Williams, Colin Richard January 2012 (has links)
Agent-based negotiation, whereby the negotiation is automated by software programs, can be applied to many different negotiation situations, including negotiations between friends, businesses or countries. A key benefit of agent-based negotiation over human negotiation is that it can be used to negotiate effectively in complex negotiation environments, which consist of multiple negotiation issues, time constraints, and multiple unknown opponents. While automated negotiation has been an active area of research in the past twenty years, existing work has a number of limitations. Specifically, most of the existing literature has considered time constraints in terms of the number of rounds of negotiation that take place. In contrast, in this work we consider time constraints which are based on the amount of time that has elapsed. This requires a different approach, since the time spent computing the next action has an effect on the utility of the outcome, whereas the actual number of offers exchanged does not. In addition to these time constraints, in the complex negotiation environments which we consider, there are multiple negotiation issues, and we assume that the opponents’ preferences over these issues and the behaviour of those opponents are unknown. Finally, in our environment there can be concurrent negotiations between many participants. Against this background, in this thesis we present the design of a range of practical negotiation strategies, the most advanced of which uses Gaussian process regression to coordinate its concession against its various opponents, whilst considering the behaviour of those opponents and the time constraints. In more detail, the strategy uses observations of the offers made by each opponent to predict the future concession of that opponent. By considering the discounting factor, it predicts the future time which maximises the utility of the offers, and we then use this in setting our rate of concession. Furthermore, we evaluate the negotiation agents that we have developed, which use our strategies, and show that, particularly in the more challenging scenarios, our most advanced strategy outperforms other state-of-the-art agents from the Automated Negotiating Agent Competition, which provides an international benchmark for this work. In more detail, our results show that, in one-to-one negotiation, in the highly discounted scenarios, our agent reaches outcomes which, on average, are 2.3% higher than those of the next best agent. Furthermore, using empirical game theoretic analysis we show the robustness of our strategy in a variety of tournament settings. This analysis shows that, in the highly discounted scenarios, no agent can benefit by choosing a different strategy (taken from the top four strategies in that setting) than ours. Finally, in the many-to-many negotiations, we show how our strategy is particularly effective in highly competitive scenarios, where it outperforms the state-of-the-art many-to-many negotiation strategy by up to 45%.

Page generated in 0.0695 seconds