Spelling suggestions: "subject:"nondeterministic"" "subject:"determinism""
1 |
Games and full abstraction for non-deterministic languagesHarmer, Russell Spencer January 1999 (has links)
No description available.
|
2 |
Parallelism with limited nondeterminismFinkelstein, Jeffrey 05 March 2017 (has links)
Computational complexity theory studies which computational problems can be solved with limited access to resources. The past fifty years have seen a focus on the relationship between intractable problems and efficient algorithms. However, the relationship between inherently sequential problems and highly parallel algorithms has not been as well studied. Are there efficient but inherently sequential problems that admit some relaxed form of highly parallel algorithm? In this dissertation, we develop the theory of structural complexity around this relationship for three common types of computational problems.
Specifically, we show tradeoffs between time, nondeterminism, and parallelizability. By clearly defining the notions and complexity classes that capture our intuition for parallelizable and sequential problems, we create a comprehensive framework for rigorously proving parallelizability and non-parallelizability of computational problems. This framework provides the means to prove whether otherwise tractable problems can be effectively parallelized, a need highlighted by the current growth of multiprocessor systems. The views adopted by this dissertation—alternate approaches to solving sequential problems using approximation, limited nondeterminism, and parameterization—can be applied practically throughout computer science.
|
3 |
A SURVEY OF LIMITED NONDETERMINISM IN COMPUTATIONAL COMPLEXITY THEORYLevy, Matthew Asher 01 January 2003 (has links)
Nondeterminism is typically used as an inherent part of the computational models used incomputational complexity. However, much work has been done looking at nondeterminism asa separate resource added to deterministic machines. This survey examines several differentapproaches to limiting the amount of nondeterminism, including Kintala and Fischer's βhierarchy, and Cai and Chen's guess-and-check model.
|
4 |
Modeling Nondeterminism in Program Semantics using Lifted Binary MultirelationsSaladi, Srikanth 01 May 2007 (has links)
No description available.
|
5 |
Requirements specification using concrete scenariosAu, Oliver T. S. January 2009 (has links)
The precision of formal specifications allows us to prove program correctness. Even if formal methods are not used throughout the software project, formalisation improves our understanding of the problem. Formal specifications are amenable to automated analysis and consistency checking. However using them is challenging. Customers do not understand formal notations. Specifiers have difficulty tackling large problems. Once systems are built, formal specifications quickly become outdated during software maintenance. A method of developing formal specifications using concrete scenarios is proposed to tackle the disadvantages just mentioned. A concrete scenario describes system behaviour with successive steps. The pre- and post-states of scenario steps are expressed with actual data rather than variables. Concrete scenarios are expressed in a natural language or formal notation. They increase customer involvement in the creation of formal specifications. Scenarios may be ranked by priorities allowing specifiers to focus on a small part of the system. Formal specifications are constructed incrementally. New requirements are also captured in concrete scenarios which guide the modification of formal specifications. On one hand, concrete scenarios assist the creation and maintenance of formal specifications. On the other hand, they facilitate program correctness proofs without using conventional formal specifications. This is achieved by adding implementation details to customer scenarios. The resulting developer scenarios, encapsulating decisions of data structures and algorithms, are generalised to operation schemas. With the implementation details, the schemas written in formal notations are programs rather than specifications.
|
6 |
The Copycat Project: An Experiment in Nondeterminism and Creative AnalogiesHofstadter, Douglas 01 January 1984 (has links)
A micro-world is described, in which many analogies involving strikingly different concepts and levels of subtlety can be made. The question "What differentiates the good ones from the bad ones?" is discussed, and then the problem of how to implement a computational model of the human ability to come up with such analogies (and to have a sense for their quality) is considered. A key part of the proposed system, now under development is its dependence on statistically emergent properties of stochastically interacting "codelets" (small pieces of ready-to-run code created by the system, and selected at random to run with probability proportional to heuristically assigned "urgencies"). Another key element is a network of linked concepts of varying levels of "semanticity", in which activation spreads and indirectly controls the urgencies of new codelets. There is pressure in the system toward maximizing the degree of "semanticity" or "intensionality" of descriptions of structures, but many such pressures, often conflicting, must interact with one another, and compromises must be made. The shifting of (1) perceived oundaries inside structures, (2) descriptive concepts chosen to apply to structures, and (3) features perceived as "salient" or not, is called "slippage". What can slip, and how are emergent consequences of the interaction of (1) the temporary ("cytoplasmic") structures involved in the analogy with (2) the permanent ("Platonic") concepts and links in the conceptual proximity network, or "slippability network". The architecture of this system is postulated as a general architecture suitable for dealing not only with fluid analogies, but also with other types of abstract perception and categorization tasks, such as musical perception, scientific theorizing, Bongard problems and others.
|
7 |
Multiparty Communication ComplexityDavid, Matei 06 August 2010 (has links)
Communication complexity is an area of complexity theory that studies an abstract model of computation called a communication protocol. In a $k$-player communication protocol, an input to a known function is partitioned into $k$ pieces of $n$ bits each, and each piece is assigned to one of the players in the protocol. The goal of the players is to evaluate the function on the distributed input by using as little communication as possible. In a Number-On-Forehead (NOF) protocol, the input piece assigned to each player is metaphorically placed on that player's forehead, so that each player sees everyone else's input but its own. In a Number-In-Hand (NIH) protocol, the piece assigned to each player is seen only by that player. Overall, the study of communication protocols has been used to obtain lower bounds and impossibility results for a wide variety of other models of computation.
Two of the main contributions presented in this thesis are negative results on the NOF model of communication, identifying limitations of NOF protocols. Together, these results consitute stepping stones towards a better fundamental understanding of this model. As the first contribution, we show that randomized NOF protocols are exponentially more powerful than deterministic NOF protocols, as long as $k \le n^c$ for some constant $c$. As the second contribution, we show that nondeterministic NOF protocols are exponentially more powerful than randomized NOF protocols, as long as $k \le \delta \cdot \log n$ for some constant $\delta < 1$.
For the third major contribution, we turn to the NIH model and we present a positive result. Informally, we show that a NIH communication protocol for a function $f$ can simulate a Stack Machine (a Turing Machine augmented with a stack) for a related function $F$, consisting of several instances of $f$ bundled together. Using this simulation and known communication complexity lower bounds, we obtain the first known (space vs. number of passes) trade-off lower bounds for Stack Machines.
|
8 |
Multiparty Communication ComplexityDavid, Matei 06 August 2010 (has links)
Communication complexity is an area of complexity theory that studies an abstract model of computation called a communication protocol. In a $k$-player communication protocol, an input to a known function is partitioned into $k$ pieces of $n$ bits each, and each piece is assigned to one of the players in the protocol. The goal of the players is to evaluate the function on the distributed input by using as little communication as possible. In a Number-On-Forehead (NOF) protocol, the input piece assigned to each player is metaphorically placed on that player's forehead, so that each player sees everyone else's input but its own. In a Number-In-Hand (NIH) protocol, the piece assigned to each player is seen only by that player. Overall, the study of communication protocols has been used to obtain lower bounds and impossibility results for a wide variety of other models of computation.
Two of the main contributions presented in this thesis are negative results on the NOF model of communication, identifying limitations of NOF protocols. Together, these results consitute stepping stones towards a better fundamental understanding of this model. As the first contribution, we show that randomized NOF protocols are exponentially more powerful than deterministic NOF protocols, as long as $k \le n^c$ for some constant $c$. As the second contribution, we show that nondeterministic NOF protocols are exponentially more powerful than randomized NOF protocols, as long as $k \le \delta \cdot \log n$ for some constant $\delta < 1$.
For the third major contribution, we turn to the NIH model and we present a positive result. Informally, we show that a NIH communication protocol for a function $f$ can simulate a Stack Machine (a Turing Machine augmented with a stack) for a related function $F$, consisting of several instances of $f$ bundled together. Using this simulation and known communication complexity lower bounds, we obtain the first known (space vs. number of passes) trade-off lower bounds for Stack Machines.
|
9 |
The Difficulty of Designing a General Heuristic Agent Navigation StrategyFors, Mikael, Hermelin, Madelen January 2011 (has links)
We consider an abstract representation of some environment in which an agent is located. Given a goal sequence, we ask what strategy said agent - utilizing readily available algorithmic tools - should incorporate to successfully find a valid traversal route such that it is optimal in accordance with a predefined error-margin. We present four scenarios that each incorporate aspects common to general navigation to further illustrate some of the difficult problems needed to be solved in any general navigation strategy. Two reinforcement learning and four graph path planning algorithms are studied and applied on said predefined scenarios. Through the introduction of a long-term strategy model we allow comparative study of the result of the applications, and note a distinct difference in performance. Further, we discuss the lack of a probabilistic algorithmic approach and why it should be an option in any general strategy as it allows verifiably "good" estimated solutions, useful when the problem at hand is NP-hard. Several meta-level concepts are introduced and discussed to further illustrate the difficulty in producing an optimal strategy with an explicit long-term horizon. We argue for a non-deterministic approach, looking at the apparent gain of epsilon-randomness when incorporated by a reinforcement learning agent. Several problems that may arise with non-determinism are discussed, based on the notion that such an agents' performance can be viewed as a markov chain; possibly resulting in suboptimal paths concerning norm.
|
10 |
State Complexity of Tree AutomataPIAO, XIAOXUE 04 January 2012 (has links)
Modern applications of XML use automata operating on unranked trees. A common definition of tree automata operating on unranked trees uses a set of vertical states that define the bottom-up computation, and the transitions on vertical states are determined by so called horizontal languages recognized by finite automata on strings. The bottom-up computation of an unranked tree automaton may be either deterministic or nondeterministic, and further variants arise depending on whether the horizontal string languages defining the transitions are represented by DFAs or NFAs. There is also an alternative syntactic definition of determinism introduced by Cristau et al.
It is known that a deterministic tree automaton with the smallest total number of states does not need to be unique nor have the smallest possible number of vertical states. We consider the question by how much we can reduce the total number of states by introducing additional vertical states. We give an upper bound for the state trade-off for deterministic tree automata where the horizontal languages are defined by DFAs, and a lower bound construction that, for variable sized alphabets, is close to the upper bound.
We establish upper and lower bounds for the state complexity of conversions between different types of deterministic and nondeterministic unranked tree automata. The bounds are, usually, tight for the numbers of vertical states. Because a minimal deterministic unranked tree automaton need not be unique, establishing lower bounds for the number of horizontal states, that is, the combined size of DFAs used to define the horizontal languages, is challenging. Based on existing lower bound results for unambiguous finite automata we develop a lower bound criterion for the number of horizontal states.
We consider the state complexity of operations on regular unranked tree languages. The concatenation of trees can be defined either as a sequential or a parallel operation. Furthermore, there are two essentially different ways to iterate sequential concatenation. We establish tight state complexity bounds for concatenation-like operations. In particular, for sequential concatenation and bottom-up iterated concatenation the bounds differ by an order of magnitude from the corresponding state complexity bounds for regular string languages. / Thesis (Ph.D, Computing) -- Queen's University, 2012-01-04 14:48:02.916
|
Page generated in 0.077 seconds