• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 22
  • 3
  • 2
  • 1
  • 1
  • Tagged with
  • 40
  • 40
  • 13
  • 7
  • 7
  • 6
  • 6
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Multiagent classical planning

Crosby, Matthew David January 2014 (has links)
Classical planning problems consist of an environment in a predefined state; a set of deterministic actions that, under certain conditions, change the state of the environment; and a set of goal conditions. A solution to a classical planning problem is a sequence of actions that leads from the initial state to a state satisfying the problem’s goal conditions. There are many methods for finding solutions to classical planning problems, and a popular technique is to exploit structures that commonly occur. One such structure, apparent in many planning domains, is a breakdown of the problem into multiple agents. However, methods for finding and exploiting multiagent structures are not prevalent in the literature and are currently not competitive. This thesis sets out to rectify this problem. Its first main contribution, is to introduce a domain independent algorithm for extracting multiagent structure from classical planning problems. The algorithm relies on identifying a generalisable property of agents in planning; namely, that agents are entities with an internal state, a part of the planning problem that, under a certain distribution of actions, only they can modify. Once this is appropriately formalised, the decomposition algorithm is introduced and is shown to produce identifiably multiagent decompositions over all of the classical planning domains used in the International Planning Competitions, even finding more detailed decompositions than are used by humans in certain cases. Solving multiagent planning problems can be challenging because a solution may require complex inter-agent coordination. The second main contribution of the thesis is a heuristic planning algorithm that effectively exploits the structure of decomposed domains. The algorithm transforms the coordination problem into a process of subgoal generation that can be solved efficiently under a well-known relaxation in planning. The generated subgoals guide the search so that it is always performed by one single-agent subproblem at a time. The algorithm is evaluated and shown to greatly outperform current state-of-the-art planners over decomposable domains. The thesis also includes discussion of the possible extensions of this work, to include the multiagent concepts of self-interested agents and concurrent actions. Results from the multiagent planning literature are improved upon and a new solution concept is presented that accounts for the ‘farsightedness’ inherent in planning. A method is then presented that can find stable solutions for a certain class of multiagent planning problems. A new method is introduced for modelling concurrent actions that allows them to be written without requiring knowledge of each other agent in the domain, and it is shown how such problems can be solved by a translation to single-agent planning.
2

Additive abstraction-based heuristics

Yang, Fan 06 1900 (has links)
In this thesis, we study theoretically and empirically the additive abstraction-based heuristics. First we present formal general definitions for abstractions that extend to general additive abstractions. We show that the general definition makes proofs of admissibility, consistency, and additivity easier, by proving that several previous methods for defining abstractions and additivity satisfy three imple conditions. Then we investigate two general methods for defining additive abstractions and run experiments to determine the effectiveness of these methods for two benchmark state spaces: TopSpin and the Pancake puzzle. Third, we propose that the accuracy of the heuristics generated by abstraction can be improved by checking for infeasibility. The theory and experiments demonstrate the approach to detect infeasibility and the application of this technique to different domains. Finally, we explore the applications of additive abstraction-based heuristics in two state spaces with nonuniform edge costs: the Sequential Ordering Problem (SOP) and the weighted Pancake puzzle. We formalize a novel way of generating additive and non-additive heuristics for these state spaces. Furthermore, we investigate the key concepts to generate good additive and non-additive abstractions. Experiments show that compared to some simple alternative heuristics, well chosen abstractions can enhance the quality of suboptimal solutions for large SOP instances and reduce search time for the weighted Pancake problems.
3

Additive abstraction-based heuristics

Yang, Fan Unknown Date
No description available.
4

Symbolic Bidirectional Breadth-First Heuristic Search

Richards, Simon Kim 11 December 2004 (has links)
A Reduced Ordered Binary Decision Diagram (BDD) is a symbolic data structure introduced to the model checking community by Bryant in 1986 to help verify properties of systems with very large state spaces. Recently, BDDs have been used in heuristic search algorithms as an approach to representing and solving search problems with very large state spaces. However, these algorithms are still not memory efficient. This thesis presents a symbolic heuristic search algorithm that uses BDDs in a memory efficient way by performing bidirectional breadthirst heuristic search. The approach is evaluated empirically against existing symbolic methods and is shown to provide a significant improvement in performance.
5

Learning Optimal Bayesian Networks with Heuristic Search

Malone, Brandon M 11 August 2012 (has links)
Bayesian networks are a widely used graphical model which formalize reasoning under uncertainty. Unfortunately, construction of a Bayesian network by an expert is timeconsuming, and, in some cases, all expertsmay not agree on the best structure for a problem domain. Additionally, for some complex systems such as those present in molecular biology, experts with an understanding of the entire domain and how individual components interact may not exist. In these cases, we must learn the network structure from available data. This dissertation focuses on score-based structure learning. In this context, a scoring function is used to measure the goodness of fit of a structure to data. The goal is to find the structure which optimizes the scoring function. The first contribution of this dissertation is a shortest-path finding perspective for the problem of learning optimal Bayesian network structures. This perspective builds on earlier dynamic programming strategies, but, as we show, offers much more flexibility. Second, we develop a set of data structures to improve the efficiency of many of the integral calculations for structure learning. Most of these data structures benefit our algorithms, dynamic programming and other formulations of the structure learning problem. Next, we introduce a suite of algorithms that leverage the new data structures and shortest-path finding perspective for structure learning. These algorithms take advantage of a number of new heuristic functions to ignore provably sub-optimal parts of the search space. They also exploit regularities in the search that previous approaches could not. All of the algorithms we present have their own advantages. Some minimize work in a provable sense; others use external memory such as hard disk to scale to datasets with more variables. Several of the algorithms quickly find solutions and improve them as long as they are given more resources. Our algorithms improve the state of the art in structure learning by running faster, using less memory and incorporating other desirable characteristics, such as anytime behavior. We also pose unanswered questions to drive research into the future.
6

A Greedy Search Algorithm for Maneuver-Based Motion Planning of Agile Vehicles

Neas, Charles Bennett 29 December 2010 (has links)
This thesis presents a greedy search algorithm for maneuver-based motion planning of agile vehicles. In maneuver-based motion planning, vehicle maneuvers are solved offline and saved in a library to be used during motion planning. From this library, a tree of possible vehicle states can be generated through the search space. A depth-first, library-based algorithm called AD-Lib is developed and used to quickly provide feasible trajectories along the tree. AD-Lib combines greedy search techniques with hill climbing and effective backtracking to guide the search process rapidly towards the goal. Using simulations of a four-thruster hovercraft, AD-Lib is compared to existing suboptimal search algorithms in both known and unknown environments with static obstacles. AD-Lib is shown to be faster than existing techniques, at the expense of increased path cost. The motion planning strategy of AD-Lib along with a switching controller is also tested in an environment with dynamic obstacles. / Master of Science
7

Automatic detection and classification of leukaemia cells

Ismail, Waidah Binti January 2012 (has links)
Today, there is a substantial number of software and research groups that focus on the development of image processing software to extract useful information from medical images, in order to assist and improve patient diagnosis. The work presented in this thesis is centred on processing of images of blood and bone marrow smears of patients suffering from leukaemia, a common type of cancer. In general, cancer is due to aberrant gene expression, which is caused by either mutations or epigenetic changes in DNA. Poor diet and unhealthy lifestyle may trigger or contribute to these changes, although the underlying mechanism is often unknown. Importantly, many cancer types including leukaemia are curable and patient survival and treatment can be improved, subject to prompt diagnosis. In particular, this study focuses on Acute Myeloid Leukaemia (AML), which can be of eight distinct types (M0 to M7), with the main objective to develop a methodology to automatically detect and classify leukaemia cells into one of the above types. The data was collected from the Department of Haematology, Universiti Sains Malaysia, in Malaysia. Three main methods, namely Cellular Automata, Heuristic Search and classification using Neural Networks are facilitated. In the case of Cellular Automata, an improved method based on the 8-neighbourhood and rules were developed to remove noise from images and estimate the radius of the potential blast cells contained in them. The proposed methodology selects the starting points, corresponding to potential blast cells, for the subsequent seeded heuristic search. The Seeded Heuristic employs a new fitness function for blast cell detection. Furthermore, the WEKA software is utilised for classification of blast cells and hence images, into AML subtypes. As a result accuracy of 97.22% was achieved in the classification of blasts into M3 and other AML subtypes. Finally, these algorithms are integrated into an automated system for image processing. In brief, the research presented in this thesis involves the use of advanced computational techniques for processing and classification of medical images, that is, images of blood samples from patients suffering from leukaemia.
8

Structure and inference in classical planning

Lipovetzky, Nir 07 December 2012 (has links)
Classical planning is the problem of finding a sequence of actions for achieving a goal from an initial state assuming that actions have deterministic effects. The most effective approach for finding such plans is based on heuristic search guided by heuristics extracted automatically from the problem representation. In this thesis, we introduce alternative approaches for performing inference over the structure of planning problems that do not appeal to heuristic functions, nor to reductions to other formalisms such as SAT or CSP. We show that many of the standard benchmark domains can be solved with almost no search or a polynomially bounded amount of search, once the structure of planning problems is taken into account. In certain cases we can characterize this structure in terms of a novel width parameter for classical planning. / Los problemas en planificación clásica consisten en encontrar la secuencia de acciones que lleve a un agente a su objetivo desde un estado inicial, asumiendo que los efectos de las acciones son determinísticos. El enfoque más efectivo para encontrar dichos planes es la búsqueda heurística, extrayendo de la representación del problema de forma automática heurísticas que guien la búsqueda. En esta tesis, introducimos enfoques alternativos para realizar inferencias sobre la estructura del los problemas de planificación, sin apelar a funciones heurísticas, reducciones a SAT o CSP. Demostramos que la mayoría de problemas estándares pueden ser resueltos casi sin búsqueda o con una cantidad de búsqueda polinomialmente limitada, en algunos casos, caracterizando la estructura de los problemas en término de un nuevo parámetro de complejidad para la planificación clásica.
9

Exact and heuristic algorithms for the Euclidean Steiner tree problem

Van Laarhoven, Jon William 01 July 2010 (has links)
In this thesis, we study the Euclidean Steiner tree problem (ESTP) which arises in the field of combinatorial optimization. The ESTP asks for a network of minimal total edge length spanning a set of given terminal points in Rd with the ability to add auxiliary connecting points (Steiner points) to decrease the overall length of the network. The graph theory literature contains extensive studies of exact, approximation, and heuristic algorithms for ESTP in the plane, but less is known in higher dimensions. The contributions of this thesis include a heuristic algorithm and enhancements to an exact algorithm for solving the ESTP. We present a local search heuristic for the ESTP in Rd for d ≥ 2. The algorithm utilizes the Delaunay triangulation to generate candidate Steiner points for insertion, the minimum spanning tree to create a network on the inserted points, and second order cone programming to optimize the locations of Steiner points. Unlike other ESTP heuristics relying on the Delaunay triangulation, the algorithm inserts Steiner points probabilistically into Delaunay triangles to achieve different subtrees on subsets of terminal points. The algorithm extends effectively into higher dimensions, and we present computational results on benchmark test problems in Rd for 2 ≤ d ≤ 5. We develop new geometric conditions derived from properties of Steiner trees which bound below the number of Steiner points on paths between terminals in the Steiner minimal tree. We describe conditions for trees with a Steiner topology and show how these conditions relate to the Voronoi diagram. We describe more restrictive conditions for trees with a full Steiner topology and their implementation into the algorithm of Smith (1992). We present computational results on benchmark test problems in Rd for 2 ≤ d ≤ 5.
10

Water quality modeling and rainfall estimation: a data driven approach

Roz, Evan Phillips 01 July 2011 (has links)
Water is vital to man and its quality it a serious topic of concern. Addressing sustainability issues requires new understanding of water quality and water transport. Past research in hydrology has focused primarily on physics-based models to explain hydrological transport and water quality processes. The widespread use of in situ hydrological instrumentation has provided researchers a wealth of data to use for analysis and therefore use of data mining for data-driven modeling is warranted. In fact, this relatively new field of hydroinformatics makes use of the vast data collection and communication networks that are prevalent in the field of hydrology. In this Thesis, a data-driven approach for analyzing water quality is introduced. Improvements in the data collection of information system allow collection of large volumes of data. Although improvements in data collection systems have given researchers sufficient information about various systems, they must be used in conjunction with novel data-mining algorithms to build models and recognize patterns in large data sets. Since the mid 1990's, data mining has been successful used for model extraction and describing various phenomena of interest.

Page generated in 0.04 seconds