• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 8
  • 4
  • 1
  • Tagged with
  • 18
  • 18
  • 9
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Memory-efficient graph search applied to multiple sequence alignment

Zhou, Rong 06 August 2005 (has links)
Graph search is used in many areas of computer science. It is well-known that the scalability of graph-search algorithms such as A* is limited by their memory requirements. In this dissertation, I describe three complementary strategies for reducing the memory requirements of graph-search algorithms, especially for multiple sequence alignment (a central problem in computational molecular biology). These search strategies dramatically increase the range and difficulty of multiple sequence alignment problems that can be solved. The first strategy uses a divide-and-conquer method of solution reconstruction, and one of my contributions is to show that when divide-and-conquer solution reconstruction is used, a layer-by-layer strategy for multiple sequence alignment is more memory-efficient than a bestirst strategy. The second strategy is a new approach to duplicate detection in external-memory graph search that involves partitioning the search graph based on an abstraction of the state space. For graphs with sufficient local structure, it allows graph-search algorithms to use external memory, such as disk storage, almost as efficiently as internal memory. The third strategy is a technique for reducing the memory requirements of sub-alignment search heuristics that are stored in lookup tables. It uses the start and goal states of a problem instance to restrict the region of the state space for which a table-based heuristic is needed, making it possible to store more accurate heuristic estimates in the same amount of memory. These three strategies dramatically improve the scalability of graph search not only for multiple sequence alignment, but for many other graph-search problems, and generalizations of these search strategies for other graph-search problems are discussed throughout the dissertation.
2

Automated 3-D segmentation and analysis of retinal optical coherence tomography images

Garvin, Mona Kathryn 01 January 2008 (has links)
Optical coherence tomography (OCT) is becoming an increasingly important modality for the noninvasive assessment of a variety of ocular diseases such as glaucoma, diabetic macular edema, and age-related macular degeneration. Even though individual layers of the retina are visible on OCT images, current commercial quantitative assessment is limited to measuring the thickness of only one layer. Because each intraretinal layer may be affected differently by disease, an intraretinal layer segmentation approach is needed to enable quantification of individual layer properties, such as thickness or texture. Furthermore, with the latest generation of OCT scanner systems producing true volumetric image data, processing these images using 3-D methods is important for maximal extraction of image information. In this thesis, an optimal 3-D graph search approach for the intraretinal layer segmentation of OCT images is presented. It is built upon the optimal 3-D multiple surface graph-theoretic approach presented by Li et al. (K. Li, X. Wu, D. Z. Chen, and M. Sonka, "Optimal surface segmentation in volumetric images - a graph-theoretic approach," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 28, no. 1, pp. 119-134, 2006). In this method, multiple surfaces can be found simultaneously by transforming the 3-D segmentation problem into finding a minimum-cost closed set in a corresponding vertex-weighted geometric graph. However, the original formulation of this approach did not incorporate varying feasibility constraints or true regional information, two extensions that would aid in the intraretinal layer segmentation of OCT images. Thus, the major contributions of this thesis include: 1) extending the optimal 3-D graph-theoretic segmentation approach to allow for the incorporation of varying feasibility constraints and regional information, 2) developing a method for learning varying constraints and cost functions from examples for use in the approach, 3) developing and validating a method for the 3-D segmentation of intraretinal layers in both time-domain and spectral-domain OCT images (achieving error rates comparable to that of multiple human experts), and 4) analyzing layer thickness properties in normal subjects and in patients with anterior ischemic optic neuropathy (AION).
3

MRI fat-water separation using graph search based methods

Cui, Chen 01 August 2017 (has links)
The separation of water and fat from multi-echo images is a classic problem in magnetic resonance imaging (MRI) with a wide range of important clinical applications. For example, removal of fat signal can provide better visualization of other signal of interest in MRI scans. In other cases, the fat distribution map can be of great importance in diagnosis. Although many methods have been proposed over the past three decades, robust fat water separation remains a challenge as radiological technology and clinical expectation continue to grow. The problem presents three key difficulties: a) the presence of B0 field inhomogeneities, often large in the state-of-the-art research and clinical settings, which makes the problem non-linear and ill-posed; b) the ambiguity of signal modeling in locations with only one metabolite (either fat or water), which can manifest as spurious fat water swaps in the separation; c) the computational expenditure in fat water separation as the size of the data is increasing along with evolving MRI hardware, which hampers the clinical applicability of the fat water separation. The main focus of this thesis is to develop novel graph based algorithms to estimate the B0 field inhomogeneity maps and separate fat water signals with global accuracy and computational efficiency. We propose a new smoothness constrained framework for the GlObally Optimal Surface Estimation (GOOSE), in which the spatial smoothness of the B0 field is modeled as a finite constraint between adjacent voxels in a uniformly discretized graph. We further develop a new non-equidistant graph model that enables a Rapid GlObally Optimal Surface Estimation (R-GOOSE) in a subset of the fully discretized graph in GOOSE. Extensions of the above frameworks are also developed to achieve high computational efficiency for processing large 3D datasets. Global convergence of the optimization formulation is proven in all frameworks. The developed methods have also been extensively compared to the existing state-of-the-art fat water separation methods on a variety of datasets with consistent performance of high accuracy and efficiency.
4

Automated 3-D segmentation of intraretinal surfaces from optical coherence tomography images centered on the optic nerve head

Antony, Bhavna Josephine 01 December 2009 (has links)
Optical coherence tomography (OCT), being a noninvasive imaging modality, has begun to find vast use in the diagnosis and management of retinal diseases. These high-resolution images of the retina allow structural changes to be detected and tracked. For instance, in glaucoma, the retinal nerve fiber layer (RNFL) has been known to thin. The recent availability of the considerably larger volumetric data from the spectral-domain OCT scanners has further increased the need for new processing techniques. This body of work is centered around an automated 3-D graph-theoretic approach for the segmentation of 7 surfaces (6 layers) of the retina from 3-D spectral-domain OCT images centered on the optic nerve head (ONH). The multiple surfaces are detected through the computation of a minimum-cost closed set in a vertex-weighted graph constructed using edge/regional information, and subject to a priori determined varying surface interaction and smoothness constraints. The method also addresses the challenges posed by presence of the neural canal and the large blood vessels found at the ONH. The method was used to study RNFL thickness maps of normal and glaucomatous eyes, which showed average thicknesses of 73.72 +/- 32.72um and 60.38 +/- 25.22um (p < 0.01), respectively.
5

Axiom relevance decision engine : technical report

Frank, Mario January 2012 (has links)
This document presents an axiom selection technique for classic first order theorem proving based on the relevance of axioms for the proof of a conjecture. It is based on unifiability of predicates and does not need statistical information like symbol frequency. The scope of the technique is the reduction of the set of axioms and the increase of the amount of provable conjectures in a given time. Since the technique generates a subset of the axiom set, it can be used as a preprocessor for automated theorem proving. This technical report describes the conception, implementation and evaluation of ARDE. The selection method, which is based on a breadth-first graph search by unifiability of predicates, is a weakened form of the connection calculus and uses specialised variants or unifiability to speed up the selection. The implementation of the concept is evaluated with comparison to the results of the world championship of theorem provers of the year 2012 (CASC J6). It is shown that both the theorem prover leanCoP which uses the connection calculus and E which uses equality reasoning, can benefit from the selection approach. Also, the evaluation shows that the concept is applyable for theorem proving problems with thousands of formulae and that the selection is independent from the calculus used by the theorem prover. / Dieser technische Report beschreibt die Konzeption, Implementierung und Evaluation eines Verfahrens zur Auswahl von logischen Formeln bezüglich derer Relevanz für den Beweis einer logischen Formel. Das Verfahren wird ausschließlich für die Prädikatenlogik erster Ordnung angewandt, wenngleich es auch für höherstufige Prädikatenlogiken geeignet ist. Das Verfahren nutzt eine unifikationsbasierte Breitensuche im Graphen wobei jeder Knoten im Graphen ein Prädikat und jede existierende Kante eine Unifizierbarkeitsrelation ist. Ziel des Verfahrens ist die Reduktion einer gegebenen Menge von Formeln auf eine für aktuelle Theorembeweiser handhabbare Größe. Daher ist das Verfahren als Präprozess-Schritt für das automatische Theorembeweisen geeignet. Zur Beschleunigung der Suche wird neben der Standard-Unifikation eine abgeschwächte Unifikation verwendet. Das System wurde während der Weltmeisterschaft der Theorembeweiser im Jahre 2014 (CASC J6) in Manchester zusammen mit dem Theorembeweiser leanCoP eingereicht und konnte leanCoP dabei unterstützen, Probleme zu lösen, die leanCoP alleine nicht handhaben kann. Die Tests mit leanCoP und dem Theorembeweiser E im Nachgang zu der Weltmeisterschaft zeigen, dass das Verfahren unabhängig von dem verwendeten Kalkül ist und bei beiden Theorembeweisern positive Auswirkungen auf die Beweisbarkeit von Problemen mit großen Formelmengen hat.
6

Graph Search as a Feature in Imperative/Procedural Programming Languages

January 2018 (has links)
abstract: Graph theory is a critical component of computer science and software engineering, with algorithms concerning graph traversal and comprehension powering much of the largest problems in both industry and research. Engineers and researchers often have an accurate view of their target graph, however they struggle to implement a correct, and efficient, search over that graph. To facilitate rapid, correct, efficient, and intuitive development of graph based solutions we propose a new programming language construct - the search statement. Given a supra-root node, a procedure which determines the children of a given parent node, and optional definitions of the fail-fast acceptance or rejection of a solution, the search statement can conduct a search over any graph or network. Structurally, this statement is modelled after the common switch statement and is put into a largely imperative/procedural context to allow for immediate and intuitive development by most programmers. The Go programming language has been used as a foundation and proof-of-concept of the search statement. A Go compiler is provided which implements this construct. / Dissertation/Thesis / Masters Thesis Software Engineering 2018
7

Ranked Search on Data Graphs

Varadarajan, Ramakrishna R. 10 March 2009 (has links)
Graph-structured databases are widely prevalent, and the problem of effective search and retrieval from such graphs has been receiving much attention recently. For example, the Web can be naturally viewed as a graph. Likewise, a relational database can be viewed as a graph where tuples are modeled as vertices connected via foreign-key relationships. Keyword search querying has emerged as one of the most effective paradigms for information discovery, especially over HTML documents in the World Wide Web. One of the key advantages of keyword search querying is its simplicity – users do not have to learn a complex query language, and can issue queries without any prior knowledge about the structure of the underlying data. The purpose of this dissertation was to develop techniques for user-friendly, high quality and efficient searching of graph structured databases. Several ranked search methods on data graphs have been studied in the recent years. Given a top-k keyword search query on a graph and some ranking criteria, a keyword proximity search finds the top-k answers where each answer is a substructure of the graph containing all query keywords, which illustrates the relationship between the keyword present in the graph. We applied keyword proximity search on the web and the page graph of web documents to find top-k answers that satisfy user’s information need and increase user satisfaction. Another effective ranking mechanism applied on data graphs is the authority flow based ranking mechanism. Given a top-k keyword search query on a graph, an authority-flow based search finds the top-k answers where each answer is a node in the graph ranked according to its relevance and importance to the query. We developed techniques that improved the authority flow based search on data graphs by creating a framework to explain and reformulate them taking in to consideration user preferences and feedback. We also applied the proposed graph search techniques for Information Discovery over biological databases. Our algorithms were experimentally evaluated for performance and quality. The quality of our method was compared to current approaches by using user surveys.
8

Query on Knowledge Graphs with Hierarchical Relationships

Liu, Kaihua 27 October 2017 (has links)
The dramatic popularity of graph database has resulted in a growing interest in graph queries. Two major topics are included in graph queries. One is based on structural relationship to find meaningful results, such as subgraph pattern match and shortest-path query. The other one focuses on semantic-based query to find question answering from knowledge bases. However, most of these queries take knowledge graphs as flat forms and use only normal relationship to mine these graphs, which may lead to mistakes in the query results. In this thesis, we find hierarchical relationship in the knowledge on their semantic relations and make use of hierarchical relationship to query on knowledge graphs; and then we propose a meaningful query and its corresponding efficient query algorithm to get top-k answers on hierarchical knowledge graphs. We also design algorithms on distributed frameworks, which can improve its performance. To demonstrate the effectiveness and the efficiency of our algorithms, we use CISCO related products information that we crawled from official websites to do experiments on distributed frameworks.
9

Video game pathfinding and improvements to discrete search on grid-based maps

Anguelov, Bobby 02 March 2012 (has links)
The most basic requirement for any computer controlled game agent in a video game is to be able to successfully navigate the game environment. Pathfinding is an essential component of any agent navigation system. Pathfinding is, at the simplest level, a search technique for finding a route between two points in an environment. The real-time multi-agent nature of video games places extremely tight constraints on the pathfinding problem. This study aims to provide the first complete review of the current state of video game pathfinding both in regards to the graph search algorithms employed as well as the implications of pathfinding within dynamic game environments. Furthermore this thesis presents novel work in the form of a domain specific search algorithm for use on grid-based game maps: the spatial grid A* algorithm which is shown to offer significant improvements over A* within the intended domain. Copyright / Dissertation (MSc)--University of Pretoria, 2011. / Computer Science / unrestricted
10

Problemas de Corte e Empacotamento: Uma abordagem em Grafo E/OU / Cutting and packing problems: an AND/OR-Graph approach

Vianna, Andréa Carla Gonçalves 19 December 2000 (has links)
O problema de corte consiste no corte de objetos maiores para produção de peças menores, de modo que uma certa função objetivo seja otimizada, por exemplo, a perda seja minimizada. O problema de empacotamento pode também ser visto como um problema de corte, onde as peças menores são arranjadas dentro dos objetos. Uma abordagem em grafo E/OU para a resolução de problemas de corte e empacotamento foi proposta inicialmente por Morabito (1989) para problemas de corte bidimensionais e, mais tarde, estendida para problemas tridimensionais (Morabito, 1992). Nesta abordagem foi utilizada uma técnica de busca híbrida, onde se combinou a busca em profundidade primeiro com limite de profundidade e a busca hill-climbing, utilizando-se heurísticas baseadas nos limitantes superiores e inferiores. Experiências computacionais mostraram a viabilidade de uso na prática desta abordagem. Mais tarde, Arenales (1993) generalizou esta a abordagem em grafo E/OU mostrando como diferentes problemas de corte poderiam ser resolvidos, independentemente da dimensão, formas dos objetos e itens, baseado em simples hipóteses, sem realizar, entretanto, estudos computacionais. O presente trabalho tem por objetivo estender a abordagem em grafo E/OU para tratar outros casos não analisados pelos trabalhos anteriores, tais como situações envolvendo diferentes processos de corte, bem como a implementação computacional de métodos baseados na abordagem em grafo E/OU, mostrando, assim, a versatilidade da abordagem para tratar diversas situações práticas de problemas de corte e sua viabilidade computacional. / The cutting problem consists of cutting larger objects in order to produce smaller pieces, in such a way as to optimizing a given objective function, for example, minimizing the waste. The packing problem can also be seen as a cutting problem, where the position that each smaller piece is arranged inside of the objects can be seen as the place it was cut from. An AND/OR-graph approach to solve cutting and packing problems was initially proposed by Morabito (1989) for two-dimensional cutting problem and, later, extended to threedimensional problems (Morabito, 1992). That approach uses a hybrid search, which combines depth-first search under depth bound and hill-climbing strategy. Heuristics were devised based on upper and lower bounds. Computational experiences demonstrated its practical feasibility. The AND/OR-graph approach was later generalized by Arenales (1993) based on simple hypothesis. He showed that different cutting problems Gould be solved using the AND/ORgraph approach, independently of the dimension and shapes. The main objective of this thesis is the practical extension of the AND/OR-graph approach to handle other cases not considered by previous works. It was considered different cutting processes, as well as the analysis of computational implementation, showing how can it be adapted to many classes of practical cutting and packing problems.

Page generated in 0.0668 seconds