• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 32
  • Tagged with
  • 163
  • 163
  • 77
  • 53
  • 35
  • 32
  • 30
  • 28
  • 27
  • 27
  • 27
  • 24
  • 24
  • 23
  • 22
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Evaluating and Improving the Efficiency of Software and Algorithms for Sequence Data Analysis

Eaves, Hugh L 01 January 2016 (has links)
With the ever-growing size of sequence data sets, data processing and analysis are an increasingly large portion of the time and money spent on nucleic acid sequencing projects. Correspondingly, the performance of the software and algorithms used to perform that analysis has a direct effect on the time and expense involved. Although the analytical methods are widely varied, certain types of software and algorithms are applicable to a number of areas. Targeting improvements to these common elements has the potential for wide reaching rewards. This dissertation research consisted of several projects to characterize and improve upon the efficiency of several common elements of sequence data analysis software and algorithms. The first project sought to improve the efficiency of the short read mapping process, as mapping is the most time consuming step in many data analysis pipelines. The result was a new short read mapping algorithm and software, demonstrated to be more computationally efficient than existing software and enabling more of the raw data to be utilized. While developing this software, it was discovered that a widely used bioinformatics software library introduced a great deal of inefficiency into the application. Given the potential impact of similar libraries to other applications, and because little research had been done to evaluate library efficiency, the second project evaluated the efficiency of seven of the most popular bioinformatics software libraries, written in C++, Java, Python, and Perl. This evaluation showed that two of libraries written in the most popular language, Java, were an order of magnitude slower and used more memory than expected based on the language in which they were implemented. The third and final project, therefore, was the development of a new general-purpose bioinformatics software library for Java. This library, known as BioMojo, incorporated a new design approach resulting in vastly improved efficiency. Assessing the performance of this new library using the benchmark methods developed for the second project showed that BioMojo outperformed all of the other libraries across all benchmark tasks, being up to 30 times more CPU efficient than existing Java libraries.
42

Algorithms for trigonometric polynomial and rational approximation

Javed, Mohsin January 2016 (has links)
This thesis presents new numerical algorithms for approximating functions by trigonometric polynomials and trigonometric rational functions. We begin by reviewing trigonometric polynomial interpolation and the barycentric formula for trigonometric polynomial interpolation in Chapter 1. Another feature of this chapter is the use of the complex plane, contour integrals and phase portraits for visualising various properties and relationships between periodic functions and their Laurent and trigonometric series. We also derive a periodic analogue of the Hermite integral formula which enables us to analyze interpolation error using contour integrals. We have not been able to find such a formula in the literature. Chapter 2 discusses trigonometric rational interpolation and trigonometric linearized rational least-squares approximations. To our knowledge, this is the first attempt to numerically solve these problems. The contribution of this chapter is presented in the form of a robust algorithm for computing trigonometric rational interpolants of prescribed numerator and denominator degrees at an arbitrary grid of interpolation points. The algorithm can also be used to compute trigonometric linearized rational least-squares and trigonometric polynomial least-squares approximations. Chapter 3 deals with the problem of trigonometric minimax approximation of functions, first in a space of trigonometric polynomials and then in a set of trigonometric rational functions. The contribution of this chapter is presented in the form of an algorithm, which to our knowledge, is the first description of a Remez-like algorithm to numerically compute trigonometric minimax polynomial and rational approximations. Our algorithm also uses trigonometric barycentric interpolation and Chebyshev-eigenvalue based root finding. Chapter 4 discusses the Fourier-Padé (called trigonometric Padé) approximation of a function. We review two existing approaches to the problem, both of which are based on rational approximations of a Laurent series. We present a numerical algorithm with examples and compute various type (m, n) trigonometric Padé approximants.
43

Strategies for Encoding XML Documents in Relational Databases: Comparisons and Contrasts.

Leonard, Jonathan Lee 06 May 2006 (has links)
The rise of XML as a de facto standard for document and data exchange has created a need to store and query XML documents in relational databases, today's de facto standard for data storage. Two common strategies for storing XML documents in relational databases, a process known as document shredding, are Interval encoding and ORDPATH Encoding. Interval encoding, which uses a fixed mapping for shredding XML documents, tends to favor selection queries, at a potential cost of O(N) for supporting insertion queries. ORDPATH Encoding, which uses a looser mapping for shredding XML, supports fixed-cost insertions, at a potential cost of longer-running selection queries. Experiments conducted for this research suggest that the breakeven point between the two algorithms occurs when users offer an average 1 insertion to every 5.6 queries, relative to documents of between 1.5 MB and 4 MB in size. However, heterogeneous tests of varying mixes of selects and inserts indicate that Interval always outperforms ORDPATH for mixes ranging from 76% selects to 88% selects. Queries for this experiment and sample documents were drawn from the XMark benchmark suite.
44

The Apprentices' Tower of Hanoi

Ball, Cory BH 01 May 2015 (has links)
The Apprentices' Tower of Hanoi is introduced in this thesis. Several bounds are found in regards to optimal algorithms which solve the puzzle. Graph theoretic properties of the associated state graphs are explored. A brief summary of other Tower of Hanoi variants is also presented.
45

Electrodynamical Modeling for Light Transport Simulation

Saunders, Michael G 01 May 2017 (has links)
Modernity in the computer graphics community is characterized by a burgeoning interest in physically based rendering techniques. That is to say that mathematical reasoning from first principles is widely preferred to ad hoc, approximate reasoning in blind pursuit of photorealism. Thereby, the purpose of our research is to investigate the efficacy of explicit electrodynamical modeling by means of the generalized Jones vector given by Azzam [1] and the generalized Jones matrix given by Ortega-Quijano & Arce-Diego [2] in the context of stochastic light transport simulation for computer graphics. To augment the status quo path tracing framework with such a modeling technique would permit a plethora of complex optical effects—including dispersion, birefringence, dichroism, and thin film interference, and the physical optical elements associated with these effects—to become naturally supported, fully integrated features in physically based rendering software.
46

Application of Huffman Data Compression Algorithm in Hashing Computation

Devulapalli Venkata,, Lakshmi Narasimha 01 April 2018 (has links)
Cryptography is the art of protecting information by encrypting the original message into an unreadable format. A cryptographic hash function is a hash function which takes an arbitrary length of the text message as input and converts that text into a fixed length of encrypted characters which is infeasible to invert. The values returned by the hash function are called as the message digest or simply hash values. Because of its versatility, hash functions are used in many applications such as message authentication, digital signatures, and password hashing [Thomsen and Knudsen, 2005]. The purpose of this study is to apply Huffman data compression algorithm to the SHA-1 hash function in cryptography. Huffman data compression algorithm is an optimal compression or prefix algorithm where the frequencies of the letters are used to compress the data [Huffman, 1952]. An integrated approach is applied to achieve new compressed hash function by integrating Huffman compressed codes in the core functionality of hashing computation of the original hash function.
47

Exploring the Effect of Different Numbers of Convolutional Filters and Training Loops on the Performance of AlphaZero

Prince, Jared 01 October 2018 (has links)
In this work, the algorithm used by AlphaZero is adapted for dots and boxes, a two-player game. This algorithm is explored using different numbers of convolutional filters and training loops, in order to better understand the effect these parameters have on the learning of the player. Different board sizes are also tested to compare these parameters in relation to game complexity. AlphaZero originated as a Go player using an algorithm which combines Monte Carlo tree search and convolutional neural networks. This novel approach, integrating a reinforcement learning method previously applied to Go (MCTS) with a supervised learning method (neural networks) led to a player which beat all its competitors.
48

Combinatorial Polynomial Hirsch Conjecture

Miller, Sam 01 January 2017 (has links)
The Hirsch Conjecture states that for a d-dimensional polytope with n facets, the diameter of the graph of the polytope is at most n-d. This conjecture was disproven in 2010 by Francisco Santos Leal. However, a polynomial bound in n and d on the diameter of a polytope may still exist. Finding a polynomial bound would provide a worst-case scenario runtime for the Simplex Method of Linear Programming. However working only with polytopes in higher dimensions can prove challenging, so other approaches are welcome. There are many equivalent formulations of the Hirsch Conjecture, one of which is the Combinatorial Polynomial Hirsch Conjecture, which turns the problem into a matter of counting sets. This thesis explores the Combinatorial Polynomial Hirsch Conjecture.
49

An algorithm for two-dimensional density reconstruction in proton computed tomography (PCT)

Tafas, Jihad 01 January 2007 (has links)
The purpose of this thesis is to develop an optimized and effective iterative reconstruction algorithm and hardware acceleration methods that work synonymously together through reconstruction in proton computed tomography, which accurately maps the electron density.
50

Introducing Non-Determinism to the Parallel C Compiler

Concepcion, Rowen 01 June 2014 (has links)
The Planguages project is the birthplace of the Planguage programmingapproach, which is designed to alleviate the task of writing parallelprograms and harness massively parallel computers and networks of workstations. Planguage has two existing translators, Parallel C (PC) and Pfortran,which is used for their base languages, C and Fortran77. The translatorswork with MPI (Message Passing Interface) for communications. SOS(ipStreams, Overlapping and Shortcutting), a function library that supportsthe three named functionalities, can be used to further optimize parallel algorithms. This project is the next step in the continuing project of updatingthe PC Compiler. The goal is to test the viability of using “shortcutting”functions. Parallel programs with the ability to shortcut can be generatedby the updated version of the PC Compiler. In addition, this project introducesthe ability of the PC Compiler to translate a race condition intoa non-deterministic solution. This document explores different phases of the project in detail. Thefollowing phases are included: software design, algorithm design, analysis,and results. The deliverables, source code, and diagrams are included asAppendices.

Page generated in 0.044 seconds