• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2596
  • 912
  • 381
  • 347
  • 331
  • 101
  • 66
  • 49
  • 40
  • 36
  • 34
  • 32
  • 31
  • 27
  • 26
  • Tagged with
  • 5940
  • 1422
  • 871
  • 726
  • 722
  • 669
  • 492
  • 490
  • 479
  • 447
  • 421
  • 414
  • 386
  • 365
  • 340
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
561

New Algorithm and Data Structures for the All Pairs Shortest Path Problem

Hashim, Mashitoh January 2013 (has links)
In 1985, Moffat-Takaoka (MT) algorithm was developed to solve the all pairs shortest path (APSP) problem. This algorithm manages to get time complexity of O(n² log n) expected time when the end-point independent model of probabilistic assumption is used. However, the use of a critical point introduced in this algorithm has made the implementation of this algorithm quite complicated and the running time of this algorithm is difficult to analyze. Therefore, this study introduces a new deterministic algorithm for the APSP that provides an alternative to the existing MT algorithm. The major advantages of this approach compared to the MT algorithm are its simplicity, intuitive appeal and ease of analysis. Moreover, the algorithm was shown to be efficient as the expected running time is the same O(n² log n). Performance of a good algorithm depends on the data structure used to speed up the operations needed by the algorithm such as insert, delete-min and decrease-key operations. In this study, two new data structures have been implemented, namely quaternary and dimensional heaps. In the experiment carried out, the quaternary heap that employed similar concept with the trinomial heap with a special insertion cache function performed better than the trinomial heap when the number of n vertices was small. Likewise, the dimensional heap data structure executed the decrease-key operation efficiently by maintaining the thinnest structure possible through the use of thin and thick edges, far surpassing the existing binary, Fibonacci and 2-3 heaps data structures when a special acyclic graph was used. Taken together all these promising findings, a new improved algorithm running on a good data structure can be implemented to enhance the computing accuracy and speed of todays computing machines.
562

A statistical model for locating regulatory regions in novel DNA sequences

Byng, Martyn Charles January 2001 (has links)
No description available.
563

Three Essays on Networks and Public Economics

Bouchard St Amant, Pier-André 15 October 2013 (has links)
This thesis is a collection of three essays. The first two study how ideas spread through a network of individuals, and how it an advertiser can exploit it. In the model I develop, users choose their sources of information based on the perceived usefulness of their sources of information. This contrasts with previous literature where there is no choice made by network users and thus, the information flow is fixed. I provide a complete theoretical characterization of the solution and define a natural measure of influence based on choices of users. I also present an algorithm to solve the model in polynomial time on any network, regardless of the scale or the topology. I also discuss the properties of a network technology from a public economic standpoint. In essence, a network allows the reproduction of ideas for free for the advertiser. If there is any free-riding problem, I show that coalitions of users on the network can solve such problem. I also discuss the social value of networks, a value that cannot be captured for profit. The third essay is completely distinct from the network paradigm and instead studies funding rules for public universities. I show that a funding rule that depends solely on enrolment leads to "competition by franchise" and that such behavior is sometimes inefficient. I suggest instead an alternate funding rule that allows government to increase welfare without increasing spending in universities. / Thesis (Ph.D, Economics) -- Queen's University, 2013-10-11 17:16:42.178
564

Content-Based Image Retrieval for Tattoos: An Analysis and Comparison of Keypoint Detection Algorithms

Kemp, Neal 01 January 2013 (has links)
The field of biometrics has grown significantly in the past decade due to an increase in interest from law enforcement. Law enforcement officials are interested in adding tattoos alongside irises and fingerprints to their toolbox of biometrics. They often use these biometrics to aid in the identification of victims and suspects. Like facial recognition, tattoos have seen a spike in attention over the past few years. Tattoos, however, have not received as much attention by researchers. This lack of attention towards tattoos stems from the difficulty inherent in matching these tattoos. Such difficulties include image quality, affine transformation, warping of tattoos around the body, and in some cases, excessive body hair covering the tattoo. We will utilize context-based image retrieval to find a tattoo in a database which means using one image to query against a database in order to find similar tattoos. We will focus specifically on the keypoint detection process in computer vision. In addition, we are interested in finding not just exact matches but also similar tattoos. We will conclude that the ORB detector pulls the most relevant features and thus is the best chance for yielding an accurate result from content-based image retrieval for tattoos. However, we will also show that even ORB will not work on its own in a content-based image retrieval system. Other processes will have to be involved in order to return accurate matches. We will give recommendations on next-steps to create a better tattoo retrieval system.
565

Wireless Transmission Method of Emergency Response---An Implementation for Multiple Sinks Data Collection in Wireless Sensor Networks

Nan Chen, Nan January 2013 (has links)
In wireless sensor networks, source nodes usually send their data packets to a single sink following a specific routing protocol. In this way, unicast delivery becomes a dominant means of data transmission through the network. However, if one of links in the route is out of order, a dynamic routing protocol will rule the search by the routers for a new transmission route and thus, more time will be taken with regards to searching for the route. In this paper, a condition in which an emergency occurs is envisioned. Source nodes must send the emergency information packets to the possible sink as soon as possible. Then the multicast delivery should be taken instead of the unicast delivery so as to save more time.
566

Labirintų generavimo programa / Labyrinth’s Generation Program

Steišiūnas, Aivaras 03 September 2010 (has links)
Tikslas: Sukurti labirintų generavimo programą. Analitinėje dalyje nagrinėjami aspektai: labirintų klasifikacija ir labirintų generavimo algoritmai. Kuriamo produkto prototipai: deadalus 2.2, mazemaker 1.0, mazeworks. Technologinis sprendimas: HTML, Java, XML, PNG. Panaudota programinė įranga: NetBeans, Paint, Microsoft Word, Internet Explorer, Altova Umodel, Natepad++. Naudotos UML diagramos: panaudos atvejų diagrama iliustruoja, kokius veiksmus su programine įranga gali atlikti vartotojas; Veiklos konteksto diagrama – vaizduoja materialiuosius srautus tarp vartotojo ir programinės įrangos; Būsenų diagrama – labirinto būsenos programos veikimo metu; Dvi sekų diagramos – parodo, kokia eilės tvarka atliekami veiksmai labirinto generavimo metu; Dvi veiklos diagramos - detalus algoritmo realizavimas; Klasių diagrama – parodo, kiek yra klasių ir kaip jos siejasi tarpusavyje; Komponentų diagrama – programinės įrangos posistemės. Testavimo apimtis: testuotas generavimas, atvaizdavimas, sprendimas, redagavimas, saugojimas ir įkėlimas. Vartotojo dokumentacijos sudedamosios dalys: apie programą, nustatymai, labirinto generavimas, labirinto įkėlimas iš failo, labirinto išsaugojimas, labirinto vaizdo išsaugojimas, failo struktūra, labirinto sprendimas, labirinto kūrimas, labirinto generavimas. / This is a documentation of maze creation and generation. First of all you will find some history about labyrinths. After is described a maze classification and types of mazes. Also there are five maze generation algorithms. In next chapter are established the requirements for software development. Some information about existing solutions. In third chapter is graphically described code which implements the algorithms. You can find out how algorithms works. How many classes are used in the project and how it's related. Also you can find software test results and user guide.
567

A Hybrid Hole-filling Algorithm

Long, Junhui 12 September 2013 (has links)
A polygon mesh, or a 3D mesh, consisting of a collection of vertices, edges, and polygons in three-dimensional space, is the standard way of representing 3D objects. In practice, polygon meshes acquired from the 3D scanning process fail to meet the quality requirements for most practical applications. Mesh defects like holes, duplicate elements, non-manifold elements are introduced during the scanning process, which lowers the quality of the output meshes. In this thesis, we describe a complete mesh-repairing process that fixes all defects within a polygon mesh. This process is divided into two parts: the mesh-cleaning part and the hole-filling part. In the mesh-cleaning part, we describe the ways of repairing different types of mesh defects. In the hole-filling part, we discuss two main hole-filling approaches: the surface-based method and the volumetric. In addition, we present a hybrid algorithm by combining the surface-based approach and the volumetric approach. We compare the meshes created by different hole-filing algorithms and show that the new algorithm is a good alternative to the existing ones. / Thesis (Master, Computing) -- Queen's University, 2013-09-11 23:45:08.591
568

Exact and Monte-Carlo algorithms for combinatorial games / Exakta och Monte-Carlo algoritmer för kombinatoriska spel

Leino, Anders January 2014 (has links)
This thesis concerns combinatorial games and algorithms that can be used to play them.Basic definitions and results about combinatorial games are covered, and an implementation of the minimax algorithm with alpha-beta pruning is presented.Following this, we give a description and implementation of the common UCT (Upper Confidence bounds applied to Trees) variant of MCTS (Monte-Carlo tree search).Then, a framework for testing the behavior of UCT as first player, at various numbers of iterations (namely 2,7, ... 27), versus minimax as second player, is described.Finally, we present the results obtained by applying this framework to the 2.2 million smallest non-trivial positional games having winning sets of size either 2 or 3.It is seen that on almost all different classifications of the games studied, UCT converges quickly to near-perfect play. / Denna rapport handlar om kombinatoriska spel och algoritmer som kan användas för att spela dessa.Grundläggande definitioner och resultat som berör kombinatoriska spel täcks, och en implementation av minimax-algoritmen med alpha-beta beskärning ges.Detta följs av en beskrivning samt en implementation av UCT varianten av MCTS (Monte-Carlo tree search).Sedan beskrivs ett ramverk för att testa beteendet för UCT som första spelare, vid olika antal iterationer (nämligen 2, 7, ... 27), mot minimax som andra spelare.Till sist beskrivs resultaten vi funnit genom att använda detta ramverk för att spela de 2,2 miljoner minsta icke triviala positionella spelen med vinstmängder av storlek antingen 2 eller 3.Vi finner att, för nästan alla olika klassificeringar av spel vi studerar, så konvergerar UCT snabbt mot nära perfekt spel.
569

Design and Analysis of Intelligent Fuzzy Tension Controllers for Rolling Mills

Liu, Jingrong January 2002 (has links)
This thesis presents a fuzzy logic controller aimed at maintaining constant tension between two adjacent stands in tandem rolling mills. The fuzzy tension controller monitors tension variation by resorting to electric current comparison of different operation modes and sets the reference for speed controller of the upstream stand. Based on modeling the rolling stand as a single input single output linear discrete system, which works in the normal mode and is subject to internal and external noise, the element settings and parameter selections in the design of the fuzzy controller are discussed. To improve the performance of the fuzzy controller, a dynamic fuzzy controller is proposed. By switching the fuzzy controller elements in relation to the step response, both transient and stationary performances are enhanced. To endow the fuzzy controller with intelligence of generalization, flexibility and adaptivity, self-learning techniques are introduced to obtain fuzzy controller parameters. With the inclusion of supervision and concern for conventional control criteria, the parameters of the fuzzy inference system are tuned by a backward propagation algorithm or their optimal values are located by means of a genetic algorithm. In simulations, the neuro-fuzzy tension controller exhibits the real-time applicability, while the genetic fuzzy tension controller reveals an outstanding global optimization ability.
570

Reconstruction and Visualization of Polyhedra Using Projections

Hasan, Masud January 2005 (has links)
Two types of problems are studied in this thesis: reconstruction and visualization of polygons and polyhedra. <br /><br /> Three problems are considered in reconstruction of polygons and polyhedra, given a set of projection characteristics. The first problem is to reconstruct a closed convex polygon (polyhedron) given the number of visible edges (faces) from each of a set of directions <em>S</em>. The main results for this problem include the necessary and sufficient conditions for the existence of a polygon that realizes the projections. This characterization gives an algorithm to construct a feasible polygon when it exists. The other main result is an algorithm to find the maximum and minimum size of a feasible polygon for the given set <em>S</em>. Some special cases for non-convex polygons and for perspective projections are also studied. <br /><br /> For reconstruction of polyhedra, it is shown that when the projection directions are co-planar, a feasible polyhedron (i. e. a polyhedron satisfying the projection properties) can be constructed from a feasible polygon and vice versa. When the directions are covered by two planes, if the number of visible faces from each of the directions is at least four, then an algorithm is presented to decide the existence of a feasible polyhedron and to construct one, when it exists. When the directions see arbitrary number of faces, the same algorithm works, except for a particular sub-case. <br /><br /> A polyhedron is, in general, called equiprojective, if from any direction the size of the projection or the projection boundary is fixed, where the "size" means the number of vertices, edge, or faces. A special problem on reconstruction of polyhedra is to find all equiprojective polyhedra. For the case when the size is the number of vertices in the projection boundary, main results include the characterization of all equiprojective polyhedra and an algorithm to recognize them, and finding the minimum equiprojective polyhedra. Other measures of equiprojectivity are also studied. <br /><br /> Finally, the problem of efficient visualization of polyhedra under given constraints is considered. A user might wish to find a projection that highlights certain properties of a polyhedron. In particular, the problem considered is given a set of vertices, edges, and/or faces of a convex polyhedron, how to determine all projections of the polyhedron such that the elements of the given set are on the projection boundary. The results include efficient algorithms for both perspective and orthogonal projections, and improved adaptive algorithm when only edges are given and they form disjoint paths. A related problem of finding all projections where the given edges, faces, and/or vertices are not on the projection boundary is also studied.

Page generated in 0.0766 seconds