• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2619
  • 941
  • 381
  • 347
  • 331
  • 101
  • 66
  • 49
  • 40
  • 36
  • 34
  • 32
  • 32
  • 27
  • 26
  • Tagged with
  • 5995
  • 1459
  • 890
  • 730
  • 724
  • 706
  • 494
  • 493
  • 482
  • 453
  • 421
  • 414
  • 386
  • 366
  • 342
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
361

A Faster Algorithm for Computing Straight Skeletons

Mencel, Liam A. 06 May 2014 (has links)
We present a new algorithm for computing the straight skeleton of a polygon. For a polygon with n vertices, among which r are reflex vertices, we give a deterministic algorithm that reduces the straight skeleton computation to a motorcycle graph computation in O(n (log n) log r) time. It improves on the previously best known algorithm for this reduction, which is randomised, and runs in expected O(n √(h+1) log² n) time for a polygon with h holes. Using known motorcycle graph algorithms, our result yields improved time bounds for computing straight skeletons. In particular, we can compute the straight skeleton of a non-degenerate polygon in O(n (log n) log r + r^(4/3 + ε)) time for any ε > 0. On degenerate input, our time bound increases to O(n (log n) log r + r^(17/11 + ε))
362

UNSUPERVISED AND SEMI-SUPERVISED LEARNING IN AUTOMATIC INDUSTRIAL IMAGE INSPECTION

Weitao Tang (12462516) 27 April 2022 (has links)
<p>It has been widely studied in industry production environment to apply computer version onX-ray images for automatic visual inspection. Traditional methods embrace image processingtechniques and require custom design for each product. Although the accuracy of this approachvaries, it often fall short to meet the expectations in the production environment. Recently, deeplearning algorithms have significantly promoted the capability of computer vision in various tasksand provided new prospects for the automatic inspection system. Numerous studies appliedsupervised deep learning to inspect industrial images and reported promising results. However,the methods used in these studies are often supervised, which requires heavy manual annotation.It is therefore not realistic in many manufacturing scenarios because products are constantlyupdated. Data collection, annotation and algorithm training can only be performed after thecompletion of the manufacturing process, causing a significant delay in training the models andestablishing the inspection system. This research was aimed to tackle the problem usingunsupervised and semi-supervised methods so that these computer vision-based machine learningapproaches can be rapidly deployed in real-life scenarios. More specifically, this dissertationproposed an unsupervised approach and a semi-supervised deep learning method to identifydefective products from industrial inspection images. The proposed methods were evaluated onseveral open source inspection datasets and a dataset of X-Ray images obtained from a die castingplant. The results demonstrated that the proposed approach achieved better results than otherstate-of-the-art techniques on several occasions.</p>
363

A Genetic Algorithm Model for Financial Asset Diversification

Onek, Tristan 01 April 2019 (has links)
Machine learning models can produce balanced financial portfolios through a variety of methods. Genetic algorithms are one such method that can optimally combine different funds that may occupy a portfolio. This study introduces a genetic algorithm model that finds optimal combinations of funds for a portfolio through a new approach to fitness formula calculation. Each fund in a given population has a base fitness score consisting of the sum of several technical analysis indicators. Each indicator chosen measures a different performance aspect of a fund, allowing for a balanced fitness score. Additionally, each fund has multiple category variables that determine diversity when combined into a portfolio. The base fitness score for each portfolio is the sum of its funds' individual fitness scores. Portfolio fitness scores adjust based on the included funds' category variable diversity. Portfolios that consist of funds with largely similar categories receive lower adjusted fitness scores and do not cross over. This process encourages strong and diversified portfolios to reproduce. This model creates diverse portfolios that outperform market benchmarks and demonstrates future potential as a diversification-aware investment strategy.
364

Evaluating the Integration of Online, Interactive Tutorials into a Data Structures and Algorithms Course

Breakiron, Daniel Aubrey 28 May 2013 (has links)
OpenDSA is a collection of open source tutorials for teaching data structures and algorithms. It was created with the goals of visualizing complex, abstract topics; increasing the amount of practice material available to students; and providing immediate feedback and incremental assessment. In this thesis, I first describe aspects of the OpenDSA architecture relevant to collecting user interaction data. I then present an analysis of the interaction log data gathered from three classes during Spring 2013. The analysis focuses on determining the time distribution of student activity, determining the time required for assignment completion, and exploring \credit-seeking" behaviors and behavior related to non-required exercises. We identified clusters of students based on when they completed exercises, verified the reliability of estimated time requirements for exercises, provided evidence that a majority of students do not read the text, discovered a measurement that could be used to identify exercises that require additional development, and found evidence that students complete exercises after obtaining credit. Furthermore, we determined that slideshow usage was fairly high (even when credit was not ordered), and skipping to the end of slideshows was more common when credit was offered but also occurred when it was not. / Master of Science
365

Helical Antenna Optimization Using Genetic Algorithms

Lovestead, Raymond L. 06 October 1999 (has links)
The genetic algorithm (GA) is used to design helical antennas that provide a significantly larger bandwidth than conventional helices with the same size. Over the bandwidth of operation, the GA-optimized helix offers considerably smaller axial-ratio and slightly higher gain than the conventional helix. Also, the input resistance remains relatively constant over the bandwidth. On the other hand, for nearly the same bandwidth and gain, the GA-optimized helix offers a size reduction of 2:1 relative to the conventional helix. The optimization is achieved by allowing the genetic algorithm to control a polynomial that defines the envelope around which the helix is wrapped. The fitness level is defined as a combination of gain, bandwidth and axial ratio as determined by an analysis of the helix using NEC2. To experimentally verify the optimization results, a prototype 12-turn, two-wavelength high, GA-helix is built and tested on the Virginia Tech outdoor antenna range. Far-field radiation patterns are measured over a wide frequency range. The axial-ratio information is extracted from the measured pattern data. Comparison of measured and NEC-2 computed radiation patterns shows excellent agreement. The agreement between the measured and calculated axial-ratios is reasonable. The prototype GA-helix provides a peak gain of more than 13 dB and an upper-to-lower frequency ratio of 1.89. The 3-dB bandwidth of the antenna is 1.27 GHz (1.435 GHz - 2.705 GHz). Over this bandwidth the computed gain varies less than 3 dB and the axial-ratio remains below 3 dB. / Master of Science
366

Query AutoAwesome

Suryavanshi, Chetna 01 August 2019 (has links)
This research investigates how to improve legacy queries. Legacy queries are queries that programmers have coded and are used in applications. A database application typically has tens to hundreds of such queries. One way to improve legacy queries is to add new, interesting queries that are similar to or based on the set of queries. We propose Query AutoAwesome, a tool to generate new queries from legacy queries. The Query AutoAwesome philosophy is taken from Google’s AutoAwesomizer tool for photos, which automatically improves a photo uploaded to Google by animating the photo or adding special effects. In a similar vein, Query AutoAwesome automatically enhances a query by ingesting a database and the query. Query AutoAwesome produces a set of enhanced queries that a user can then choose to use or discard. A key problem that we solve is that the space of potential enhancements is large, so we introduce objective functions to narrow the search space to a tractable space. We describe our plans for implementing Query AutoAwesome and discuss our ideas for future work.
367

A genetic algorithm approach to best scenarios selection for performance evaluation of vehicle active safety systems

Gholamjafari, Ali January 2015 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / Gholamjafari, Ali MSECE, Purdue University, May 2015. A Genetic Algorithm Approach to Best Scenarios Selection for Performance Evaluation of Vehicle Active Safety Systems . Major Professor: Dr. Lingxi Li. One of the most crucial tasks for Intelligent Transportation Systems is to enhance driving safety. During the past several years, active safety systems have been broadly studied and they have been playing a significant role in vehicular safety. Pedestrian Pre- Collision System (PCS) is a type of active safety systems which is used toward pedestrian safety. Such system utilizes camera, radar or a combination of both to detect the relative position of the pedestrians towards the vehicle. Based on the speed and direction of the car, position of the pedestrian, and other useful information, the systems can anticipate the collision/near-collision events and take proper actions to reduce the damage due to the potential accidents. The actions could be triggering the braking system to stop the car automatically or could be simply sending a warning signal to the driver depending on the type of the events. We need to design proper testing scenarios, perform the vehicle testing, collect and analyze data to evaluate the performance of PCS systems. It is impossible though to test all possible accident scenarios due to the high cost of the experiments and the time limit. Therefore, a subset of complete testing scenarios (which is critical due to the different types of cost such as fatalities, social costs, the numbers of crashes, etc.) need to be considered instead. Note that selecting a subset of testing scenarios is equivalent to an optimization problem which is maximizing a cost function while satisfying a set of constraints. In this thesis, we develop an approach based on Genetic Algorithm to solve such optimization problems. We then utilize crash and field database to validate the accuracy of our algorithm. We show that our method is effective and robust, and runs much faster than exhaustive search algorithms. We also present some crucial testing scenarios as the result of our approach, which can be used in PCS field testing.
368

State Estimation Using a Parametric Approximation of the Viterbi Algorithm

Jakob, Åslund January 2021 (has links)
In this work, a new method of approximating the Maximum-likelihood estimate has been presented. The method consists of first using the Viterbi algorithm to estimate the log likelihood of the state, and then approximating that log likelihood to keep the computational complexity down. Various methods for approximating the log likelihood are introduced, most of these using linear regression and feature vectors. The methods were compared to a Kalman filter or Extended Kalman filter (depending on wether the system was linear or nonlinear) as well as a Particle filter modified to return a maximum likelihood estimate. Two systems were used for testing, one very simple linear system as well as a complex nonlinear system. Both of these were 1-dimensional. When applied to the simple system, the presented method outperformed both the Kalman filter and the Particle filter. While many approximation methods gave a good results the best one was using a cubic spline. For the more complex system, the method presented here could not outperform the particle filter. The most promising approximation method for this system was a Chebyshev approximation.
369

Klasifikace paketů s využitím technologie FPGA / Packet Classification Using FPGA Technology

Puš, Viktor January 2008 (has links)
This diploma thesis deals with packet classification in computer networks. The problem of packet classification is described, together with requirements for classification algorithm. Then, necessary theoretical background is introduced. Contemporary approaches to the classification are described, together with the critique of the current state of the field. The main focus of the work is the new algorithm of packet classification based on problem decomposition. Unique property of the algorithm is constant time complexity in terms of external memory accesses. Algorithm implemetation is proposed, using FPGA and one external memory. Planned prototype may achieve throughput of 64 Gbit/s in the worst case.
370

Animace algoritmů v prostředí Silverlight / Algorithm Animation in Silverlight

Gargulák, David January 2009 (has links)
The goal of this work was to create a program for the animation of algorithms in Silverlight. To develop this Silverlight module, platform .NET and programing language C# were used. This work contains basic information about Silverlight module and similar module named Flash.

Page generated in 0.0483 seconds