• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 164
  • 164
  • 56
  • 31
  • 28
  • 23
  • 19
  • 19
  • 18
  • 18
  • 17
  • 16
  • 15
  • 14
  • 14
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Algorithms for Library-Based Microbial Source Tracking

Montana, Aldrin 01 June 2013 (has links) (PDF)
Pyroprinting is a novel, library-based microbial source tracking method developed by the Biology department at Cal Poly, San Luis Obispo. This method consists of two parts: (1) a collection of bacterial fingerprints, called pyroprints, from known host species, and (2) a method for pyroprint comparison. Currently, Cal Poly Library of Pyroprints (CPLOP), a web-based database application, provides storage and analysis of over $10000$ pyroprints. This number is quickly growing as students and researchers continue to use pyroprinting for research. Biologists conducting research using pyroprinting rely on methods for partitioning collected bacterial isolates into bacterial strains. Clustering algorithms are commonly used for bacterial strain analysis of organisms in computational biology. Unfortunately, agglomerative hierarchical clustering, a commonly used clustering algorithm, is inadequate given the nature of data collection for pyroprinting. While the clusters produced by agglomerative hierarchical clustering are acceptable, pyroprinting requires a method of analysis that is scalable and incorporates useful metadata into the clustering process. We propose ontology-based hierarchical clustering (OHClust!), a modification of agglomerative hierarchical clustering that expresses metadata-based relationships as an ontology to direct the order in which hierarchical clustering algorithms analyze the data. In this thesis, the strengths and weaknesses of OHClust! are discussed, and its performance is analyzed in comparison to agglomerative hierarchical clustering.
22

Real-Time TDDFT-Based Filtered Spectroscopy

Williams, Ivan 18 December 2020 (has links)
We demonstrate the ability to simulate targeted excitation of atomistic systems using our real-time TDDFT-based simulation framework NESSIE. Traditional ap- proaches for extracting spectra through real-time TDDFT involves excitation of all frequencies via impulse requiring long simulation times and yielding poor resolution. By exciting the system with a modulated sinc pulse between the frequencies of inter- est we are able to obtain a spectral response with far more precision in a significantly shorter time frame than competing implementations.
23

IoTA: Internet of Things Assistant

Okumura, Brandon M 01 July 2017 (has links)
The Internet of Things is the networking of electronic devices, or “Things”, that enables them to collect and share data, as well as interact with their physical surround- ings. Analyzing this collected data allows us to make smarter economic decisions. These interconnected networks are usually driven by low-powered micro-controllers or cheap CPUs that are designed to function optimally with very little hardware. As scale and computational requirements increase, these micro-controllers are unable to grow without being physically replaced. This thesis proposes a system, IoTA, that assists the Internet of Things by pro- viding a shared computational resource for endpoint devices. This solution extends the functionality of endpoint devices without the need of physical replacement. The IoTA system is designed to be easily integrable to any existing IoT network. This system presents a model that allows for seamless processing of jobs submitted by endpoint devices while keeping scalability and flexibility in mind. Additionally, IoTA is built on top of existing IoT protocols. Evaluation shows there is a significant performance benefit in processing computationally heavy algorithms on the IoTA system as compared to processing them locally on the endpoint devices themselves.
24

Artificial Neural Network-Based Robotic Control

Ng, Justin 01 June 2018 (has links)
Artificial neural networks (ANNs) are highly-capable alternatives to traditional problem solving schemes due to their ability to solve non-linear systems with a nonalgorithmic approach. The applications of ANNs range from process control to pattern recognition and, with increasing importance, robotics. This paper demonstrates continuous control of a robot using the deep deterministic policy gradients (DDPG) algorithm, an actor-critic reinforcement learning strategy, originally conceived by Google DeepMind. After training, the robot performs controlled locomotion within an enclosed area. The paper also details the robot design process and explores the challenges of implementation in a real-time system.
25

Verification of acoustic dissipation in two-phase dilute dispersed flow models in computational fluid dynamics

Reeder, Brennan 10 December 2021 (has links)
With existing numerical models for fluid particle systems in CHEM, the acoustic-particle interactions associated with two-phase dilute dispersed flow can be captured and the particle model can be validated using experimental and analytical data and verified using numerical techniques. The experimental and analytical data come from Zink and Delsasso and provides data for particles of diameters 5 to 15 microns for frequencies between 500Hz to 13600Hz. In the particle number density measurements by Zink and Delsasso there was a 10% estimated error range. Using the fourth order skew symmetric flux in CHEM and the built in Eulerian and Lagrangian particle models, the sound wave dissipation was captured and found to be within the margin of error. Two additional tests were conducted to measure the effect of nonlinear acoustics and increased bulk density on the dissipation. Nonlinear effects showed no significant effect and the linear increase in bulk density showed a linear increase in dissipation.
26

Leveraging Intermediate Artifacts to Improve Automated Trace Link Retrieval

Rodriguez, Alberto D 01 June 2021 (has links) (PDF)
Software traceability establishes a network of connections between diverse artifacts such as requirements, design, and code. However, given the cost and effort of creating and maintaining trace links manually, researchers have proposed automated approaches using information retrieval techniques. Current approaches focus almost entirely upon generating links between pairs of artifacts and have not leveraged the broader network of interconnected artifacts. In this paper we investigate the use of intermediate artifacts to enhance the accuracy of the generated trace links – focus- ing on paths consisting of source, target, and intermediate artifacts. We propose and evaluate combinations of techniques for computing semantic similarity, scaling scores across multiple paths, and aggregating results from multiple paths. We report results from five projects, including one large industrial project. We find that leverag- ing intermediate artifacts improves the accuracy of end-to-end trace retrieval across all datasets and accuracy metrics. After further analysis, we discover that leveraging intermediate artifacts is only helpful when a project’s artifacts share a common vocabulary, which tends to occur in refinement and decomposition hierarchies of artifacts. Given our hybrid approach that integrates both direct and transitive links, we observed little to no loss of accuracy when intermediate artifacts lacked a shared vocabulary with source or target artifacts.
27

Optimizing the Distributed Hydrology Soil Vegetation Model for Uncertainty Assessment with Serial, Multicore and Distributed Accelerations

Adriance, Andrew 01 May 2018 (has links) (PDF)
Hydrology is the study of water. Hydrology tracks various attributes of water such as its quality and movement. As a tool Hydrology allows researchers to investigate topics such as the impacts of wildfires, logging, and commercial development. With perfect and complete data collection researchers could answer these questions with complete certainty. However, due to cost and potential sources of error this is impractical. As such researchers rely on simulations. The Distributed Hydrology Soil Vegetation Model(also referenced to as DHSVM) is a scientific mathematical model to numerically represent watersheds. Hydrology, as with all fields, continues to produce large amounts of data from researchers. As the stores of data increase the scientific models that process them require occasional improvements to better handle processing the masses of information. This paper investigates DHSVM as a serial C program. The paper implements and analyzes various high performance computing advancements to the original code base. Specifically this paper investigates compiler optimization, implementing par- allel computing with OpenMP, and adding distributed computing with OpenMPI. DHSVM was also tuned to run many instances on California Polytechnic State Uni- visity, San Luis Obispo’s high performance computer cluster. These additions to DHSVM help speed-up the results returned to researches, and improves DHSVM’s ability to be used with uncertainty analysis methods. iv This paper was able to improve the performance of DHSVM 2 times with serial and compiler optimization. In addition to the serial and compiler optimizations this paper found that OpenMP provided a noticeable speed up on hardware, that also scaled as the hardware improved. The pareallel optimization doubled DHSVM’s speed again on commodity hardware. Finally it was found that OpenMPI was best used for running multiple instances of DHSVM. All combined this paper was able to improve the performance of DHSVM by 4.4 times per instance, and allow it to run multiple instances on computing clusters.
28

Incorporating Histograms of Oriented Gradients Into Monte Carlo Localization

Norris, Michael K 01 June 2016 (has links) (PDF)
This work presents improvements to Monte Carlo Localization (MCL) for a mobile robot using computer vision. Solutions to the localization problem aim to provide fine resolution on location approximation, and also be resistant to changes in the environment. One such environment change is the kidnapped/teleported robot problem, where a robot is suddenly transported to a new location and must re-localize. The standard method of "Augmented MCL" uses particle filtering combined with addition of random particles under certain conditions to solve the kidnapped robot problem. This solution is robust, but not always fast. This work combines Histogram of Oriented Gradients (HOG) computer vision with particle filtering to speed up the localization process. The major slowdown in Augmented MCL is the conditional addition of random particles, which depends on the ratio of a short term and long term average of particle weights. This ratio does not change quickly when a robot is kidnapped, leading the robot to believe it is in the wrong location for a period of time. This work replaces this average-based conditional with a comparison of the HOG image directly in front of the robot with a cached version. This resulted in a speedup ranging from from 25.3% to 80.7% (depending on parameters used) in localization time over the baseline Augmented MCL.
29

Sparse Sampling of Velocity MRI

Chinta, Venkateswarao Yogesh 10 1900 (has links)
<p>Standard MRI is used to image objects at rest. In addition to standard MRI images, which measure tissues at rest, Phase Contrast MRI can be used to quantify the motion of blood and tissue in the human body. The current method used in Phase Contrast MRI is time consuming. The development of new trajectories has minimized imaging time, but creates sub-sampling errors. The proposed method uses regularization of velocities and proton densities to eliminate errors arising from k-space under-sampling.</p> / Master of Applied Science (MASc)
30

Condition Monitoring for Rotational Machinery

Volante, Daniel C. 10 1900 (has links)
<p>Vibrating screens are industrial machines used to sort aggregates through their high rotational accelerations. Utilized in mining operations, they are able to screen dozens of tonnes of material per hour. To enhance maintenance and troubleshooting, this thesis introduces a vibration based condition monitoring system capable of observing machine operation. Using acceleration data collected from remote parts of the machine, software continuously detects for abnormal operation triggered by fault conditions. Users are to be notified in the event of a fault and be provided with relevant information.</p> <p>Acceleration data is acquired from a set of sensor devices that are mounted to specified points on the vibrating screen. Data is then wirelessly transmitted to a centralized unit for digital signal processing. Existing sensor devices developed for a previous project have been upgraded and integrated into the monitoring system. Alternative communication technologies and the utilized Wi-Fi network are examined and discussed.</p> <p>The condition monitoring system's hardware and software was designed following engineering principles. Development produced a functional prototype system, implementing the monitoring process. The monitoring technique utilizes signal filtering and processing to compute a set of variables that reveal the status of the machine. Decision making strategies are then employed as to determine when a fault has occurred.</p> <p>Testing performed on the developed monitoring system has also been documented. The performance of the prototype system is examined as different fault scenarios are induced and monitored. Results and descriptions of virtual simulations and live industrial experiments are presented. The relationships between machine faults and detected fault signatures are also discussed.</p> / Master of Applied Science (MASc)

Page generated in 0.1745 seconds