• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 215
  • 38
  • 27
  • 23
  • 12
  • 8
  • 5
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 399
  • 194
  • 81
  • 74
  • 58
  • 54
  • 48
  • 47
  • 46
  • 45
  • 37
  • 33
  • 33
  • 32
  • 30
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
191

A High Performance Parallel Sparse Linear Equation Solver Using CUDA

Martin, Andrew John 14 July 2011 (has links)
No description available.
192

Pipelined IEEE-754 Double Precision Floating Point Arithmetic Operators on Virtex FPGA’s

Pathanjali, Nandini 22 May 2002 (has links)
No description available.
193

Power Corrections in e<sup>+</sup>e<sup>-</sup> →M<sub>1</sub>M<sub>2</sub> and in Charmless Two Body B meson Decays

Duraisamy, Murugeswaran January 2009 (has links)
No description available.
194

Development of A Fast Converging Hybrid Method for Analyzing Three-Dimensional Doubly Periodic Structures

Wang, Feng January 2013 (has links)
No description available.
195

Evaluation of PM2.5 Components and Source Apportionment at a Rural Site in the Ohio River Valley Region

Deshpande, Seemantini R. 27 September 2007 (has links)
No description available.
196

Latent Factor Models for Recommender Systems and Market Segmentation Through Clustering

Zeng, Jingying 29 August 2017 (has links)
No description available.
197

Asymptotic and Factorization Analysis for Inverse Shape Problems in Tomography and Scattering Theory

Govanni Granados (18283216) 01 April 2024 (has links)
<p dir="ltr">Developing non-invasive and non-destructive testing in complex media continues to be a rich field of study (see e.g.[22, 28, 36, 76, 89] ). These types of tests have applications in medical imaging, geophysical exploration, and engineering where one would like to detect an interior region or estimate a model parameter. With the current rapid development of this enabling technology, there is a growing demand for new mathematical theory and computational algorithms for inverse problems in partial differential equations. Here the physical models are given by a boundary value problem stemming from Electrical Impedance Tomography (EIT), Diffuse Optical Tomography (DOT), as well as acoustic scattering problems. Important mathematical questions arise regarding existence, uniqueness, and continuity with respect to measured surface data. Rather than determining the solution of a given boundary value problem, we are concerned with using surface data in order to develop and implement numerical algorithms to recover unknown subregions within a known domain. A unifying theme of this thesis is to develop Qualitative Methods to solve inverse shape problems using measured surface data. These methods require very few a priori assumptions on the regions of interest, boundary conditions, and model parameter estimation. The counterpart to qualitative methods, iterative methods, typically require a priori information that may not be readily available and can be more computationally expensive. Qualitative Methods usually require more data.</p><p dir="ltr">This thesis expands the library of Qualitative Methods for elliptic problems coming from tomography and scattering theory. We consider inverse shape problems where our goal is to recover extended and small volume regions. For extended regions, we consider applying a modified version of the well-known Factorization Method [73]. Whereas for the small volume regions, we develop a Multiple Signal Classification (MUSIC)-type algorithm (see for e.g. [3, 5]). In all of our problems, we derive an imaging functional that will effectively recover the region of interest. The results of this thesis form part of the theoretical forefront of physical applications. Furthermore, it extends the mathematical theory at the intersection of mathematics, physics and engineering. Lastly, it also advances knowledge and understanding of imaging techniques for non-invasive and non-destructive testing.</p>
198

High-Dimensional Generative Models for 3D Perception

Chen, Cong 21 June 2021 (has links)
Modern robotics and automation systems require high-level reasoning capability in representing, identifying, and interpreting the three-dimensional data of the real world. Understanding the world's geometric structure by visual data is known as 3D perception. The necessity of analyzing irregular and complex 3D data has led to the development of high-dimensional frameworks for data learning. Here, we design several sparse learning-based approaches for high-dimensional data that effectively tackle multiple perception problems, including data filtering, data recovery, and data retrieval. The frameworks offer generative solutions for analyzing complex and irregular data structures without prior knowledge of data. The first part of the dissertation proposes a novel method that simultaneously filters point cloud noise and outliers as well as completing missing data by utilizing a unified framework consisting of a novel tensor data representation, an adaptive feature encoder, and a generative Bayesian network. In the next section, a novel multi-level generative chaotic Recurrent Neural Network (RNN) has been proposed using a sparse tensor structure for image restoration. In the last part of the dissertation, we discuss the detection followed by localization, where we discuss extracting features from sparse tensors for data retrieval. / Doctor of Philosophy / The development of automation systems and robotics brought the modern world unrivaled affluence and convenience. However, the current automated tasks are mainly simple repetitive motions. Tasks that require more artificial capability with advanced visual cognition are still an unsolved problem for automation. Many of the high-level cognition-based tasks require the accurate visual perception of the environment and dynamic objects from the data received from the optical sensor. The capability to represent, identify and interpret complex visual data for understanding the geometric structure of the world is 3D perception. To better tackle the existing 3D perception challenges, this dissertation proposed a set of generative learning-based frameworks on sparse tensor data for various high-dimensional robotics perception applications: underwater point cloud filtering, image restoration, deformation detection, and localization. Underwater point cloud data is relevant for many applications such as environmental monitoring or geological exploration. The data collected with sonar sensors are however subjected to different types of noise, including holes, noise measurements, and outliers. In the first chapter, we propose a generative model for point cloud data recovery using Variational Bayesian (VB) based sparse tensor factorization methods to tackle these three defects simultaneously. In the second part of the dissertation, we propose an image restoration technique to tackle missing data, which is essential for many perception applications. An efficient generative chaotic RNN framework has been introduced for recovering the sparse tensor from a single corrupted image for various types of missing data. In the last chapter, a multi-level CNN for high-dimension tensor feature extraction for underwater vehicle localization has been proposed.
199

Air Quality in Mexico City: Spatial and Temporal Variations of Particulate Polycyclic Aromatic Hydrocarbons and Source Apportionment of Gasoline-Versus-Diesel Vehicle Emissions

Thornhill, Dwight Anthony Corey 21 August 2007 (has links)
The Mexico City Metropolitan Area (MCMA) is one of the largest cities in the world, and as with many megacities worldwide, it experiences serious air quality and pollution problems, especially with ozone and particulate matter. Ozone levels exceed the health-based standard, which is equivalent to the U.S. standard, on approximately 80% of all days, and concentrations of particulate matter 10 μm and smaller (PM10) exceed the standard on more than 40% of all days in most years. Particulate polycyclic aromatic hydrocarbons (PAHs) are a class of semi-volatile compounds that are formed during combustion and many of these compounds are known or suspected carcinogens. Recent studies on PAHs in Mexico City indicate that very high concentrations have been observed there and may pose a serious health hazard. The first part of this thesis describes results from the Megacities Initiative: Local and Regional Observations (MILAGRO) study in Mexico City in March 2006. During this field campaign, we measured PAH and aerosol active surface area (AS) concentrations at six different locations throughout the city using the Aerodyne Mobile Laboratory (AML). The different sites encompassed a mix of residential, commercial, industrial, and undeveloped land use. The goals of this research were to describe spatial and temporal patterns in PAH and AS concentrations, to gain insight into sources of PAHs, and to quantify the relationships between PAHs and other pollutants. We observed that the highest measurements were generally found at sites with dense traffic networks. Also, PAH concentrations varied considerably in space. An important implication of this result is that for risk assessment studies, a single monitoring site will not adequately represent an individual's exposure. Source identification and apportionment are essential for developing effective control strategies to improve air quality and therefore reduce the health impacts associated with fine particulate matter and PAHs. However, very few studies have separated gasoline- versus diesel-powered vehicle emissions under a variety of on-road driving conditions. The second part of this thesis focuses on distinguishing between the two types of engine emissions within the MCMA using positive matrix factorization (PMF) receptor modeling. The Aerodyne Mobile Laboratory drove throughout the MCMA in March 2006 and measured on-road concentrations of a large suite of gaseous and particulate pollutants, including carbon dioxide, carbon monoxide (CO), nitric oxide (NO), benzene (C6H6), formaldehyde (HCHO), ammonia (NH3), fine particulate matter (PM2.5), PAHs, and black carbon (BC). These pollutant species served as the input data for the receptor model. Fuel-based emission factors and annual emissions within Mexico City were then calculated from the source profiles of the PMF model and fuel sales data. We found that gasoline-powered vehicles were responsible for 90% of mobile source CO emissions and 85% of VOCs, while diesel-powered vehicles accounted for almost all of NO emissions (99.98%). Furthermore, the annual emissions estimates for CO and VOC were lower than estimated during the MCMA-2003 field campaign. The number of megacities is expected to grow dramatically in the coming decades. As one of the world's largest megacities, Mexico City serves as a model for studying air quality problems in highly populated, extremely polluted environments. The results of this work can be used by policy makers to improve air quality and reduce related health risks in Mexico City and other megacities. / Master of Science
200

Accuracy and Interpretability Testing of Text Mining Methods

Ashton, Triss A. 08 1900 (has links)
Extracting meaningful information from large collections of text data is problematic because of the sheer size of the database. However, automated analytic methods capable of processing such data have emerged. These methods, collectively called text mining first began to appear in 1988. A number of additional text mining methods quickly developed in independent research silos with each based on unique mathematical algorithms. How good each of these methods are at analyzing text is unclear. Method development typically evolves from some research silo centric requirement with the success of the method measured by a custom requirement-based metric. Results of the new method are then compared to another method that was similarly developed. The proposed research introduces an experimentally designed testing method to text mining that eliminates research silo bias and simultaneously evaluates methods from all of the major context-region text mining method families. The proposed research method follows a random block factorial design with two treatments consisting of three and five levels (RBF-35) with repeated measures. Contribution of the research is threefold. First, the users perceived a difference in the effectiveness of the various methods. Second, while still not clear, there are characteristics with in the text collection that affect the algorithms ability to extract meaningful results. Third, this research develops an experimental design process for testing the algorithms that is adaptable into other areas of software development and algorithm testing. This design eliminates the bias based practices historically employed by algorithm developers.

Page generated in 0.1155 seconds