• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 833
  • 92
  • 87
  • 86
  • 34
  • 15
  • 14
  • 11
  • 9
  • 8
  • 8
  • 6
  • 6
  • 6
  • 5
  • Tagged with
  • 1516
  • 266
  • 258
  • 241
  • 213
  • 188
  • 187
  • 170
  • 169
  • 168
  • 163
  • 157
  • 145
  • 138
  • 131
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
221

Classification of multisite electrode recordings via variable dimension Gaussian mixtures

Nguyen, David P. 08 1900 (has links)
No description available.
222

A Gaussian mixture modeling approach to text-independent speaker identification

Reynolds, Douglas A. 08 1900 (has links)
No description available.
223

Inference for Continuous Stochastic Processes Using Gaussian Process Regression

Fang, Yizhou January 2014 (has links)
Gaussian process regression (GPR) is a long-standing technique for statistical interpolation between observed data points. Having originally been applied to spatial analysis in the 1950s, GPR offers highly nonlinear predictions with uncertainty adjusting to the degree of extrapolation -- at the expense of very few model parameters to be fit. Thus GPR has gained considerable popularity in statistical applications such as machine learning and nonparametric density estimation. In this thesis, we explore the potential for GPR to improve the efficiency of parametric inference for continuous-time stochastic processes. For almost all such processes, the likelihood function based on discrete observations cannot be written in closed-form. However, it can be very well approximated if the inter-observation time is small. Therefore, a popular strategy for parametric inference is to introduce missing data between actual observations. In a Bayesian context, samples from the posterior distribution of the parameters and missing data are then typically obtained using Markov chain Monte Carlo (MCMC) methods, which can be computationally very expensive. Here, we consider the possibility of using GPR to impute the marginal distribution of the missing data directly. These imputations could then be leveraged to produce independent draws from the joint posterior by Importance Sampling, for a significant gain in computational efficiency. In order to illustrate the methodology, three continuous processes are examined. The first one is based on a neural excitation model with a non-standard periodic component. The second and third are popular financial models often used for option pricing. While preliminary inferential results are quite promising, we point out several improvements to the methodology which remain to be explored.
224

Generalized Degrees of Freedom for Gaussian Interference Channel with Discrete Constellations

Pang, Chu 26 November 2012 (has links)
In wireless channels and many other channels, interference is a central phenomenon. Mitigating interference is a key to improving system performance. To find the limit of the achievable rates for these channels in the presence of interference, the two-user Gaussian interference channel has been the subject of intensive study in network information theory. However, most current results have been obtained by assuming Gaussian input distributions. While optimal in single-user Gaussian channels, the issue with this assumption is that the Gaussian noise becomes the worst noise when the input distribution is also Gaussian. In this thesis, we propose a class of discrete constellations. We show that this class of constellations can automatically achieve the same sum rates as schemes that treat interference as noise or perform time sharing.
225

Generalized Degrees of Freedom for Gaussian Interference Channel with Discrete Constellations

Pang, Chu 26 November 2012 (has links)
In wireless channels and many other channels, interference is a central phenomenon. Mitigating interference is a key to improving system performance. To find the limit of the achievable rates for these channels in the presence of interference, the two-user Gaussian interference channel has been the subject of intensive study in network information theory. However, most current results have been obtained by assuming Gaussian input distributions. While optimal in single-user Gaussian channels, the issue with this assumption is that the Gaussian noise becomes the worst noise when the input distribution is also Gaussian. In this thesis, we propose a class of discrete constellations. We show that this class of constellations can automatically achieve the same sum rates as schemes that treat interference as noise or perform time sharing.
226

Data spacing and uncertainty

Wilde, Brandon Jesse 11 1900 (has links)
Modeling spatial variables involves uncertainty. Uncertainty is affected by the degree to which a spatial variable has been sampled: decreased spacing between samples leads to decreased uncertainty. The reduction in uncertainty due to increased sampling is dependent on the properties of the variable being modeled. A densely sampled erratic variable may have a level of uncertainty similar to a sparsely sampled continuous variable. A simulation based approach is developed to quantify the relationship between uncertainty and data spacing. Reference realizations are simulated and sampled at different spacings. The samples are used to condition additional realizations from which uncertainty is quantified. A number of factors complicate the relationship between uncertainty and data spacing including the proportional effect, nonstationary variogram, classification threshold, number of realizations, data quality and modeling scale. A case study of the relationship between uncertainty and data density for bitumen thickness data from northern Alberta is presented. / Mining Engineering
227

Valid estimation and prediction inference in analysis of a computer model

Nagy, Béla 11 1900 (has links)
Computer models or simulators are becoming increasingly common in many fields in science and engineering, powered by the phenomenal growth in computer hardware over the past decades. Many of these simulators implement a particular mathematical model as a deterministic computer code, meaning that running the simulator again with the same input gives the same output. Often running the code involves some computationally expensive tasks, such as solving complex systems of partial differential equations numerically. When simulator runs become too long, it may limit their usefulness. In order to overcome time or budget constraints by making the most out of limited computational resources, a statistical methodology has been proposed, known as the "Design and Analysis of Computer Experiments". The main idea is to run the expensive simulator only at a relatively few, carefully chosen design points in the input space, and based on the outputs construct an emulator (statistical model) that can emulate (predict) the output at new, untried locations at a fraction of the cost. This approach is useful provided that we can measure how much the predictions of the cheap emulator deviate from the real response surface of the original computer model. One way to quantify emulator error is to construct pointwise prediction bands designed to envelope the response surface and make assertions that the true response (simulator output) is enclosed by these envelopes with a certain probability. Of course, to be able to make such probabilistic statements, one needs to introduce some kind of randomness. A common strategy that we use here is to model the computer code as a random function, also known as a Gaussian stochastic process. We concern ourselves with smooth response surfaces and use the Gaussian covariance function that is ideal in cases when the response function is infinitely differentiable. In this thesis, we propose Fast Bayesian Inference (FBI) that is both computationally efficient and can be implemented as a black box. Simulation results show that it can achieve remarkably accurate prediction uncertainty assessments in terms of matching coverage probabilities of the prediction bands and the associated reparameterizations can also help parameter uncertainty assessments.
228

Gaussian cellular multiple access channels

Letzepis, Nicholas Alexander January 2006 (has links)
The ever increasing demand for wireless services using mobile cellular communication technology has lead to intensive efforts by researchers to determine the information theoretic limitations of these systems. Toward this end, numerous cellular models have been formulated and analyzed using information theoretic principles to succinctly identify the impact of system constraints and design parameters on performance. One such model that captures the essence of the uplink of a cellular channel is the Gaussian cellular multiple access channel (GCMAC), which is the focus of this dissertation. / Thesis (PhD)--University of South Australia, 2006
229

GMMEDA : A demonstration of probabilistic modeling in continuous metaheuristic optimization using mixture models

Naveen Kumar Unknown Date (has links)
Optimization problems are common throughout science, engineering and commerce. The desire to continually improve solutions and resolve larger, complex problems has given prominence to this field of research for several decades and has led to the development of a range of optimization algorithms for different class of problems. The Estimation of Distribution Algorithms (EDAs) are a relatively recent class of metaheuristic optimization algorithms based on using probabilistic modeling techniques to control the search process. Within the general EDA framework, a number of different probabilistic models have been previously proposed for both discrete and continuous optimization problems. This thesis focuses on GMMEDAs; continuous EDAs based on the Gaussian Mixture Models (GMM) with parameter estimation performed using the Expectation Maximization (EM) algorithm. To date, this type of model has only received limited attention in the literature. There are few previous experimental studies of the algorithms. Furthermore, a number of implementation details of Continuous Iterated Density Estimation Algorithm based on Gaussian Mixture Model have not been previously documented. This thesis intends to provide a clear description of the GMMEDAs, discuss the implementation decisions and details and provides experimental study to evaluate the performance of the algorithms. The effectiveness of the GMMEDAs with varying model complexity (structure of covariance matrices and number of components) was tested against five benchmark functions (Sphere, Rastrigin, Griewank, Ackley and Rosenbrock) with varying dimensionality (2−, 10− and 30−D). The effect of the selection pressure parameters is also studied in this experiment. The results of the 2D experiments show that a variant of the GMMEDA with moderate complexity (Diagonal GMMEDA) was able to optimize both unimodal and multimodal functions. Further, experimental analysis of the 10 and 30D functions optimized results indicates that the simpler variant of the GMMEDA (Spherical GMMEDA) was most effective of all three variants of the algorithm. However, a greater consistency in the results of these functions is achieved when the most complex variant of the algorithm (Full GMMEDA) is used. The comparison of the results for four artificial test functions - Sphere, Griewank, Ackley and Rosenbrock - showed that the GMMEDA variants optimized most of complex functions better than existing continuous EDAs. This was achieved because of the ability of the GMM components to model the functions effectively. The analysis of the results evaluated by variants of the GMMEDA showed that number of the components and the selection pressure does affect the optimum value of artificial test function. The convergence of the GMMEDA variants to the respective functions best local optimum has been caused more by the complexity in the GMM components. The complexity of GMMEDA because of the number of components increases as the complexity owing to the structure of the covariance matrices increase. However, while finding optimum value of complex functions the increased complexity in GMMEDA due to complex covariance structure overrides the complexity due to increase in number of components. Additionally, the affect on the convergence due to the number of components decreases for most functions when the selection pressure increased. These affects have been noticed in the results in the form of stability of the results related to the functions. Other factors that affect the convergence of the model to the local optima are the initialization of the GMM parameters, the number of the EM components, and the reset condition. The initialization of the GMM components, though not visible graphically in the 10D optimization has shown: for different initialization of the GMM parameters in 2D, the optimum value of the functions is affected. The initialization of the population in the Evolutionary Algorithms has shown to affect the convergence of the algorithm to the functions global optimum. The observation of similar affects due to initialization of GMM parameters on the optimization of the 2D functions indicates that the convergence of the GMM in the 10D could be affected, which in turn, could affect the optimum value of respective functions. The estimated values related to the covariance and mean over the EM iteration in the 2D indicated that some functions needed a greater number of EM iterations while finding their optimum value. This indicates that lesser number of EM iterations could affect the fitting of the components to the selected population in the 10D and the fitting can affect the effective modeling of functions with varying complexity. Finally, the reset condition has shown as resetting the covariance and the best fitness value of individual in each generation in 2D. This condition is certain to affect the convergence of the GMMEDA variants to the respective functions best local optimum. The rate at which the reset condition was invoked could certainly have caused the GMM components covariance values to reset to their initials values and thus the model fitting the percentage of the selected population could have been affected. Considering all the affects caused by the different factors, the results indicates that a smaller number of the components and percentage of the selected population with a simpler S-GMMEDA modeled most functions with a varying complexity.
230

Gaussian cellular multiple access channels

Letzepis, Nicholas Alexander January 2006 (has links)
The ever increasing demand for wireless services using mobile cellular communication technology has lead to intensive efforts by researchers to determine the information theoretic limitations of these systems. Toward this end, numerous cellular models have been formulated and analyzed using information theoretic principles to succinctly identify the impact of system constraints and design parameters on performance. One such model that captures the essence of the uplink of a cellular channel is the Gaussian cellular multiple access channel (GCMAC), which is the focus of this dissertation. / Thesis (PhD)--University of South Australia, 2006

Page generated in 0.0351 seconds