• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 759
  • 471
  • 186
  • 86
  • 73
  • 26
  • 24
  • 23
  • 22
  • 21
  • 16
  • 16
  • 11
  • 10
  • 10
  • Tagged with
  • 2024
  • 621
  • 259
  • 212
  • 197
  • 172
  • 166
  • 153
  • 147
  • 140
  • 139
  • 138
  • 137
  • 126
  • 125
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Preamble Design for Symbol Timing Estimation from SOQPSK-TG Waveforms

Erkmen, Baris I., Tkacenko, Andre, Okino, Clayton M. 10 1900 (has links)
ITC/USA 2009 Conference Proceedings / The Forty-Fifth Annual International Telemetering Conference and Technical Exhibition / October 26-29, 2009 / Riviera Hotel & Convention Center, Las Vegas, Nevada / Data-aided symbol synchronization for bursty communications utilizes a predetermined modulation sequence, i.e., a preamble, preceding the payload. For effective symbol synchronization, this preamble must be designed in accordance with the modulation format. In this paper, we analyze preambles for shaped offset quadrature phase-shift keying (SOQPSK) waveforms. We compare the performance of several preambles by deriving the Cram´er-Rao bound (CRB), and identify a desirable one for the Telemetry Group variant of SOQPSK. We also demonstrate, via simulation, that the maximum likelihood estimator with this preamble approaches the CRB at moderate signal-to-noise ratio.
82

On Parameter Estimation Employing Sinewave Fit andPhase Noise Compensation in OFDM Systems

Negusse, Senay January 2015 (has links)
In today’s modern society, we are surrounded by a multitude of digital devices.The number of available digital devices is set to grow even more. As the trendcontinues, product life-cycle is a major issue in mass production of these devices.Testing and verification is responsible for a significant percentage of the productioncost of digital devices. Time efficient procedures for testing and characterization aretherefore sought for. Moreover, the need for flexible and low-cost solutions in thedesign architecture of radio frequency devices coupled with the demand for highdata rate has presented a challenge caused by interferences from the analog circuitparts. Study of digital signal processing based techniques which would alleviate theeffects of the analog impairments is therefore a pertinent subject. In the first part of this thesis, we address parameter estimation based on wave-form fitting. We look at the sinewave model for parameter estimation which iseventually used to characterize the performance of a device. The underlying goal isto formulate and analyze a set of new parameter estimators which provide a moreaccurate estimate than well known estimators. Specifically, we study the maximum-likelihood (ML) SNR estimator employing the three-parameter sine fit and derivealternative estimator based on its statistical distribution. We show that the meansquare error (MSE) of the alternative estimators is lower than the MSE of the MLestimator for a small sample size and a few of the new estimators are very close tothe Cramér-Rao lower bound (CRB). Simply put, the number of acquired measure-ment samples translate to measurement time, implying that the fewer the numberof samples required for a given accuracy, the faster the test would be. We alsostudy a sub-sampling approach for frequency estimation problem in a dual channelsinewave model with common frequency. Coprime subsampling technique is usedwhere the signals from both channels are uniformly subsampled with coprime pairof sparse samplers. Such subsampling technique is especially beneficial to lower thesampling frequency required in applications with high bandwidth requirement. TheCRB based on the co-prime subsampled data set is derived and numerical illus-trations are given showing the relation between the cost in performance based onthe mean squared error and the employed coprime factors for a given measurementtime. In the second part of the thesis, we deal with the problem of phase-noise (PHN).First, we look at a scheme in orthogonal frequency-division multiplexing (OFDM)system where pilot subcarriers are employed for joint PHN compensation, channelestimation and symbol detection. We investigate a method where the PHN statis-tics is approximated by a finite number of vectors and design a PHN codebook. Amethod of selecting the element in the codebook that is closest to the current PHNrealization with the corresponding channel estimate is discussed. We present simula-tion results showing improved performance compared to state-of-the art techniques.We also look at a sequential Monte-Carlo based method for combined channel im-pulse response and PHN tracking employing known OFDM symbols. Such techniqueallows time domain compensation of PHN such that simultaneous cancellation ofthe common phase error and reduction of the inter-carrier interference occurs. / <p>QC 20150529</p>
83

USE OF COMPUTER GENERATED HOLOGRAMS FOR OPTICAL ALIGNMENT

Zehnder, Rene January 2011 (has links)
The necessity to align a multi component null corrector that is used to test the 8.4 [m] off axis parabola segments of the primary mirror of the Giant Magellan Telescope (GMT) initiated this work. Computer Generated Holograms (CGHs) are often a component of these null correctors and their capability to have multiplefunctionality allows them not only to contribute to the measurement wavefront but also support the alignment. The CGH can also be used as an external tool to support the alignment of complex optical systems, although, for the applications shown in this work, the CGH is always a component of the optical system. In general CGHs change the shape of the illuminating wavefront that then can produce optical references. The uncertainty of position of those references not only depends on the uncertainty of position of the CGH with respect to the illuminating wavefront but also on the uncertainty on the shape of the illuminating wavefront. A complete analysis of the uncertainty on the position of the projected references therefore includes the illuminating optical system, that is typically an interferometer. This work provides the relationships needed to calculate the combined propagation of uncertainties on the projected optical references. This includes a geometrical optical description how light carries information of position and how diffraction may alter it. Any optical reference must be transferred to a mechanically tangible quantity for the alignment. The process to obtain the position of spheres relative to the CGH pattern where, the spheres are attached to the CGH, is provided and applied to the GMT null corrector. Knowing the location of the spheres relative to the CGH pattern is equivalent to know the location of the spheres with respect to the wavefront the pattern generates. This work provides various tools for the design and analysis to use CGHs for optical alignment including the statistical foundation that goes with it.
84

Modeling Stochastic Processes in Gamma-Ray Imaging Detectors and Evaluation of a Multi-Anode PMT Scintillation Camera for Use with Maximum-Likelihood Estimation Methods

Hunter, William Coulis Jason January 2007 (has links)
Maximum-likelihood estimation or other probabilistic estimation methods are underused in many areas of applied gamma-ray imaging, particularly in biomedicine. In this work, we show how to use our understanding of stochastic processes in a scintillation camera and their effect on signal formation to better estimate gamma-ray interaction parameters such as interaction position or energy.To apply statistical estimation methods, we need an accurate description of the signal statistics as a function of the parameters to be estimated. First, we develop a probability model of the signals conditioned on the parameters to be estimated by carefully examining the signal generation process. Subsequently, the likelihood model is calibrated by measuring signal statistics for an ensemble of events as a function of the estimate parameters.In this work, we investigate the application of ML-estimation methods for three topics. First, we design, build, and evaluate a scintillation camera based on a multi-anode PMT readout for use with ML-estimation techniques. Next, we develop methods for calibrating the response statistics of a thick-detector gamma camera as a function of interaction depth. Finally, we demonstrate the use of ML estimation with a modified clinical Anger camera.
85

Inverse Optical Design and Its Applications

Sakamoto, Julia January 2012 (has links)
We present a new method for determining the complete set of patient-specific ocular parameters, including surface curvatures, asphericities, refractive indices, tilts, decentrations, thicknesses, and index gradients. The data consist of the raw detector outputs of one or more Shack-Hartmann wavefront sensors (WFSs); unlike conventional wavefront sensing, we do not perform centroid estimation, wavefront reconstruction, or wavefront correction. Parameters in the eye model are estimated by maximizing the likelihood. Since a purely Gaussian noise model is used to emulate electronic noise, maximum-likelihood (ML) estimation reduces to nonlinear least-squares fitting between the data and the output of our optical design program. Bounds on the estimate variances are computed with the Fisher information matrix (FIM) for different configurations of the data-acquisition system, thus enabling system optimization. A global search algorithm called simulated annealing (SA) is used for the estimation step, due to multiple local extrema in the likelihood surface. The ML approach to parameter estimation is very time-consuming, so rapid processing techniques are implemented with the graphics processing unit (GPU).We are leveraging our general method of reverse-engineering optical systems in optical shop testing for various applications. For surface profilometry of aspheres, which involves the estimation of high-order aspheric coefficients, we generated a rapid ray-tracing algorithm that is well-suited to the GPU architecture. Additionally, reconstruction of the index distribution of GRIN lenses is performed using analytic solutions to the eikonal equation. Another application is parameterized wavefront estimation, in which the pupil phase distribution of an optical system is estimated from multiple irradiance patterns near focus. The speed and accuracy of the forward computations are emphasized, and our approach has been refined to handle large wavefront aberrations and nuisance parameters in the imaging system.
86

Information Entropy and Ecological Energetics : Predicting and Analysing Structure and Energy Flow in Ecological Networks applying the Concept of MaxEnt

Brinck, Katharina January 2014 (has links)
Ecological networks are complex systems forming hierarchical structures in which energy and matter is transferred between the network’s compartments. Predicting energy flows in food webs usually involves complex parameter-rich models. In this thesis, the application of the principle of maximum entropy (MaxEnt) to obtain least biased probability distributions based on prior knowledge is proposed as an alternative to predict the most likely energy flows in food webs from the network topology alone. This approach not only simplifies the characterisation of food web flow patterns based on little empirical knowledge but can also be used to investigate the role of bottom-up and top-down controlling forces in ecosystems resulting from the emergent phenomena based on the complex interactions on the level of species and individuals. The integrative measure of “flow extent”, incorporating both bottom- up and top-down controlling forces on ecosystems, is proposed as a principle behind ecosystem evolution and evaluated against empirical data on food web structure. It could be demonstrated that the method of predicting energy flow with the help of MaxEnt is very flexible, applicable to many different setting and types of questions in ecology, and therefore providing a powerful tool for modelling the energy transfer in ecosystems. Further research has to show in how far the most likely flow patterns are realised in real-word ecosystems. The concept of flow extent maximisation as a selection principle during ecosystem evolution can enhance the understanding of emergent phenomena in complex ecosystems and maybe help to draw a link between thermodynamics and ecology.
87

Models for target detection times.

Bae, Deok Hwan January 1989 (has links)
Approved for public release; distribution in unlimited. / Some battlefield models have a component in them which models the time it takes for an observer to detect a target. Different observers may have different mean detection times due to various factors such as the type of sensor used, environmental conditions, fatigue of the observer, etc. Two parametric models for the distribution of time to target detection are considered which can incorporate these factors. Maximum likelihood estimation procedures for the parameters are described. Results of simulation experiments to study the small sample behavior of the estimators are presented. / http://archive.org/details/modelsfortargetd00baed / Major, Korean Air Force
88

Optimal designs for maximum likelihood estimation and factorial structure design

Chowdhury, Monsur 06 September 2016 (has links)
This thesis develops methodologies for the construction of various types of optimal designs with applications in maximum likelihood estimation and factorial structure design. The methodologies are applied to some real data sets throughout the thesis. We start with a broad review of optimal design theory including various types of optimal designs along with some fundamental concepts. We then consider a class of optimization problems and determine the optimality conditions. An important tool is the directional derivative of a criterion function. We study extensively the properties of the directional derivatives. In order to determine the optimal designs, we consider a class of multiplicative algorithms indexed by a function, which satisfies certain conditions. The most important and popular design criterion in applications is D-optimality. We construct such designs for various regression models and develop some useful strategies for better convergence of the algorithms. The remaining thesis is devoted to some important applications of optimal design theory. We first consider the problem of determining maximum likelihood estimates of the cell probabilities under the hypothesis of marginal homogeneity in a square contingency table. We formulate the Lagrangian function and remove the Lagrange parameters by substitution. We then transform the problem to one of maximizing some functions of the cell probabilities simultaneously. We apply this problem to some real data sets, namely, a US Migration data, and a data on grading of unaided distance vision. We solve another estimation problem to determine the maximum likelihood estimation of the parameters of the latent variable models such as Bradley-Terry model where the data come from a paired comparisons experiment. We approach this problem by considering the observed frequency having a binomial distribution and then replacing the binomial parameters in terms of optimal design weights. We apply this problem to a data set from American League Baseball Teams. Finally, we construct some optimal structure designs for comparing test treatments with a control. We introduce different structure designs and establish their properties using the incidence and characteristic matrices. We also develop methods of obtaining optimal R-type structure designs and show how such designs are trace, A- and MV-optimal. / October 2016
89

Model-based recursive partitioning

Zeileis, Achim, Hothorn, Torsten, Hornik, Kurt January 2005 (has links) (PDF)
Recursive partitioning is embedded into the general and well-established class of parametric models that can be fitted using M-type estimators (including maximum likelihood). An algorithm for model-based recursive partitioning is suggested for which the basic steps are: (1) fit a parametric model to a data set, (2) test for parameter instability over a set of partitioning variables, (3) if there is some overall parameter instability, split the model with respect to the variable associated with the highest instability, (4) repeat the procedure in each of the daughter nodes. The algorithm yields a partitioned (or segmented) parametric model that can effectively be visualized and that subject-matter scientists are used to analyze and interpret. / Series: Research Report Series / Department of Statistics and Mathematics
90

Variability of Suspended-Sediment Concentration in the Connecticut River Estuary

Cuttler, Michael Vincent William January 2012 (has links)
Thesis advisor: Gail Kineke / Turbidity maxima are areas of elevated suspended-sediment concentration commonly found at the head of the salt intrusion in partially-mixed estuaries. The suspended-sediment distribution in the Connecticut River estuary was examined to determine where turbidity maxima exist and how they form. Areas of enhanced suspended-sediment concentration were found to exist at all phases of the tide near the head of the salt intrusion as well as downstream of this point in deeper parts of the estuarine channel. These areas are locations where peaks in the longitudinal salinity gradient exist, suggesting the presence of a front, or zone of flow convergence. During flood conditions there is a layer of landward-flowing water in the middle of the water column that decelerates upon entering deep parts of the estuary; thus enhancing particle settling. During ebb conditions, stratification and therefore settling from surface waters is enhanced. The combination of processes acting throughout the tidal cycle focuses and, potentially, traps sediment in the deeper parts of the Connecticut River estuary. / Thesis (BS) — Boston College, 2012. / Submitted to: Boston College. College of Arts and Sciences. / Discipline: College Honors Program. / Discipline: Geology & Geophysics Honors Program. / Discipline: Earth and Environmental Sciences.

Page generated in 0.0326 seconds