• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 447
  • 103
  • 99
  • 49
  • 43
  • 20
  • 17
  • 14
  • 11
  • 10
  • 7
  • 7
  • 6
  • 6
  • 4
  • Tagged with
  • 943
  • 165
  • 128
  • 106
  • 100
  • 96
  • 94
  • 94
  • 92
  • 88
  • 80
  • 73
  • 70
  • 70
  • 67
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
531

Multitaper Higher-Order Spectral Analysis of Nonlinear Multivariate Random Processes

He, HUIXIA 04 November 2008 (has links)
In this work, I will describe a new statistical tool: the canonical bicoherence, which is a combination of the canonical coherence and the bicoherence. I will provide its definitions, properties, estimation by multitaper methods and statistics, and estimate the variance of the estimates by the weighted jackknife method. I will discuss its applicability and usefulness in nonlinear quadratic phase coupling detection and analysis for multivariate random processes. Furthermore, I will develop the time-varying canonical bicoherence for the nonlinear analysis of non-stationary random processes. In this thesis, the canonical bicoherence is mainly applied in two types of data: a) three-component geomagnetic field data, and b) high-dimensional brain electroencephalogram data. Both results obtained will be linked with physical or physiological interpretations. In particular, this thesis is the first work where the novel method of ``canonical bicoherence'' is introduced and applied to the nonlinear quadratic phase coupling detection and analysis for multivariate random processes. / Thesis (Ph.D, Mathematics & Statistics) -- Queen's University, 2008-10-31 15:03:57.596
532

Analyse des tendances et configurations de la criminalité au Québec à l’aide d’un indice de gravité de la criminalité

Goupil, Jean-Philippe 04 1900 (has links)
La mesure traditionnelle de la criminalité (taux pour 100 000 habitants) pose problème dans l'analyse des variations de la criminalité dans le temps ou l'espace. Le problème est dû au fait que le taux de criminalité est essentiellement déterminé par des infractions moins graves et très fréquentes. La présente étude a permis de tester l’utilité du nouvel outil développé par Statistique Canada qui procure un index de « gravité de la criminalité » dans lequel chaque crime est pondéré par son score de gravité (basé sur les décisions sentencielles moyennes au Canada de 2002 à 2007 pour chaque forme de crime). Appliquées aux statistiques officielles du Québec de 1977 à 2008, nos analyses montrent que l’indice de gravité s’avère une mesure utile pour dresser un portrait plus juste des tendances des crimes violents d’une année à l’autre. Plus exactement, l’indice de gravité montre que le taux de crimes violents est demeuré stable de 1977 à 1992 contrairement à l'image fournie par le taux traditionnel qui montre plutôt une montée fulgurante durant cette période. L’indice de gravité peut également être utile à l’égard des crimes violents pour comparer plus adéquatement les territoires entre eux afin d’établir ceux qui présentent une criminalité plus grave. Cependant, à l’égard de la criminalité globale et des crimes sans violence, l’indice de gravité n’est d’aucune utilité et présente la même lecture de la criminalité que la mesure traditionnelle. Cela s’explique par le fait que ce sont toujours les mêmes infractions (les vols, les méfaits et les introductions par effraction) qui contribuent majoritairement à chacune des deux mesures de la criminalité. / The traditional measure of the criminality (rate for 100 000 inhabitants) raises problem in the analysis of the variations of the criminality in time or space. The problem is due to the fact that the rate of criminality is essentially determined by less serious and very frequent crimes. The present study allowed testing the utility of a new tool developed by Statistics Canada which gets an index of "severity of criminality" in which every crime is weighted by its score of severity (based on the sentencielles decisions average in Canada from 2002 till 2007 for every kind of crime. Applied to the official statistics of Quebec from 1977 till 2008, our analyses show that the indication of gravity turns out to be a useful measure to draw up a more realistic portrait of the trends of violent crimes from one year to the next. More precisely, the crime seriousness index shows that the rate of violent crimes remained stable from 1977 till 1992, on contradiction to what is shown by traditional rate which rather demonstrates a huge rise during this period. Crime seriousness index can also be used towards violent crimes, in order to compare more adequately territories, to establish those who show more serious crimes. However, among global criminality and crimes without violence, the indication of gravity is of no utility and presents the same reading of the criminality as traditional measures. It explains that it is always the same crimes (thefts, mischief and breaking and entering) that contribute mainly to each of both measures of the criminality.
533

Small Area Estimation for Survey Data: A Hierarchical Bayes Approach

Karaganis, Milana 14 September 2009 (has links)
Model-based estimation techniques have been widely used in small area estimation. This thesis focuses on the Hierarchical Bayes (HB) estimation techniques in application to small area estimation for survey data. We will study the impact of applying spatial structure to area-specific effects and utilizing a specific generalized linear mixed model in comparison with a traditional Fay-Herriot estimation model. We will also analyze different loss functions with applications to a small area estimation problem and compare estimates obtained under these loss functions. Overall, for the case study under consideration, area-specific geographical effects will be shown to have a significant effect on estimates. As well, using a generalized linear mixed model will prove to be more advantageous than the usual Fay-Herriot model. We will also demonstrate the benefits of using a weighted balanced-type loss function for the purpose of balancing the precision of estimates with their closeness to the direct estimates.
534

Novel feedback and signalling mechanisms for interference management and efficient modulation

Abu-alhiga, Rami January 2010 (has links)
In order to meet the ever-growing demand for mobile data, a number of different technologies have been adopted by the fourth generation standardization bodies. These include multiple access schemes such as spatial division multiple access (SDMA), and efficient modulation techniques such as orthogonal frequency division multiplexing (OFDM)-based modulation. The specific objectives of this theses are to develop an effective feedback method for interference management in smart antenna SDMA systems and to design an efficient OFDM-based modulation technique, where an additional dimension is added to the conventional two-dimensional modulation techniques such as quadrature amplitude modulation (QAM). In SDMA time division duplex (TDD) systems, where channel reciprocity is maintained, uplink (UL) channel sounding method is considered as one of the most promising feedback methods due to its bandwidth and delay efficiency. Conventional channel sounding (CCS) only conveys the channel state information (CSI) of each active user to the base station (BS). Due to the limitation in system performance because of co-channel interference (CCI) from adjacent cells in interference-limited scenarios, CSI is only a suboptimal metric for multiuser spatial multiplexing optimization. The first major contribution of this theses is a novel interference feedback method proposed to provide the BS with implicit knowledge about the interference level received by each mobile station (MS). More specifically, it is proposed to weight the conventional channel sounding pilots by the level of the experienced interference at the user’s side. Interference-weighted channel sounding (IWCS) acts as a spectrally efficient feedback technique that provides the BS with implicit knowledge about CCI experienced by each MS, and significantly improves the downlink (DL) sum capacity for both greedy and fair scheduling policies. For the sake of completeness, a novel procedure is developed to make the IWCS pilots usable for UL optimization. It is proposed to divide the optimization metric obtained from the IWCS pilots by the interference experienced at the BS’s antennas. The resultant new metric, the channel gain divided by the multiplication of DL and UL interference, provides link-protection awareness and is used to optimize both UL and DL. Using maximum capacity scheduling criterion, the link-protection aware metric results in a gain in the median system sum capacity of 26.7% and 12.5% in DL and UL respectively compared to the case when conventional channel sounding techniques are used. Moreover, heuristic algorithm has been proposed in order to facilitate a practical optimization and to reduce the computational complexity. The second major contribution of this theses is an innovative transmission approach, referred to as subcarrier-index modulation (SIM), which is proposed to be integrated with OFDM. The key idea of SIM is to employ the subcarrier-index to convey information to the receiver. Furthermore, a closed-form analytical bit error ratio (BER) of SIM OFDM in Rayleigh channel is derived. Simulation results show BER performance gain of 4 dB over 4-QAM OFDM for both coded and uncoded data without power saving policy. Alternatively, power saving policy maintains an average gain of 1 dB while only using half OFDM symbol transmit power.
535

VISUALIZATION OF BRAIN WHITE MATTER TRACTS USING HEAVILY T2-WEIGHTED THREE-DIMENSIONAL FLUID-ATTENUATED INVERSION-RECOVERY MAGNETIC RESONANCE IMAGING

KAWAI, HISASHI, BOKURA, KIMINORI, NAGANAWA, SHINJI, YAMAZAKI, MASAHIRO 08 1900 (has links)
No description available.
536

Small Area Estimation for Survey Data: A Hierarchical Bayes Approach

Karaganis, Milana 14 September 2009 (has links)
Model-based estimation techniques have been widely used in small area estimation. This thesis focuses on the Hierarchical Bayes (HB) estimation techniques in application to small area estimation for survey data. We will study the impact of applying spatial structure to area-specific effects and utilizing a specific generalized linear mixed model in comparison with a traditional Fay-Herriot estimation model. We will also analyze different loss functions with applications to a small area estimation problem and compare estimates obtained under these loss functions. Overall, for the case study under consideration, area-specific geographical effects will be shown to have a significant effect on estimates. As well, using a generalized linear mixed model will prove to be more advantageous than the usual Fay-Herriot model. We will also demonstrate the benefits of using a weighted balanced-type loss function for the purpose of balancing the precision of estimates with their closeness to the direct estimates.
537

Algebraic decoder specification: coupling formal-language theory and statistical machine translation

Büchse, Matthias 28 January 2015 (has links) (PDF)
The specification of a decoder, i.e., a program that translates sentences from one natural language into another, is an intricate process, driven by the application and lacking a canonical methodology. The practical nature of decoder development inhibits the transfer of knowledge between theory and application, which is unfortunate because many contemporary decoders are in fact related to formal-language theory. This thesis proposes an algebraic framework where a decoder is specified by an expression built from a fixed set of operations. As yet, this framework accommodates contemporary syntax-based decoders, it spans two levels of abstraction, and, primarily, it encourages mutual stimulation between the theory of weighted tree automata and the application.
538

Is 3-Tesla Gd-EOB-DTPA-enhanced MRI with diffusion-weighted imaging superior to 64-slice contrast-enhanced CT for the diagnosis of hepatocellular carcinoma?

Maiwald, Bettina, Lobsien, Donald, Kahn, Thomas, Stumpp, Patrick 11 November 2014 (has links) (PDF)
Objectives: To compare 64-slice contrast-enhanced computed tomography (CT) with 3-Tesla magnetic resonance imaging (MRI) using Gd-EOB-DTPA for the diagnosis of hepatocellular carcinoma (HCC) and evaluate the utility of diffusion-weighted imaging (DWI) in this setting. Methods: 3-phase-liver-CT was performed in fifty patients (42 male, 8 female) with suspected or proven HCC. The patients were subjected to a 3-Tesla-MRI-examination with Gd-EOB-DTPA and diffusion weighted imaging (DWI) at b-values of 0, 50 and 400 s/mm2. The apparent diffusion coefficient (ADC)-value was determined for each lesion detected in DWI. The histopathological report after resection or biopsy of a lesion served as the gold standard, and a surrogate of follow-up or complementary imaging techniques in combination with clinical and paraclinical parameters was used in unresected lesions. Diagnostic accuracy, sensitivity, specificity, and positive and negative predictive values were evaluated for each technique. Results: MRI detected slightly more lesions that were considered suspicious for HCC per patient compared to CT (2.7 versus 2.3, respectively). ADC-measurements in HCC showed notably heterogeneous values with a median of 1.2±0.5×10−3 mm2/s (range from 0.07±0.1 to 3.0±0.1×10−3 mm2/s). MRI showed similar diagnostic accuracy, sensitivity, and positive and negative predictive values compared to CT (AUC 0.837, sensitivity 92%, PPV 80% and NPV 90% for MRI vs. AUC 0.798, sensitivity 85%, PPV 79% and NPV 82% for CT; not significant). Specificity was 75% for both techniques. Conclusions: Our study did not show a statistically significant difference in detection in detection of HCC between MRI and CT. Gd-EOB-DTPA-enhanced MRI tended to detect more lesions per patient compared to contrast-enhanced CT; therefore, we would recommend this modality as the first-choice imaging method for the detection of HCC and therapeutic decisions. However, contrast-enhanced CT was not inferior in our study, so that it can be a useful image modality for follow-up examinations.
539

An Xml Based Content-based Image Retrieval System With Mpeg-7 Descriptors

Arslan, Serdar 01 December 2004 (has links) (PDF)
Recently, very large collections of images and videos have grown rapidly. In parallel with this growth, content-based retrieval and querying the indexed collections are required to access visual information. Three main components of the visual information are color, texture and shape. In this thesis, an XML based content-based image retrieval system is presented that combines three visual descriptors of MPEG-7 and measures similarity of images by applying a distance function. An XML database is used for storing these three descriptors. The system is also extended to support high dimensional indexing for efficient search and retrieval from its XML database. To do this, an index structure, called M-Tree, is implemented which uses weighted Euclidean distance function for similarity measure. Ordered Weighted Aggregation (OWA) operators are used to define the weights of the distance function and to combine three features&rsquo / distance functions into one. The system supports nearest neighbor queries and three types of fuzzy queries / feature-based, image-based and color-based queries. Also it is shown through experimental results and analysis of retrieval effectiveness of querying that the content-based retrieval system is effective in terms of retrieval and scalability.
540

Some Aspects on Confirmatory Factor Analysis of Ordinal Variables and Generating Non-normal Data

Luo, Hao January 2011 (has links)
This thesis, which consists of five papers, is concerned with various aspects of confirmatory factor analysis (CFA) of ordinal variables and the generation of non-normal data. The first paper studies the performances of different estimation methods used in CFA when ordinal data are encountered.  To take ordinality into account the four estimation methods, i.e., maximum likelihood (ML), unweighted least squares, diagonally weighted least squares, and weighted least squares (WLS), are used in combination with polychoric correlations. The effect of model sizes and number of categories on the parameter estimates, their standard errors, and the common chi-square measure of fit when the models are both correct and misspecified are examined. The second paper focuses on the appropriate estimator of the polychoric correlation when fitting a CFA model. A non-parametric polychoric correlation coefficient based on the discrete version of Spearman's rank correlation is proposed to contend with the situation of non-normal underlying distributions. The simulation study shows the benefits of using the non-parametric polychoric correlation under conditions of non-normality. The third paper raises the issue of simultaneous factor analysis. We study the effect of pooling multi-group data on the estimation of factor loadings. Given the same factor loadings but different factor means and correlations, we investigate how much information is lost by pooling the groups together and only estimating the combined data set using the WLS method. The parameter estimates and their standard errors are compared with results obtained by multi-group analysis using ML. The fourth paper uses a Monte Carlo simulation to assess the reliability of the Fleishman's power method under various conditions of skewness, kurtosis, and sample size. Based on the generated non-normal samples, the power of D'Agostino's (1986) normality test is studied. The fifth paper extends the evaluation of algorithms to the generation of multivariate non-normal data.  Apart from the requirement of generating reliable skewness and kurtosis, the generated data also need to possess the desired correlation matrices.  Four algorithms are investigated in terms of simplicity, generality, and reliability of the technique.

Page generated in 0.0424 seconds