• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 483
  • 198
  • 196
  • 51
  • 44
  • 36
  • 24
  • 22
  • 19
  • 13
  • 8
  • 6
  • 5
  • 5
  • 4
  • Tagged with
  • 1299
  • 215
  • 154
  • 133
  • 125
  • 123
  • 115
  • 109
  • 104
  • 101
  • 86
  • 85
  • 82
  • 77
  • 76
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
291

Tonsils : a risk factor for moderate and severe chronic periodontitis? /

Wynn, William Bernard. January 2002 (has links) (PDF)
Thesis--University of Oklahoma. / Includes bibliographical references (leaves 39-41).
292

SAR remote sensing of soil Moisture

Snapir, Boris 12 1900 (has links)
Synthetic Aperture Radar (SAR) has been identified as a good candidate to provide high-resolution soil moisture information over extended areas. SAR data could be used as observations within a global Data Assimilation (DA) approach to benefit applications such as hydrology and agriculture. Prior to developing an operational DA system, one must tackle the following challenges of soil moisture estimation with SAR: (1) the dependency of the measured radar signal on both soil moisture and soil surface roughness which leads to an ill-conditioned inverse problem, and (2) the difficulty in characterizing spatially/temporally surface roughness of natural soils and its scattering contribution. The objectives of this project are (1) to develop a roughness measurement method to improve the spatial/temporal characterization of soil surface roughness, and (2) to investigate to what extent the inverse problem can be solved by combining multipolarization, multi-incidence, and/or multi-frequency radar measurements. The first objective is achieved with a measurement method based on Structure from Motion (SfM). It is tailored to monitor natural surface roughness changes which have often been assumed negligible although without evidence. The measurement method is flexible, a.ordable, straightforward and generates Digital Elevation Models (DEMs) for a SAR-pixel-size plot with mm accuracy. A new processing method based on band-filtering of the DEM and its 2D Power Spectral Density (PSD) is proposed to compute the classical roughness parameters. Time series of DEMs show that non-negligible changes in surface roughness can happen within two months at scales relevant for microwave scattering. The second objective is achieved using maximum likelihood fitting of the Oh backscattering model to (1) full-polarimetric Radarsat-2 data and (2) simulated multi-polarization / multi-incidence / multi-frequency radar data. Model fitting with the Radarsat-2 images leads to poor soil moisture retrieval which is related to inaccuracy of the Oh model. Model fitting with the simulated data quantifies the amount of multilooking for di.erent combinations of measurements needed to mitigate the critical e.ect of speckle on soil moisture uncertainty. Results also suggest that dual-polarization measurements at L- and C-bands are a promising combination to achieve the observation requirements of soil moisture. In conclusion, the SfM method along with the recommended processing techniques are good candidates to improve the characterization of surface roughness. A combination of multi-polarization and multi-frequency radar measurements appears to be a robust basis for a future Data Assimilation system for global soil moisture monitoring.
293

Statistical Idealities and Expected Realities in the Wavelet Techniques Used for Denoising

DeNooyer, Eric-Jan D. 01 January 2010 (has links)
In the field of signal processing, one of the underlying enemies in obtaining a good quality signal is noise. The most common examples of signals that can be corrupted by noise are images and audio signals. Since the early 1980's, a time when wavelet transformations became a modernly defined tool, statistical techniques have been incorporated into processes that use wavelets with the goal of maximizing signal-to-noise ratios. We provide a brief history of wavelet theory, going back to Alfréd Haar's 1909 dissertation on orthogonal functions, as well as its important relationship to the earlier work of Joseph Fourier (circa 1801), which brought about that famous mathematical transformation, the Fourier series. We demonstrate how wavelet theory can be used to reconstruct an analyzed function, ergo, that it can be used to analyze and reconstruct images and audio signals as well. Then, in order to ground the understanding of the application of wavelets to the science of denoising, we discuss some important concepts from statistics. From all of these, we introduce the subject of wavelet shrinkage, a technique that combines wavelets and statistics into a "thresholding" scheme that effectively reduces noise without doing too much damage to the desired signal. Subsequently, we discuss how the effectiveness of these techniques are measured, both in the ideal sense and in the expected sense. We then look at an illustrative example in the application of one technique. Finally, we analyze this example more generally, in accordance with the underlying theory, and make some conclusions as to when wavelets are an effective technique in increasing a signal-to-noise ratio.
294

Relationship of rehabilitation counselors' ethnicity match and cultural competency to service provision and employment outcome for vocational rehabilitation consumers

Jorgensen-Wagers, Kendra Lara 27 April 2015 (has links)
The purpose of this dissertation is to explore whether rehabilitation counselor/consumer racial similarity is related to rehabilitation services provided and employment outcomes achieved. This research contributes additional information to the research base on how to educate and train practicing vocational rehabilitation counselors to support culturally and linguistically diverse (CLD) consumers more effectively. Information regarding racial dyads may help to articulate whether CLD counselors are more effective with CLD consumers. Utilizing the RSA-911 data file submitted by California for fiscal year 2006 as well as 189 Multicultural Counseling Inventory (MCI) surveys collected on practicing vocational rehabilitation counselors, ANOVA, MANOVA, ANCOVA and chi-square analysis were used to measure relationships among counselor and consumer dyads that were either similar or dissimilar in ethnicity and case service variables and outcome variables. Additional analysis focused on the influence of high versus low cultural competency of rehabilitation counselors and the same consumer case service and outcome variables. Study results revealed statistically significant differences on MCI total scores between counselor ethnic groups, with Hispanic counselors demonstrating the highest mean scores overall on the MCI. In addition, significant differences existed between counselor ethnic groups on MCI total scores, as well as all four subscale scores. Chi-square was used to investigate the effect of counselor ethnicity and cultural competency scores on the case service variables of job search, rehabilitation technology, maintenance and college training, as well as outcome variables of competitive employment. MANOVA and ANCOVA were used for the continuous outcome variables of cost of services, wages at closure and weekly earnings at closure. Significant chi-square results were found for some of the service and outcome variables for both the matched counselor/consumer dyads and the counselor competency score groups. More significant findings existed in the ANCOVA analysis for cultural competency scores than for matched ethnicity groups. Implications of the findings and suggestions for future research are discussed. / text
295

Mixture distributions with application to microarray data analysis

Lynch, O'Neil 01 June 2009 (has links)
The main goal in analyzing microarray data is to determine the genes that are differentially expressed across two types of tissue samples or samples obtained under two experimental conditions. In this dissertation we proposed two methods to determine differentially expressed genes. For the penalized normal mixture model (PMMM) to determine genes that are differentially expressed, we penalized both the variance and the mixing proportion parameters simultaneously. The variance parameter was penalized so that the log-likelihood will be bounded, while the mixing proportion parameter was penalized so that its estimates are not on the boundary of its parametric space. The null distribution of the likelihood ratio test statistic (LRTS) was simulated so that we could perform a hypothesis test for the number of components of the penalized normal mixture model. In addition to simulating the null distribution of the LRTS for the penalized normal mixture model, we showed that the maximum likelihood estimates were asymptotically normal, which is a first step that is necessary to prove the asymptotic null distribution of the LRTS. This result is a significant contribution to field of normal mixture model. The modified p-value approach for detecting differentially expressed genes was also discussed in this dissertation. The modified p-value approach was implemented so that a hypothesis test for the number of components can be conducted by using the modified likelihood ratio test. In the modified p-value approach we penalized the mixing proportion so that the estimates of the mixing proportion are not on the boundary of its parametric space. The null distribution of the (LRTS) was simulated so that the number of components of the uniform beta mixture model can be determined. Finally, for both modified methods, the penalized normal mixture model and the modified p-value approach were applied to simulated and real data.
296

Bayesian Inference Approaches for Particle Trajectory Analysis in Cell Biology

Monnier, Nilah 28 August 2013 (has links)
Despite the importance of single particle motion in biological systems, systematic inference approaches to analyze particle trajectories and evaluate competing motion models are lacking. An automated approach for robust evaluation of motion models that does not require manual intervention is highly desirable to enable analysis of datasets from high-throughput imaging technologies that contain hundreds or thousands of trajectories of biological particles, such as membrane receptors, vesicles, chromosomes or kinetochores, mRNA particles, or whole cells in developing embryos. Bayesian inference is a general theoretical framework for performing such model comparisons that has proven successful in handling noise and experimental limitations in other biological applications. The inherent Bayesian penalty on model complexity, which avoids overfitting, is particularly important for particle trajectory analysis given the highly stochastic nature of particle diffusion. This thesis presents two complementary approaches for analyzing particle motion using Bayesian inference. The first method, MSD-Bayes, discriminates a wide range of motion models--including diffusion, directed motion, anomalous and confined diffusion--based on mean- square displacement analysis of a set of particle trajectories, while the second method, HMM-Bayes, identifies dynamic switching between diffusive and directed motion along individual trajectories using hidden Markov models. These approaches are validated on biological particle trajectory datasets from a wide range of experimental systems, demonstrating their broad applicability to research in cell biology.
297

Metamorphosis of Statue Square

吳智豪, Ng, Chi-ho. January 1994 (has links)
published_or_final_version / Architecture / Master / Master of Architecture
298

Infarkto gydymo įvairiais vaistais statistiniai tyrimo metodai / The statistical methods of investigation of the infarction treatment with the help of different drugs

Stasiukaitytė, Irma 10 June 2004 (has links)
The goal of the present thesis is to ascertain the impact of different drugs, intended for the infarction treatment; investigation of the other factors, which may stipulate bleeding in the course of the operation and within the post-operation period. The investigation was carried out in two stages. During the first stage the data was accumulated for processing (investigation of the sample homogeneity and normality); the second stage implied solution of the statistical tasks (solution of the tasks, which correspond to the goals of the thesis). The methods of data analysis and the models of binary logistic and linear logistic regression were applied. 89 patients, who survived the myocarditis infarction, were investigated and it was ascertained that there is no huge difference in between the tranexamic acid and aprotinin. The bleeding complications may be caused by aspirin, which has been used before the operation. One of the complications, i.e. the drainage, may be predicted, judging from the amount of haemoglobin, haematocrit in the blood as well as creatinin. The model of the binary logistic regression assisted us in drawing the conclusion that smoking, hypothermia, euroscore and other factors produce an impact upon the bleeding complications.
299

Framing pluralism: a reconfiguration of the Robson Square complex in downtown Vancouver, British Columbia

Bligh, Christopher Graham 09 March 2009 (has links)
This thesis addresses the question of how a dialogue may be developed between the socio-political notion of pluralism and a pluralistic public sphere, and the design of public space and public architecture. More particularly, it considers how architecture may both accommodate multiple publics while simultaneously framing connection or association between them. Following an exploration of this question at an abstract, theoretical level, the thesis moves to a more specific architectural investigation. This investigation takes the form of a design project situated within the Robson Square complex in downtown Vancouver. The complex, constructed in the mid-1970's, was designed by Canada's pre-eminent architect Arthur Erickson and is the major civic space in the city. The thesis design project undertakes a theoretical re-evaluation and physical renovation of this Utopian mega-structure, with the intention of shifting the existing homogeneity and institutionality of the complex to align it with the thesis argument. Through the vehicle of a design project,, the abstract theoretical argument is translated and focused through the particularities of an architecture embedded in its site. This methodology requires the project to address issues connected to the existing masterwork, including: the question of how to conceptualize monumentally in a grid city; the relationship between the 'sacred' space of the civic circumstance and the 'profane' space of the street and commercial program; the form of the institution within the city; and the issue of working within, and manipulating, a pre-existing architectural language. Further, the scale of the site is reflected in the scope of the design project. The project moves from the development of urban design strategies to the detailed consideration of the material and construction of the different interventions. The architecture remains, however, conceptual and is a demonstration of how the developed strategies may generate form and guide program. The project does not attempt to fully develop a building in detail. The thesis concludes with an afterword on the success of the project as a demonstration of the thesis argument. At the same time, the limits of architecture to act as a socio-political device are acknowledged. Further, speculations are made as to the way in which the position and the strategies developed within the thesis might inform wider discussions on architecture and the urbanism of grid cities such as Vancouver. The key concept forming the basis of these speculations is the idea of a 'difficult' co-existence of parts and a whole, reflecting the pluralistic ideal of association within fragmentation.
300

The Square Root Function of a Matrix

Gordon, Crystal Monterz 24 April 2007 (has links)
Having origins in the increasingly popular Matrix Theory, the square root function of a matrix has received notable attention in recent years. In this thesis, we discuss some of the more common matrix functions and their general properties, but we specifically explore the square root function of a matrix and the most efficient method (Schur decomposition) of computing it. Calculating the square root of a 2×2 matrix by the Cayley-Hamilton Theorem is highlighted, along with square roots of positive semidefinite matrices and general square roots using the Jordan Canonical Form.

Page generated in 0.0827 seconds