• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 872
  • 412
  • 156
  • 84
  • 79
  • 35
  • 27
  • 18
  • 17
  • 16
  • 14
  • 13
  • 10
  • 8
  • 8
  • Tagged with
  • 2102
  • 2102
  • 548
  • 431
  • 430
  • 382
  • 380
  • 203
  • 191
  • 167
  • 162
  • 160
  • 156
  • 147
  • 147
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

DNA Microarray Data Analysis and Mining: Affymetrix Software Package and In-House Complementary Packages

Xu, Lizhe 19 December 2003 (has links)
Data management and analysis represent a major challenge for microarray studies. In this study, Affymetrix software was used to analyze an HIV-infection data. The microarray analysis shows remarkably different results when using different parameters provided by the software. This highlights the fact that a standardized analysis tool, incorporating biological information about the genes is needed in order to better interpret the microarray study. To address the data management problem, in-house programs, including scripts and a database, were designed. The in-house programs were also used to overcome problems and inconveniences discovered during the data analysis, including management of the gene lists. The database provides rapid connection to many online public databases, as well as the integration of the original microarray data, relevant publications and other useful information. The in-house programs allow investigators to process and analyze the full Affymetrix microarray data in a speedy manner.
112

The effectiveness of missing data techniques in principal component analysis

Maartens, Huibrecht Elizabeth January 2015 (has links)
A dissertation submitted to the Faculty of Science, University of the Witwatersrand, Johannesburg, in fulfilment of requirements for the degree of Master of Science. Johannesburg, 2015. / Exploratory data analysis (EDA) methods such as Principal Component Analysis (PCA) play an important role in statistical analysis. The analysis assumes that a complete dataset is observed. If the underlying data contains missing observations, the analysis cannot be completed immediately as a method to handle these missing observations must first be implemented. Missing data are a problem in any area of research, but researchers tend to ignore the problem, even though the missing observations can lead to incorrect conclusions and results. Many methods exist in the statistical literature for handling missing data. There are many methods in the context of PCA with missing data, but few studies have focused on a comparison of these methods in order to determine the most effective method. In this study the effectiveness of the Expectation Maximisation (EM) algorithm and the iterative PCA (iPCA) algorithm are assessed and compared against the well-known yet flawed methods of case-wise deletion (CW) and mean imputation. Two techniques for the application of the multiple imputation (MI) method of Markov Chain Monte Carlo (MCMC) with the EM algorithm in a PCA context are suggested and their effectiveness is evaluated compared to the other methods. The analysis is based on a simulated dataset and the effectiveness of the methods analysed using the sum of squared deviations (SSD) and the Rv coefficient, a measure of similarity between two datasets. The results show that the MI technique applying PCA in the calculation of the final imputed values and the iPCA algorithm are the most effective techniques, compared to the other techniques in the analysis.
113

Problemas inversos em física experimental: a secção de choque fotonuclear e radiação de aniquilação elétron-pósitron / Inverse problems in experimental physics: a section of the fotonuclear shock and radiation of electron-positron annihilation.

Takiya, Carlos 27 June 2003 (has links)
Os métodos para resolução de Problemas Inversos aplicados a dados experimentais (Regularização, Bayesianos e Máxima Entropia) foram revistos. O procedimento de Mínimos Quadrados com Regularização por Mínima Variância (MQ-RMV) foi desenvolvido para resolução de Problemas Inversos lineares, tendo sido aplicado em: a) espectros unidimensionais simulados; b)determinação da secção de choque ANTPOT.34 S (, xn) a partir de yield de bremsstrahlung; c)análise da radiação de aniquilação elétron-pósitron em alumínio de experimento de coincidência com dois detetores semicondutores. Os resultados são comparados aos obtidos por outros métodos. / The methods used to solve inverse problems applied to experimental data (Regularization, Bayesian and Maximum Entropy) were revised. The Least-Squares procedure with Minimum Variance Regularization (LS-MVR) was developed to solve linear inverse problems, being applied to: a)simulated one-dimensional histograms; b) 34S (, xn) cross-section determination radiation in Aluminum from coincidence experiments with two semiconductor detectors. The results were compared to that obtained by other methods.
114

Análise de dados utilizando a medida de tempo de consenso em redes complexas / Data anlysis using the consensus time measure for complex networks

Lopez, Jean Pierre Huertas 30 March 2011 (has links)
Redes são representações poderosas para muitos sistemas complexos, onde vértices representam elementos do sistema e arestas representam conexões entre eles. Redes Complexas podem ser definidas como grafos de grande escala que possuem distribuição não trivial de conexões. Um tópico importante em redes complexas é a detecção de comunidades. Embora a detecção de comunidades tenha revelado bons resultados na análise de agrupamento de dados com grupos de diversos formatos, existem ainda algumas dificuldades na representação em rede de um conjunto de dados. Outro tópico recente é a caracterização de simplicidade em redes complexas. Existem poucos trabalhos nessa área, no entanto, o tema tem muita relevância, pois permite analisar a simplicidade da estrutura de conexões de uma região de vértices, ou de toda a rede. Além disso, mediante a análise de simplicidade de redes dinâmicas no tempo, é possível conhecer como vem se comportando a evolução da rede em termos de simplicidade. Considerando a rede como um sistema dinâmico de agentes acoplados, foi proposto neste trabalho uma medida de distância baseada no tempo de consenso na presença de um líder em uma rede acoplada. Utilizando essa medida de distância, foi proposto um método de detecção de comunidades para análise de agrupamento de dados, e um método de análise de simplicidade em redes complexas. Além disso, foi proposto uma técnica de construção de redes esparsas para agrupamento de dados. Os métodos têm sido testados com dados artificiais e reais, obtendo resultados promissores / Networks are powerful representations for many complex systems, where nodes represent elements of the system and edges represent connections between them. Complex networks can be defined as graphs with no trivial distribution of connections. An important topic in complex networks is the community detection. Although the community detection have reported good results in the data clustering analysis with groups of different formats, there are still some dificulties in the representation of a data set as a network. Another recent topic is the characterization of simplicity in complex networks. There are few studies reported in this area, however, the topic has much relevance, since it allows analyzing the simplicity of the structure of connections between nodes of a region or connections of the entire network. Moreover, by analyzing simplicity of dynamic networks in time, it is possible to know the behavior in the network evolution in terms of simplicity. Considering the network as a coupled dynamic system of agents, we proposed a distance measure based on the consensus time in the presence of a leader in a coupled network. Using this distance measure, we proposed a method for detecting communities to analyze data clustering, and a method for simplicity analysis in complex networks. Furthermore, we propose a technique to build sparse networks for data clustering. The methods have been tested with artificial and real data, obtaining promising results
115

The distinction of simulated failure data by the likelihood ratio test

Drayer, Darryl D January 2011 (has links)
Typescript (photocopy). / Digitized by Kansas Correctional Industries
116

Detection of outliers in failure data

Gallup, Donald Robert January 2011 (has links)
Typescript (photocopy). / Digitized by Kansas Correctional Industries
117

Data analysis techniques useful for the detection of B-mode polarisation of the Cosmic Microwave Background

Wallis, Christopher January 2016 (has links)
Asymmetric beams can create significant bias in estimates of the power spectra from cosmic microwave background (CMB) experiments. With the temperature power spectrum many orders of magnitude stronger than the B-mode power spectrum any systematic error that couples the two must be carefully controlled and/or removed. In this thesis, I derive unbiased estimators for the CMB temperature and polarisation power spectra taking into account general beams and scan strategies. I test my correction algorithm on simulations of two temperature-only experiments and demonstrate that it is unbiased. I also develop a map-making algorithm that removes beam asymmetry bias at the map level. I demonstrate its implementation using simulations. I present two new map-making algorithms that create polarisation maps clean of temperature-to-polarisation leakage systematics due to differential gain and pointing between a detector pair. Where a half wave plate is used, I show that the spin-2 systematic due to differential ellipticity can also be removed using my algorithms. The first algorithm is designed to work with scan strategies that have a good range of crossing angles for each map pixel and the second for scan strategies that have a limited range of crossing angles. I demonstrate both algorithms by using simulations of time ordered data with realistic scan strategies and instrumental noise. I investigate the role that a scan strategy can have in mitigating certain common systematics by averaging systematic errors down with many crossing angles. I present approximate analytic forms for the error on the recovered B-mode power spectrum that would result from these systematic errors. I use these analytic predictions to search the parameter space of common satellite scan strategies to identify the features of a scan strategy that have most impact in mitigating systematic effects.
118

Problemas inversos em física experimental: a secção de choque fotonuclear e radiação de aniquilação elétron-pósitron / Inverse problems in experimental physics: a section of the fotonuclear shock and radiation of electron-positron annihilation.

Carlos Takiya 27 June 2003 (has links)
Os métodos para resolução de Problemas Inversos aplicados a dados experimentais (Regularização, Bayesianos e Máxima Entropia) foram revistos. O procedimento de Mínimos Quadrados com Regularização por Mínima Variância (MQ-RMV) foi desenvolvido para resolução de Problemas Inversos lineares, tendo sido aplicado em: a) espectros unidimensionais simulados; b)determinação da secção de choque ANTPOT.34 S (, xn) a partir de yield de bremsstrahlung; c)análise da radiação de aniquilação elétron-pósitron em alumínio de experimento de coincidência com dois detetores semicondutores. Os resultados são comparados aos obtidos por outros métodos. / The methods used to solve inverse problems applied to experimental data (Regularization, Bayesian and Maximum Entropy) were revised. The Least-Squares procedure with Minimum Variance Regularization (LS-MVR) was developed to solve linear inverse problems, being applied to: a)simulated one-dimensional histograms; b) 34S (, xn) cross-section determination radiation in Aluminum from coincidence experiments with two semiconductor detectors. The results were compared to that obtained by other methods.
119

Determinants and consequences of working capital management

Supatanakornkij, Sasithorn January 2015 (has links)
Well-managed working capital plays an important role in running a sound and successful business as it has a direct influence on liquidity and profitability. Working capital management (WCM) has recently received an increased focus from businesses and been regarded as a key managerial intervention to maintain solvency, especially during the global financial crisis when external financing was less available (PwC, 2012). This thesis contains a comprehensive analysis of the determinants and consequences of WCM. For the determinants of WCM, the results suggest that the nature of a firm’s WCM is determined by a combination of firm characteristics, economic condition, and country-level variables. Sources of financing, firm size, and levels of profitability and investment in long-term assets play a vital role in the management of working capital. The financial downturn has also put increased pressure on firms to operate with a lower level of working capital. In addition, country-level variables (i.e., legal environment and culture) have a significant influence on determining a firm’s WCM as well as its determinants. For the consequences of WCM, the findings highlight the importance of higher efficiency in WCM in terms of its potential contribution in enhancing profitability. In particular, firms operating with lower accounts receivable, inventory, and accounts payable periods are associated with higher profitability. Firms can also enhance their profitability further by ensuring a proper “fit” among these components of working capital. Finally, achieving higher efficiency in inventory management can be a source of profitability improvements during the financial crisis. Overall, the thesis contributes to the accounting and finance literature in two distinct ways: research design and new findings. A more extensive data set (in terms of countries coverage and time frame), new estimation technique (i.e., dynamic panel generalised method of moments (GMM) estimation to produce more consistent and reliable results), and substantive robustness tests (conspicuous by their absence in prior studies) were applied and result in several new empirical findings. First, a firm’s WCM is influenced not only by internal factors but also external factors such as country setting, legal environment and culture. Second, a comprehensive measure of WCM (i.e., cash conversion cycle (CCC)) does not represent a useful surrogate for the effects of WCM on corporate profitability. Instead, an examination of the individual components of CCC gives more pronounced and valid results. Third, by managing working capital correctly, firms can enhance their profitability even further, at different levels, and through different components of profitability (including profit margin and asset productivity).
120

Comparing Event Detection Methods in Single-Channel Analysis Using Simulated Data

Dextraze, Mathieu Francis 16 October 2019 (has links)
With more states revealed, and more reliable rates inferred, mechanistic schemes for ion channels have increased in complexity over the history of single-channel studies. At the forefront of single-channel studies we are faced with a temporal barrier delimiting the briefest event which can be detected in single-channel data. Despite improvements in single-channel data analysis, the use of existing methods remains sub-optimal. As existing methods in single-channel data analysis are unquantified, optimal conditions for data analysis are unknown. Here we present a modular single-channel data simulator with two engines; a Hidden Markov Model (HMM) engine, and a sampling engine. The simulator is a tool which provides the necessary a priori information to be able to quantify and compare existing methods in order to optimize analytic conditions. We demonstrate the utility of our simulator by providing a preliminary comparison of two event detection methods in single-channel data analysis; Threshold Crossing and Segmental k-means with Hidden Markov Modelling (SKM-HMM).

Page generated in 0.0551 seconds