• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 307
  • 139
  • 34
  • 31
  • 24
  • 19
  • 16
  • 16
  • 14
  • 12
  • 7
  • 5
  • 4
  • 3
  • 2
  • Tagged with
  • 746
  • 746
  • 746
  • 141
  • 118
  • 112
  • 102
  • 86
  • 69
  • 66
  • 59
  • 59
  • 55
  • 55
  • 52
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
341

"Spaghetti "主成份分析之延伸-應用於時間相關之區間型台灣股價資料 / An extension of Spaghetti PCA for time dependent interval data

陳品達, Chen, Pin-Da Unknown Date (has links)
摘要 近幾年發展的區間型態資料之主成份分析,運用在某些領域的資料上尚未成熟,例如股票價格的資料,這些資料是與時間息息相關地,於是有了時間相關的區間資料分析 (Irpino, 2006. Pattern Recognition Letters 27, 504-513)。本文延續這個分析,針對時間相關之區間型台灣股價資料進行研究。Irpino (2006) 的方法只考慮每週的開盤價與收盤價,為了得到更多資訊,我們提出三種方法,第一個方法,將每週的最高價(最低價)納入分析,由兩點的分析變成三點的分析;第二個方法,我們同時考慮最高價與最低價,變成四點的分析,這兩個方法都能得到原始方法不能得到的資訊-公司的穩定度,其中又以第二個方法較為準確;第三種方法引用Irpino (2006) 的建議,我們改變區間的分配,而此方法得到的結果與原 始的方法差異不大。 本文分別收集了台灣金融市場三十家半導體與台指五十中的四十七家公司於民國九十七年九月一號到十二月二十六號共十七週的股價資料進行實證分析。以台指五十為例,分析結果顯示編號17的台達電子工業股份有限公司、編號24的鴻海科技集團,這兩家公司的未來被看好;而編號10的聯陽半導體股份有限公司、編號35的統一超商股份有限公司,此兩家公司的未來不被看好,這四家公司在民國九十八年一月五號到一月七號三天的走勢確實是如此!此外,結果顯示 金融體系的公司比電子體系的公司來得穩定。 關鍵字:主成份分析,區間型資料,時間相關 / ABSTRACT The methods for principal component analysis on interval data have not been ripe yet in some areas, for example, the data of stock prices that are closely related to the time, so the analysis of time dependent interval data was proposed (Irpino, 2006. Pattern Recognition Letters 27, 504-513). In this paper, we apply this approach to the stock prices data in Taiwan. The original “Spaghetti” PCA in Irpino (2006) considered only the starting and the ending prices for each week. In order to get more information we propose three methods. We consider the highest (lowest) price for each week to our analysis in Method 1, and the analysis changes from two points to three points. In Method 2, we consider all information to our analysis which considers four points. These two methods can get more information than the original one. For example, we can get the information of stability degree of the company. For the Method 3, we quote the suggestion from Irpino (2006) to change the distribution of intervals from uniform to beta. However, the result is similar to the original result. In our approach, we collect data of stock prices from 37 companies of semiconductor and 47 companies of TSEC Taiwan 50 index in Taiwan financial market during the 17 weeks from September 1 to December 26, 2008. For TSEC Taiwan 50 index, the results of this analysis are that the future trend of Delta (Delta Electronics Incorporation) which numbers 17 and Foxconn (Foxconn Electronics Incorporation) which numbers 24 are optimistic; And ITE (Integrated Technology Express) which numbers 10 and 7-ELEVEn (President Chain Store Corporation) which numbers 35 are not good. In fact, the trends of these four companies are indicated these results during January 5th to 7th. What’s more, the financial companies are steadier than the electronic industry. Keywords: Principal component analysis; Interval data; Time dependent
342

The classification patterns of bank financial ratios

Kordogly, Rima January 2010 (has links)
Financial ratios are key units of analysis in most quantitative financial research including bankruptcy prediction, performance and efficiency analysis, mergers and acquisitions, and credit ratings, amongst others. Since hundreds of ratios can be computed using available financial data and given the substantial overlap in information provided by many of these ratios, choosing amongst ratios has been a significant issue facing practitioners and researchers. An important contribution of the present thesis is to show that ratios can be arranged into groups where each group describes a separate financial aspect or dimension of a given firm or industry. Then by choosing representative ratios from each group, a small, yet comprehensive, set of ratios can be identified and used for further analysis. Whilst a substantial part of the financial ratio literature has focused on classifying financial ratios empirically and on assessing the stability of the ratio groups over different periods and industries, relatively little attention has been paid to the classifying of financial ratios of the banking sector. This study aims to explore the classification patterns of 56 financial ratios for banks of different type, size and age. Using data from the Uniform Bank Performance Report (UBPR), large samples of commercial, savings, and De Novo (newlychartered) commercial banks were obtained for the period between 2001 and 2005, inclusive. Principal Component Analysis (PCA) was performed on a yearly basis to classify the banks' ratios after applying the inverse sinh transformation to enhance the distributional properties of the data. The number of patterns were decided using Parallel Analysis. The study also uses various methods including visual comparison, correlation, congruency, and transformation analysis to assess the time series stability and cross-sectional similarity of the identified ratio patterns. The study identifies 13 or 14 ratio patterns for commercial banks and 10 or 11 ratio patterns for savings banks over the period on which the study is based. These patterns are generally stable over time; yet, some dissimilarity was found between the ratio patterns for the two types of banks – that is, the commercial and savings banks. A certain degree of dissimilarity was also found between the financial patterns for commercial banks belonging to different asset-size classes. Furthermore, four ratio patterns were consistently identified for the De Novo commercial banks in the first year of their operations. However, no evidence of convergence was found between the ratio patterns of the De Novo commercial banks and the ratio patterns of the incumbent (that is, long established) commercial banks. The findings of this study bring useful insights particularly to researchers who employ bank financial ratios in empirical analysis. Methodologically, this research pioneers the application of the inverse sinh transformation and parallel analysis in the area of the ratio classification literature. Also, it contributes to the use of transformation analysis as a factor comparison technique by deriving a significance test for the outputs of this analysis. Moreover, this is the only large scale study to be conducted on the classification patterns of bank financial ratios.
343

Time-of-flight secondary ion mass spectrometry - fundamental issues for quantitative measurements and multivariate data analysis

Lee, Joanna L. S. January 2011 (has links)
Time-of-flight secondary ion mass spectrometry (ToF-SIMS) is a powerful technique for the analysis of organic surfaces and interfaces for many innovative technologies. However, despite recent developments, there are still many issues and challenges hindering the robust, validated use of ToF-SIMS for quantitative measurement. These include: the lack of metrology and fundamental understanding for the use of novel cluster primary ion beams such as C60<sup>n+</sup> and Ar<sub>2000</sub><sup>+</sup>; the need for validated and robust measurement protocols for difficult samples, such as those with significant micron scale surface topography; the lack of guidance on novel data analysis methods including multivariate analysis which have the potential to simplify many time-consuming and intensive analyses in industry; and the need to establish best practice to improve the accuracy of measurements. This thesis describes research undertaken to address the above challenges. Sample topography and field effects were evaluated experimentally using model conducting and insulating fibres and compared with computer simulations to provide recommendation to diagnose and reduce the effects. Two popular multivariate methods, principal component analysis (PCA) and multivariate curve resolution (MCR), were explored using mixed organic systems consisting of a simple polymer blend and complex hair fibres treated with a multi-component formulation to evaluate different multivariate and data preprocessing methods for the optimal identification, localisation and quantification of the chemical components. Finally, cluster ion beams C60<sup>n+</sup> and Ar<sub>500-2500</sub><sup>+</sup> were evaluated on an inorganic surface and an organic delta layer reference material respectively to elucidate the fundamental metrology of cluster ion sputtering and pave the way for their use in organic depth profiling. These studies provide the essential metrological foundation to address frontier issues in surface and nanoanalysis and extend the measurement capabilities of ToF-SIMS.
344

X-Ray Micro- and Nano-Diffraction Imaging on Human Mesenchymal Stem Cells and Differentiated Cells

Bernhardt, Marten 15 June 2016 (has links)
No description available.
345

Quantifying Adoption Intensity for Weed-Resistance Management Practices and Its Determinants among US Soybean, Corn, and Cotton Farmers

Dong, Fengxia, Mitchell, Paul D., Hurley, Terrance M., Frisvold, George B. 01 1900 (has links)
Using data envelopment analysis with principal components, we calculate an adoption-intensity index for herbicide-resistance best management practices (BMPs). Empirical results for over 1,100 farmers in twenty-two U.S. states suggest that many farmers could improve their herbicide resistance BMP adoption. Two-limit truncated regression results show that higher yields and a greater proportion of acres planted with Roundup Ready (R) seeds motivate weed BMP adoption. While soybean and corn farmers have lower adoption intensity than cotton farmers, farmer educational attainment and greater concern for herbicide effectiveness and for human and environmental safety are found to help increase the adoption of weed BMPs.
346

Strategies for Deriving a Single Measure of the Overall Burden of Antimicrobial Resistance in Hospitals

Orlando, Alessandro 11 May 2010 (has links)
Background: Antimicrobial-resistant infections result in hospital stays costing between $18,000 and $29,000. As of 2009, Centers for Medicare and Medicaid Services no longer upgrade payments for hospital-acquired infections. Hospital epidemiologists monitor and document rates of individual resistant microbes in antibiogram reports. Overall summary measures capturing resistance within a hospital may be useful. Objectives: We applied four techniques (L1- and L2-principal component analysis (PCA), desirability functions, and simple summary) to create summary measures of resistance and described the four summary measures with respect to reliability, proportion of variance explained, and clinical utility. Methods: We requested antibiograms from hospitals participating in the University HealthSystem Consortium for the years 2002–2008 (n=40). A clinical team selected organism-drug resistant pairs (as resistant isolates per 1,000 patient days) based on 1) virulence, 2) complicated or toxic therapies, 3) transmissibility, and 4) high incidence with increasing levels of resistance. Four methods were used to create summary scores: 1) L1- and L2-PCA: derived multipliers so that the variance explained is maximized; 2) desirability function: transformed resistance data to be between 0 and 1; 3) simple sum: each resistance rate was added and divided by the square root of the total number of microbes summed. Simple correlation analyses between time and each summary score evaluated reliability. For each year, we calculated the proportion of explained variance by dividing each summary score’s variance by the variance in the original data. Clinical utility was checked by comparing the trends for all of the individual microbe’s resistance rates to the trends seen in the summary scores for each hospital. Results: Proportion of variance explained by L1- and L2-PCA and the simple sum was 0.61, 0.62, and 0.29 respectively. Simple sum and L1- and L2-PCA summary scores best followed the trends seen in the individual antimicrobial resistance rates; trends in desirability function scores deviated from those seen in individual trends of antimicrobial resistance. L1- and L2-PCA summary scores were more influenced by MRSA rates, and the simple sum score was less influenced. Pearson correlation coefficients revealed good reliability through time. Conclusion: Deriving summary measures of antimicrobial resistance can be reliable over time and explain a high proportion of variance. Infection control practitioners and hospital epidemiologists may find the inclusion of a summary score of antimicrobial resistance beneficial in describing the trends of overall resistance in their yearly antibiogram reports.
347

Ekologie společenstev z hlediska klasické a bayesovské statistiky / Community ecology from the perspective of classic and bayesian statistics

Klimeš, Adam January 2016 (has links)
Community ecology from the perspective of classic and Bayesian statistics Ekologie společenstev z hlediska klasické a Bayesovské statistiky Řešitel: Adam Klimeš Vedoucí práce: Mgr. Petr Keil, Ph.D. Abstract Quantitative evaluation of evidence through statistics is a central part of present-day science. Bayesian approach represents an emerging but rapidly developing enrichment of statistical analysis. The approach differs in its foundations from the classic methods. These differences, such as the different interpretation of probability, are often seen as obstacles for acceptance of Bayesian approach. In this thesis I outline ways to deal with the assumptions of Bayesian approach, and I address the main objections against it. I present Bayesian approach as a new way to handle data to answer scientific questions. I do this from a standpoint of community ecology: I illustrate the novelty that Bayesian approach brings to data analysis of typical community ecology data, specifically, the analysis of multivariate datasets. I focus on principal component analysis, one of the typical and frequently used analytical techniques. I execute Bayesian analyses that are analogical to the classic principal components analysis, I report the advantages of the Bayesian version, such as the possibility of working with...
348

Two Essays: “Does Corporate Governance Affect the Adjustment Speed towards Target Capital Structure?” and “Do Option Traders on REITs and Non-REITs React Differently to New Information?”

Liao, Li-Kai 18 May 2012 (has links)
The first chapter investigates how corporate governance influences firms’ capital structure behavior. Based on the premise that costs associated with deviations from the target capital structure are positively correlated to the extent of deviation, we hypothesize that the initial deviation from the target will be shorter for a firm with good corporate governance than for a firm with poor corporate governance. We also hypothesize that the former group will employ a higher speed of adjustment towards target than the latter group due primarily to the following reasons. First, a firm with well-placed governance system will adjust at a faster rate because longer it stays deviated, the higher the loss of value it faces. Second, firms with better governance structures enjoy lower adjustment costs. We develop three sets of measures for the quality of corporate governance and analyze how they influence a firm’s rebalancing behavior in presence of relevant control variables. Our results are consistent with the hypotheses. The second chapter explores investors’ reactions to new information on REITs and non-REITs option markets. The real estate market can be fairly volatile; what remains unclear is whether price changes are excessively volatile relative to fundamentals. This study attempts to examine the latter by using the methodology based on Stein (1989), which utilizes option data. The advantage of using option data rather than stock data to assess the reactions to information is that option valuation is not affected by changes in risk premium. Under volatility mean reversion, the changes in implied volatilities of long-term options should be less than those of short-term options. If not, an excessive reaction is suggested. Specifically, the study compares the changes in implied volatilities of options on REITs and non-REITs. Because real estate transactions typically involve a great degree of leverage, reactions can be greater for REITs than for non-REITs; on the other hand, there are several reasons that REITs are subject to potentially a lower degree of excessive reactions. Empirical results indicate that the reactions to information are stronger in non-REITs than in REITs. Moreover, we find that down markets are associated with stronger reactions, which we argue might be due to a leverage effect.
349

[en] NEURAL NETWORKS APPLIED TO PROXIES FOR RESERVOIR AND SURFACE INTEGRATED SIMULATION / [pt] REDES NEURAIS APLICADAS À CONSTRUÇÃO DE APROXIMADORES PARA SIMULAÇÃO INTEGRADA ENTRE RESERVATÓRIO E SISTEMA DE PRODUÇÃO

MANOELA RABELLO KOHLER 01 August 2014 (has links)
[pt] O desenvolvimento de um reservatório de petróleo já conhecido e delimitado consiste em encontrar uma alternativa (configuração) de poços que contribua para maximizar a receita a ser obtida com o óleo recuperado do reservatório. A busca por esta alternativa frequentemente é baseada em processos de otimização que usam o valor presente líquido (VPL) do projeto como função de avaliação das alternativas encontradas durante a busca. Dentre outras variáveis, o cálculo do VPL é diretamente dependente dos dados de produção de óleo, gás e água durante a vida produtiva do reservatório, bem como de seus custos de desenvolvimento. Determinar a localização, os tipos (produtor ou injetor) e a trajetória de poços em um reservatório é um problema de otimização complexo que depende de uma grande quantidade de variáveis, dentre elas as propriedades do reservatório (tais como porosidade e permeabilidade) e os critérios econômicos. Os processos de otimização aplicados a este tipo de problema têm um alto custo computacional devido ao uso contínuo de simuladores que reproduzem as condições do reservatório e do sistema de superfície. O uso dos simuladores pode ser substituído por um aproximador, que neste trabalho, é um modelo que utiliza Redes Neurais Artificiais. Os aproximadores aqui apresentados são feitos para substituir a simulação integrada do reservatório, do poço e da superfície (linhas de produção e riser). As amostras para a construção do aproximador é feita utilizando os simuladores de reservatório e de superfície e para reduzir o número de amostras necessárias e tornar sua construção mais rápida, utiliza-se Hipercubo Latino e Análise de Componentes Principais. Os aproximadores foram testados em dois reservatórios petrolíferos: um reservatório sintético, e baseado em um caso real. Os resultados encontrados indicam que estes aproximadores conseguem bom desempenho na substituição dos simuladores no processo de otimização devido aos baixos erros encontrados e à substancial diminuição do custo computacional. / [en] The development of an oil reservoir consists in finding an alternative of wells that contributes to maximizing the revenue to be obtained from the recovered reservoir oil. The pursuit for this alternative is often based on optimization processes using the net present value (NPV) of the project as the evaluation function of the alternatives found during this pursuit. Among other variables, the NPV calculation is directly dependent on the oil, gas and water production data during the productive life of the reservoir, as well as their development costs. Determine the number, location, type (producer or injector) and the trajectory of wells in a reservoir is a complex optimization problem which depends on a lot of variables, including the reservoir properties (such as porosity and permeability) and economic criteria. The optimization processes applied to this type of problem has a high computational cost due to the continuous use of simulators that reproduce the conditions of the reservoir and the surface system. The use of simulators may be replaced by proxies. At the present work, proxies were constructed using artificial neural networks. The proxies presented here are meant to replace the integrated reservoir, well and surface (production lines and riser) simulation to reduce the computational cost of a decision support system. The samples for the construction of the proxies are produced using reservoir and surface simulators. To reduce the number of samples needed for the proxy construction, and, to reduce the dimension of the problem, Latin Hypercube and Principal Component Analysis are used. The approximators were tested in two oil reservoirs: a synthetic reservoir, and another with real features. The results indicate that these approximators can perform well in replacement of simulators in the optimization process due to low errors found and a substantial decrease in computational cost.
350

Caractérisation de paramètres cosmétologiques à partir d'images multispectrales de peau / Characterization of cosmetologic data from multispectral skin images

Corvo, Joris 01 December 2016 (has links)
Grâce aux informations spatiales et spectrales qu'elle apporte, l'imagerie multispectrale de la peau est devenue un outil incontournable de la dermatologie. Cette thèse a pour objectif d'évaluer l’intérêt de cet outil pour la cosmétologie à travers trois études : la détection d'un fond de teint, l'évaluation de l'âge et la mesure de la rugosité.Une base d'images multispectrales de peau est construite à l'aide d'un système à multiples filtres optiques. Une phase de prétraitement est nécessaire à la standardisation et à la mise en valeur de la texture des images.Les matrices de covariance des acquisitions peuvent être représentées dans un espace multidimensionnel, ce qui constitue une nouvelle approche de visualisation de données multivariées. De même, une nouvelle alternative de réduction de la dimensionnalité basée sur l'ACP est proposée dans cette thèse. L'analyse approfondie de la texture des images multispectrales est réalisée : les paramètres de texture issus de la morphologie mathématique et plus généralement de l'analyse d'images sont adaptés aux images multivariées. Dans cette adaptation, plusieurs distances spectrales sont expérimentées, dont une distance intégrant le modèle LIP et la métrique d'Asplünd.Les résultats des prédictions statistiques générées à partir des données de texture permettent de conclure quant à la pertinence du traitement des données et de l'utilisation de l'imagerie multispectrale pour les trois études considérées. / Thanks to its precision in spatial and spectral domain, multispectral imaging has become an essential tool in dermatology. This thesis focuses on the interest of this technology for cosmetological parameters assessment through three different studies: the detection of a foundation make-up, age assessment and roughness measurement.A database of multispectral skin images is build using a multiple optical filters system. A preprocessing step allows to standardize those texture images before their exploitation.Covariance matrices of mutispectral acquisitions can be displayed in a multidimensional scaling space which is a novel way to represent multivariate data sets. Likewise, a new dimensionality reduction algorithm based on PCA is proposed in this thesis.A complete study of the images texture is performed: texture features from mathematical morphology and more generally from image analysis are expanded to the case of multivariate images. In this process, several spectral distances are tested, among which a new distance associating the LIP model to the Asplund metric.Statistical predictions are generated from texture data. Thoses predictions lead to a conclusion about the data processing efficiency and the relevance of multispectral imaging for the three cosmetologic studies.

Page generated in 0.0985 seconds