Spelling suggestions: "subject:"wilcoxon"" "subject:"ilcoxon""
21 |
Where to Invest? : Choosing the optimal stock market for investing in a cross-listed Nordic firmFagerlund, Elias, Mashrukh, Talukder January 2012 (has links)
The purpose of this study is to investigate whether the location of buying stocks in a Nordic cross-listed company matters in terms of 1) earning abnormal returns, or 2) gaining in optimizing the amount spent by buying the specific stock cheap. Nowadays, markets are becoming more integrated and if we believe in the efficient market hypothesis, prices of the same class of stocks paying the same dividend annually, of an MNC must be the same irrespective of the stock exchange it is listed upon. Though efficient market hypothesis exists in theory, market imperfection is a reality. All the Nordic (Swedish, Finnish, Norwegian, Danish and Icelandic) firms listed on foreign stock exchanges in addition to their home market have been included in the sample. In fact, this sample represents 100% of the population. The daily prices of cross-listed stocks have been analyzed and conclusions have been drawn based on the mean returns and mean prices along with Wilcoxon Signed-Rank test statistics. The data have been analyzed over the last ten years capturing the recent economic cycle. The whole period has also been divided into three sub-periods to establish comparisons with the whole period. This paper reports that even though returns on cross-listed stocks are statistically same over all periods, prices of the stocks vary according to the location of listing. That is, investors can buy from a stock exchange where the specific stock is underpriced thereby decreasing the amount invested in absolute term and optimizing the amount spent if not the return. The returns and prices have analyzed using the local currency of the MNC’s country of origin and Special Drawing Rights (SDRs). No considerable differences on the returns or pattern of price movements have been observed while using two currencies.
|
22 |
Numerical Methods for Wilcoxon Fractal Image CompressionJau, Pei-Hung 28 June 2007 (has links)
In the thesis, the Wilcoxon approach to linear regression problems is combined with the fractal image compression to form a novel Wilcoxon fractal image compression. When the original image is corrupted by noise, we argue that the fractal image compression scheme should be insensitive to those outliers present in the corrupted image. This leads to the new concept of robust fractal image compression. The proposed Wilcoxon fractal image compression is the first attempt toward the design of robust fractal image compression. Four different numerical methods, i.e., steepest decent, line minimization based on quadratic interpolation, line minimization based on cubic interpolation, and least absolute deviation, will be proposed to solve the associated linear Wilcoxon regression problem. From the simulation results, it will be seen that, compared with the traditional fractal image compression, Wilcoxon fractal image compression has very good robustness against outliers caused by salt-and-pepper noise. However, it does not show great improvement of the robustness against outliers caused by Gaussian noise.
|
23 |
Contributions to Imputation Methods Based on Ranks and to Treatment Selection Methods in Personalized MedicineMatsouaka, Roland Albert January 2012 (has links)
The chapters of this thesis focus two different issues that arise in clinical trials and propose novel methods to address them. The first issue arises in the analysis of data with non-ignorable missing observations. The second issue concerns the development of methods that provide physicians better tools to understand and treat diseases efficiently by using each patient's characteristics and personal biomedical profile. Inherent to most clinical trials is the issue of missing data, specially those that arise when patients drop out the study without further measurements. Proper handling of missing data is crucial in all statistical analyses because disregarding missing observations can lead to biased results. In the first two chapters of this thesis, we deal with the "worst-rank score" missing data imputation technique in pretest-posttest clinical trials. Subjects are randomly assigned to two treatments and the response is recorded at baseline prior to treatment (pretest response), and after a pre-specified follow-up period (posttest response). The treatment effect is then assessed on the change in response from baseline to the end of follow-up time. Subjects with missing response at the end of follow-up are assign values that are worse than any observed response (worst-rank score). Data analysis is then conducted using Wilcoxon-Mann-Whitney test. In the first chapter, we derive explicit closed-form formulas for power and sample size calculations using both tied and untied worst-rank score imputation, where the worst-rank scores are either a fixed value (tied score) or depend on the time of withdrawal (untied score). We use simulations to demonstrate the validity of these formulas. In addition, we examine and compare four different simplification approaches to estimate sample sizes. These approaches depend on whether data from the literature or a pilot study are available. In second chapter, we introduce the weighted Wilcoxon-Mann-Whitney test on un-tied worst-rank score (composite) outcome. First, we demonstrate that the weighted test is exactly the ordinary Wilcoxon-Mann-Whitney test when the weights are equal. Then, we derive optimal weights that maximize the power of the corresponding weighted Wilcoxon-Mann-Whitney test. We prove, using simulations, that the weighted test is more powerful than the ordinary test. Furthermore, we propose two different step-wise procedures to analyze data using the weighted test and assess their performances through simulation studies. Finally, we illustrate the new approach using data from a recent randomized clinical trial of normobaric oxygen therapy on patients with acute ischemic stroke. The third and last chapter of this thesis concerns the development of robust methods for treatment groups identification in personalized medicine. As we know, physicians often have to use a trial-and-error approach to find the most effective medication for their patients. Personalized medicine methods aim at tailoring strategies for disease prevention, detection or treatment by using each individual subject's personal characteristics and medical profile. This would result to (1) better diagnosis and earlier interventions, (2) maximum therapeutic benefits and reduced adverse events, (3) more effective therapy, and (4) more efficient drug development. Novel methods have been proposed to identify subgroup of patients who would benefit from a given treatment. In the last chapter of this thesis, we develop a robust method for treatment assignment for future patients based on the expected total outcome. In addition, we provide a method to assess the incremental value of new covariate(s) in improving treatment assignment. We evaluate the accuracy of our methods through simulation studies and illustrate them with two examples using data from two HIV/AIDS clinical trials.
|
24 |
An Analysis of Fourier Transform Infrared Spectroscopy Data to Predict Herpes Simplex Virus 1 InfectionChampion, Patrick D 20 November 2008 (has links)
The purpose of this analysis is to evaluate the usefulness of Fourier Transform Infrared (FTIR) spectroscopy in the detection of Herpes Simplex Virus 1 (hsv1) infection at an early stage. The raw absorption values were standardized to eliminate inter-sampling error. Wilcoxon-Mann-Whitney (WMW) statistic's Z score was calculated to select significant spectral regions. Partial least squares modeling was performed because of multicollinearity. Kolmogorov-Smirnov statistic showed models for healthy tissues from different time groups were not from same distribution. The additional 24 hour dataset was evaluated using the following methods. Variables were selected by WMW Z score. Difference of Composites statistic, DC, was created as a disease indicator and evaluated using area under the ROC curve, specificities, and confidence intervals using bootstrap algorithm. The specificity of DC was high, however the confidence intervals were large. Future studies are required with larger sample sizes to test this statistic's usefulness.
|
25 |
The Impact of Midbrain Cauterize Size on Auditory and Visual Responses' DistributionZhang, Yan 20 April 2009 (has links)
This thesis presents several statistical analysis on a cooperative project with Dr. Pallas and Yuting Mao from Biology Department of Georgia State University. This research concludes the impact of cauterize size of animals’ midbrain on auditory and visual response in brains. Besides some already commonly used statistical analysis method, such as MANOVA and Frequency Test, a unique combination of Permutation Test, Kolmogorov-Smirnov Test and Wilcoxon Rank Sum Test is applied to our non-parametric data. Some simulation results show the Permutation Test we used has very good powers, and fits the need for this study. The result confirms part of the Biology Department’s hypothesis statistically and enhances more complete understanding of the experiments and the potential impact of helping patients with Acquired Brain Injury.
|
26 |
Fúze obchodních korporací a jejich vliv na finanční situaci / Mergers of Business Corporations and Their Influence on Financial SituationKubáňová, Andrea January 2016 (has links)
This thesis is concerned with domestic mergers of business corporations realized between 2009 - 2012 and their influence on financial situation of chosen subjects. The first part of my paper describes basic theoretical aspects of mergers including their division and motives leading to the realization and their phases. The second chapter consists of trade-legal and accounting adjustment of mergers. Considering practical part, the main focus is on legal modification valid from 31. 12. 2011 to 1. 1. 2012. Following part of this thesis explains accounting solutions of mergers in example, where the immediate impact is shown on realization of the opening balance sheet. The fourth part is concerned with financial analysis, where the sources, users and relevant methods are mentioned. All analysis and results are submitted and described in the final part of my thesis.
|
27 |
Research on Robust Fuzzy Neural NetworksWu, Hsu-Kun 19 November 2010 (has links)
In many practical applications, it is well known that data collected inevitably contain one or more anomalous outliers; that is, observations that are well separated from the majority or bulk of the data, or in some fashion deviate from the general pattern of the data. The occurrence of outliers may be due to misplaced decimal points, recording errors, transmission errors, or equipment failure. These outliers can lead to erroneous parameter estimation and consequently affect the correctness and accuracy of the model inference. In order to solve these problems, three robust fuzzy neural networks (FNNs) will be proposed in this dissertation. This provides alternative learning machines when faced with general nonlinear learning problems. Our emphasis will be put particularly on the robustness of these learning machines against outliers. Though we consider only FNNs in this study, the extension of our approach to other neural networks, such as artificial neural networks and radial basis function networks, is straightforward.
In the first part of the dissertation, M-estimators, where M stands for maximum likelihood, frequently used in robust regression for linear parametric regression problems will be generalized to nonparametric Maximum Likelihood Fuzzy Neural Networks (MFNNs) for nonlinear regression problems. Simple weight updating rules based on gradient descent and iteratively reweighted least squares (IRLS) will be derived.
In the second part of the dissertation, least trimmed squares estimators, abbreviated as LTS-estimators, frequently used in robust (or resistant) regression for linear parametric regression problems will be generalized to nonparametric least trimmed squares fuzzy neural networks, abbreviated as LTS-FNNs, for nonlinear regression problems. Again, simple weight updating rules based on gradient descent and iteratively reweighted least squares (IRLS) algorithms will be provided.
In the last part of the dissertation, by combining the easy interpretability of the parametric models and the flexibility of the nonparametric models, semiparametric fuzzy neural networks (semiparametric FNNs) and semiparametric Wilcoxon fuzzy neural networks (semiparametric WFNNs) will be proposed. The corresponding learning rules are based on the backfitting procedure which is frequently used in semiparametric regression.
|
28 |
Familiaridad en melodías, un estudio introductorio a la memoria musical basándose en la medición del potencial evocado N400Quintero-Rincón, Antonio January 2015 (has links)
La cognición y la percepción musical es el estudio científico de las operaciones mentales y neuronales que subyacen al escuchar, crear, moverse y componer música; su interdisciplinariedad incluye métodos de la psicología cognitiva y sensorial, la neurociencia, la musicología, la informática, la teoría musical, las matemáticas, el procesamiento de señales, la genética, la biología y los aspectos socioculturales de la música. El Procesamiento de la música es una actividad mental compleja con un alto componente cognitivo que involucra varias áreas del cerebro, la música cada vez más se utiliza como una forma de investigar la comprensión de las funciones de la mente y el cerebro involucrando temas de la psicológia cognitiva, como la memoria, la atención, la organización perceptual, la categorización y la emoción. Una herramienta poderosa para este fin son los Potenciales Evocados por Eventos (ERP), son mediciones directas de la corteza cerebral generadas en respuesta a estímulos externos, por lo general se usan para medir eventos sensoriales, afectivos y cognitivos. El componente ERP más relacionado con el lenguaje y mejor estudiado es el N400, es una onda con pendiente negativa localizada en la zona central y parietal del cerebro, con una amplitud ligeramente mayor en el hemisferio derecho respecto a del hemisferio izquierdo. El N400 normalmente es visto como una respuesta a violaciones de las expectativas semánticas y su primer uso data de 1980 por M. Kutas y S.A. Hillyard, en este estudio se creo un paradigma que consistió en una lectura con frases fuera de contexto, los resultados mostraron que las palabras que eran más anormales suscitaron un valor positivo en los ERPs, mientras que las palabras semánticamente incorrectas provocaron una onda negativa N400 tardía. Posteriores estudios muestran que la música y el lenguaje se pueden comparar a pesar de que se representan de forma similar, pero a la vez diferente, esto ha llevado a determinar que estos dos dominios pueden transmitir índices fisiológicos, información y conceptos, fortaleciendo cada vez más la unión entre estos dos dominios. La presente tesis forma parte de la primera etapa de un estudio macro donde se pretende determinar si la memoria musical está preservada en personas con deterioro congnitivo leve (DLC) y Alzheimer, en este estudio se plantea la siguiente hipótesis: Conociendo que el potencial evocado N400 es un marcador de deterioro cognitivo, debería esperarse que al evaluar el N400 musical no esté afectado en el DLC y en el Alzheimer, por lo tanto la memoria musical está preservada y puede medirse con el N400. Este estudio está basado en la investigación de Robbin Miranda con la supervisión de Michael T. Ullman, de la Universidad de Georgetown sobre: Doble disociación entre las reglas y la memoria en la música, un estudio de Potenciales Evocados por Eventos, publicado en el Journal Neuroimage. Los resultados en este estudio mostraron una doble disociación, independiente de la formación musical, entre las reglas musicales y la memoria. Además los autores sugieren que las disociaciones de las reglas/memoria, neurocognitivamente se extienden desde el lenguaje a la música, fortaleciendo aún más las similitudes entre los dos dominios. En este estudio se desarrollo un set de estímulos con melodías bien conocidas en Estados Unidos con las siguientes tres condiciones: melodía original (Control), melodía con violación en tono (In-key) y melodía con violación fuera de tono (Out-of-key), esto con el fin de poder estudiar la doble disociación y la dependencia de reglas tanto en la música como en el lenguaje. Teniendo todo esto en cuenta, para la presente tesis se planteo la siguiente hipótesis: Sabiendo que existen melodías bien conocidas es de esperar que si tienen una violación en tono o fuera de tono, sea percibida por el oyente, igualmente si la melodía no es conocida, se espera que la violación fuera de tono sea percibida por el oyente, más no una violación en tono. Para dicho fin, se plantearon dos objetivos: a) Adaptar los estímulos a la cultura musical Argentina, construyendo nuevos estímulos donde fuera necesario y b) Determinar la Familiaridad de las Melodías en sujetos normales. El desarrollo de esta tesis cuenta con un amplio estado del arte y está divido en los siguientes capítulos: dos capítulos iniciales introductorios, necesarios para un tema interdisciplinario como este, uno sobre el Cerebro donde se abordan temas re- ferentes a la actividad eléctrica del cerebro, la neocorteza, el encefalograma, el N400 y las oscilaciones cerebrales; y otro sobre el Procesamiento Cognitivo, donde se abor- dan temas sobre el estado mental, la psicología del sentido común, la conciencia, la persepción auditiva, la atención, la emoción y la memoria. Posteriormente sigue un capítulo donde se explora el tema de la especialización del hemisferio derecho desde el punto de vista de la psicología del arte, tema de mucha controversia y que hasta la fecha sigue abierto. Una vez dados esto temas, en el capítulo sobre el Material y Métodos, se dan temas relacionados con el estímulo, consonancia y disonancia, los cuales son atributos sensoriales cognitivos claves en la percepción musical y el estudio base usado en este trabajo. Finalmente se muestra un experimento con los Resultados y Discusión donde la evaluación de la familiaridad permitió determinar las melodías a usar para el estudio y validar las diferentes condiciones para trabajo futuros. / Esta tesis es un proyecto multidisciplinario en temas de psicología cognitiva y sensorial, neurociencia, música, informática, teoría musical, biología, estadística y procesamiento de señales encefalográficas. Contó con dos directores: el Dr. Marcelo Risk y el Ing. José A. Rapallini. Además del texto, se incluye una muestra de los audios utilizados.
|
29 |
Infrared Spectroscopy in Combination with Advanced Statistical Methods for Distinguishing Viral Infected Biological CellsTang, Tian 17 November 2008 (has links)
Fourier Transform Infrared (FTIR) microscopy is a sensitive method for detecting difference in the morphology of biological cells. In this study FTIR spectra were obtained for uninfected cells, and cells infected with two different viruses. The spectra obtained are difficult to discriminate visually. Here we apply advanced statistical methods to the analysis of the spectra, to test if such spectra are useful for diagnosing viral infections in cells. Logistic Regression (LR) and Partial Least Squares Regression (PLSR) were used to build models which allow us to diagnose if spectral differences are related to infection state of the cells. A three-fold, balanced cross-validation method was applied to estimate the shrinkages of the area under the receiving operator characteristic curve (AUC), and specificities at sensitivities of 95%, 90% and 80%. AUC, sensitivity and specificity were used to gauge the goodness of the discrimination methods. Our statistical results shows that the spectra associated with different cellular states are very effectively discriminated. We also find that the overall performance of PLSR is better than that of LR, especially for new data validation. Our analysis supports the idea that FTIR microscopy is a useful tool for detection of viral infections in biological cells.
|
30 |
Advanced Statistical Methodologies in Determining the Observation Time to Discriminate Viruses Using FTIRLuo, Shan 13 July 2009 (has links)
Fourier transform infrared (FTIR) spectroscopy, one method of electromagnetic radiation for detecting specific cellular molecular structure, can be used to discriminate different types of cells. The objective is to find the minimum time (choice among 2 hour, 4 hour and 6 hour) to record FTIR readings such that different viruses can be discriminated. A new method is adopted for the datasets. Briefly, inner differences are created as the control group, and Wilcoxon Signed Rank Test is used as the first selecting variable procedure in order to prepare the next stage of discrimination. In the second stage we propose either partial least squares (PLS) method or simply taking significant differences as the discriminator. Finally, k-fold cross-validation method is used to estimate the shrinkages of the goodness measures, such as sensitivity, specificity and area under the ROC curve (AUC). There is no doubt in our mind 6 hour is enough for discriminating mock from Hsv1, and Coxsackie viruses. Adeno virus is an exception.
|
Page generated in 0.0382 seconds