• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1590
  • 568
  • 227
  • 185
  • 155
  • 89
  • 46
  • 41
  • 33
  • 32
  • 21
  • 19
  • 16
  • 15
  • 15
  • Tagged with
  • 3610
  • 643
  • 423
  • 418
  • 358
  • 316
  • 292
  • 273
  • 243
  • 235
  • 210
  • 193
  • 188
  • 185
  • 183
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
281

NONINVASIVE MULTIMODAL DIFFUSE OPTICAL IMAGING OF VULNERABLE TISSUE HEMODYNAMICS

Zhao, Mingjun 01 January 2019 (has links)
Measurement of tissue hemodynamics provides vital information for the assessment of tissue viability. This thesis reports three noninvasive near-infrared diffuse optical systems for spectroscopic measurements and tomographic imaging of tissue hemodynamics in vulnerable tissues with the goal of disease diagnosis and treatment monitoring. A hybrid near-infrared spectroscopy/diffuse correlation spectroscopy (NIRS/DCS) instrument with a contact fiber-optic probe was developed and utilized for simultaneous and continuous monitoring of blood flow (BF), blood oxygenation, and oxidative metabolism in exercising gastrocnemius. Results measured by the hybrid NIRS/DCS instrument in 37 subjects (mean age: 67 ± 6) indicated that vitamin D supplement plus aerobic training improved muscle metabolic function in older population. To reduce the interference and potential infection risk on vulnerable tissues caused by the contact measurement, a noncontact diffuse correlation spectroscopy/tomography (ncDCS/ncDCT) system was then developed. The ncDCS/ncDCT system employed optical lenses to project limited numbers of sources and detectors on the tissue surface. A motor-driven noncontact probe scanned over a region of interest to collect boundary data for three dimensional (3D) tomographic imaging of blood flow distribution. The ncDCS was tested for BF measurements in mastectomy skin flaps. Nineteen (19) patients underwent mastectomy and implant-based breast reconstruction were measured before and immediately after mastectomy. The BF index after mastectomy in each patient was normalized to its baseline value before surgery to get relative BF (rBF). Since rBF values in the patients with necrosis (n = 4) were significantly lower than those without necrosis (n = 15), rBF levels can be used to predict mastectomy skin flap necrosis. The ncDCT was tested for 3D imaging of BF distributions in chronic wounds of 5 patients. Spatial variations in BF contrasts over the wounded tissues were observed, indicating the capability of ncDCT in detecting tissue hemodynamic heterogeneities. To improve temporal/spatial resolution and avoid motion artifacts due to a long mechanical scanning of ncDCT, an electron-multiplying charge-coupled device based noncontact speckle contrast diffuse correlation tomography (scDCT) was developed. Validation of scDCT was done by imaging both high and low BF contrasts in tissue-like phantoms and human forearms. In a wound imaging study using scDCT, significant lower BF values were observed in the burned areas/volumes compared to surrounding normal tissues in two patients with burn. One limitation in this study was the potential influence of other unknown tissue optical properties such as tissue absorption coefficient (µa) on BF measurements. A new algorithm was then developed to extract both µa and BF using light intensities and speckle contrasts measured by scDCT at multiple source-detector distances. The new algorithm was validated using tissue-like liquid phantoms with varied values of µa and BF index. In-vivo validation and application of the innovative scDCT technique with the new algorithm is the subject of future work.
282

Parciální a podmíněné korelační koeficienty / Partial correlation coefficients and theirs extension

Říha, Samuel January 2015 (has links)
No description available.
283

Estudo da influência de eventos sobre a estrutura do mercado brasileiro de ações a partir de redes ponderadas por correlações de Pearson, Spearman e Kendall / Weighted networks from Pearson, Spearman and Kendall correlations to characterize the influence of events on the Brazilian stock market structure

Letícia Aparecida Origuela 06 August 2018 (has links)
Neste trabalho foi analisada a influência de um evento sobre o mercado de ações brasileiro a partir das redes, e suas árvores geradoras mínimas, obtidas de medidas de dependência baseadas nas correlações de Pearson, de Spearman e de Kendall. O evento considerado foi a notícia da noite de 17 de maio de 2017 em que o dono da empresa brasileira JBS, Joesley Batista, gravou o então Presidente da República Michel Temer autorizando a compra do silêncio de um Deputado Federal. O dia seguinte a notícia, 18 de maio de 2017, foi definido como o dia do evento. Foram coletados dados de alta frequência de 58 ações do Ibovespa no período de 11 a 25 de maio de 2017. As alterações nas redes das ações do mercado foram analisadas comparando-se o período anterior e posterior ao evento em duas escalas de tempo: (1) Redes diárias: cinco pregões antes do evento, o dia do evento e, cinco pregões depois do evento, com cotações a cada 15 minutos; (2) Agrupadas em antes e depois: agrupando os dados dos 5 dias antes e dos 5 dias depois do evento. O estudo das redes diárias indicou mudança de tendência nas suas propriedades no decorrer do período que contém o evento, com cotações a cada 15 minutos. Isto sugeriu que análise do efeito médio contido nos dados agrupados antes de depois do evento poderiam tornar mais evidente as mudanças na estrutura de rede das ações. As redes antes e depois do evento apresentaram mudanças significativas nas suas métricas que ficaram mais evidenciadas nas árvores geradoras mínimas. As redes geradas pelas correlações de Kendall e Spearman apresentaram um número maior de agrupamentos antes e depois do evento e, após o evento, as árvores geradoras mínimas apresentaram uma redução do número de agrupamentos de ações para todos os tipos de correlação. As distribuições de grau ponderado após o evento indicam uma probabilidade maior de vértices com graus distante da média. As métricas das árvores geradoras mínimas por correlação de Spearman sofreram a maior variação, seguidas pelas de Kendall e Pearson, e também, indicaram que as redes após o evento ficaram mais robustas, ou seja, mais rígidas. A maior robustez das redes após o evento indica maior conectividade do mercado, tornando-o, como um todo, mais suscetível ao impacto de novos acontecimentos. / In this work the influence of an event on the Brazilian stock market was analyzed from networks and its minimum spanning trees obtained from measures of dependence based on the Pearson, Spearman, and Kendall\'s correlations. The event considered was the news in the evening of May 17, 2017 in which the owner of the Brazilian company JBS, Joesley Batista, recorded the Brazilian President Michel Temer authorizing the purchase of the silence of a congress member. The day just after the news, May 18, 2017, was defined as the event day. High-frequency data from 58 Ibovespa shares were collected from 11 to 25 May 2017. Changes in the stocks networks were analyzed comparing the period before and after the event in two time scales: (1) Daily networks: five trade sections before the event, the day of the event and, five trade sections after the event, with price every 15 minutes; (2) Grouped before and after do evento: grouping data from 5 days before and 5 days after event. The study of the daily networks indicated a change of trend in their properties during the period that contains the event, with quotations every 15 minutes. The study of daily networks indicated a change of trend in their properties during the period containing the event. This suggested that analysis of the mean effect of grouped data before and after the event could highlight the changes in the network structure. The networks before and after the event showed significant changes in their metrics, which became more evident from the minimum spanning trees. After the event, the minimum spanning trees for grouped data got a smaller number of clusters in the networks for all kind of correlations. The networks generated by Kendall and Spearman correlations presented a larger number of clusters before and after the event. The weighted degree distributions after the event suggest a power law decay tail for all the correlations considered and indicates a higher probability of vertices with weighted degrees far away from the mean weighted degree. The minimum spanning tree metrics generated by Spearman correlation suffered the greatest variation, followed by those of Kendall and Pearson; and their values indicates that after the event the networks became more robust, that is, more rigid. The increase in the networks robustness after the event indicates a higher market connectivity, making it as a whole, more susceptible to the impact of new events.
284

Canonical Correlation and Clustering for High Dimensional Data

Ouyang, Qing January 2019 (has links)
Multi-view datasets arise naturally in statistical genetics when the genetic and trait profile of an individual is portrayed by two feature vectors. A motivating problem concerning the Skin Intrinsic Fluorescence (SIF) study on the Diabetes Control and Complications Trial (DCCT) subjects is presented. A widely applied quantitative method to explore the correlation structure between two domains of a multi-view dataset is the Canonical Correlation Analysis (CCA), which seeks the canonical loading vectors such that the transformed canonical covariates are maximally correlated. In the high dimensional case, regularization of the dataset is required before CCA can be applied. Furthermore, the nature of genetic research suggests that sparse output is more desirable. In this thesis, two regularized CCA (rCCA) methods and a sparse CCA (sCCA) method are presented. When correlation sub-structure exists, stand-alone CCA method will not perform well. To tackle this limitation, a mixture of local CCA models can be employed. In this thesis, I review a correlation clustering algorithm proposed by Fern, Brodley and Friedl (2005), which seeks to group subjects into clusters such that features are identically correlated within each cluster. An evaluation study is performed to assess the effectiveness of CCA and correlation clustering algorithms using artificial multi-view datasets. Both sCCA and sCCA-based correlation clustering exhibited superior performance compare to the rCCA and rCCA-based correlation clustering. The sCCA and the sCCA-clustering are applied to the multi-view dataset consisted of PrediXcan imputed gene expression and SIF measurements of DCCT subjects. The stand-alone sparse CCA method identified 193 among 11538 genes being correlated with SIF#7. Further investigation of these 193 genes with simple linear regression and t-test revealed that only two genes, ENSG00000100281.9 and ENSG00000112787.8, were significance in association with SIF#7. No plausible clustering scheme was detected by the sCCA based correlation clustering method. / Thesis / Master of Science (MSc)
285

Risk aggregation and capital allocation using copulas / Martinette Venter

Venter, Martinette January 2014 (has links)
Banking is a risk and return business; in order to obtain the desired returns, banks are required to take on risks. Following the demise of Lehman Brothers in September 2008, the Basel III Accord proposed considerable increases in capital charges for banks. Whilst this ensures greater economic stability, banks now face an increasing risk of becoming capital inefficient. Furthermore, capital analysts are not only required to estimate capital requirements for individual business lines, but also for the organization as a whole. Copulas are a popular technique to model joint multi-dimensional problems, as they can be applied as a mechanism that models relationships among multivariate distributions. Firstly, a review of the Basel Capital Accord will be provided. Secondly, well known risk measures as proposed under the Basel Accord will be investigated. The penultimate chapter is dedicated to the theory of copulas as well as other measures of dependence. The final chapter presents a practical illustration of how business line losses can be simulated by using the Gaussian, Cauchy, Student t and Clayton copulas in order to determine capital requirements using 95% VaR, 99% VaR, 95% ETL, 99% ETL and StressVaR. The resultant capital estimates will always be a function of the choice of copula, the choice of risk measure and the correlation inputs into the copula calibration algorithm. The choice of copula, the choice of risk measure and the conservativeness of correlation inputs will be determined by the organization’s risk appetite. / Sc (Applied Mathematics), North-West University, Potchefstroom Campus, 2014
286

Assessing the suitability of regulatory asset correlations applied to South African loan losses / Hestia Jacomina Stoffberg

Stoffberg, Hestia Jacomina January 2015 (has links)
The Basel Committee on Banking Supervision (BCBS) designed the Internal Ratings Based (IRB) approach, which is based on a single risk factor model. This IRB approach was de-signed to determine banks’ regulatory capital for credit risk. The asymptotic single risk factor (ASRF) model they used makes use of prescribed asset correlations, which banks must use for their credit risk regulatory capital, in order to abide by the BCBS’s rules. Banks need to abide by these rules to reach an international standard of banking that promotes the health of the specific bank. To evaluate whether these correlations are as conservative as the BCBS intended, i.e. not too onerous or too lenient, empirical asset correlations embedded in gross loss data, spanning different economic milieus, were backed out of the regulatory credit risk model. A technique to extract these asset correlations from a Vasicek distribution of empirical loan losses was proposed and tested in international markets. This technique was used to extract the empirical asset correlation, and then compare the prescribed correlations for developed (US) and developing (South Africa) economies over the total time period, as well as a rolling time period. For the first analysis, the BCBS’s asset correlation was conservative when com-pared to South Africa and the US for all loan types. Comparing the empirical asset correlation over a seven-year rolling time period for South Africa and the BCBS, the specified asset cor-relation was found to be as conservative as the BCBS intended. Comparing the US empirical asset correlation for the same rolling period to that of the BCBS, it was found that for all loans, the BCBS was conservative, up until 2012. In 2012 the empirical asset correlation sur-passed that of the BCBS, and thus the BCBS was not as conservative as they had originally intended. / MCom (Risk Management), North-West University, Potchefstroom Campus, 2015
287

Risk aggregation and capital allocation using copulas / Martinette Venter

Venter, Martinette January 2014 (has links)
Banking is a risk and return business; in order to obtain the desired returns, banks are required to take on risks. Following the demise of Lehman Brothers in September 2008, the Basel III Accord proposed considerable increases in capital charges for banks. Whilst this ensures greater economic stability, banks now face an increasing risk of becoming capital inefficient. Furthermore, capital analysts are not only required to estimate capital requirements for individual business lines, but also for the organization as a whole. Copulas are a popular technique to model joint multi-dimensional problems, as they can be applied as a mechanism that models relationships among multivariate distributions. Firstly, a review of the Basel Capital Accord will be provided. Secondly, well known risk measures as proposed under the Basel Accord will be investigated. The penultimate chapter is dedicated to the theory of copulas as well as other measures of dependence. The final chapter presents a practical illustration of how business line losses can be simulated by using the Gaussian, Cauchy, Student t and Clayton copulas in order to determine capital requirements using 95% VaR, 99% VaR, 95% ETL, 99% ETL and StressVaR. The resultant capital estimates will always be a function of the choice of copula, the choice of risk measure and the correlation inputs into the copula calibration algorithm. The choice of copula, the choice of risk measure and the conservativeness of correlation inputs will be determined by the organization’s risk appetite. / Sc (Applied Mathematics), North-West University, Potchefstroom Campus, 2014
288

Assessing the suitability of regulatory asset correlations applied to South African loan losses / Hestia Jacomina Stoffberg

Stoffberg, Hestia Jacomina January 2015 (has links)
The Basel Committee on Banking Supervision (BCBS) designed the Internal Ratings Based (IRB) approach, which is based on a single risk factor model. This IRB approach was de-signed to determine banks’ regulatory capital for credit risk. The asymptotic single risk factor (ASRF) model they used makes use of prescribed asset correlations, which banks must use for their credit risk regulatory capital, in order to abide by the BCBS’s rules. Banks need to abide by these rules to reach an international standard of banking that promotes the health of the specific bank. To evaluate whether these correlations are as conservative as the BCBS intended, i.e. not too onerous or too lenient, empirical asset correlations embedded in gross loss data, spanning different economic milieus, were backed out of the regulatory credit risk model. A technique to extract these asset correlations from a Vasicek distribution of empirical loan losses was proposed and tested in international markets. This technique was used to extract the empirical asset correlation, and then compare the prescribed correlations for developed (US) and developing (South Africa) economies over the total time period, as well as a rolling time period. For the first analysis, the BCBS’s asset correlation was conservative when com-pared to South Africa and the US for all loan types. Comparing the empirical asset correlation over a seven-year rolling time period for South Africa and the BCBS, the specified asset cor-relation was found to be as conservative as the BCBS intended. Comparing the US empirical asset correlation for the same rolling period to that of the BCBS, it was found that for all loans, the BCBS was conservative, up until 2012. In 2012 the empirical asset correlation sur-passed that of the BCBS, and thus the BCBS was not as conservative as they had originally intended. / MCom (Risk Management), North-West University, Potchefstroom Campus, 2015
289

Productivity measurement and its relationship to quality in a South African Minting Company

Mtotywa, Matolwandile Mzuvukile January 2007 (has links)
The aim of this study was to investigate a productivity measurement at the South African Minting Company and evaluate the relationship between productivity and quality. Special emphasis was given to profit-linked total factor model as the tool for measurement. This was encouraged by their ability to separate productivity, profitability and price recovery. Three models were selected and evaluated. These models American Productivity Center (APC) Model, “Profitability = productivity + price recovery” (PPP) model and multi-factor productivity measurement model (MFPMM). APC model was selected as the suitable model because of its simplicity, easy to set up, its ability to produce both financial and non financial data, and allow for route cause analysis with expert system, and more insight for the manager with Microsoft Excels’ What if analysis “Goal seek”. APC model was set up for four periods, from 1 April 2004 to 30 September 2007. The overall profitability results of the circulation coins profit center show an overall positive contribution. There was a break-even of the price recovery for 2006 financial year (period 2). In 2007 financial year (period 3), there was a negative contribution, and this improved to almost break-even in the six month period during this 2008 financial year (period 4). This means there was much more inflation on input resources and the recovery was not fully realised in the price of goods sold. Individual input costs show that the negative price recovery is culminating from material, labour and energy costs contributions. There is a plausible explanation for material and labour, but not for energy. The metal volatility is the underlying cause of the price variation. Labour variation was a company strategy to adjust employee to higher percentiles. Productivity was always positive with the highest contribution in the current financial year (period 4). This means that the profitability at SA Mint has been driven by productivity in the past two financial years. iv Survey of the questionnaire shows average scores for productivity and quality. It is noteworthy, that the lowest mean score for productivity is for the statement “Products are produced in error-free process”. This is a productivity quality measure. In addition, the same variable shows r2 value of 0.42. A conclusion is that even though productivity and quality are highly correlated and show a highly positive relationship, there is a concern on quality in the company. A link can be made that low price recovery becomes more difficult when the quality is not always good. Defective product is a cost, because the product does not reach the customer and if the product is reworked it is still a cost, though low, but more importantly it decreases the available capacity. This study was successful in setting up APC model and producing data that is worthy to the company and academic world. Finally, this study was successful in its quest to establish the relationship between productivity and quality.
290

A Time Correlated Approach to Adaptable Digital Filtering

Grossman, Hy, Pellarin, Steve 10 1900 (has links)
ITC/USA 2006 Conference Proceedings / The Forty-Second Annual International Telemetering Conference and Technical Exhibition / October 23-26, 2006 / Town and Country Resort & Convention Center, San Diego, California / Signal conditioning is a critical element in all data telemetry systems. Data from all sensors must be band limited prior to digitization and transmission to prevent the potentially disastrous effects of aliasing. While the 6th order analog low-pass Butterworth filter has long been the de facto standard for data channel filtering, advances in digital signal processing techniques now provide a potentially better alternative. This paper describes the challenges in developing a flexible approach to adaptable data channel filtering using DSP techniques. Factors such as anti-alias filter requirements, time correlated sampling, decimation and filter delays will be discussed. Also discussed will be the implementation and relative merits and drawbacks of various symmetrical FIR and IIR filters. The discussion will be presented from an intuitive and practical perspective as much as possible.

Page generated in 0.0999 seconds