• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 59
  • 14
  • 11
  • 6
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 119
  • 119
  • 22
  • 21
  • 17
  • 14
  • 13
  • 13
  • 13
  • 13
  • 12
  • 11
  • 11
  • 11
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Biomarker discovery and statistical modeling with applications in childhood epilepsy and Angelman syndrome

Spencer, Elizabeth Rose Stevens 04 February 2022 (has links)
Biomarker discovery and statistical modeling reveals the brain activity that supports brain function and dysfunction. Detecting abnormal brain activity is critical for developing biomarkers of disease, elucidating disease mechanisms and evolution, and ultimately improving disease course. In my thesis, we develop statistical methodology to characterize neural activity in disease from noisy electrophysiological recordings. First, we develop a modification of a classic statistical modeling approach - multivariate Granger causality - to infer coordinated activity between brain regions. Assuming the signaling dependencies vary smoothly, we propose to write the history terms in autoregressive models of the signals using a lower dimensional spline basis. This procedure requires fewer parameters than the standard approach, thus increasing the statistical power. we show that this procedure accurately estimates brain dynamics in simulations and examples of physiological recordings from a patient with pharmacoresistant epilepsy. This work provides a statistical framework to understand alternations in coordinated brain activity in disease. Second, we demonstrate that sleep spindles, thalamically-driven neural rhythms (9-15 Hz) associated with sleep-dependent learning, are a reliable biomarker for Rolandic epilepsy. Rolandic epilepsy is the most common form of childhood epilepsy and characterized by nocturnal focal epileptic discharges as well as neurocognitive deficits. We show that sleep spindle rate is reduced regionally across cortex and correlated with poor cognitive performance in epilepsy. These results provide evidence for a regional disruption to the thalamocortical circuit in Rolandic epilepsy, and a potential mechanistic explanation for the cognitive deficits observed. Finally, we develop a procedure to utilize delta rhythms (2-4 Hz), a sensitive biomarker for Angelman syndrome, as a non-invasive measure of treatment efficacy in clinical trials. Angelman syndrome is a rare neurodevelopmental disorder caused by reduced expression of the UBE3A protein. Many disease-modifying treatments are being developed to reinstate UBE3A expression. To aid in clinical trials, we propose a procedure that detects therapeutic improvements in delta power outside of the natural variability over age by developing a longitudinal natural history model of delta power. These results demonstrate the utility of biomarker discovery and statistical modeling for elucidating disease course and mechanisms with the long-term goal of improving patient outcomes.
42

Hodnocení komerčního rizika při exportu do Číny / Evaluation of the Commercial Risk at Exporting to China

Polák, Josef January 2013 (has links)
This PhD thesis focuses on current issues of commercial risk in international trade, particularly on the evaluation of commercial risk when exporting to China. This PhD thesis presents initial theoretical framework for solution of the problem and also presents statistical results of primary research conducted for Czech exporters necessary to meet the objectives of the dissertation. The aim of this PhD thesis is to construct the model for assessment of commercial risks of exporting to China. The constructed model is probabilistic model, while outcoming results of resulting commercial risk rating based on the averaging of the probable costs or losses caused by the effects of commercial risk which may arise in exporting business entity at unsecured contract, and may take considerable values. The constructed model allows both, to calculate with costs in their absolute probable values as well as to calculate with costs in their relative values as percentages of the contract value. The issue of trade with China is broad and encompasses several disciplines. This implies a large potential for further research which aims in particular to the modeling of knowledge, and by extension created the probabilistic model to the knowledge model.
43

A Nonlinear Statistical Algorithm to Predict Daily Lightning in Mississippi

Thead, Erin Amanda 15 December 2012 (has links)
Recent improvements in numerical weather model resolution open the possibility of producing forecasts for lightning using indirect lightning threat indicators well in advance of an event. This research examines the feasibility of a statistical machine-learning algorithm known as a support vector machine (SVM) to provide a probabilistic lightning forecast for Mississippi at 9 km resolution up to one day in advance of a thunderstorm event. Although the results indicate that SVM forecasts are not consistently accurate with single-day lightning forecasts, the SVM performs skillfully on a data set consisting of many forecast days. It is plausible that errors by the numerical forecast model are responsible for the poorer performance of the SVM with individual forecasts. More research needs to be conducted into the possibility of using SVM for lightning prediction with input data sets from a variety of numerical weather models.
44

Linkages between soil properties and phosphorus leaching from ground-based urban agriculture in Linköping, Sweden

Tai, Kara January 2022 (has links)
Cities have the potential to change the way resources and nutrients are utilized as they are centers of consumption and waste production. Losses of nutrients like nitrogen and phosphorus (P) to water ways, called eutrophication, is a major water quality issue that marine ecosystems face (Bennett et al., 2001; Smith & Schindler, 2009). Urban agriculture (UA) provides a chance for some nutrient reuse within city boundaries, but there exists a gap in knowledge regarding how soil properties influence P movement patterns within UA contexts. To explore the relationships between P leachate and soil characteristics from urban gardens, I created generalized linear mixed models (GLMMs) using data from 8 gardens in Linköping, Sweden, over a period of 2 years. Though leachate data and soil traits varied between gardens, values from the urban gardens generally did not vary extensively compared to those from field studies or rural agriculture. As hypothesized, plant-available P from the ammonium lactate soil P test (P-AL) and degree of P saturation (DPS) were both important, although why they were significant to their respective water quality variables was unclear. Moreover, spatial correlations were also not as influential as expected in P leaching. Additionally, other important soil characteristics (pH, clay, plant-available iron (Fe-AL), and plant-available aluminum (Al-AL)) seemed to relate to P adsorption and release, indicating a need for future research in that direction.
45

The Maximization of the Logarithmic Entropy Function as a New Effective Tool in Statistical Modeling and Analytical Decision Making

Diab, Yosri 04 1900 (has links)
This thesis introduces a new effective method in statistical modeling and probabilistic decision making problems. The method is based on maximizing the Shannon Logarithmic Entropy Function for information, subject to the given prior information to serve as constraints, to generate a probability distribution. The method is known as the Maximum Entropy Principle or "Jaynes Principle". Tribus used it earlier, but in a limited case, without general application to either statistical modeling or probablistic decision making. In this thesis, a new method which generalizes the above principle is introduced. This permits practical applications, some of which are illustrated. / Thesis / Master of Engineering (ME)
46

Robust and Efficient Feature Selection for High-Dimensional Datasets

Mo, Dengyao 19 April 2011 (has links)
No description available.
47

A Model to Predict Ohio University Student Attrition from Admissions and Involvement Data

Roth, Sadie E. 05 August 2008 (has links)
No description available.
48

Volatility Modeling and Risk Measurement using Statistical Models based on the Multivariate Student's t Distribution

Banasaz, Mohammad Mahdi 01 April 2022 (has links)
An effective risk management program requires reliable risk measurement. Failure to assess inherited risks in mortgage-backed securities in the U.S. market contributed to the financial crisis of 2007–2008, which has prompted government regulators to pay greater attention to controlling risk in banks, investment funds, credit unions, and other financial institutions to prevent bankruptcy and financial crisis in the future. In order to calculate risk in a reliable manner, this thesis has focused on the statistical modeling of expected return and volatility. The primary aim of this study is to propose a framework, based on the probabilistic reduction approach, to reliably quantify market risk using statistical models and historical data. Particular emphasis is placed on the importance of the validity of the probabilistic assumptions in risk measurement by demonstrating how a statistically misspecified model will lead the evaluation of risk astray. The concept of market risk is explained by discussing the narrow definition of risk in a financial context and its evaluation and implications for financial management. After highlighting empirical evidence and discussing the limitations of the ARCH-GARCH-type volatility models using exchange rate and stock market data, we proposed Student's t Autoregressive models to estimate expected return and volatility to measure risk, using Value at Risk (VaR) and Expected Shortfall (ES). The misspecification testing analysis shows that our proposed models can adequately capture the chance regularities in exchange rates and stock indexes data and give a reliable estimation of regression and skedastic functions used in risk measurement. According to empirical findings, the COVID-19 pandemic in the first quarter of 2020 posed an enormous risk to global financial markets. The risk in financial markets returned to levels prior to the COVID-19 pandemic in 2021, after COVID-19 vaccine distribution started in developed countries. / Doctor of Philosophy / Reliable risk measurement is necessary for any effective risk management program. Hence, the primary purpose of this dissertation was to propose a framework to quantify market risk using statistical models and historical data, with a particular emphasis placed on checking the validity of probabilistic assumptions underlying models. After discussing the concept of market risk and its evaluation methods in financial management, we explored the empirical evidence in financial data and highlighted some limitations of other well-known modeling approaches. In order to ameliorate limitations, this study proposed Student's t Autoregressive models to estimate the conditional mean and the conditional variance of the financial variables and use them to measure risk via two popular methods: Value at Risk (VaR) and Expected Shortfall (ES). Further investigation shows that our proposed models can adequately model exchange rates and stock indexes data and give reliable estimations to use in risk measurement. We used our model to quantify risk in global financial markets in recent years. The results show that the COVID-19 pandemic posed an enormous risk to global financial markets in the first quarter of 2020. In 2021, the level of risk in financial markets returned to levels before the COVID-19 pandemic, after COVID-19 vaccine distribution started in developed countries.
49

Análise de custo-eficácia dos pagamentos por serviços ambientais em paisagens fragmentadas: estudo de caso de São Paulo / Cost-effectiveness analysis of payments for environmental services in fragmented landscapes: case study in the State of São Paulo

Fendrich, Arthur Nicolaus 14 November 2017 (has links)
Mesmo com o crescimento da dependência da vida humana em relação aos serviços ecossistêmicos, a taxa de perda de diversidade genética no planeta tem alcançado níveis semelhantes à de grandes eventos de extinção, evidenciando a necessidade de ações para a conservação dos recursos naturais. Em adição aos tradicionais instrumentos de comando e controle para a conservação, os instrumentos econômicos têm tido crescente atenção no mundo nos últimos anos, com especial destaque para os Pagamentos por Serviços Ambientais (PSA). A abordagem de pagamentos de incentivos tem crescido na última década e, apesar das potencialidades que o PSA apresenta, muitos programas falham em incorporar o conhecimento científico em sua execução, sendo esse um dos aspectos que podem acarretar baixo desempenho ambiental e econômico. Neste contexto, o presente projeto buscou avaliar a custo-eficácia do PSA em paisagens fragmentadas. A área de estudo é o estado de São Paulo, cuja fragmentação historicamente ocorre pela expansão agropecuária e pelos diversos impactos decorrentes do grande crescimento populacional em seu território. Foram distribuídos questionários para a obtenção das preferências dos proprietários rurais paulistas em relação aos programas de PSA para restauração de vegetação nativa. Os dados coletados foram relacionados a características socioeconômicas e ambientais e um modelo beta inflacionado de zero misto dentro da classe GAMLSS foi utilizado. Em seguida, o modelo foi utilizado para predizer os resultados para os proprietários não entrevistados e a curva de investimento para diferentes retornos para conservação foi construída. Os resultados apontaram que o PSA é uma alternativa muito custosa frente aos orçamentos ambientais paulistas e que traz poucos benefícios para a restauração no estado de São Paulo. A pesquisa possui uma vertente teórica, pois contribui para a compreensão da adequabilidade do PSA em paisagens fragmentadas, e uma vertente prática, pois explicita a quantidade de recursos necessária para a execução dos programas analisados. / Although the dependence of human activities on ecosystem services has risen in the past decades, the current rate of genetic diversity loss has substantially declined and reached alarming levels. In addition to the traditional command and control approach for the promotion of conservation, growing attention has been given to economic instruments, especially to Payments for Environmental Services (PES). Despite all potentialities of the PES instrument, many programs fail to introduce scientic knowledge in the execution. Such a lack of foundation may result in low environmental and economic performance. The present research aims at evaluating the cost-effectiveness of PES in fragmented landscapes. The study area is the state of São Paulo, which has been fragmented by the agricultural and pasture expansion, and the impacts linked to the large population growth. A survey with dierent PES programs was sent to rural landowners and responses were analyzed and linked to socioeconomic and environmental characteristics through a zero-inflated beta mixed model, within the GAMLSS framework. The model was used to predict enrollment of non-respondents in different PES programs. Finally, the relationship between total area for restoration and the amount of resources needed for each program was compared to the environmental budget of the state of São Paulo. Results show that PES is a very costly alternative that can provide only few results for restoration. The present work has a theoretical orientation, as it contributes to the comprehension of the feasibility of PES programs in fragmented landscapes, and a practical orientation, as it quantifies the amount of resources required by the programs analyzed.
50

Maxent Estimation of Aquatic Escherichia Coli Stream Impairment

Gilfillan, Dennis, Joyner, Timothy Andrew, Scheuerman, Phillip R. 13 September 2018 (has links)
Background: The leading cause of surface water impairment in United States’ rivers and streams is pathogen contamination. Although use of fecal indicators has reduced human health risk, current approaches to identify and reduce exposure can be improved. One important knowledge gap within exposure assessment is characterization of complex fate and transport processes of fecal pollution. Novel modeling processes can inform watershed decision-making to improve exposure assessment. Methods: We used the ecological model, Maxent, and the fecal indicator bacterium Escherichia coli to identify environmental factors associated with surface water impairment. Samples were collected August, November, February, and May for 8 years on Sinking Creek in Northeast Tennessee and analyzed for 10 water quality parameters and E. coli concentrations. Univariate and multivariate models estimated probability of impairment given the water quality parameters. Model performance was assessed using area under the receiving operating characteristic (AUC) and prediction accuracy, defined as the model’s ability to predict both true positives (impairment) and true negatives (compliance). Univariate models generated action values, or environmental thresholds, to indicate potential E. coli impairment based on a single parameter. Multivariate models predicted probability of impairment given a suite of environmental variables, and jack-knife sensitivity analysis removed unresponsive variables to elicit a set of the most responsive parameters. Results: Water temperature univariate models performed best as indicated by AUC, but alkalinity models were the most accurate at correctly classifying impairment. Sensitivity analysis revealed that models were most sensitive to removal of specific conductance. Other sensitive variables included water temperature, dissolved oxygen, discharge, and NO3. The removal of dissolved oxygen improved model performance based on testing AUC, justifying development of two optimized multivariate models; a 5-variable model including all sensitive parameters, and a 4-variable model that excluded dissolved oxygen. Discussion: Results suggest that E. coli impairment in Sinking Creek is influenced by seasonality and agricultural run-off, stressing the need for multi-month sampling along a stream continuum. Although discharge was not predictive of E. coli impairment alone, its interactive effect stresses the importance of both flow dependent and independent processes associated with E. coli impairment. This research also highlights the interactions between nutrient and fecal pollution, a key consideration for watersheds with multiple synergistic impairments. Although one indicator cannot mimic the plethora of existing pathogens in water, incorporating modeling can fine tune an indicator’s utility, providing information concerning fate, transport, and source of fecal pollution while prioritizing resources and increasing confidence in decision making. Methods We used the ecological model, Maxent, and the fecal indicator bacterium Escherichia coli to identify environmental factors associated with surface water impairment. Samples were collected August, November, February, and May for 8 years on Sinking Creek in Northeast Tennessee and analyzed for 10 water quality parameters and E. coli concentrations. Univariate and multivariate models estimated probability of impairment given the water quality parameters. Model performance was assessed using area under the receiving operating characteristic (AUC) and prediction accuracy, defined as the model’s ability to predict both true positives (impairment) and true negatives (compliance). Univariate models generated action values, or environmental thresholds, to indicate potential E. coli impairment based on a single parameter. Multivariate models predicted probability of impairment given a suite of environmental variables, and jack-knife sensitivity analysis removed unresponsive variables to elicit a set of the most responsive parameters. Results Water temperature univariate models performed best as indicated by AUC, but alkalinity models were the most accurate at correctly classifying impairment. Sensitivity analysis revealed that models were most sensitive to removal of specific conductance. Other sensitive variables included water temperature, dissolved oxygen, discharge, and NO3. The removal of dissolved oxygen improved model performance based on testing AUC, justifying development of two optimized multivariate models; a 5-variable model including all sensitive parameters, and a 4-variable model that excluded dissolved oxygen. Discussion Results suggest that E. coli impairment in Sinking Creek is influenced by seasonality and agricultural run-off, stressing the need for multi-month sampling along a stream continuum. Although discharge was not predictive of E. coli impairment alone, its interactive effect stresses the importance of both flow dependent and independent processes associated with E. coli impairment. This research also highlights the interactions between nutrient and fecal pollution, a key consideration for watersheds with multiple synergistic impairments. Although one indicator cannot mimic theplethora of existing pathogens in water, incorporating modeling can fine tune an indicator’s utility, providing information concerning fate, transport, and source of fecal pollution while prioritizing resources and increasing confidence in decision making.

Page generated in 0.0445 seconds