• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 35
  • 15
  • 5
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 290
  • 290
  • 101
  • 99
  • 81
  • 69
  • 69
  • 46
  • 39
  • 38
  • 38
  • 37
  • 35
  • 32
  • 31
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
281

Newsvendor Models With Monte Carlo Sampling

Ekwegh, Ijeoma W 01 August 2016 (has links)
Newsvendor Models with Monte Carlo Sampling by Ijeoma Winifred Ekwegh The newsvendor model is used in solving inventory problems in which demand is random. In this thesis, we will focus on a method of using Monte Carlo sampling to estimate the order quantity that will either maximizes revenue or minimizes cost given that demand is uncertain. Given data, the Monte Carlo approach will be used in sampling data over scenarios and also estimating the probability density function. A bootstrapping process yields an empirical distribution for the order quantity that will maximize the expected profit. Finally, this method will be used on a newsvendor example to show that it works in maximizing profit.
282

Exploring the Effectiveness of Environmentally Sustainable Practices in Municipal Government: A Case Study of the City of Knoxville’s Department of Parks and Recreation

Brown, Anthony Michael 01 August 2011 (has links)
Sustainability practices produce programs and services that meet current needs while preserving the environment and natural resources for the future. City parks and recreation departments are facing budget shortfalls and increasing expectations from customers. Governments are now embracing sustainability practices to create financial savings while also fostering relations with customers. The purpose of this single case study was twofold: (1) to examine the effectiveness of one city department’s strategies in outsourcing its environmental sustainability program through a performance contract with Ameresco; and (2) to examine the perceptions of key department employees about the effectiveness of the sustainability initiative. A snowball sample of 14 employees, stratified by employee class (upper administrative, middle management and, line staff) was drawn from the City of Knoxville, Tennessee’s parks and recreation department. Qualitative data generated from semi-structured interviews was coded and thematized to analyze the perceptions of the employees included in the sample about the agency’s sustainability practices. Additionally, financial archival data from utility bills (N = 96) were analyzed over the implementation phase of the contract to determine if cost savings were realized. Key findings of the study included: (1) financial savings were realized across key operating areas as a result of the contract with Ameresco; (2) employees identified positive feelings towards investment in environmental initiatives; (3) sustainability can be obtained through the implementation of an environmental sustainability performance contract; and (4) sustainable practices can further increase efficiency of facilities operation. The results of this study may be generalized to cities of similar size and governmental structure.
283

Disease Correlation Model: Application to Cataract Incidence in the Presence of Diabetes

dePillis-Lindheim, Lydia 01 April 2013 (has links)
Diabetes is a major risk factor for the development of cataract [3,14,20,22]. In this thesis, we create a model that allows us to understand the incidence of one disease in the context of another; in particular, cataract in the presence of diabetes. The World Health Organization's Vision 2020 blindness-prevention initiative administers surgeries to remove cataracts, the leading cause of blindness worldwide [24]. One of the geographic areas most impacted by cataract-related blindness is Sub-Saharan Africa. In order to plan the number of surgeries to administer, the World Health Organization uses data on cataract prevalence. However, an estimation of the incidence of cataract is more useful than prevalence data for the purpose of resource planning. In 2012, Dray and Williams developed a method for estimating incidence based on prevalence data [5]. Incidence estimates can be further refined by considering associated risk factors such as diabetes. We therefore extend the Dray and Williams model to include diabetes prevalence when calculating cataract incidence estimates. We explore two possible approaches to our model construction, one a detailed extension, and the other, a simplification of that extension. We provide a discussion comparing the two approaches.
284

An Assessment of the Performances of Several Univariate Tests of Normality

Adefisoye, James Olusegun 24 March 2015 (has links)
The importance of checking the normality assumption in most statistical procedures especially parametric tests cannot be over emphasized as the validity of the inferences drawn from such procedures usually depend on the validity of this assumption. Numerous methods have been proposed by different authors over the years, some popular and frequently used, others, not so much. This study addresses the performance of eighteen of the available tests for different sample sizes, significance levels, and for a number of symmetric and asymmetric distributions by conducting a Monte-Carlo simulation. The results showed that considerable power is not achieved for symmetric distributions when sample size is less than one hundred and for such distributions, the kurtosis test is most powerful provided the distribution is leptokurtic or platykurtic. The Shapiro-Wilk test remains the most powerful test for asymmetric distributions. We conclude that different tests are suitable under different characteristics of alternative distributions.
285

GIS-integrated mathematical modeling of social phenomena at macro- and micro- levels—a multivariate geographically-weighted regression model for identifying locations vulnerable to hosting terrorist safe-houses: France as case study

Eisman, Elyktra 13 November 2015 (has links)
Adaptability and invisibility are hallmarks of modern terrorism, and keeping pace with its dynamic nature presents a serious challenge for societies throughout the world. Innovations in computer science have incorporated applied mathematics to develop a wide array of predictive models to support the variety of approaches to counterterrorism. Predictive models are usually designed to forecast the location of attacks. Although this may protect individual structures or locations, it does not reduce the threat—it merely changes the target. While predictive models dedicated to events or social relationships receive much attention where the mathematical and social science communities intersect, models dedicated to terrorist locations such as safe-houses (rather than their targets or training sites) are rare and possibly nonexistent. At the time of this research, there were no publically available models designed to predict locations where violent extremists are likely to reside. This research uses France as a case study to present a complex systems model that incorporates multiple quantitative, qualitative and geospatial variables that differ in terms of scale, weight, and type. Though many of these variables are recognized by specialists in security studies, there remains controversy with respect to their relative importance, degree of interaction, and interdependence. Additionally, some of the variables proposed in this research are not generally recognized as drivers, yet they warrant examination based on their potential role within a complex system. This research tested multiple regression models and determined that geographically-weighted regression analysis produced the most accurate result to accommodate non-stationary coefficient behavior, demonstrating that geographic variables are critical to understanding and predicting the phenomenon of terrorism. This dissertation presents a flexible prototypical model that can be refined and applied to other regions to inform stakeholders such as policy-makers and law enforcement in their efforts to improve national security and enhance quality-of-life.
286

Langevinized Ensemble Kalman Filter for Large-Scale Dynamic Systems

Peiyi Zhang (11166777) 26 July 2021 (has links)
<p>The Ensemble Kalman filter (EnKF) has achieved great successes in data assimilation in atmospheric and oceanic sciences, but its failure in convergence to the right filtering distribution precludes its use for uncertainty quantification. Other existing methods, such as particle filter or sequential importance sampler, do not scale well to the dimension of the system and the sample size of the datasets. In this dissertation, we address these difficulties in a coherent way.</p><p><br></p><p> </p><p>In the first part of the dissertation, we reformulate the EnKF under the framework of Langevin dynamics, which leads to a new particle filtering algorithm, the so-called Langevinized EnKF (LEnKF). The LEnKF algorithm inherits the forecast-analysis procedure from the EnKF and the use of mini-batch data from the stochastic gradient Langevin-type algorithms, which make it scalable with respect to both the dimension and sample size. We prove that the LEnKF converges to the right filtering distribution in Wasserstein distance under the big data scenario that the dynamic system consists of a large number of stages and has a large number of samples observed at each stage, and thus it can be used for uncertainty quantification. We reformulate the Bayesian inverse problem as a dynamic state estimation problem based on the techniques of subsampling and Langevin diffusion process. We illustrate the performance of the LEnKF using a variety of examples, including the Lorenz-96 model, high-dimensional variable selection, Bayesian deep learning, and Long Short-Term Memory (LSTM) network learning with dynamic data.</p><p><br></p><p> </p><p>In the second part of the dissertation, we focus on two extensions of the LEnKF algorithm. Like the EnKF, the LEnKF algorithm was developed for Gaussian dynamic systems containing no unknown parameters. We propose the so-called stochastic approximation- LEnKF (SA-LEnKF) for simultaneously estimating the states and parameters of dynamic systems, where the parameters are estimated on the fly based on the state variables simulated by the LEnKF under the framework of stochastic approximation. Under mild conditions, we prove the consistency of resulting parameter estimator and the ergodicity of the SA-LEnKF. For non-Gaussian dynamic systems, we extend the LEnKF algorithm (Extended LEnKF) by introducing a latent Gaussian measurement variable to dynamic systems. Those two extensions inherit the scalability of the LEnKF algorithm with respect to the dimension and sample size. The numerical results indicate that they outperform other existing methods in both states/parameters estimation and uncertainty quantification.</p>
287

Determinants of Health Care Use Among Rural, Low-Income Mothers and Children: A Simultaneous Systems Approach to Negative Binomial Regression Modeling

Valluri, Swetha 01 January 2011 (has links) (PDF)
The determinants of health care use among rural, low-income mothers and their children were assessed using a multi-state, longitudinal data set, Rural Families Speak. The results indicate that rural mothers’ decisions regarding health care utilization for themselves and for their child can be best modeled using a simultaneous systems approach to negative binomial regression. Mothers’ visits to a health care provider increased with higher self-assessed depression scores, increased number of child’s doctor visits, greater numbers of total children in the household, greater numbers of chronic conditions, need for prenatal or post-partum care, development of a new medical condition, and having health insurance (Medicaid/equivalent and HMO/private). Child’s visits to a health care provider, on the other hand, increased with greater numbers of chronic conditions, development of a new medical condition, and increased mothers’ visits to a doctor. Child’s utilization of pediatric health care services decreased with higher levels of maternal depression, greater numbers of total children in the household, if the mother had HMO/private health care coverage, if the mother was pregnant, and if the mother was Latina/African American. Mother’s use of health care services decreased with her age, increased number of child’s chronic conditions, income as a percent of the federal poverty line, and if child had HMO/private health care insurance. The study expands the econometric techniques available for assessing maternal and pediatric health care use and the results contribute to an understanding of how rural, low-income mothers choose the level of health care services use for themselves and for their child. Additionally, the results would assist in formulating policies to reorient the type of health care services provided to this vulnerable population.
288

Addressing Challenges in Graphical Models: MAP estimation, Evidence, Non-Normality, and Subject-Specific Inference

Sagar K N Ksheera (15295831) 17 April 2023 (has links)
<p>Graphs are a natural choice for understanding the associations between variables, and assuming a probabilistic embedding for the graph structure leads to a variety of graphical models that enable us to understand these associations even further. In the realm of high-dimensional data, where the number of associations between interacting variables is far greater than the available number of data points, the goal is to infer a sparse graph. In this thesis, we make contributions in the domain of Bayesian graphical models, where our prior belief on the graph structure, encoded via uncertainty on the model parameters, enables the estimation of sparse graphs.</p> <p><br></p> <p>We begin with the Gaussian Graphical Model (GGM) in Chapter 2, one of the simplest and most famous graphical models, where the joint distribution of interacting variables is assumed to be Gaussian. In GGMs, the conditional independence among variables is encoded in the inverse of the covariance matrix, also known as the precision matrix. Under a Bayesian framework, we propose a novel prior--penalty dual called the `graphical horseshoe-like' prior and penalty, to estimate precision matrix. We also establish the posterior convergence of the precision matrix estimate and the frequentist consistency of the maximum a posteriori (MAP) estimator.</p> <p><br></p> <p>In Chapter 3, we develop a general framework based on local linear approximation for MAP estimation of the precision matrix in GGMs. This general framework holds true for any graphical prior, where the element-wise priors can be written as a Laplace scale mixture. As an application of the framework, we perform MAP estimation of the precision matrix under the graphical horseshoe penalty.</p> <p><br></p> <p>In Chapter 4, we focus on graphical models where the joint distribution of interacting variables cannot be assumed Gaussian. Motivated by the quantile graphical models, where the Gaussian likelihood assumption is relaxed, we draw inspiration from the domain of precision medicine, where personalized inference is crucial to tailor individual-specific treatment plans. With an aim to infer Directed Acyclic Graphs (DAGs), we propose a novel quantile DAG learning framework, where the DAGs depend on individual-specific covariates, making personalized inference possible. We demonstrate the potential of this framework in the regime of precision medicine by applying it to infer protein-protein interaction networks in Lung adenocarcinoma and Lung squamous cell carcinoma.</p> <p><br></p> <p>Finally, we conclude this thesis in Chapter 5, by developing a novel framework to compute the marginal likelihood in a GGM, addressing a longstanding open problem. Under this framework, we can compute the marginal likelihood for a broad class of priors on the precision matrix, where the element-wise priors on the diagonal entries can be written as gamma or scale mixtures of gamma random variables and those on the off-diagonal terms can be represented as normal or scale mixtures of normal. This result paves new roads for model selection using Bayes factors and tuning of prior hyper-parameters.</p>
289

Towards structured planning and learning at the state fisheries agency scale

Aldridge, Caleb A 09 December 2022 (has links)
Inland recreational fisheries has grown philosophically and scientifically to consider economic and sociopolitical aspects (non-biological) in addition to the biological. However, integrating biological and non-biological aspects of inland fisheries has been challenging. Thus, an opportunity exists to develop approaches and tools which operationalize planning and decision-making processes which include biological and non-biological aspects of a fishery. This dissertation expands the idea that a core set of goals and objectives is shared among and within inland fisheries agencies; that many routine operations of inland fisheries managers can be regimented or standardized; and the novel concept that current information and operations can be used to improve decision making through structured decision making and adaptive management approaches at the agency scale. In CHAPTER II, my results show that the goals of inland fisheries agencies tend to be more similar than different but have expanded and diversified since the 1970s. I suggest that changes in perspectives and communication technology, as well as provisions within nationwide funding mechanisms, have led to goals becoming more homogenous across the USA and more diverse within each bureau. In CHAPTER III, I found that standardized collection and careful curation of data has allowed one inland fisheries bureau to acquire a large fish and fisheries database and that managers use this database to summarize common fish population parameters and indices, craft objectives, and set targets. The regimentation of data management and analysis has helped managers within the inland fisheries bureau to assess fish populations and fisheries efficiently and effectively across waterbodies within their districts and state. In CHAPTER IV, I extend CHAPTERS II and III to show that biological and non-biological management objectives and their associated measurable attributes and management actions can be synthesized into a common set of decision elements. I demonstrate how common decision elements enable managers to easily structure decisions and help to address common problems at the agency scale. Using a subset of common decision elements, I demonstrate how existing agency operations (e.g., monitoring) can be used to expedite learning and improve decision making for a common problem faced by managers in multiple, similar systems.
290

Software Internationalization: A Framework Validated Against Industry Requirements for Computer Science and Software Engineering Programs

Vũ, John Huân 01 March 2010 (has links)
View John Huân Vũ's thesis presentation at http://youtu.be/y3bzNmkTr-c. In 2001, the ACM and IEEE Computing Curriculum stated that it was necessary to address "the need to develop implementation models that are international in scope and could be practiced in universities around the world." With increasing connectivity through the internet, the move towards a global economy and growing use of technology places software internationalization as a more important concern for developers. However, there has been a "clear shortage in terms of numbers of trained persons applying for entry-level positions" in this area. Eric Brechner, Director of Microsoft Development Training, suggested five new courses to add to the computer science curriculum due to the growing "gap between what college graduates in any field are taught and what they need to know to work in industry." He concludes that "globalization and accessibility should be part of any course of introductory programming," stating: A course on globalization and accessibility is long overdue on college campuses. It is embarrassing to take graduates from a college with a diverse student population and have to teach them how to write software for a diverse set of customers. This should be part of introductory software development. Anything less is insulting to students, their family, and the peoples of the world. There is very little research into how the subject of software internationalization should be taught to meet the major requirements of the industry. The research question of the thesis is thus, "Is there a framework for software internationalization that has been validated against industry requirements?" The answer is no. The framework "would promote communication between academia and industry ... that could serve as a common reference point in discussions." Since no such framework for software internationalization currently exists, one will be developed here. The contribution of this thesis includes a provisional framework to prepare graduates to internationalize software and a validation of the framework against industry requirements. The requirement of this framework is to provide a portable and standardized set of requirements for computer science and software engineering programs to teach future graduates.

Page generated in 0.1209 seconds