• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 627
  • 267
  • 111
  • 73
  • 43
  • 43
  • 35
  • 22
  • 17
  • 11
  • 8
  • 7
  • 5
  • 5
  • 5
  • Tagged with
  • 1429
  • 530
  • 171
  • 160
  • 157
  • 147
  • 114
  • 104
  • 104
  • 100
  • 100
  • 97
  • 95
  • 94
  • 93
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
321

A Discrete Choice Mean Variance (EV) Cost Model to Measure Impact of Household Risk from Drinking Water Pipe Corrosion

Sarver, Eric Andrew 08 June 2017 (has links)
In traditional investment decision making, one tool commonly used is the mean variance model, also known as an expected-value variance (EV) model, which evaluates the anticipated payout of different assets with respect to uncertainty where portfolios with higher risk demand higher expected returns from an individual. This thesis adapts this framework to a cost setting where decision makers are evaluating alternative physical assets that carry lifetime cost uncertainty for maintenance. Specifically, this paper examines homeowner choices for their home plumbing systems in the event of a pinhole leak, a tiny pin-sized hole that forms in copper, drinking-water pipes. These leaks can cause substantial damage and cost homeowners thousands of dollars in repairs. Since pinhole leaks are not related to the age of pipe material, a homeowner is subject to the risk of additional costs if a pinhole leak occurs again despite their repair efforts. The EV cost model in this paper defines two discrete choices for the homeowner in the event of a leak; to apply a simple repair at lower cost and higher future cost uncertainty, or to replace their plumbing with new pipe material, usually made of plastic, at a higher upfront cost but lower likelihood of future expenses. The risk preference of homeowners are demonstrated by their repair strategy selection, as well as the level of cost they incur to reduce uncertainty. Risk neutral individuals will select the repair strategy with the lowest lifetime expected cost and high variance, while risk averse homeowners will prefer to replace their plumbing with higher cost but lower variance. Risk averse individuals are also exposed to indirect costs, which is an additional unobserved cost in the form of a risk premium the homeowner is willing to pay to remove all uncertainty of future pinhole leak expense. Expected costs and variances are also higher for regions in the U.S. that experience elevated leak incident rates, known as hotspots. Using this mean variance cost framework, indirect cost can be quantified for homeowners in hotspot regions and compared to the rest of the U.S. to evaluate the magnitude of pinhole leak risk. The EV cost model estimates risk premiums on pinhole leaks to be $442 for homeowners in hotspots and $305 for those in the rest of the U.S. Finally, this paper examines the impact of pinhole leak cost uncertainty on the U.S. economy. Of an estimated $692 million in annual pinhole leak costs to homeowners, this study estimates a lower bound cost of $54 million per year (7.8% of estimated national annual cost) in risk premium that homeowners would be willing to pay to avoid pinhole leak cost uncertainty. Information in this study on the role of risk in home plumbing decisions and indirect costs would be helpful to policymakers and water utility managers as they deal with infrastructure management decisions. Furthermore, the EV cost methodology established in this paper demonstrates an effective use of mean variance modeling under cost uncertainty. / Master of Science
322

A comparison of a distributed control system’s graphical interfaces : a DoE approach to evaluate efficiency in automated process plants / En jämförelse av grafiska gränssnitt för ett distribuerat kontrollsystem : en försöksplaneringsstrategi för att utvärdera effektiviteten i fabriker med automatiserade processer

Maanja, Karen January 2024 (has links)
Distributed control systems play a central role for critical processes within a plant that needs to be monitored or controlled. They ensure high production availability and output while simultaneously ensuring the safety of the personnel and the environment. However, 5% of global annual production is lost due to unscheduled downtime. 80% of the unscheduled shutdowns could have been prevented and 40% of these are caused by human error.  This study is conducted at ABB's Process Automation team in Umeå. The aim is to examine if different human-machine interfaces affect operators' effectiveness in resolving errors and maintaining a high production level. DoE is the chosen approach for this study which includes planning and conducting an experiment where the two dependent variables are Effect and Time. The independent variables examined are Scenario, Graphic, and Operator which are used as factors in a factorial design, each having two levels.  Experiments showed that the design of the human-machine interface has no impact on either responses, i.e. it has no statistically significant effect on the production in terms of operator effectiveness or production efficiency. Instead, the level of experience of the operators seems to be the main contributor of variance in production in the models used. / Distribuerade styrsystem spelar en central roll för kritiska processer inom en anläggning som måste övervakas eller kontrolleras. De säkerställer hög produktionstillgänglighet ochproduktion samtidigt som säkerheten för personalen och miljön säkerställs. Det har visats att 5% av den globala årsproduktionen går förlorad på grund av oplanerade driftstopp. 80% av de oplanerade avbrotten kunde ha förhindrats och 40% av dessa orsakas av den mänskliga faktorn. Denna studie genomförs hos ABB:s Process Automation team i Umeå. Målet är att undersöka om olika gränssnitt för styrsystemen är en viktig faktor för operatörens effektivitet i att åtgärda fel och att upprätthålla en hög produktionsnivå. Försöksplanering är det valda tillvägagångssättet för denna studie som inkluderar planering och genomförande av experimentet där de två beroende variabler är Effekt och Tid. De oberoende variabler som undersöks är Scenario, Grafik och Operatör, och används som faktorer i en faktoriell design, där faktorerna har två nivåer vardera. Experimentet visade att utformningen av den grafiska designen för gränssnittet inte har någon inverkan på någondera svaren, d.v.s. den har ingen statistiskt signifikant effekt på produktionen i form av operatörseffektivitet eller produktionseffektivitet. Istället tycks operatörernas erfarenhetsnivå vara den främsta orsaken till variationen i produktionen i de modeller som används.
323

Discrete-ordinates cost optimization of weight-dependent variance reduction techniques for Monte Carlo neutral particle transport

Solomon, Clell J. Jr. January 1900 (has links)
Doctor of Philosophy / Department of Mechanical and Nuclear Engineering / J. Kenneth Shultis / A method for deterministically calculating the population variances of Monte Carlo particle transport calculations involving weight-dependent variance reduction has been developed. This method solves a set of equations developed by Booth and Cashwell [1979], but extends them to consider the weight-window variance reduction technique. Furthermore, equations that calculate the duration of a single history in an MCNP5 (RSICC version 1.51) calculation have been developed as well. The calculation cost, defined as the inverse figure of merit, of a Monte Carlo calculation can be deterministically minimized from calculations of the expected variance and expected calculation time per history.The method has been applied to one- and two-dimensional multi-group and mixed material problems for optimization of weight-window lower bounds. With the adjoint (importance) function as a basis for optimization, an optimization mesh is superimposed on the geometry. Regions of weight-window lower bounds contained within the same optimization mesh element are optimized together with a scaling parameter. Using this additional optimization mesh restricts the size of the optimization problem, thereby eliminating the need to optimize each individual weight-window lower bound. Application of the optimization method to a one-dimensional problem, designed to replicate the variance reduction iron-window effect, obtains a gain in efficiency by a factor of 2 over standard deterministically generated weight windows. The gain in two dimensional problems varies. For a 2-D block problem and a 2-D two-legged duct problem, the efficiency gain is a factor of about 1.2. The top-hat problem sees an efficiency gain of 1.3, while a 2-D 3-legged duct problem sees an efficiency gain of only 1.05. This work represents the first attempt at deterministic optimization of Monte Carlo calculations with weight-dependent variance reduction. However, the current work is limited in the size of problems that can be run by the amount of computer memory available in computational systems. This limitation results primarily from the added discretization of the Monte Carlo particle weight required to perform the weight-dependent analyses. Alternate discretization methods for the Monte Carlo weight should be a topic of future investigation. Furthermore, the accuracy with which the MCNP5 calculation times can be calculated deterministically merits further study.
324

Reduced Complexity Viterbi Decoders for SOQPSK Signals over Multipath Channels

Kannappa, Sandeep Mavuduru 10 1900 (has links)
ITC/USA 2010 Conference Proceedings / The Forty-Sixth Annual International Telemetering Conference and Technical Exhibition / October 25-28, 2010 / Town and Country Resort & Convention Center, San Diego, California / High data rate communication between airborne vehicles and ground stations over the bandwidth constrained Aeronautical Telemetry channel is attributed to the development of bandwidth efficient Advanced Range Telemetry (ARTM) waveforms. This communication takes place over a multipath channel consisting of two components - a line of sight and one or more ground reflected paths which result in frequency selective fading. We concentrate on the ARTM SOQPSKTG transmit waveform suite and decode information bits using the reduced complexity Viterbi algorithm. Two different methodologies are proposed to implement reduced complexity Viterbi decoders in multipath channels. The first method jointly equalizes the channel and decodes the information bits using the reduced complexity Viterbi algorithm while the second method utilizes the minimum mean square error equalizer prior to applying the Viterbi decoder. An extensive numerical study is performed in comparing the performance of the above methodologies. We also demonstrate the performance gain offered by our reduced complexity Viterbi decoders over the existing linear receiver. In the numerical study, both perfect and estimated channel state information are considered.
325

Making Community: Student Subculture and Cultural Variance in the Harvey Mudd College Inner Dormitories

Mabon, Kinzie T 01 January 2016 (has links)
This thesis is a study of the student subculture and cultural variances of that subculture represented by the dorm practices of the inner dorms on Harvey Mudd College’s campus. Using Dick Hebdige’s theory of subcultures and David Schneider’s theory of cultural variances, this work examines the ways that the four inner dorms support and reproduce the Harvey Mudd College student subculture so that all students share values and behaviors that are unique to the Harvey Mudd student population. After first establishing the presence of a Harvey Mudd College student subculture, viewing the dorm practices of North, South, East, and West dorms at Harvey Mudd College through the lens of four main values shared by Harvey Mudd students presents the case that each of the four inner dorms work to provide students of all backgrounds the opportunity to be participating members of the Harvey Mudd student subculture.
326

An investigation of the market efficiency of the Nairobi Securities Exchange

Njuguna, Josephine M. 10 1900 (has links)
This study tests for the market efficiency of the Nairobi Securities Exchange (NSE) after the year 2000 to determine the effect of technological advancements on market efficiency. Data that is used is the NSE 20 share index over the period 2001 to 2015; and the NSE All Share Index (NSE ASI) from its initiation during 2008 to 2015. We cannot accept the Efficient Market Hypothesis (EMH) for the NSE using the serial correlation test, the unit root tests and the runs test. However, we can accept the EMH for the more robust variance ratio test. Overall, the results of the market efficiency are mixed. The most significant finding is that the efficiency of the NSE has increased since the year 2000 which suggests that advancements in technology have contributed to the increase in the market efficiency of the NSE. / Business Management / M. Com. (Business Management)
327

Efficient Exact Tests in Linear Mixed Models for Longitudinal Microbiome Studies

Zhai, Jing January 2016 (has links)
Microbiome plays an important role in human health. The analysis of association between microbiome and clinical outcome has become an active direction in biostatistics research. Testing the microbiome effect on clinical phenotypes directly using operational taxonomic unit abundance data is a challenging problem due to the high dimensionality, non-normality and phylogenetic structure of the data. Most of the studies only focus on describing the change of microbe population that occur in patients who have the specific clinical condition. Instead, a statistical strategy utilizing distance-based or similarity-based non-parametric testing, in which a distance or similarity measure is defined between any two microbiome samples, is developed to assess association between microbiome composition and outcomes of interest. Despite the improvements, this test is still not easily interpretable and not able to adjust for potential covariates. A novel approach, kernel-based semi-parametric regression framework, is applied in evaluating the association while controlling the covariates. The framework utilizes a kernel function which is a measure of similarity between samples' microbiome compositions and characterizes the relationship between the microbiome and the outcome of interest. This kernel-based regression model, however, cannot be applied in longitudinal studies since it could not model the correlation between the repeated measurements. We proposed microbiome association exact tests (MAETs) in linear mixed model can deal with longitudinal microbiome data. MAETs can test not only the effect of overall microbiome but also the effect from specific cluster of the OTUs while controlling for others by introducing more random effects in the model. The current methods for multiple variance component testing are based on either asymptotic distribution or parametric bootstrap which require large sample size or high computational cost. The exact (R)LRT tests, an computational efficient and powerful testing methodology, was derived by Crainiceanu. Since the exact (R)LRT can only be used in testing one variance component, we proposed an approach that combines the recent development of exact (R)LRT and a strategy for simplifying linear mixed model with multiple variance components to a single case. The Monte Carlo simulation studies present correctly controlled type I error and provided superior power in testing association between microbiome and outcomes in longitudinal studies. Finally, the MAETs were applied to longitudinal pulmonary microbiome datasets to demonstrate that microbiome composition is associated with lung function and immunological outcomes. We also successfully found two interesting genera Prevotella and Veillonella which are associated with forced vital capacity.
328

Right Ventricle Segmentation Using Cardiac Magnetic Resonance Images

Rosado-Toro, Jose A. January 2016 (has links)
The world health organization has identified cardiovascular disease as the leading cause of non-accidental deaths in the world. The heart is identified as diseased when it is not operating at peak efficiency. Early diagnosis of heart disease can impact treatment and improve a patient's outcome. An early sign of a diseased heart is a reduction in its pumping ability, which can be measured by performing functional evaluations. These are typically focused on the ability of the ventricles to pump blood to the lungs (right ventricle) or to the rest of the body (left ventricle). Non-invasive imaging modalities such as cardiac magnetic resonance have allowed the use of quantitative methods for ventricular functional evaluation. The evaluation still requires the tracing of the ventricles in the end-diastolic and end-systolic phases. Even though manual tracing is still considered the gold standard, it is prone to intra- and inter-observer variability and is time consuming. Therefore, substantial research work has been focused on the development of semi- and fully automated ventricle segmentation algorithms. In 2009 a medical imaging conference issued a challenge for short-axis left ventricle segmentation. A semi-automated technique using polar dynamic programming generated results that were within human variability. This is because a path in a polar coordinate system yields a circular object in the Cartesian grid and the left ventricle can be approximated as a circular object. In 2012 there was a right ventricle segmentation challenge, but no polar dynamic programming algorithms were proposed. One reason may be that polar dynamic programming can only segment circular shapes. To use polar dynamic programming for the segmentation of the right ventricle we first expanded the capability of the technique to segment non-circular shapes. We apply this new polar dynamic programming in a framework that uses user-selected landmarks to segment the right ventricle in the four chamber view. We also explore the use of four chamber right ventricular segmentation to segment short-axis views of the right ventricle.
329

Empirical Bayes estimation of the extreme value index in an ANOVA setting

Jordaan, Aletta Gertruida 04 1900 (has links)
Thesis (MComm)-- Stellenbosch University, 2014. / ENGLISH ABSTRACT: Extreme value theory (EVT) involves the development of statistical models and techniques in order to describe and model extreme events. In order to make inferences about extreme quantiles, it is necessary to estimate the extreme value index (EVI). Numerous estimators of the EVI exist in the literature. However, these estimators are only applicable in the single sample setting. The aim of this study is to obtain an improved estimator of the EVI that is applicable to an ANOVA setting. An ANOVA setting lends itself naturally to empirical Bayes (EB) estimators, which are the main estimators under consideration in this study. EB estimators have not received much attention in the literature. The study begins with a literature study, covering the areas of application of EVT, Bayesian theory and EB theory. Different estimation methods of the EVI are discussed, focusing also on possible methods of determining the optimal threshold. Specifically, two adaptive methods of threshold selection are considered. A simulation study is carried out to compare the performance of different estimation methods, applied only in the single sample setting. First order and second order estimation methods are considered. In the case of second order estimation, possible methods of estimating the second order parameter are also explored. With regards to obtaining an estimator that is applicable to an ANOVA setting, a first order EB estimator and a second order EB estimator of the EVI are derived. A case study of five insurance claims portfolios is used to examine whether the two EB estimators improve the accuracy of estimating the EVI, when compared to viewing the portfolios in isolation. The results showed that the first order EB estimator performed better than the Hill estimator. However, the second order EB estimator did not perform better than the “benchmark” second order estimator, namely fitting the perturbed Pareto distribution to all observations above a pre-determined threshold by means of maximum likelihood estimation. / AFRIKAANSE OPSOMMING: Ekstreemwaardeteorie (EWT) behels die ontwikkeling van statistiese modelle en tegnieke wat gebruik word om ekstreme gebeurtenisse te beskryf en te modelleer. Ten einde inferensies aangaande ekstreem kwantiele te maak, is dit nodig om die ekstreem waarde indeks (EWI) te beraam. Daar bestaan talle beramers van die EWI in die literatuur. Hierdie beramers is egter slegs van toepassing in die enkele steekproef geval. Die doel van hierdie studie is om ’n meer akkurate beramer van die EWI te verkry wat van toepassing is in ’n ANOVA opset. ’n ANOVA opset leen homself tot die gebruik van empiriese Bayes (EB) beramers, wat die fokus van hierdie studie sal wees. Hierdie beramers is nog nie in literatuur ondersoek nie. Die studie begin met ’n literatuurstudie, wat die areas van toepassing vir EWT, Bayes teorie en EB teorie insluit. Verskillende metodes van EWI beraming word bespreek, insluitend ’n bespreking oor hoe die optimale drempel bepaal kan word. Spesifiek word twee aanpasbare metodes van drempelseleksie beskou. ’n Simulasiestudie is uitgevoer om die akkuraatheid van beraming van verskillende beramingsmetodes te vergelyk, in die enkele steekproef geval. Eerste orde en tweede orde beramingsmetodes word beskou. In die geval van tweede orde beraming, word moontlike beramingsmetodes van die tweede orde parameter ook ondersoek. ’n Eerste orde en ’n tweede orde EB beramer van die EWI is afgelei met die doel om ’n beramer te kry wat van toepassing is vir die ANAVA opset. ’n Gevallestudie van vyf versekeringsportefeuljes word gebruik om ondersoek in te stel of die twee EB beramers die akkuraatheid van beraming van die EWI verbeter, in vergelyking met die EWI beramers wat verkry word deur die portefeuljes afsonderlik te ontleed. Die resultate toon dat die eerste orde EB beramer beter gevaar het as die Hill beramer. Die tweede orde EB beramer het egter slegter gevaar as die tweede orde beramer wat gebruik is as maatstaf, naamlik die passing van die gesteurde Pareto verdeling (PPD) aan alle waarnemings bo ’n gegewe drempel, met behulp van maksimum aanneemlikheidsberaming.
330

Enough is Enough : Sufficient number of securities in an optimal portfolio

Barkino, Iliam, Rivera Öman, Marcus January 2016 (has links)
This empirical study has shown that optimal portfolios need approximately 10 securities to diversify away the unsystematic risk. This challenges previous studies of randomly chosen portfolios which states that at least 30 securities are needed. The result of this study sheds light upon the difference in risk diversification between random portfolios and optimal portfolios and is a valuable contribution for investors. The study suggests that a major part of the unsystematic risk in a portfolio can be diversified away with fewer securities by using portfolio optimization. Individual investors especially, who usually have portfolios consisting of few securities, benefit from these results. There are today multiple user-friendly software applications that can perform the computations of portfolio optimization without the user having to know the mathematics behind the program. Microsoft Excel’s solver function is an example of a well-used software for portfolio optimization. In this study however, MATLAB was used to perform all the optimizations. The study was executed on data of 140 stocks on NASDAQ Stockholm during 2000-2014. Multiple optimizations were done with varying input in order to yield a result that only depended on the investigated variable, that is, how many different stocks that are needed in order to diversify away the unsystematic risk in a portfolio. / <p>Osäker på examinatorns namn, tog namnet på den person som skickade mejl om betyg.</p>

Page generated in 0.0654 seconds