• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 35
  • 15
  • 5
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 289
  • 289
  • 101
  • 99
  • 81
  • 69
  • 69
  • 46
  • 39
  • 38
  • 38
  • 37
  • 35
  • 32
  • 31
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
261

RESIDENTIAL ELECTRICITY CONSUMPTION ANALYSIS: A CROSSDOMAIN APPROACH TO EVALUATE THE IMPACT OF COVID-19 IN A RESIDENTIAL AREA IN INDIANA

Manuel Eduardo Mar Valencia (11256321) 10 August 2021 (has links)
The pandemic scenario caused by COVID-19 is an event with no precedent. Therefore, it<br>is a phenomenon that can be studied to observe how electricity loads have changed during the stayat-home order weeks. The data collection process was done through online surveys and using<br>publicly available data. This study is focusing on analyzing household energy units such as<br>appliances, HVAC, lighting systems. However, collecting this data is expensive and timeconsuming since dwellings would have to be studied individually. As a solution, previous studies<br>have shown success in characterizing residential electricity using surveys with stochastic models.<br>This characterized electricity consumption data allows the researchers to generate a predictive<br>model, make a regression and understand the data. In that way, the data collection process will not<br>be as costly as installing measuring instruments or smart meters. The input data will be the<br>behavioral characteristics of each participant; meanwhile, the output of the analysis will be the<br>estimated electricity consumption "kWh." After generating the "kWh" target, a sensitivity analysis<br>will be done to observe the electricity consumption through time and examine how people evolved<br>their load during and after the stay-at-home order.<br>This research can help understand the change in electricity consumption of people who<br>worked at home during the pandemic and generate energy indicators and costs such as home office<br>electricity cost kWh/year. In addition to utilities and energy, managers can benefit from having a<br>clear understanding of domestic consumers during emergency scenarios as pandemics. <br>
262

Statistical Designs for Network A/B Testing

Pokhilko, Victoria V 01 January 2019 (has links)
A/B testing refers to the statistical procedure of experimental design and analysis to compare two treatments, A and B, applied to different testing subjects. It is widely used by technology companies such as Facebook, LinkedIn, and Netflix, to compare different algorithms, web-designs, and other online products and services. The subjects participating in these online A/B testing experiments are users who are connected in different scales of social networks. Two connected subjects are similar in terms of their social behaviors, education and financial background, and other demographic aspects. Hence, it is only natural to assume that their reactions to online products and services are related to their network adjacency. In this research, we propose to use the conditional autoregressive model (CAR) to present the network structure and include the network effects in the estimation and inference of the treatment effect. The following statistical designs are presented: D-optimal design for network A/B testing, a re-randomization experimental design approach for network A/B testing and covariate-assisted Bayesian sequential design for network A/B testing. The effectiveness of the proposed methods are shown through numerical results with synthetic networks and real social networks.
263

Optimisation of food overloading at long distance flights

Eger, Karl-Heinz, Uranchimeg, Tudevdagva 22 August 2009 (has links)
This paper deals with optimisation of food overloading at long distance flights. It is described how in case of two offered meals and two several passenger groups reserve meals are to distribute to both meals such that the probability that each passenger will get the meal of its choice is maximised. A statistical procedure is presented for estimation of needed demand probabilities.
264

Information Acquisition in Engineering Design: Descriptive Models and Behavioral Experiments

Ashish Mortiram Chaudhari (9183002) 29 July 2020 (has links)
Engineering designers commonly make sequential information acquisition decisions such as selecting designs for performance evaluation, selecting information sources, deciding whom to communicate with in design teams, and deciding when to stop design exploration. There is significant literature on normative decision making for engineering design, however, there is a lack of descriptive modeling of how designers actually make information acquisition decisions. Such descriptive modeling is important for accurately modeling design decisions, identifying sources of inefficiencies, and improving the design process. To that end, the research objective of the dissertation is to understand how designers make sequential information acquisition decisions and identify models that provide the best description of a designer’s decisions strategies. For gaining this understanding, the research approach consists of a synthesis of descriptive theories from psychological and cognitive sciences, along with empirical evidence from behavioral experiments under different design situations. Statistical Bayesian inference is used to determine how well alternate descriptive decision models describe the experimental data. This approach quantifies a designer's decision strategies through posterior parameter estimation and Bayesian model comparison. <br><br>Two research studies, presented in this dissertation, focus on assessing the effects of monetary incentives, fixed budget, type of design space exploration, and the availability of system-wide information on information acquisition decisions. The first study presented in this dissertation investigates information acquisition by an individual designer when multiple information sources are available and the total budget is limited. The results suggest that the student subjects' decisions are better represented by the heuristic-based models than the expected utility(EU)-based models. <br>While the EU-based models result in better net payoff, the heuristic models used by the subjects generate better design performance. The results also indicate the potential for nudging designers' decisions towards maximizing the net payoff by setting the fixed budget at low values and providing monetary incentives proportional to the saved budget.<br><br>The second study investigates information acquisition through communication. The focus is on designers’ decisions about whom to communicate with, and how much to communicate when there is interdependence between subsystems being designed. This study analyzes team communication of NASA engineers at a mission design laboratory (MDL) as well as of engineering students designing a simplified automotive engine in an undergraduate classroom environment. The results indicate that the rate of interactions increases in response to the reduce in system-level design performance in both settings. Additionally, the following factors seem to positively influence communication decisions: the pairwise design interdependence, node-wise popularity (significant with NASA MDL engineers due to large team size), and pairwise reciprocity.<br><br>The dissertation work increases the knowledge about engineering design decision making in following aspects. First, individuals make information acquisition decisions using simple heuristics based on in-situ information such as available budget amount and present system performance.<br>The proposed multi-discipline approach proves helpful for describing heuristics analytically and inferring context-specific decision strategies using statistical Bayesian inference. This work has potential application in developing decision support tools for engineering design. Second, the comparison of communication patterns between student design teams and NASA MDL teams reveals that the engine experiment preserves some but not all of the communication patterns of interest. We find that the representativeness depends not on matching subjects, tasks, and context separately, but rather on the behavior that results from the interactions of these three dimensions. This work provides lessons for designing representative experiments in the future.
265

THE GAME CHANGER: ANALYTICAL METHODS FOR ENERGY DEMAND PREDICTION UNDER CLIMATE CHANGE

Debora Maia Silva (10688724) 22 April 2021 (has links)
<div>Accurate prediction of electricity demand is a critical step in balancing the grid. Many factors influence electricity demand. Among these factors, climate variability has been the most pressing one in recent times, challenging the resilient operation of the grid, especially during climatic extremes. In this dissertation, fundamental challenges related to accurate characterization of the climate-energy nexus are presented in Chapters 2--4, as described below. </div><div><br></div><div>Chapter 2 explores the cost of neglecting the role of humidity in predicting summer-time residential electricity consumption. Analysis of electricity demand in the CONUS region demonstrates that even though surface temperature---the most widely used metric for characterising heat stress---is an important factor, it is not sufficient for accurately characterizing cooling demand. The chapter proceeds to show significant underestimations of the climate sensitivity of demand, both in the observational space as well as under climate change. Specifically, the analysis reveals underestimations as high as 10-15% across CONUS, especially in high energy consuming states such as California and Texas. </div><div><br></div><div>Chapter 3 takes a critical look at one of the most widely used metrics, namely, the Cooling Degree Days (CDD), often calculated with an arbitrary set point temperature of 65F or 18.3C, ignoring possible variations due to different patterns of electricity consumption across different regions and climate zones. In this chapter, updated values are derived based on historical electricity consumption data across the country at the state level. Chapter 3 analysis demonstrates significant variation, as high as +-25%, between derived set point variables and the conventional value of 65F. Moreover, the CDD calculation is extended to account for the role of humidity, in the light of lessons learnt in the previous chapter. Our results reveal that under climate change scenarios, the air-temperature based CDD underestimates thermal comfort by as much as ~22%.</div><div><br></div><div>The predictive analytics conducted in Chapter 2 and Chapter 3 revealed a significant challenge in characterizing the climate-demand nexuses: the ability to capture the variability at the upper tails. Chapter 4 explores this specific challenge, with the specific goal of developing an algorithm to increase prediction accuracy at the higher quantiles of the demand distributions. Specifically, Chapter 4 presents a data-centric approach at the utility level (as opposed to the state-level analyses in the previous chapters), focusing on high-energy consuming states of California and Texas. The developed algorithm shows a general improvement of 7% in the mean prediction accuracy and an improvement of 15% for the 90th quantile predictions.</div>
266

DEVELOPMENT OF IMAGE-BASED DENSITY DIAGNOSTICS WITH BACKGROUND-ORIENTED SCHLIEREN AND APPLICATION TO PLASMA INDUCED FLOW

Lalit Rajendran (8960978) 07 May 2021 (has links)
<p>There is growing interest in the use of nanosecond surface dielectric barrier discharge (ns-SDBD) actuators for high-speed (supersonic/hypersonic) flow control. A plasma discharge is created using a nanosecond-duration pulse of several kilovolts, and leads to a rapid heat release and a complex three-dimensional flow field. Past work has been limited to qualitative visualizations such as schlieren imaging, and detailed measurements of the induced flow are required to develop a mechanistic model of the actuator performance. </p><p><br></p><p></p><p>Background-Oriented Schlieren (BOS) is a quantitative variant of schlieren imaging and measures density gradients in a flow field by tracking the apparent distortion of a target dot pattern. The distortion is estimated by cross-correlation, and the density gradients can be integrated spatially to obtain the density field. Owing to the simple setup and ease of use, BOS has been applied widely, and is becoming the preferred density measurement technique. However, there are several unaddressed limitations with potential for improvement, especially for application to complex flow fields such as those induced by plasma actuators. </p><p></p><p>This thesis presents a series of developments aimed at improving the various aspects of the BOS measurement chain to provide an overall improvement in the accuracy, precision, spatial resolution and dynamic range. A brief summary of the contributions are: </p><p>1) a synthetic image generation methodology to perform error and uncertainty analysis for PIV/BOS experiments, </p><p>2) an uncertainty quantification methodology to report local, instantaneous, a-posteriori uncertainty bounds on the density field, by propagating displacement uncertainties through the measurement chain,</p><p>3) an improved displacement uncertainty estimation method using a meta-uncertainty framework whereby uncertainties estimated by different methods are combined based on the sensitivities to image perturbations, </p><p>4) the development of a Weighted Least Squares-based density integration methodology to reduce the sensitivity of the density estimation procedure to measurement noise.</p><p>5) a tracking-based processing algorithm to improve the accuracy, precision and spatial resolution of the measurements, </p><p>6) a theoretical model of the measurement process to demonstrate the effect of density gradients on the position uncertainty, and an uncertainty quantification methodology for tracking-based BOS,</p><p>Then the improvements to BOS are applied to perform a detailed characterization of the flow induced by a filamentary surface plasma discharge to develop a reduced-order model for the length and time scales of the induced flow. The measurements show that the induced flow consists of a hot gas kernel filled with vorticity in a vortex ring that expands and cools over time. A reduced-order model is developed to describe the induced flow and applying the model to the experimental data reveals that the vortex ring's properties govern the time scale associated with the kernel dynamics. The model predictions for the actuator-induced flow length and time scales can guide the choice of filament spacing and pulse frequencies for practical multi-pulse ns-SDBD configurations.</p>
267

A tale of two applications: closed-loop quality control for 3D printing, and multiple imputation and the bootstrap for the analysis of big data with missingness

Wenbin Zhu (12226001) 20 April 2022 (has links)
<div><b>1. A Closed-Loop Machine Learning and Compensation Framework for Geometric Accuracy Control of 3D Printed Products</b></div><div><b><br></b></div>Additive manufacturing (AM) systems enable direct printing of three-dimensional (3D) physical products from computer-aided design (CAD) models. Despite the many advantages that AM systems have over traditional manufacturing, one of their significant limitations that impedes their wide adoption is geometric inaccuracies, or shape deviations between the printed product and the nominal CAD model. Machine learning for shape deviations can enable geometric accuracy control of 3D printed products via the generation of compensation plans, which are modifications of CAD models informed by the machine learning algorithm that reduce deviations in expectation. However, existing machine learning and compensation frameworks cannot accommodate deviations of fully 3D shapes with different geometries. The feasibility of existing frameworks for geometric accuracy control is further limited by resource constraints in AM systems that prevent the printing of multiple copies of new shapes.<div><br></div><div>We present a closed-loop machine learning and compensation framework that can improve geometric accuracy control of 3D shapes in AM systems. Our framework is based on a Bayesian extreme learning machine (BELM) architecture that leverages data and deviation models from previously printed products to transfer deviation models, and more accurately capture deviation patterns, for new 3D products. The closed-loop nature of compensation under our framework, in which past compensated products that do not adequately meet dimensional specifications are fed into the BELMs to re-learn the deviation model, enables the identification of effective compensation plans and satisfies resource constraints by printing only one new shape at a time. The power and cost-effectiveness of our framework are demonstrated with two validation experiments that involve different geometries for a Markforged Metal X AM machine printing 17-4 PH stainless steel products. As demonstrated in our case studies, our framework can reduce shape inaccuracies by 30% to 60% (depending on a shape's geometric complexity) in at most two iterations, with three training shapes and one or two test shapes for a specific geometry involved across the iterations. We also perform an additional validation experiment using a third geometry to establish the capabilities of our framework for prospective shape deviation prediction of 3D shapes that have never been printed before. This third experiment indicates that choosing one suitable class of past products for prospective prediction and model transfer, instead of including all past printed products with different geometries, could be sufficient for obtaining deviation models with good predictive performance. Ultimately, our closed-loop machine learning and compensation framework provides an important step towards accurate and cost-efficient deviation modeling and compensation for fully 3D printed products using a minimal number of printed training and test shapes, and thereby can advance AM as a high-quality manufacturing paradigm.<br></div><div><br></div><div><b>2. Multiple Imputation and the Bootstrap for the Analysis of Big Data with Missingness</b></div><div><br></div><div>Inference can be a challenging task for Big Data. Two significant issues are that Big Data frequently exhibit complicated missing data patterns, and that the complex statistical models and machine learning algorithms typically used to analyze Big Data do not have convenient quantification of uncertainties for estimators. These two difficulties have previously been addressed using multiple imputation and the bootstrap, respectively. However, it is not clear how multiple imputation and bootstrap procedures can be effectively combined to perform statistical inferences on Big Data with missing values. We investigate a practical framework for the combination of multiple imputation and bootstrap methods. Our framework is based on two principles: distribution of multiple imputation and bootstrap calculations across parallel computational cores, and the quantification of sources of variability involved in bootstrap procedures that use subsampling techniques via random effects or hierarchical models. This framework effectively extends the scope of existing methods for multiple imputation and the bootstrap to a broad range of Big Data settings. We perform simulation studies for linear and logistic regression across Big Data settings with different rates of missingness to characterize the frequentist properties and computational efficiencies of the combinations of multiple imputation and the bootstrap. We further illustrate how effective combinations of multiple imputation and the bootstrap for Big Data analyses can be identified in practice by means of both the simulation studies and a case study on COVID infection status data. Ultimately, our investigation demonstrates how the flexible combination of multiple imputation and the bootstrap under our framework can enable valid statistical inferences in an effective manner for Big Data with missingness.<br></div>
268

Statistical practice in preclinical neurosciences: Implications for successful translation of research evidence from humans to animals

Hogue, Olivia 23 May 2022 (has links)
No description available.
269

The Structural Basis for the Interdependence of Drug Resistance in the HIV-1 Protease

Ragland, Debra A. 13 December 2016 (has links)
The human immunodeficiency virus type 1 (HIV-1) protease (PR) is a critical drug target as it is responsible for virion maturation. Mutations within the active site (1°) of the PR directly interfere with inhibitor binding while mutations distal to the active site (2°) to restore enzymatic fitness. Increasing mutation number is not directly proportional to the severity of resistance, suggesting that resistance is not simply additive but that it is interdependent. The interdependency of both primary and secondary mutations to drive protease inhibitor (PI) resistance is grossly understudied. To structurally and dynamically characterize the direct role of secondary mutations in drug resistance, I selected a panel of single-site mutant protease crystal structures complexed with the PI darunavir (DRV). From these studies, I developed a network hypothesis that explains how mutations outside the active site are able to perpetuate changes to the active site of the protease to disrupt inhibitor binding. I then expanded the panel to include highly mutated multi-drug resistant variants. To elucidate the interdependency between primary and secondary mutations I used statistical and machine-learning techniques to determine which specific mutations underlie the perturbations of key inter-molecular interactions. From these studies, I have determined that mutations distal to the active site are able to perturb the global PR hydrogen bonding patterns, while primary and secondary mutations cooperatively perturb hydrophobic contacts between the PR and DRV. Discerning and exploiting the mechanisms that underlie drug resistance in viral targets could proactively ameliorate both current treatment and inhibitor design for HIV-1 targets.
270

SVD-BAYES: A SINGULAR VALUE DECOMPOSITION-BASED APPROACH UNDER BAYESIAN FRAMEWORK FOR INDIRECT ESTIMATION OF AGE-SPECIFIC FERTILITY AND MORTALITY

Chu, Yue January 2020 (has links)
No description available.

Page generated in 0.2178 seconds