• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 71
  • 7
  • 5
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 122
  • 122
  • 26
  • 23
  • 20
  • 16
  • 16
  • 15
  • 14
  • 13
  • 10
  • 10
  • 10
  • 9
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Validation of a DC-DC Boost Circuit Model and Control Algorithm

Zumberge, Jon T. 27 August 2015 (has links)
No description available.
52

Improving the accuracy of statistics used in de-identification and model validation (via the concordance statistic) pertaining to time-to-event data

Caetano, Samantha-Jo January 2020 (has links)
Time-to-event data is very common in medical research. Thus, clinicians and patients need analysis of this data to be accurate, as it is often used to interpret disease screening results, inform treatment decisions, and identify at-risk patient groups (ie, sex, race, gene expressions, etc.). This thesis tackles three statistical issues pertaining to time-to-event data. The first issue was incurred from an Institute for Clinical and Evaluative Sciences lung cancer registry data set, which was de-identified by censoring patients at an earlier date. This resulted in an underestimate of the observed times of censored patients. Five methods were proposed to account for the underestimation incurred by de-identification. A subsequent simulation study was conducted to compare the effectiveness of each method in reducing bias, and mean squared error as well as improving coverage probabilities of four different KM estimates. The simulation results demonstrated that situations with relatively large numbers of censored patients required methodology with larger perturbation. In these scenarios, the fourth proposed method (which perturbed censored times such that they were censored in the final year of study) yielded estimates with the smallest bias, mean squared error, and largest coverage probability. Alternatively, when there were smaller numbers of censored patients, any manipulation to the altered data set worsened the accuracy of the estimates. The second issue arises when investigating model validation via the concordance (c) statistic. Specifically, the c-statistic is intended for measuring the accuracy of statistical models which assess the risk associated with a binary outcome. The c-statistic estimates the proportion of patient pairs where the patient with a higher predicted risk had experienced the event. The definition of a c-statistic cannot be uniquely extended to time-to-event outcomes, thus many proposals have been made. The second project developed a parametric c-statistic which assumes to the true survival times are exponentially distributed to invoke the memoryless property. A simulation study was conducted which included a comparative analysis of two other time-to-event c-statistics. Three different definitions of concordance in the time-to-event setting were compared, as were three different c-statistics. The c-statistic developed by the authors yielded the smallest bias when censoring is present in data, even when the exponentially distributed parametric assumptions do not hold. The c-statistic developed by the authors appears to be the most robust to censored data. Thus, it is recommended to use this c-statistic to validate prediction models applied to censored data. The third project in this thesis developed and assessed the appropriateness of an empirical time-to-event c-statistic that is derived by estimating the survival times of censored patients via the EM algorithm. A simulation study was conducted for various sample sizes, censoring levels and correlation rates. A non-parametric bootstrap was employed and the mean and standard error of the bias of 4 different time-to-event c-statistics were compared, including the empirical EM c-statistic developed by the authors. The newly developed c-statistic yielded the smallest mean bias and standard error in all simulated scenarios. The c-statistic developed by the authors appears to be the most appropriate when estimating concordance of a time-to-event model. Thus, it is recommended to use this c-statistic to validate prediction models applied to censored data. / Thesis / Doctor of Philosophy (PhD)
53

Dynamic Model of a Small Autonomous Hydrofoil Vessel

Moon, Heejip 06 June 2024 (has links)
This thesis presents the development of a six degree of freedom nonlinear dynamic model for a single-mast fully submerged hydrofoil vehicle. The aim of the model is to aid in evaluating various model-based controllers for autonomous operation by simulating their performance before implementation in the field. Initially, first principles approach is employed to develop an approximate dynamic model of the vehicle. Prediction of the vehicle motion using the first principles model is then compared with the data from the tow tank experiments to assess the accuracy of the assumptions made in estimating the hydrofoil performance. Additionally, the dynamic model is adjusted to reflect the measured hydrodynamic forces in the tow tank tests. Utilizing the modified dynamic model to simulate the vehicle motion, an initial height controller is designed and tuned in field trials until stable foiling state was achieved. We evaluate the field results and discuss the limitation of employing steady-state tow tank data in establishing the vehicle dynamic model. / Master of Science / This thesis presents the development of a model describing the motion of a hydrofoil vehicle. The craft uses hydrofoils which act like conventional airplane wings that work in water instead of air to lift the hull fully out of the water. In order to maintain a set height above the water and direction of travel, the vehicle needs some form of a controller for autonomous operation. The purpose of the vehicle model is to aid in development of these controllers by simulating and evaluating their performance before implementation in the field. Initially, forces acting on the vehicle are approximated using fundamental hydrodynamic theory. The theoretical model is then compared with experimental data to assist in characterization of the hydrofoils. Building upon the measured test data, we create a preliminary height controller in simulation and conduct field trials to achieve stable foiling state.
54

Model Validation for a Steel Deck Truss Bridge over the New River

Hickey, Lucas James 26 May 2008 (has links)
This thesis presents the methods utilized to model a steel deck truss bridge over the New River in Hillsville, Virginia. These methods were evaluated by comparing analytical results with data recorded from 14 members during live load testing. The research presented herein is part of a larger endeavor to understand the structural behavior and collapse mechanism of the erstwhile I-35W bridge in Minneapolis, MN. Objectives accomplished toward this end include investigation of lacing effects on built up member strain detection, live load testing of a steel truss bridge, and evaluating modeling techniques in comparison to recorded data. Before any live load testing could be performed, it was necessary to confirm an acceptable strain gage layout for measuring member strains. The effect of riveted lacing in built-up members was investigated by constructing a two-thirds mockup of a typical bridge member. The mockup was then instrumented with strain gages and subjected to known strains in order to determine the most effective strain gage arrangement. Testing analysis concluded that for a built up member consisting of laced channels, one strain gage installed on the middle of the extreme fiber of each channel's flanges was sufficient. Thus, laced members on the bridge were mounted with four strain gages each. Data from live loads were obtained by loading two trucks to 25 tons each. Trucks were positioned at eight locations on the bridge in four different relative truck positions. Data were recorded continuously and reduced to member forces for model validation comparisons. Deflections at selected truss nodes were also recorded for model validation purposes. The model validation process began by developing four simple truss models, each reflecting different expected restraint conditions, in the hopes of bracketing data from recorded results. Models were refined to frames, and then frames including floor beams and stringers for greater accuracy. The final, most accurate model was selected and used for a failure analysis. This model showed where the minimum amount of load could be applied in order to learn about the bridge's failure behavior, for a test to be conducted at a later time. / Master of Science
55

Development and Validation of a DEM-based Model for Predicting Grain Damage

Zhengpu Chen (7036694) 20 May 2024 (has links)
<p dir="ltr">During agricultural production, grain damage is a persistent problem that reduces grain quality. The goal of this study is to develop mechanics-based models that can accurately predict grain damage caused by mechanical handling processes and validate the models with lab-scale and industrial-scale test systems.</p><p dir="ltr">A discrete element method (DEM) simulation was developed to predict the impact damage of corn kernels in a Stein breakage tester. The DEM model relied on an empirically generated, three-parameter Weibull distribution describing the damage probability of repeated impacts. It was found that the DEM model was able to give good predictions on the kernel damage fraction for different sample sizes and operating times. The root-mean-square deviation between the damage fractions acquired from the simulation and experiment is 0.05. A sensitivity analysis was performed to study the effects of material and interaction properties on damage fraction. It was found that damage resistance parameters, coefficients of restitution, and particle shape representation had a significant effect on damage fraction. The statistics of the number of contacts and impact velocity were collected in the simulation to interpret the results of sensitivity analysis at the contact level. The locations where the damage occurs on the particle and in the operating device were also predicted by the model.</p><p dir="ltr">In addition to impact damage, another major type of grain damage is compression damage caused by mechanical harvesting and handling processes. A mechanistic model was developed to predict the compression damage of corn kernels using the DEM. The critical model input parameters were determined using a combination of single kernel direct measurements and bulk kernel calibration tests. The Young's modulus was measured with a single kernel compression test and verified with a bulk kernel compression test. An innovative approach was proposed to calibrate the average failure stress using a bulk kernel compression test. After implementation of the model, a validation test was performed using a Victoria mill. Comparing the simulation and the experimental results demonstrated that the simulation gave a good prediction of the damage fraction and the location of the damage when the von Mises stress damage criterion with a variable damage threshold was used. A sensitivity analysis was conducted to study the effects of selected model input parameters, including particle shape, Young's modulus, particle-particle coefficient of friction, particle-boundary coefficient of friction, particle-boundary coefficient of restitution, and damage criterion.</p><p dir="ltr">An industrial-scale handling system was designed and built to validate the DEM-based grain impact damage model. The low moisture content corn and soybean samples were handled through the system at three impeller speed levels and two feed rate levels, and the amount of damage caused by handling was evaluated. DEM simulations with the impact damage model were constructed and run under the corresponding test conditions. The experimental results showed that grain damage increased with increasing impeller speed and decreasing feed rate, which aligned with the model predictions. The simulated damage fraction values were larger than the experimental measurements when the experimentally-measured DEM input parameters were used. The simulation predictions can be significantly improved by decreasing the particle-boundary coefficient of restitution (PB COR). The mean absolute error between the simulation and experimental results decreased from 0.14 to 0.02 for the corn tests and from 0.05 to 0.01 for the soybean tests after the reduction of PB COR.</p><p dir="ltr">The developed damage models can accurately predict the amount of grain damage and the locations where the damage occur within a grain handling system. The models are expected to be useful in providing guidance on designing and operating grain handling processes to minimize kernel damage and, thus, improve grain quality. To further improve the performance of the model, the methods that accurately and efficiently determine the model input parameters need to be explored. In addition, in this work, the models were only applied to corn and soybeans at specific conditions. The applicability of the model to other types of grain, such as rice, or other grain conditions, such as wet corn, should be investigated.</p>
56

A parametric simulation on the effect of the rejected brine temperature on the performance of multieffect distillation with thermal vapour compression desalination process and its environmental impacts

Buabbas, Saleh K., Al-Obaidi, Mudhar A.A.R., Mujtaba, Iqbal 31 March 2022 (has links)
Yes / Multieffect distillation with thermal vapour compression (MED–TVC) is one of the most attractive thermal desalination technologies for the production of freshwater. Several mathematical models were presented in the open literature to analyse the steady-state performance of such process. However, these models have several limitations and assumptions. Therefore, there remains the challenge of having a reliable model to accurately predict the performance of the MED process. Thus, this research attempts to resolve this challenge by rectifying the shortcomings of the models found in the literature and create a new one. The robustness of the developed model is evaluated against the actual data of Umm Al-Nar commercial plant situated in UAE. In seawater desalinisation, a large amount of high-salinity stream (brine) is rejected back into the sea. This paper investigates the influence of the rejected (exit) brine temperature on the system performance parameters of MED–TVC process. Specifically, these parameters are considered as total heat consumption, gain output ratio, freshwater production, heat transfer area and performance ratio. Also, the particular parameters of TVC section of the entrainment ratio, compression ratio and expansion ratio are also addressed. Moreover, a critical evaluation of the influence of the rejected brine temperature on the seawater is also embedded.
57

Machine Learning-Based Parameter Validation

Badayos, Noah Garcia 24 April 2014 (has links)
As power system grids continue to grow in order to support an increasing energy demand, the system's behavior accordingly evolves, continuing to challenge designs for maintaining security. It has become apparent in the past few years that, as much as discovering vulnerabilities in the power network, accurate simulations are very critical. This study explores a classification method for validating simulation models, using disturbance measurements from phasor measurement units (PMU). The technique used employs the Random Forest learning algorithm to find a correlation between specific model parameter changes, and the variations in the dynamic response. Also, the measurements used for building and evaluating the classifiers were characterized using Prony decomposition. The generator model, consisting of an exciter, governor, and its standard parameters have been validated using short circuit faults. Single-error classifiers were first tested, where the accuracies of the classifiers built using positive, negative, and zero sequence measurements were compared. The negative sequence measurements have consistently produced the best classifiers, with majority of the parameter classes attaining F-measure accuracies greater than 90%. A multiple-parameter error technique for validation has also been developed and tested on standard generator parameters. Only a few target parameter classes had good accuracies in the presence of multiple parameter errors, but the results were enough to permit a sequential process of validation, where elimination of a highly detectable error can improve the accuracy of suspect errors dependent on the former's removal, and continuing the procedure until all corrections are covered. / Ph. D.
58

Methodology for VHDL performance model construction and validation

Vuppala, Srilekha 29 August 2008 (has links)
Hardware description languages(HDLs) are frequently used to construct performance models to represent systems early in the design process. This research study has resulted in the development of a methodology to construct VHDL performance models which will help to significantly reduce the time from an initial conception to a working design. To further reduce development time, reuse of existing structural primitives is emphasized. Typical models of multi-processor architectures are very large and complex. Validation of theses models is difficult and time consuming. This thesis also develops a methodology for model validation. A seventeen processor raceway architecture, that was developed as a part of the ongoing RASSP(Rapid Prototyping of Application Specific Signal Processors) project, is used as a template to illustrate the new methodologies of performance model construction and model validation. The design consists of seventeen processors interconnected by multiple crossbar switches. Two software algorithms were mapped onto the architecture: a Synthetic Aperture Radar(SAR) Range Processing Algorithm and a SAR Multiswath Processing Algorithm. The methodologies developed in this thesis will considerably reduce the amount of time needed to construct and validate performance models of complex multi-processor architectures. / Master of Science
59

Experimental and Modeling of Pneumatic Tire Performance on Ice

Jimenez, Emilio 23 April 2018 (has links)
The tire-ice interaction is a highly complex phenomenon, which has a direct influence on the overall performance of the pneumatic tire. From tire-terrain interaction dynamics, it is evident that icy road conditions and tire operational parameters play a vital role in determining the overall performance of the vehicle. With the reduction of traction available at the surface in icy conditions, the dynamics of the vehicle becomes more unpredictable, as the system can become unstable. In order to design an appropriate safety system, the tire-ice interaction must be closely investigated. Since the tire is the part of the vehicle that is in direct contact with the terrain during operation, it is critical to have an in-depth understanding of the contact mechanics at the contact patch. This study has led to the development and validation of an existing tire-ice model to further improve the understanding of the contact phenomena at the tire-ice interface. Experimental investigations led to a novel measurement technique in order to validate the semi-empirical based tire-ice contact model. The Advanced Tire-Ice Interface Model serves to simulate the temperature rise at the contact patch based on the pressure distribution in the contact patch, thermal properties of the tread compound and of the ice surface. Since its initial development, the advanced model is now capable of simulating the thin water film created from the melted ice, the prediction of tractive performance, the estimation of the viscous friction due to the water layer, and the influence of braking operations including the locked wheel condition. Experimental studies, carried out at the Terramechanics, Multibody, and Vehicle Systems (TMVS) Laboratory, were performed on the Terramechanics Rig. The investigation included measuring the bulk temperature distribution at the contact patch in order to validate the temperature rise simulations of the original Tire-Ice Model. The tractive performance of a P225/60R16 97S Standard Reference Test Tire and a 235/55R-19 Pirelli Scorpion Verde All-Season Plus XL were also investigated during this study. A design of experiment was prepared to capture the tire tractive performance under various controlled operating conditions. / Ph. D. / Icy road conditions and tire performance play a vital role in determining the overall performance of a vehicle. With the reduction of traction available at the surface in icy conditions, the vehicle becomes more unpredictable and can become uncontrollable. In order to design an appropriate safety system, the tire-ice interaction must be closely investigated. This research aims at enhancing the understanding of the tire-ice contact interaction at the contact patch through modeling and experimental studies for a pneumatic tire traversing over solid ice. Prior work in the laboratory produced a Tire-Ice Model (TIM) with the purpose of estimating the friction at the tire-ice interface. The current work builds on that study, resulting in the Advanced Tire-Ice Interface Model (ATIIM). This model predicts the temperature rise at the tire-ice interface based on the measured pressure distribution and the thermal properties of the tire and of the ice surface. This model allows a more thorough investigation of the tire-ice interface, being capable of predicting the height of the thin water film created from the melted ice, the prediction of tractive performance of the tire, the estimation of the viscous friction due to the water layer at the contact interface, and the influence of braking operations, including the locked wheel (skid) condition. Experimental studies were carried out at Terramechanics, Multibody, and Vehicle Systems (TMVS) Laboratory on the Terramechanics Rig. The experimental investigation included measuring the temperature at various points at the tire-ice interface in order to compare the temperature rise predicted using the ATIIM. Furthermore, the tractive performance of the tire was also investigated by examining different conditions of vertical tire load, tire inflation pressure, and ice surface temperatures as well as various steering configurations set by the user. In addition to investigating the performance at the tire-ice interface, a vehicle model in which the front wheels are considered as one (and the same for the rear wheels), often referred to as the bicycle model, is studied while traveling over smooth ice. To ensure the accuracy of the vehicle simulation, the tire model chosen must account for the actual conditions in which the model will operate. In this study, the ATIIM is incorporated in empirical tire models commonly used in industry and used in conjunction with a vehicle model to accurately predict the behavior of the vehicle when operating on smooth ice.
60

Validation and Evaluation of Emergency Response Plans through Agent-Based Modeling and Simulation

Helsing, Joseph 05 1900 (has links)
Biological emergency response planning plays a critical role in protecting the public from possible devastating results of sudden disease outbreaks. These plans describe the distribution of medical countermeasures across a region using limited resources within a restricted time window. Thus, the ability to determine that such a plan will be feasible, i.e. successfully provide service to affected populations within the time limit, is crucial. Many of the current efforts to validate plans are in the form of live drills and training, but those may not test plan activation at the appropriate scale or with sufficient numbers of participants. Thus, this necessitates the use of computational resources to aid emergency managers and planners in developing and evaluating plans before they must be used. Current emergency response plan generation software packages such as RE-PLAN or RealOpt, provide rate-based validation analyses. However, these types of analysis may neglect details of real-world traffic dynamics. Therefore, this dissertation presents Validating Emergency Response Plan Execution Through Simulation (VERPETS), a novel, computational system for the agent-based simulation of biological emergency response plan activation. This system converts raw road network, population distribution, and emergency response plan data into a format suitable for simulation, and then performs these simulations using SUMO, or Simulations of Urban Mobility, to simulate realistic traffic dynamics. Additionally, high performance computing methodologies were utilized to decrease agent load on simulations and improve performance. Further strategies, such as use of agent scaling and a time limit on simulation execution, were also examined. Experimental results indicate that the time to plan completion, i.e. the time when all individuals of the population have received medication, determined by VERPETS aligned well with current alternate methodologies. It was determined that the dynamic of traffic congestion at the POD itself was one of the major factors affecting the completion time of the plan, and thus allowed for more rapid calculations of plan completion time. Thus, this system provides not only a novel methodology to validate emergency response plans, but also a validation of other current strategies of emergency response plan validation.

Page generated in 0.1087 seconds