431 |
Brachiating power line inspection robot: controller design and implementationShongwe, Lindokuhle 29 September 2021 (has links)
The prevalence of electrical transmission networks has led to an increase in productivity and prosperity. In 2014, estimates showed that the global electric power transmission network consisted of 5.5 million circuit kilometres (Ckm) of high-voltage transmission lines with a combined capacity of 17 million mega-volt ampere. The vastness of the global transmission grid presents a significant problem for infrastructure maintenance. The high maintenance costs, coupled with challenging terrain, provide an opportunity for autonomous inspection robots. The Brachiating Power Line Inspection Robot (BPLIR) with wheels [73] is a transmission line inspection robot. The BPLIR is the focus of this research and this dissertation tackles the problem of state estimation, adaptive trajectory generation and robust control for the BPLIR. A kinematics-based Kalman Filter state estimator was designed and implemented to determine the full system state. Instrumentation used for measurement consisted of 2 Inertial Measurement Units (IMUs). The advantages of utilising IMUs is that they are less susceptible to drift, have no moving parts and are not prone to misalignment errors. The use of IMU's in the design meant that absolute angles (link angles measured with respect to earth) could be estimated, enabling the BPLIR to navigate inclined slopes. Quantitative Feedback Control theory was employed to address the issue of parameter uncertainty during operation. The operating environment of the BPLIR requires it to be robust to environmental factors such as wind disturbance and uncertainty in joint friction over time. The resulting robust control system was able to compensate for uncertain system parameters and reject disturbances in simulation. An online trajectory generator (OTG), inspired by Raibert-style reverse-time symmetry[10], fed into the control system to drive the end effector to the power line by employing brachiation. The OTG produced two trajectories; one of which was reverse time symmetrical and; another which minimised the perpendicular distance between the end gripper and the power line. Linear interpolation between the two trajectories ensured a smooth bump-less trajectory for the BPLIR to follow.
|
432 |
Evaluation of the stress ralaxation technique for measuring softening kinetics in aluminium alloysGeorge, Sarah January 2006 (has links)
Includes bibliographical references. / The development of the microstructure during thennomechanical processing (TMP) is critical in determining the final properties and quality of metal strip. In the particular case where aluminium sheet is used for lithographic applications, the surface appearance after electro-etching should be devoid of any streaking or inhomogeneous discolouration. The cause of possible streaking effects can be related to poor microstructure development during TMP and often arises as a result of inadequate recrystallisation. To avoid the deleterious effects, it is important to implement the appropriate rolling conditions in order to control the processes of recovery and recrystallisation. The means by which the correct rolling conditions can be established is usually by extensive laboratory simulations and concomitant microstructural analysis. In view of the fact that this approach is often tedious, the present study has investigated the use of the stress relaxation technique to provide rapid data on the recovery and recrystallisation kinetics for commercial purity aluminium under defonnation conditions that closely simulate hot rolling operations. Stress relaxation (SR) curves have been generated for AA1200 aluminium, as well as for two magnesium containing alloys, namely AA5182 (5wt% Mg) and an experimental alloy (Al-l %Mg). Fully recrystallised microstructures were subjected to uniaxial compression in the temperature range of 300-400?C. Strains and strain rates were up to 0.7 and ls respectively. Stress relaxation was measured for intervals up to 15 minutes and the evolved microstructures were examined after fixed intervals using polarised light microscopy and electron backscatter diffraction.
|
433 |
AN IMPLEMENTATION STRATEGY FOR LEAN MANUFACTURING IN HIGHMIX AND LOW VOLUME (HMLV) ENVIRONMENTAljubiri, Abdullah 29 August 2019 (has links)
No description available.
|
434 |
Design of the Trunk and Torso of a Lower-Limb ExoskeletonPaar, Maja 01 September 2021 (has links)
No description available.
|
435 |
On Building Blocks for Virtual Testing of Unidirectional Polymeric CompositesJanuary 2019 (has links)
abstract: This research summarizes the characterization of the constituent materials of a unidirectional composite for use in a finite element model. Specifically the T800s-F3900 composite from Toray Composites, Seattle, WA. Testing was carried out on cured polymer matrix provided by the manufacturer and single fiber specimen. The material model chosen for the polymer matrix was MAT 187 (Semi-Analytical Model for Polymers) which allowed for input of the tension, compression, and shear load responses.
The matrix was tested in tension, compression, and shear and was assumed to be isotropic. Ultimate strengths of the matrix were found to be 10 580 psi in tension, 25 900 psi in compression, and 5 940 in shear. The material properties calculated suggest the resin as being an isotropic material with the moduli in tension and compression being approximately equal (3% difference between the experimental values) and the shear modulus following typical isotropic relations. Single fiber properties were obtained for the T800s fiber in tension only with the modulus being approximately 40 500 ksi and the peak stress value being approximately 309 ksi.
The material model predicts the behavior of the multi-element testing simulations in both deformation and failure in the direction of loading. / Dissertation/Thesis / Masters Thesis Engineering 2019
|
436 |
Development of a Framework for Charging Energy Storage for Large-Nonlinear Loads in a Tightly Coupled Microgrid Power SystemUnknown Date (has links)
Tightly coupled microgrid power systems strike a balance between maintaining the desired power quality and insuring a high utilization efficiency of large-nonlinear loads. This is of great interest to the US Navy. Application of large-nonlinear loads in microgrid power systems requires energy storage to serve as the power supply for the large-nonlinear load. Degradation of power quality caused by improper charging large-nonlinear load's energy storage will affect the normal operation of every electrical device inside it. The objectives of the proposed generalized framework for charging energy storage for large-nonlinear loads in a tightly coupled microgrid are to 1) mitigate the power impact of energy storage charging in order to maintain the desired power quality in microgrid power systems. and 2)ensure a rapid charging speed of the energy storage. In this research, the aforementioned objectives were pursued using both hardware and software solutions. The hardware solution involved selecting an optimal distribution architecture that can ensure fast charging of energy storage without degradation of power quality. The software solution involved the development of a comprehensive control strategy that can coordinate power generation controls and energy storage charging. The goal of developing a generalized microgrid scale framework for charging an energy storage for large-nonlinear load module was thus achieved. This achievement can contribute to the application of large-nonlinear loads in tightly coupled microgrid power systems. / A Dissertation submitted to the Department of Mechanical Engineering in partial fulfillment of the requirements for the degree of Doctor of Philosophy. / Spring Semester, 2015. / November 12, 2014. / generalized predictive control, large-nonlinear load, microgrid, power quality / Includes bibliographical references. / David A. Cartes, Professor Directing Thesis; Rodney Roberts, University Representative; Emmanuel G. Collins, Committee Member; Leon Van Dommelen, Committee Member; Sanjeev Srivastava, Committee Member.
|
437 |
3D reconstruction of porous structure and permeability simulationChen, Ao 07 May 2020 (has links)
There is an urgent need to improve the efficiency of heavy oil production, which requires a better knowledge of the correlation between the microstructure characteristics and fluid properties. However, due to the size- and time-scale limitations, it's hard to directly observe
ow behavior and the effects of fluid properties on recovery process. Also repeated drilling at different sites to get raw rock samples could be expensive. This situation motivates us to propose an approach to reconstruct the porous structure based on a few rock samples. With the reconstructed 3D structure, the correlation can be further determined and more experiments could be done. The data we received from our collaboration company Aramco were used to reconstruct 3D porous structure, then two nanoscribe technologies, two-photon lithography and large area projection micro-stereolithography were adopted to rebuild the porous structure. Some fluid tests were simulated and permeability of the rock sample were calculated based on Darcy's law and Kozenv-Carman equations. From the simulated results, the pressure at every point in the sample can be obtained and the permeability of the bare sample are 0.889 and 1.169 darcies respectively, when the fluid set to be water and base oil. / 2021-05-07T00:00:00Z
|
438 |
Machine learning for effective predictions and prescriptions in health careXu, Tingting 19 May 2020 (has links)
Early detection of acute hospitalizations and enhancing treatment efficiency is important to improve patients’ long-term life quality and reduce health care costs. This thesis develops data-driven methods to predict important health related events and optimize treatment options. Applications include predicting chronic-disease-related hospitalizations, predicting the effect of interventions, such as In Vitro Fertilization (IVF), and learning and improving upon physicians' prescription policies.
For a binary hospitalization classification problem, and to strike a balance between accuracy and interpretability of the prediction, a novel Alternating Clustering and Classification (ACC) method is proposed, which employs an alternating optimization approach that jointly identifies hidden patient clusters and adapts classifiers to each cluster. Convergence and out-of-sample guarantees for this algorithm are established. The algorithm is validated on large data sets from the Boston Medical Center, the largest safety-net hospital system in New England.
For the IVF outcome prediction problem, and for women who have difficulty conceiving, several predictive models that estimate IVF success rate are designed. For predicted non-pregnant subjects, an algorithm further predicts whether no embryos were implanted (due to embryo abnormalities) or pregnancy did not occur despite implantation. Results are presented to assess the sensitivity of the models to specific predictive variables.
The third problem considered amounts to modeling the patients' disease progression as a Markov Decision Process (MDP), and seeking to estimate the physicians' prescription policy and the disease state transition probabilities. Two regularized maximum likelihood estimation algorithms for learning the transition probability model and policy, respectively, are proposed. A sample complexity result that guarantees a low regret with a relatively small amount of training samples is established. The theoretical results are illustrated using a healthcare example.
Finally, the thesis develops a framework for learning and improving the pharmacological therapy algorithm used by physicians to treat type 2 diabetes, based on prescription data. First, the proposed approach predicts the outcomes of prescriptions using regression, and then a policy consistent with physicians' prescriptions using a parametric multi-class classification method is synthesized from data. Then, by optimizing over algorithm parameters in the prescription model,
the algorithm is able to achieve better glucose control effects.
|
439 |
Modeling, Design and Control of Vacuum Assisted Resin Transfer Molding (VARTM) for Thickness Variation ReductionUnknown Date (has links)
In general, composite manufacturing processes have more variations compared to the metal manufacturing processes due to the larger raw material and manufacturing processes variations. Vacuum-assisted resin transfer molding (VARTM), one of a commonly used composite manufacturing processes, is becoming more popular due to its low cost tooling and environmental friendly operating conditions. Currently, most commercial products manufactured by VARTM are developed based on the user's experience and involve repeated experiments. To optimize the process, reduce manufacturing costs, and maintain consistent part quality, knowledge of mold filling, especially flow through thickness direction is required. This dissertation investigates the mechanism of the thickness variation and quantifies the magnitudes of the thickness distribution. Typically, thickness gradient and variations of VARTMed parts result from material variations and the infusion pressure gradient during the process. After infusion, certain amount of pressure gradient is frozen into the preform, which primarily contributes to the thickness variation. This research investigates the mechanism of the thickness variation dynamic change during the infusion and curing/relaxing processes. A numerical model was developed to track the thickness change of the bagging film free surface. A time-dependent permeability model as a function of compaction pressure was incorporated into an existing resin transfer molding code for obtaining the initial conditions of curing/relaxing process. Control volume (CV) and volume of fluid (VOF) methods were combined to solve the free surface problem. In addition, this dissertation analyzes the sources of the uncertainties and quantifies the magnitudes of the uncertainties by error propagation theory to characterize the statistical properties of the permeability values. Normal distribution and Weibull distribution were utilized as the statistical models for representing the average permeability values and race-tracking effects, respectively. Factors related to the part thickness variation were identified with design of experiments method and a better tooling design was obtained by configuring the different flow media. With the help of the simulation program, a process model-based tooling design optimization was formulated. However, the parameter uncertainty made the deterministic optimization unreliable. To address the issue of part-to-part thickness variation, a stochastic process simulation coupled with optimization was proposed and demonstrated. / A Dissertation Submitted to the Department of Industrial and Manufacturing
Engineering in Partial Fulfillment of the Requirements for the Degree of Doctor of
Philosophy. / Summer Semester, 2006. / May 26, 2006. / Variation Reduction, Simulation, Stochastic Process, Sampling, VARTM / Includes bibliographical references. / Chuck Zhang, Professor Co-Directing Dissertation; Ben Wang, Professor Co-Directing Dissertation; Max Gunzburger, Outside Committee Member; Okenwa Okoli, Committee Member; Zhiyong Liang, Committee Member.
|
440 |
Assessment of Response Surface Modeling Techniques for Parametric Analyses of Computationally Expensive SimulationsUnknown Date (has links)
Traditional experimental design techniques have long been employed in industrial, agricultural, and other physical settings to characterize and optimize systems and processes. Such physical systems are characterized by random error, resulting in inconsistent behavior under identical operating conditions. The issues associated with such a random error (or stochastic) component have been addressed in classical design of experiments (DOE), and methods for handling such error are well documented by Myers (2002). In many modern applications, adequate experimentation of physical systems is too costly and/or time consuming, so Fang et al. (2006) suggest that computer simulations are becoming more prominent as computing power increases. Simulations serve as surrogates of physical systems and make use of uncertain or imprecise parameters. Insight about a physical system derived from a simulation should be taken into account with this uncertainty. Sensitivity and uncertainty analyses are tools that support validation and verification and allow an experimenter to place confidence in simulation results. One means to satisfy this end is by conducting and analyzing a computer experiment. A computer experiment is defined by Currin (1988) as a collection of runs from a simulation in which a record of response variables are logged and examined. Many computer simulations are deterministic, that is identical operating conditions produce no variability in system performance. This poses a problem for traditional statistical modeling methods that are related to DOE, as these classical methods assume that the errors are independently and identically distributed. Models derived from ordinary least squares (OLS) regression assume that variance is fixed over the design space. However, Fang et al. (2006) mention that a deterministic simulation does not have constant variance, as observed points in the design space are precisely known. Because usual uncertainty measurements derived from OLS residuals are not sensible in this case, Currin (1988), Denison et al. (2002), Fang et al. (2006), and Sacks (1989) suggest that a Bayesian approach to regression modeling is more appropriate than traditional techniques because traditional, non-Bayesian approaches fail to take uncertainty of the parameters into account. Bayesian methods involve optimization over the unknown parameters and allow the modeler to make prior assumptions on the regression variance and therefore may be better suited for predicting unobserved responses. The following study compares traditional methods to other techniques including Bayesian regression. / A Thesis submitted to the Department of Industrial and Manufacturing Engineering in partial fulfillment of the requirements for the degree of Master of
Science. / Spring Semester, 2009. / March 6, 2009. / Design of Experiments, Response Surface Methodology, Bayesian Regression, Computer Experiments, Simulations / Includes bibliographical references. / Joseph J. Pignatiello, Jr., Professor Directing Thesis; Arda Vanli, Committee Member; Thomas Baldwin, Committee Member; Mischa Steurer, Committee Member.
|
Page generated in 0.1016 seconds