• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 262
  • 21
  • 19
  • 14
  • 9
  • 7
  • 6
  • 5
  • 4
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 433
  • 433
  • 102
  • 100
  • 80
  • 70
  • 66
  • 65
  • 53
  • 53
  • 47
  • 42
  • 42
  • 40
  • 38
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Combining Data-driven and Theory-guided Models in Ensemble Data Assimilation

Popov, Andrey Anatoliyevich 23 August 2022 (has links)
There once was a dream that data-driven models would replace their theory-guided counterparts. We have awoken from this dream. We now know that data cannot replace theory. Data-driven models still have their advantages, mainly in computational efficiency but also providing us with some special sauce that is unreachable by our current theories. This dissertation aims to provide a way in which both the accuracy of theory-guided models, and the computational efficiency of data-driven models can be combined. This combination of theory-guided and data-driven allows us to combine ideas from a much broader set of disciplines, and can help pave the way for robust and fast methods. / Doctor of Philosophy / As an illustrative example take the problem of predicting the weather. Typically a supercomputer will run a model several times to generate predictions few days into the future. Sensors such as those on satellites will then pick up observations about a few points on the globe, that are not representative of the whole atmosphere. These observations are combined, ``assimilated'' with the computer model predictions to create a better representation of our current understanding of the state of the earth. This predict-assimilate cycle is repeated every day, and is called (sequential) data assimilation. The prediction step traditional was performed by a computer model that was based on rigorous mathematics. With the advent of big-data, many have wondered if models based purely on data would take over. This has not happened. This thesis is concerned with taking traditional mathematical models and running them alongside data-driven models in the prediction step, then building a theory in which both can be used in data assimilation at the same time in order to not have a drop in accuracy and have a decrease in computational cost.
42

Data driven modeling and MPC Based control for Pathological Tremors

Samal, Subham Swastik 19 December 2024 (has links)
Pathological tremor is a common neuromuscular disorder that significantly affects the quality of life for patients worldwide. With recent developments in robotics, rehabilitation exoskeletons serve as one of the solutions to alleviate these tremors. For improved performance of such devices, we need to solve a few problems, which include developing a model for pathological tremors, and a safe control system that can conveniently incorporate constraints on the wrist's range of motion and it's input force/torque. Accurate predictive modeling of tremor signals can be used to provide alleviation from these tremors via various currently available solutions like adaptive deep brain stimulation, electrical stimulation and rehabilitation orthoses. Existing methods are either too general or too simplistic to accurately predict these tremors in the long term, motivating us to explore better modeling of tremors for long-term predictions and analysis. We explore towards the prediction of tremors using artificial neural networks using EMG signals, leveraging the 20- 100 ms of Electromechanical Delay. The kinematics and EMG data of a publicly available Parkinson's tremor dataset is first analyzed, which confirms that the underlying EMGs have similar frequency composition as the actual tremor. 2 hybrid CNN-LSTM based deep learning architectures are then proposed to predict the tremor kinematics ahead of time using EMG signals and tremor kinematics history, and the results are compared with baseline models. This is then further extended by adding constraints-based losses in an attempt to further improve the predictions. Then, we explore the application of model-based predictive control (MPC) for the full wrist exoskeleton designed in our lab for the alleviation of tremors. The main motivation for using MPC here relies on its ability to incorporate state and input constraints, which are crucial for the user's safety. We employ a linear MPC methodology, in which the forearm-exoskeleton model is successively linearized at each time sample to obtain a linear state space model, which is then used to obtain the optimal input by minimizing a convex quadratic cost function. This is then integrated with the tremor model developed via BMFLC and neural networks to provide tremor suppression. Simulation studies are provided to demonstrate the effectiveness of the control schemes. The numerical simulations suggest that the MPC framework is capable of accurate trajectory tracking while providing better tremor suppression than a PD controller without using any tremor model, while the neural network model outperforms the frequency-based BMFLC model. The findings could set up for devising physics-based Neural networks for pathological tremor modeling and experimentally evaluate the performance of the developed framework. / Master of Science / Pathological tremors are involuntary, rhythmic, and oscillatory movements of the limbs that affect millions of people worldwide, and daily life activities like writing, eating, and object manipulation are challenging for them. In recent years, rehabilitation exoskeletons have been developed as non-invasive solutions to pathological tremor alleviation. The wrist is pivotal to human manipulation capabilities, and thus, a wrist exoskeleton (TAWE) has been developed in our lab to provide tremor alleviation. For improved performance of such devices, we need to solve a few problems, which include developing a model for pathological tremors, and a safe control system that can conveniently incorporate constraints on the wrist's range of motion and its input force/torque. We propose a deep-learning-based method for accurate modeling of tremors, along with a model predictive control framework for tremor suppression. Simulations and analyses are done to validate the tremor-modeling framework and the control framework of an exoskeleton for the tremor alleviation, and highlight shortcomings in the current methods that call for more research and advancements.
43

Fusing Modeling and Testing to Enhance Environmental Testing Approaches

Devine, Timothy Andrew 09 July 2019 (has links)
A proper understanding of the dynamics of a mechanical system is crucial to ensure the highest levels of performance. The understanding is frequently determined through modeling and testing of components. Modeling provides a cost effective method for rapidly developing a knowledge of the system, however the model is incapable of accounting for fluctuations that occur in physical spaces. Testing, when performed properly, provides a near exact understanding of how a pat or assembly functions, however can be expensive both fiscally and temporally. Often, practitioners of the two disciplines work in parallel, never bothering to intersect with the other group. Further advancement into ways to fuse modeling and testing together is able to produce a more comprehensive understanding of dynamic systems while remaining inexpensive in terms of computation, financial cost, and time. Due to this, the goal of the presented work is to develop ways to merge the two branches to include test data in models for operational systems. This is done through a series of analytical and experimental tasks examining the boundary conditions of various systems. The first venue explored was an attempt at modeling unknown boundary conditions from an operational environment by modeling the same system in known configurations using a controlled environment, such as what is seen in a laboratory test. An analytical beam was studied under applied environmental loading with grounding stiffnesses added to simulate an operational condition and the response was attempted to be matched by a free boundaries beam with a reduced number of excitation points. Due to the properties of the inverse problem approach taken, the response between the two systems matched at control locations, however at non-control locations the responses showed a large degree of variation. From the mismatch in mechanical impedance, it is apparent that improperly representing boundary conditions can have drastic effects on the accuracy of models and recreational tests. With the progression now directed towards modeling and testing of boundary conditions, methods were explored to combine the two approaches working together in harmony. The second portion of this work focuses on modeling an unknown boundary connection using a collection of similar testable boundary conditions to parametrically interpolate to the unknown configuration. This was done by using data driven models of the known systems as the interpolating functions, with system boundary stiffness being the varied parameter. This approach yielded near identical parametric model response to the original system response in analytical systems and showed some early signs of promise for an experimental beam. After the two conducted studies, the potential for extending a parametric data driven model approach to other systems is discussed. In addition to this, improvements to the approach are discussed as well as the benefits it brings. / Master of Science / A proper understanding of the dynamics of a mechanical system in a severe environment is crucial to ensure the highest levels of performance. The understanding is frequently determined through modeling and testing of components. Modeling provides a cost-effective method for rapidly developing a knowledge of the system; however, the model is incapable of accounting for fluctuations that occur in physical spaces. Testing, when performed properly, provides a near exact understanding of how a pat or assembly functions, however, can be expensive both fiscally and temporally. Often, practitioners of the two disciplines work in parallel, never bothering to intersect with the other group and favoring one approach over the other for various reasons. Further advancement into ways to fuse modeling and testing together can produce a more comprehensive understanding of dynamic systems subject to environmental excitation while remaining inexpensive in terms of computation, financial cost, and time. Due to this, the presented work aims to develop ways to merge the two branches to include test data in models for operational systems. This is done through a series of analytical and experimental tasks examining the boundary conditions of various systems and attempting to replicate the system response using inverse approaches at first. This is then proceeded by modeling boundary stiffnesses using data-driven modeling and parametric modeling approaches. The validity and impact these methods may have are also discussed.
44

Cross-Validation of Data-Driven Correction Reduced Order Modeling

Mou, Changhong 03 October 2018 (has links)
In this thesis, we develop a data-driven correction reduced order model (DDC-ROM) for numerical simulation of fluid flows. The general DDC-ROM involves two stages: (1) we apply ROM filtering (such as ROM projection) to the full order model (FOM) and construct the filtered ROM (F-ROM). (2) We use data-driven modeling to model the nonlinear interactions between resolved and unresolved modes, which solves the F-ROM's closure problem. In the DDC-ROM, a linear or quadratic ansatz is used in the data-driven modeling step. In this thesis, we propose a new cubic ansatz. To get the unknown coefficients in our ansatz, we solve an optimization problem that minimizes the difference between the FOM data and the ansatz. We test the new DDC-ROM in the numerical simulation of the one-dimensional Burgers equation with a small diffusion coefficient. Furthermore, we perform a cross-validation of the DDC-ROM to investigate whether it can be successful in computational settings that are different from the training regime. / M.S. / Practical engineering and scientific problems often require the repeated simulation of unsteady fluid flows. In these applications, the computational cost of high-fidelity full-order models can be prohibitively high. Reduced order models (ROMs) represent efficient alternatives to brute force computational approaches. In this thesis, we propose a data-driven correction ROM (DDC-ROM) in which available data and an optimization problem are used to model the nonlinear interactions between resolved and unresolved modes. In order to test the new DDC-ROM's predictability, we perform its cross-validation for the one-dimensional viscous Burgers equation and different training regimes.
45

Data-driven Methods in Mechanical Model Calibration and Prediction for Mesostructured Materials

Kim, Jee Yun 01 October 2018 (has links)
Mesoscale design involving control of material distribution pattern can create a statistically heterogeneous material system, which has shown increased adaptability to complex mechanical environments involving highly non-uniform stress fields. Advances in multi-material additive manufacturing can aid in this mesoscale design, providing voxel level control of material property. This vast freedom in design space also unlocks possibilities within optimization of the material distribution pattern. The optimization problem can be divided into a forward problem focusing on accurate predication and an inverse problem focusing on efficient search of the optimal design. In the forward problem, the physical behavior of the material can be modeled based on fundamental mechanics laws and simulated through finite element analysis (FEA). A major limitation in modeling is the unknown parameters in constitutive equations that describe the constituent materials; determining these parameters via conventional single material testing has been proven to be insufficient, which necessitates novel and effective approaches of calibration. A calibration framework based in Bayesian inference, which integrates data from simulations and physical experiments, has been applied to a study involving a mesostructured material fabricated by fused deposition modeling. Calibration results provide insights on what values these parameters converge to as well as which material parameters the model output has the largest dependence on while accounting for sources of uncertainty introduced during the modeling process. Additionally, this statistical formulation is able to provide quick predictions of the physical system by implementing a surrogate and discrepancy model. The surrogate model is meant to be a statistical representation of the simulation results, circumventing issues arising from computational load, while the discrepancy is aimed to account for the difference between the simulation output and physical experiments. In this thesis, this Bayesian calibration framework is applied to a material bending problem, where in-situ mechanical characterization data and FEA simulations based on constitutive modeling are combined to produce updated values of the unknown material parameters with uncertainty. / Master of Science / A material system obtained by applying a pattern of multiple materials has proven its adaptability to complex practical conditions. The layer by layer manufacturing process of additive manufacturing can allow for this type of design because of its control over where material can be deposited. This possibility then raises the question of how a multi-material system can be optimized in its design for a given application. In this research, we focus mainly on the problem of accurately predicting the response of the material when subjected to stimuli. Conventionally, simulations aided by finite element analysis (FEA) were relied upon for prediction, however it also presents many issues such as long run times and uncertainty in context-specific inputs of the simulation. We instead have adopted a framework using advanced statistical methodology able to combine both experimental and simulation data to significantly reduce run times as well as quantify the various uncertainties associated with running simulations.
46

Transitions in Care: A Data-Driven Exploration of Patient Pathways in the Canadian Healthcare System

Taremi, Mohammadreza January 2024 (has links)
In the complex landscape of healthcare, patients navigate through various institutions from hospitals to long-term care facilities, and each step of their journey plays a crucial role in their disease progression and treatment plan. Traditional analyses often focus on individual transitions, offering limited insight into the broader picture of patient care and disease progression. This thesis aims to explore the entire sequence of patient transitions within the Canadian healthcare system to uncover meaningful patterns and commonalities. This research employs an innovative approach to leveraging the Canadian Institute for Health Information (CIHI) dataset, consisting of around 250,000 patient records after data cleaning and including approximately 10-11 variables. Extracting a diverse category of features, such as temporal, semantic, and clinical information, constructs a detailed profile for each patient journey. These profiles then undergo an parallel mini-batch average agglomerative hierarchical clustering process, grouping together patients with similar healthcare trajectories to identify prevailing pathways and transitions within the system. By understanding these patterns, healthcare providers and policymakers can gain insights into the patient experience, potentially revealing areas for improvement, optimization, and personalization of care. Key findings include uncovering transitions in the healthcare environment, identifying the most common pathways, and studying the alternate level of care length of stay for each scenario. Looking ahead, the research anticipates incorporating additional layers of data, such as specific interventions and medications, to enrich the analysis. This expansion aims to offer a more comprehensive view of patient journeys, further enhancing the ability to tailor healthcare services to meet individual needs effectively. / Thesis / Master of Computer Science (MCS)
47

Wavelet-based Dynamic Mode Decomposition in the Context of Extended Dynamic Mode Decomposition and Koopman Theory

Tilki, Cankat 17 June 2024 (has links)
Koopman theory is widely used for data-driven modeling of nonlinear dynamical systems. One of the well-known algorithms that stem from this approach is the Extended Dynamic Mode Decomposition (EDMD), a data-driven algorithm for uncontrolled systems. In this thesis, we will start by discussing the EDMD algorithm. We will discuss how this algorithm encompasses Dynamic Mode Decomposition (DMD), a widely used data-driven algorithm. Then we will extend our discussion to input-output systems and identify ways to extend the Koopman Operator Theory to input-output systems. We will also discuss how various algorithms can be identified as instances of this framework. Special care is given to Wavelet-based Dynamic Mode Decomposition (WDMD). WDMD is a variant of DMD that uses only the input and output data. WDMD does that by generating auxiliary states acquired from the Wavelet transform. We will show how the action of the Koopman operator can be simplified by using the Wavelet transform and how the WDMD algorithm can be motivated by this representation. We will also introduce a slight modification to WDMD that makes it more robust to noise. / Master of Science / To analyze a real-world phenomenon we first build a mathematical model to capture its behavior. Traditionally, to build a mathematical model, we isolate its principles and encode it into a function. However, when the phenomenon is not well-known, isolating these principles is not possible. Hence, rather than understanding its principles, we sample data from that phenomenon and build our mathematical model directly from this data by using approximation techniques. In this thesis, we will start by focusing on cases where we can fully observe the phenomena, when no external stimuli are present. We will discuss how some algorithms originating from these approximation techniques can be identified as instances of the Extended Dynamic Mode Decomposition (EDMD) algorithm. For that, we will review an alternative approach to mathematical modeling, called the Koopman approach, and explain how the Extended DMD algorithm stems from this approach. Then we will focus on the case where there is external stimuli and we can only partially observe the phenomena. We will discuss generalizations of the Koopman approach for this case, and how various algorithms that model such systems can be identified as instances of the EDMD algorithm adapted for this case. Special attention is given to the Wavelet-based Dynamic Mode Decomposition (WDMD) algorithm. WDMD builds a mathematical model from the data by borrowing ideas from Wavelet theory, which is used in signal processing. In this way, WDMD does not require the sampling of the fully observed system. This gives WDMD the flexibility to be used for cases where we can only partially observe the phenomena. While showing that WDMD is an instance of EDMD, we will also show how Wavelet theory can simplify the Koopman approach and thus how it can pave the way for an easier analysis.
48

Perceptions of Teachers about Using and Analyzing Data to Inform Instruction

Harris, Lateasha Monique 01 January 2018 (has links)
Monitoring academic progress to guide instructional practices is an important role of teachers in a small rural school district in the Southern United States. Teachers in this region were experiencing difficulties using the approved school district model to implement data-driven instruction. The purpose of this qualitative case study was to identify elementary- and middle-level teachers' perceptions about using the Plan, Do, Study, Act (PDSA) model to analyze data in the classroom and use it to inform classroom instruction. Bambrick-Santoyo's principles for effective data-driven instruction was the conceptual framework that guided this study. The research questions were focused on teachers' perceptions of and experiences with the PDSA. A purposeful sampling was used to recruit 8 teachers from Grades 3-9 and their insights were captured through semistructured interviews, reflective journals, and document analyses of data walls. Emergent themes were identified through an open coding process, and trustworthiness was secured through triangulation and member checking. The themes were about using data to assess students, creating lessons, and collaborating with colleagues. The three findings revealed that elementary- and middle-level teachers acknowledge PDSA as an effective tool for guiding student learning, that teachers rely on assessment data, and that teachers need on-going collaborative engagement with their colleagues when using the PDSA. This study has implications for positive social change by providing a structure for improving classroom instructional practices and engaging teachers in more collaborative practices.
49

A novel approach for the improvement of error traceability and data-driven quality predictions in spindle units

Rangaraju, Adithya January 2021 (has links)
The lack of research on the impact of component degradation on the surface quality of machine tool spindles is limited and the primary motivation for this research. It is common in the manufacturing industry to replace components even if they still have some Remaining Useful Life (RUL), resulting in an ineffective maintenance strategy. The primary objective of this thesis is to design and construct an Exchangeable Spindle Unit (ESU) test stand that aims at capturing the influence of the failure transition of components during machining and its effects on the quality of the surface. Current machine tools cannot be tested with extreme component degradation, especially the spindle, since the degrading elements can lead to permanent damage, and machine tools are expensive to repair. The ESU substitutes and decouples the machine tool spindle to investigate the influence of deteriorated components on the response so that the machine tool spindle does not take the degrading effects. Data-driven quality control is another essential factor which many industries try to implement in their production line. In a traditional manufacturing scenario, quality inspections are performed to check if the parameters measured are within the nominal standards at the end of a production line or between processes. A significant flaw in the traditional approach is its inability to map the degradation of components to quality. Condition monitoring techniques can resolve this problem and help identify defects early in production. This research focuses on two objectives. The first one aims at capturing the component degradation by artificially inducing imbalance into the ESU shaft and capturing the excitation behavior during machining with an end mill tool. Imbalance effects are quantified by adding mass onto the ESU spindle shaft. The varying effects of the mass are captured and characterized using vibration signals. The second objective is to establish a correlation between the surface quality of the machined part with the characterized vibrations signals by Bagged Ensemble Tree (BET) machine learning models. The results show a good correlation between the surface roughness and the accelerometer signals. A comparison study between a balanced and imbalanced spindle along with its resultant surface quality is presented in this research. / Bristen på forskning om inverkan av komponentnedbrytning på ytkvaliteten hos verktygsmaskiner är begränsad och den primära motivationen för denna forskning. Det är vanligt inom tillverkningsindustrin att byta ut komponenter även om de fortfarande har en viss återstående livslängd, vilket resulterar i en ineffektiv underhållsstrategi. Det primära syftet med denna avhandling är att designa och konstruera en utbytbar spindelenhetstestsats som syftar till att fånga inverkan av komponentbrottsövergång under bearbetning och dess effekter på ytkvaliteten. Nuvarande verktygsmaskiner kan inte testas med extrem komponentnedbrytning, speciellt spindeln, eftersom de nedbrytande elementen kan leda till permanenta skador och verktygsmaskiner är dyra att reparera. Den utbytbara spindelenheten ersätter och kopplar bort verktygsmaskinens spindel för att undersöka effekten av försämrade komponenter på responsen så att verktygsmaskinens spindel inte absorberar de nedbrytande effekterna. Datadriven kvalitetskontroll är en annan viktig faktor som många industrier försöker implementera i sin produktionslinje. I ett traditionellt tillverkningsscenario utförs kvalitetsinspektioner för att kontrollera om de uppmätta parametrarna ligger inom de nominella normerna i slutet av en produktionslinje eller mellan processer. En betydande brist med det traditionella tillvägagångssättet är dess oförmåga att kartlägga komponenternas försämring till kvalitet. Tillståndsövervakningstekniker kan lösa detta problem och hjälpa till att identifiera defekter tidigt i produktionsprocessen. Denna forskning fokuserar på två mål. Den första syftar till att fånga komponentnedbrytning genom att artificiellt inducera obalans i axeln på den utbytbara spindelenheten och fånga excitationsbeteendet under bearbetning med ett fräsverktyg. Obalanseffekter kvantifieras genom att tillföra massa till spindelaxeln på den utbytbara spindelenheten. Massans varierande effekter fångas upp och karakteriseras med hjälp av vibrationssignaler. Det andra målet är att etablera en korrelation mellan ytkvaliteten hos den bearbetade delen med de karakteriserade vibrationssignalerna från Bagged Ensemble Tree maskininlärningsmodeller. Resultaten visar en god korrelation mellan ytjämnheten och accelerometerns signaler. En jämförande studie mellan en balanserad och obalanserad spindel tillsammans med dess resulterande ytkvalitet presenteras i denna forskning.
50

The Major Challenges in DDDM Implementation: A Single-Case Study : What are the Main Challenges for Business-to-Business MNCs to Implement a Data-Driven Decision-Making Strategy?

Varvne, Matilda, Cederholm, Simon, Medbo, Anton January 2020 (has links)
Over the past years, the value of data and DDDM have increased significantly as technological advancements have made it possible to store and analyze large amounts of data at a reasonable cost. This has resulted in completely new business models that has disrupt whole industries. DDDM allows businesses to rely their decisions on data, as opposed to on gut feeling. Up until this point, literature is eligible to provide a general view of what are the major challenges corporations encounter when implementing a DDDM strategy. However, as the field is still rather new, the challenges identified are yet very general and many corporations, especially B2B MNCs selling consumer goods, seem to struggle with this implementation. Hence, a single-case study on such a corporation, named Alpha, was carried out with the purpose to explore what are their major challenges in this process. Semi-structured interviews revealed evidence of four major findings, whereas, execution and organizational culture were supported in existing literature, however, two additional findings associated with organizational structure and consumer behavior data were discovered in the case of Alpha. Based on this, the conclusions drawn were that B2B MNCs selling consumer goods encounter the challenges of identifying local markets as frontrunners for strategies such as the one to become more data-driven, as well as the need to find a way to retrieve consumer behavior data. With these two main challenges identified, it can provide a starting point for managers when implementing DDDM strategies in B2B MNCs selling consumer goods in the future.

Page generated in 0.0386 seconds