• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 252
  • 21
  • 18
  • 12
  • 9
  • 7
  • 6
  • 5
  • 4
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 412
  • 412
  • 95
  • 91
  • 77
  • 67
  • 61
  • 58
  • 52
  • 51
  • 42
  • 41
  • 41
  • 39
  • 38
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Fusing Modeling and Testing to Enhance Environmental Testing Approaches

Devine, Timothy Andrew 09 July 2019 (has links)
A proper understanding of the dynamics of a mechanical system is crucial to ensure the highest levels of performance. The understanding is frequently determined through modeling and testing of components. Modeling provides a cost effective method for rapidly developing a knowledge of the system, however the model is incapable of accounting for fluctuations that occur in physical spaces. Testing, when performed properly, provides a near exact understanding of how a pat or assembly functions, however can be expensive both fiscally and temporally. Often, practitioners of the two disciplines work in parallel, never bothering to intersect with the other group. Further advancement into ways to fuse modeling and testing together is able to produce a more comprehensive understanding of dynamic systems while remaining inexpensive in terms of computation, financial cost, and time. Due to this, the goal of the presented work is to develop ways to merge the two branches to include test data in models for operational systems. This is done through a series of analytical and experimental tasks examining the boundary conditions of various systems. The first venue explored was an attempt at modeling unknown boundary conditions from an operational environment by modeling the same system in known configurations using a controlled environment, such as what is seen in a laboratory test. An analytical beam was studied under applied environmental loading with grounding stiffnesses added to simulate an operational condition and the response was attempted to be matched by a free boundaries beam with a reduced number of excitation points. Due to the properties of the inverse problem approach taken, the response between the two systems matched at control locations, however at non-control locations the responses showed a large degree of variation. From the mismatch in mechanical impedance, it is apparent that improperly representing boundary conditions can have drastic effects on the accuracy of models and recreational tests. With the progression now directed towards modeling and testing of boundary conditions, methods were explored to combine the two approaches working together in harmony. The second portion of this work focuses on modeling an unknown boundary connection using a collection of similar testable boundary conditions to parametrically interpolate to the unknown configuration. This was done by using data driven models of the known systems as the interpolating functions, with system boundary stiffness being the varied parameter. This approach yielded near identical parametric model response to the original system response in analytical systems and showed some early signs of promise for an experimental beam. After the two conducted studies, the potential for extending a parametric data driven model approach to other systems is discussed. In addition to this, improvements to the approach are discussed as well as the benefits it brings. / Master of Science / A proper understanding of the dynamics of a mechanical system in a severe environment is crucial to ensure the highest levels of performance. The understanding is frequently determined through modeling and testing of components. Modeling provides a cost-effective method for rapidly developing a knowledge of the system; however, the model is incapable of accounting for fluctuations that occur in physical spaces. Testing, when performed properly, provides a near exact understanding of how a pat or assembly functions, however, can be expensive both fiscally and temporally. Often, practitioners of the two disciplines work in parallel, never bothering to intersect with the other group and favoring one approach over the other for various reasons. Further advancement into ways to fuse modeling and testing together can produce a more comprehensive understanding of dynamic systems subject to environmental excitation while remaining inexpensive in terms of computation, financial cost, and time. Due to this, the presented work aims to develop ways to merge the two branches to include test data in models for operational systems. This is done through a series of analytical and experimental tasks examining the boundary conditions of various systems and attempting to replicate the system response using inverse approaches at first. This is then proceeded by modeling boundary stiffnesses using data-driven modeling and parametric modeling approaches. The validity and impact these methods may have are also discussed.
42

Cross-Validation of Data-Driven Correction Reduced Order Modeling

Mou, Changhong 03 October 2018 (has links)
In this thesis, we develop a data-driven correction reduced order model (DDC-ROM) for numerical simulation of fluid flows. The general DDC-ROM involves two stages: (1) we apply ROM filtering (such as ROM projection) to the full order model (FOM) and construct the filtered ROM (F-ROM). (2) We use data-driven modeling to model the nonlinear interactions between resolved and unresolved modes, which solves the F-ROM's closure problem. In the DDC-ROM, a linear or quadratic ansatz is used in the data-driven modeling step. In this thesis, we propose a new cubic ansatz. To get the unknown coefficients in our ansatz, we solve an optimization problem that minimizes the difference between the FOM data and the ansatz. We test the new DDC-ROM in the numerical simulation of the one-dimensional Burgers equation with a small diffusion coefficient. Furthermore, we perform a cross-validation of the DDC-ROM to investigate whether it can be successful in computational settings that are different from the training regime. / M.S. / Practical engineering and scientific problems often require the repeated simulation of unsteady fluid flows. In these applications, the computational cost of high-fidelity full-order models can be prohibitively high. Reduced order models (ROMs) represent efficient alternatives to brute force computational approaches. In this thesis, we propose a data-driven correction ROM (DDC-ROM) in which available data and an optimization problem are used to model the nonlinear interactions between resolved and unresolved modes. In order to test the new DDC-ROM's predictability, we perform its cross-validation for the one-dimensional viscous Burgers equation and different training regimes.
43

Data-driven Methods in Mechanical Model Calibration and Prediction for Mesostructured Materials

Kim, Jee Yun 01 October 2018 (has links)
Mesoscale design involving control of material distribution pattern can create a statistically heterogeneous material system, which has shown increased adaptability to complex mechanical environments involving highly non-uniform stress fields. Advances in multi-material additive manufacturing can aid in this mesoscale design, providing voxel level control of material property. This vast freedom in design space also unlocks possibilities within optimization of the material distribution pattern. The optimization problem can be divided into a forward problem focusing on accurate predication and an inverse problem focusing on efficient search of the optimal design. In the forward problem, the physical behavior of the material can be modeled based on fundamental mechanics laws and simulated through finite element analysis (FEA). A major limitation in modeling is the unknown parameters in constitutive equations that describe the constituent materials; determining these parameters via conventional single material testing has been proven to be insufficient, which necessitates novel and effective approaches of calibration. A calibration framework based in Bayesian inference, which integrates data from simulations and physical experiments, has been applied to a study involving a mesostructured material fabricated by fused deposition modeling. Calibration results provide insights on what values these parameters converge to as well as which material parameters the model output has the largest dependence on while accounting for sources of uncertainty introduced during the modeling process. Additionally, this statistical formulation is able to provide quick predictions of the physical system by implementing a surrogate and discrepancy model. The surrogate model is meant to be a statistical representation of the simulation results, circumventing issues arising from computational load, while the discrepancy is aimed to account for the difference between the simulation output and physical experiments. In this thesis, this Bayesian calibration framework is applied to a material bending problem, where in-situ mechanical characterization data and FEA simulations based on constitutive modeling are combined to produce updated values of the unknown material parameters with uncertainty. / Master of Science / A material system obtained by applying a pattern of multiple materials has proven its adaptability to complex practical conditions. The layer by layer manufacturing process of additive manufacturing can allow for this type of design because of its control over where material can be deposited. This possibility then raises the question of how a multi-material system can be optimized in its design for a given application. In this research, we focus mainly on the problem of accurately predicting the response of the material when subjected to stimuli. Conventionally, simulations aided by finite element analysis (FEA) were relied upon for prediction, however it also presents many issues such as long run times and uncertainty in context-specific inputs of the simulation. We instead have adopted a framework using advanced statistical methodology able to combine both experimental and simulation data to significantly reduce run times as well as quantify the various uncertainties associated with running simulations.
44

Combining Data-driven and Theory-guided Models in Ensemble Data Assimilation

Popov, Andrey Anatoliyevich 23 August 2022 (has links)
There once was a dream that data-driven models would replace their theory-guided counterparts. We have awoken from this dream. We now know that data cannot replace theory. Data-driven models still have their advantages, mainly in computational efficiency but also providing us with some special sauce that is unreachable by our current theories. This dissertation aims to provide a way in which both the accuracy of theory-guided models, and the computational efficiency of data-driven models can be combined. This combination of theory-guided and data-driven allows us to combine ideas from a much broader set of disciplines, and can help pave the way for robust and fast methods. / Doctor of Philosophy / As an illustrative example take the problem of predicting the weather. Typically a supercomputer will run a model several times to generate predictions few days into the future. Sensors such as those on satellites will then pick up observations about a few points on the globe, that are not representative of the whole atmosphere. These observations are combined, ``assimilated'' with the computer model predictions to create a better representation of our current understanding of the state of the earth. This predict-assimilate cycle is repeated every day, and is called (sequential) data assimilation. The prediction step traditional was performed by a computer model that was based on rigorous mathematics. With the advent of big-data, many have wondered if models based purely on data would take over. This has not happened. This thesis is concerned with taking traditional mathematical models and running them alongside data-driven models in the prediction step, then building a theory in which both can be used in data assimilation at the same time in order to not have a drop in accuracy and have a decrease in computational cost.
45

Perceptions of Teachers about Using and Analyzing Data to Inform Instruction

Harris, Lateasha Monique 01 January 2018 (has links)
Monitoring academic progress to guide instructional practices is an important role of teachers in a small rural school district in the Southern United States. Teachers in this region were experiencing difficulties using the approved school district model to implement data-driven instruction. The purpose of this qualitative case study was to identify elementary- and middle-level teachers' perceptions about using the Plan, Do, Study, Act (PDSA) model to analyze data in the classroom and use it to inform classroom instruction. Bambrick-Santoyo's principles for effective data-driven instruction was the conceptual framework that guided this study. The research questions were focused on teachers' perceptions of and experiences with the PDSA. A purposeful sampling was used to recruit 8 teachers from Grades 3-9 and their insights were captured through semistructured interviews, reflective journals, and document analyses of data walls. Emergent themes were identified through an open coding process, and trustworthiness was secured through triangulation and member checking. The themes were about using data to assess students, creating lessons, and collaborating with colleagues. The three findings revealed that elementary- and middle-level teachers acknowledge PDSA as an effective tool for guiding student learning, that teachers rely on assessment data, and that teachers need on-going collaborative engagement with their colleagues when using the PDSA. This study has implications for positive social change by providing a structure for improving classroom instructional practices and engaging teachers in more collaborative practices.
46

A novel approach for the improvement of error traceability and data-driven quality predictions in spindle units

Rangaraju, Adithya January 2021 (has links)
The lack of research on the impact of component degradation on the surface quality of machine tool spindles is limited and the primary motivation for this research. It is common in the manufacturing industry to replace components even if they still have some Remaining Useful Life (RUL), resulting in an ineffective maintenance strategy. The primary objective of this thesis is to design and construct an Exchangeable Spindle Unit (ESU) test stand that aims at capturing the influence of the failure transition of components during machining and its effects on the quality of the surface. Current machine tools cannot be tested with extreme component degradation, especially the spindle, since the degrading elements can lead to permanent damage, and machine tools are expensive to repair. The ESU substitutes and decouples the machine tool spindle to investigate the influence of deteriorated components on the response so that the machine tool spindle does not take the degrading effects. Data-driven quality control is another essential factor which many industries try to implement in their production line. In a traditional manufacturing scenario, quality inspections are performed to check if the parameters measured are within the nominal standards at the end of a production line or between processes. A significant flaw in the traditional approach is its inability to map the degradation of components to quality. Condition monitoring techniques can resolve this problem and help identify defects early in production. This research focuses on two objectives. The first one aims at capturing the component degradation by artificially inducing imbalance into the ESU shaft and capturing the excitation behavior during machining with an end mill tool. Imbalance effects are quantified by adding mass onto the ESU spindle shaft. The varying effects of the mass are captured and characterized using vibration signals. The second objective is to establish a correlation between the surface quality of the machined part with the characterized vibrations signals by Bagged Ensemble Tree (BET) machine learning models. The results show a good correlation between the surface roughness and the accelerometer signals. A comparison study between a balanced and imbalanced spindle along with its resultant surface quality is presented in this research. / Bristen på forskning om inverkan av komponentnedbrytning på ytkvaliteten hos verktygsmaskiner är begränsad och den primära motivationen för denna forskning. Det är vanligt inom tillverkningsindustrin att byta ut komponenter även om de fortfarande har en viss återstående livslängd, vilket resulterar i en ineffektiv underhållsstrategi. Det primära syftet med denna avhandling är att designa och konstruera en utbytbar spindelenhetstestsats som syftar till att fånga inverkan av komponentbrottsövergång under bearbetning och dess effekter på ytkvaliteten. Nuvarande verktygsmaskiner kan inte testas med extrem komponentnedbrytning, speciellt spindeln, eftersom de nedbrytande elementen kan leda till permanenta skador och verktygsmaskiner är dyra att reparera. Den utbytbara spindelenheten ersätter och kopplar bort verktygsmaskinens spindel för att undersöka effekten av försämrade komponenter på responsen så att verktygsmaskinens spindel inte absorberar de nedbrytande effekterna. Datadriven kvalitetskontroll är en annan viktig faktor som många industrier försöker implementera i sin produktionslinje. I ett traditionellt tillverkningsscenario utförs kvalitetsinspektioner för att kontrollera om de uppmätta parametrarna ligger inom de nominella normerna i slutet av en produktionslinje eller mellan processer. En betydande brist med det traditionella tillvägagångssättet är dess oförmåga att kartlägga komponenternas försämring till kvalitet. Tillståndsövervakningstekniker kan lösa detta problem och hjälpa till att identifiera defekter tidigt i produktionsprocessen. Denna forskning fokuserar på två mål. Den första syftar till att fånga komponentnedbrytning genom att artificiellt inducera obalans i axeln på den utbytbara spindelenheten och fånga excitationsbeteendet under bearbetning med ett fräsverktyg. Obalanseffekter kvantifieras genom att tillföra massa till spindelaxeln på den utbytbara spindelenheten. Massans varierande effekter fångas upp och karakteriseras med hjälp av vibrationssignaler. Det andra målet är att etablera en korrelation mellan ytkvaliteten hos den bearbetade delen med de karakteriserade vibrationssignalerna från Bagged Ensemble Tree maskininlärningsmodeller. Resultaten visar en god korrelation mellan ytjämnheten och accelerometerns signaler. En jämförande studie mellan en balanserad och obalanserad spindel tillsammans med dess resulterande ytkvalitet presenteras i denna forskning.
47

The Major Challenges in DDDM Implementation: A Single-Case Study : What are the Main Challenges for Business-to-Business MNCs to Implement a Data-Driven Decision-Making Strategy?

Varvne, Matilda, Cederholm, Simon, Medbo, Anton January 2020 (has links)
Over the past years, the value of data and DDDM have increased significantly as technological advancements have made it possible to store and analyze large amounts of data at a reasonable cost. This has resulted in completely new business models that has disrupt whole industries. DDDM allows businesses to rely their decisions on data, as opposed to on gut feeling. Up until this point, literature is eligible to provide a general view of what are the major challenges corporations encounter when implementing a DDDM strategy. However, as the field is still rather new, the challenges identified are yet very general and many corporations, especially B2B MNCs selling consumer goods, seem to struggle with this implementation. Hence, a single-case study on such a corporation, named Alpha, was carried out with the purpose to explore what are their major challenges in this process. Semi-structured interviews revealed evidence of four major findings, whereas, execution and organizational culture were supported in existing literature, however, two additional findings associated with organizational structure and consumer behavior data were discovered in the case of Alpha. Based on this, the conclusions drawn were that B2B MNCs selling consumer goods encounter the challenges of identifying local markets as frontrunners for strategies such as the one to become more data-driven, as well as the need to find a way to retrieve consumer behavior data. With these two main challenges identified, it can provide a starting point for managers when implementing DDDM strategies in B2B MNCs selling consumer goods in the future.
48

A Multi-Site Case Study: Acculturating Middle Schools to Use Data-Driven Instruction for Improved Student Achievement

James, Rebecca C. 05 January 2011 (has links)
In the modern era of high-stakes accountability, test data have become much more than a simple comparison (Schmoker, 2006; Payne & Miller, 2009). The information provided in modern data reports has become an invaluable tool to drive instruction in classrooms. However, there is a lack of good training for educators to evaluate data and translate findings into solid practices that can improve student learning (Blair, 2006; Dynarski, 2008; Light, Wexler, & Heinze, 2005; Payne & Miller, 2009). Some schools are good at collecting data, but often fall short at what to do next. It is the role of the principal to serve as an instructional leader and guide teachers to the answer the reoccurring question of "now what?" The purpose of this study was to investigate ways in which principals build successful data-driven instructional systems within their schools using a qualitative multi-site case study method. This research utilized a triangulation approach with structured interviews, on-site visits, and document reviews from various middle school supervisors, principals, and teachers. The findings are presented in four common themes and patterns identified as essential components administrators used to implement data-driven instructional systems to improve student achievement. The themes are 1) administrators must clearly define the vision and set the expectation of using data to improve student achievement, 2) administrators must take an active role in the data-driven process, 3) data must be easily accessible to stakeholders, and 4) stakeholders must devote time on a regular basis to the data-driven process. The four themes led to the conclusion of ten common steps administrators can use to acculturate their school or school division with the data-driven instruction process. / Ed. D.
49

Data driven modelling for environmental water management

Syed, Mofazzal January 2007 (has links)
Management of water quality is generally based on physically-based equations or hypotheses describing the behaviour of water bodies. In recent years models built on the basis of the availability of larger amounts of collected data are gaining popularity. This modelling approach can be called data driven modelling. Observational data represent specific knowledge, whereas a hypothesis represents a generalization of this knowledge that implies and characterizes all such observational data. Traditionally deterministic numerical models have been used for predicting flow and water quality processes in inland and coastal basins. These models generally take a long time to run and cannot be used as on-line decision support tools, thereby enabling imminent threats to public health risk and flooding etc. to be predicted. In contrast, Data driven models are data intensive and there are some limitations in this approach. The extrapolation capability of data driven methods are a matter of conjecture. Furthermore, the extensive data required for building a data driven model can be time and resource consuming or for the case predicting the impact of a future development then the data is unlikely to exist. The main objective of the study was to develop an integrated approach for rapid prediction of bathing water quality in estuarine and coastal waters. Faecal Coliforms (FC) were used as a water quality indicator and two of the most popular data mining techniques, namely, Genetic Programming (GP) and Artificial Neural Networks (ANNs) were used to predict the FC levels in a pilot basin. In order to provide enough data for training and testing the neural networks, a calibrated hydrodynamic and water quality model was used to generate input data for the neural networks. A novel non-linear data analysis technique, called the Gamma Test, was used to determine the data noise level and the number of data points required for developing smooth neural network models. Details are given of the data driven models, numerical models and the Gamma Test. Details are also given of a series experiments being undertaken to test data driven model performance for a different number of input parameters and time lags. The response time of the receiving water quality to the input boundary conditions obtained from the hydrodynamic model has been shown to be a useful knowledge for developing accurate and efficient neural networks. It is known that a natural phenomenon like bacterial decay is affected by a whole host of parameters which can not be captured accurately using solely the deterministic models. Therefore, the data-driven approach has been investigated using field survey data collected in Cardiff Bay to investigate the relationship between bacterial decay and other parameters. Both of the GP and ANN models gave similar, if not better, predictions of the field data in comparison with the deterministic model, with the added benefit of almost instant prediction of the bacterial levels for this recreational water body. The models have also been investigated using idealised and controlled laboratory data for the velocity distributions along compound channel reaches with idealised rods have located on the floodplain to replicate large vegetation (such as mangrove trees).
50

DDD metodologija paremto projektavimo įrankio kodo generatoriaus kūrimas ir tyrimas / DDD methodology based design tool‘s code generator development and research

Valinčius, Kęstutis 13 August 2010 (has links)
Data Driven Design metodologija plačiai naudojama įvairiose programinėse sistemose. Šios metodologijos tikslas – atskirti bei lygiagretinti programuotojų ir projektuotojų veiklą. Sistemos branduolio funkcionalumas yra įgyvendinamas sąsajomis, o dinamika – scenarijų pagalba. Taip įvedamas abstrakcijos lygmuo, kurio dėka programinis produktas tampa lankstesnis, paprasčiau palaikomas ir tobulinamas, be to šiuos veiksmus galima atlikti lygiagrečiai. Darbo tikslas buvo sukurti automatinį kodo generatorių, kuris transformuotų grafiškai sumodeliuotą scenarijų į programinį kodą. Generuojant programinį kodą automatiškai ženkliai sumažėja sintaksinių bei loginių klaidų tikimybė, viskas priklauso nuo sumodeliuoto scenarijaus. Kodas sugeneruojamas labai greitai ir visiškai nereikalingas programuotojo įsikišimas. Šis tikslas pasiektas iškėlus biznio logikos projektavimą į scenarijaus projektavimą, o kodo generavimo posistemę realizavus žiniatinklio paslaugos principu. Kodas generuojamas neprisirišant prie konkrečios architektūros, technologijos ar taikymo srities panaudojant įskiepių sistemą . Grafiniame scenarijų kūrimo įrankyje sumodeliuojamas scenarijus ir tada transformuojamas į metakalbą , iš kurios ir generuojamas galutinis programinis kodas. Metakalba – tam tikromis taisyklėmis apibrėžta „XML “ kalba. Realizavus eksperimentinę sistemą su didelėmis problemomis nebuvo susidurta. Naujos sistemos modeliavimas projektavimo įrankiu paspartino kūrimo procesą septynis kartus. Tai įrodo... [toliau žr. visą tekstą] / Data Driven Design methodology is widely used in various program systems. This methodology aim is to distinguish and parallel software developer and scenario designer’s work. Core functionality is implemented via interfaces and dynamics via scenario support. This introduces a level of abstraction, which makes software product more flexible easily maintained and improved, in addition these actions can be performed in parallel. The main aim of this work was to create automatic code generator that transforms graphically modeled scenario to software code. Automatically generated software code restricts probability of syntactic and logical errors, all depends on scenario modeling. Code is generated instantly and no need software developer interference. This aim is achieved by moving business logic designing to scenario designing process and code generator service making as a “Web service”. Using cartridge based system code is generated not attached to a specific architecture, technology or application domain. In graphical scenario modeling tool scenario is modeled and transformed to metalanguage, from which software code is generated. Metalanguage – with specific rules defined “XML” language. Experimental system was developed with no major problems. New project modeling with our modeling tool speeded the development process by seven times. This proves modeling tool advantage over manual programming.

Page generated in 0.0249 seconds