• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 278
  • 22
  • 21
  • 16
  • 9
  • 7
  • 7
  • 5
  • 4
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 460
  • 460
  • 112
  • 96
  • 96
  • 85
  • 65
  • 60
  • 60
  • 53
  • 48
  • 46
  • 45
  • 45
  • 40
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Fusing Modeling and Testing to Enhance Environmental Testing Approaches

Devine, Timothy Andrew 09 July 2019 (has links)
A proper understanding of the dynamics of a mechanical system is crucial to ensure the highest levels of performance. The understanding is frequently determined through modeling and testing of components. Modeling provides a cost effective method for rapidly developing a knowledge of the system, however the model is incapable of accounting for fluctuations that occur in physical spaces. Testing, when performed properly, provides a near exact understanding of how a pat or assembly functions, however can be expensive both fiscally and temporally. Often, practitioners of the two disciplines work in parallel, never bothering to intersect with the other group. Further advancement into ways to fuse modeling and testing together is able to produce a more comprehensive understanding of dynamic systems while remaining inexpensive in terms of computation, financial cost, and time. Due to this, the goal of the presented work is to develop ways to merge the two branches to include test data in models for operational systems. This is done through a series of analytical and experimental tasks examining the boundary conditions of various systems. The first venue explored was an attempt at modeling unknown boundary conditions from an operational environment by modeling the same system in known configurations using a controlled environment, such as what is seen in a laboratory test. An analytical beam was studied under applied environmental loading with grounding stiffnesses added to simulate an operational condition and the response was attempted to be matched by a free boundaries beam with a reduced number of excitation points. Due to the properties of the inverse problem approach taken, the response between the two systems matched at control locations, however at non-control locations the responses showed a large degree of variation. From the mismatch in mechanical impedance, it is apparent that improperly representing boundary conditions can have drastic effects on the accuracy of models and recreational tests. With the progression now directed towards modeling and testing of boundary conditions, methods were explored to combine the two approaches working together in harmony. The second portion of this work focuses on modeling an unknown boundary connection using a collection of similar testable boundary conditions to parametrically interpolate to the unknown configuration. This was done by using data driven models of the known systems as the interpolating functions, with system boundary stiffness being the varied parameter. This approach yielded near identical parametric model response to the original system response in analytical systems and showed some early signs of promise for an experimental beam. After the two conducted studies, the potential for extending a parametric data driven model approach to other systems is discussed. In addition to this, improvements to the approach are discussed as well as the benefits it brings. / Master of Science / A proper understanding of the dynamics of a mechanical system in a severe environment is crucial to ensure the highest levels of performance. The understanding is frequently determined through modeling and testing of components. Modeling provides a cost-effective method for rapidly developing a knowledge of the system; however, the model is incapable of accounting for fluctuations that occur in physical spaces. Testing, when performed properly, provides a near exact understanding of how a pat or assembly functions, however, can be expensive both fiscally and temporally. Often, practitioners of the two disciplines work in parallel, never bothering to intersect with the other group and favoring one approach over the other for various reasons. Further advancement into ways to fuse modeling and testing together can produce a more comprehensive understanding of dynamic systems subject to environmental excitation while remaining inexpensive in terms of computation, financial cost, and time. Due to this, the presented work aims to develop ways to merge the two branches to include test data in models for operational systems. This is done through a series of analytical and experimental tasks examining the boundary conditions of various systems and attempting to replicate the system response using inverse approaches at first. This is then proceeded by modeling boundary stiffnesses using data-driven modeling and parametric modeling approaches. The validity and impact these methods may have are also discussed.
52

Combining Data-driven and Theory-guided Models in Ensemble Data Assimilation

Popov, Andrey Anatoliyevich 23 August 2022 (has links)
There once was a dream that data-driven models would replace their theory-guided counterparts. We have awoken from this dream. We now know that data cannot replace theory. Data-driven models still have their advantages, mainly in computational efficiency but also providing us with some special sauce that is unreachable by our current theories. This dissertation aims to provide a way in which both the accuracy of theory-guided models, and the computational efficiency of data-driven models can be combined. This combination of theory-guided and data-driven allows us to combine ideas from a much broader set of disciplines, and can help pave the way for robust and fast methods. / Doctor of Philosophy / As an illustrative example take the problem of predicting the weather. Typically a supercomputer will run a model several times to generate predictions few days into the future. Sensors such as those on satellites will then pick up observations about a few points on the globe, that are not representative of the whole atmosphere. These observations are combined, ``assimilated'' with the computer model predictions to create a better representation of our current understanding of the state of the earth. This predict-assimilate cycle is repeated every day, and is called (sequential) data assimilation. The prediction step traditional was performed by a computer model that was based on rigorous mathematics. With the advent of big-data, many have wondered if models based purely on data would take over. This has not happened. This thesis is concerned with taking traditional mathematical models and running them alongside data-driven models in the prediction step, then building a theory in which both can be used in data assimilation at the same time in order to not have a drop in accuracy and have a decrease in computational cost.
53

A Physically Informed Data-Driven Approach to Analyze Human Induced Vibration in Civil Structures

Kessler, Ellis Carl 24 June 2021 (has links)
With the rise of the Internet of Things (IoT) and smart buildings, new algorithms are being developed to understand how occupants are interacting with buildings via structural vibration measurements. These vibration-based occupant inference algorithms (VBOI) have been developed to localize footsteps within a building, to classify occupants, and to monitor occupant health. This dissertation will present a three-stage journey proposing a path forward for VBOI research based on physically informed data-driven models of structural dynamical systems. The first part of this dissertation presents a method for extracting temporal gait parameters via underfloor accelerometers. The time between an occupant's consecutive steps can be measured with only structural vibration measurements with a similar accuracy to current gait analysis tools such as force plates and in-shoe pressure sensors. The benefit of this, and other VBOI gait analysis algorithms, is in their ease of use. Gait analysis is currently limited to a clinical setting with specialized measurement systems, however VBOI gait analysis provides the ability to bring gait analysis to any building. VBOI algorithms often make some simplifying assumptions about the dynamics of the building in which they operate. Through a calibration procedure, many VBOI algorithms can learn some system parameters. However, as demonstrated in the second part of this dissertation, some commonly made assumptions oversimplify phenomena present in civil structures such as: attenuation, reflections, and dispersion. A series of experimental and theoretical investigations show that three common assumptions made in VBOI algorithms are unable to account for at least one of these phenomena, leading to algorithms which are more accurate under certain conditions. The final part of this dissertation introduces a physically informed data-driven modelling technique which could be used in VBOI to create a more complete model of a building. Continuous residue interpolation (CRI) takes FRF measurements at a discrete number of testing locations, and creates a predictive model with continuous spatial resolution. The fitted CRI model can be used to simulate the response at any location to an input at any other location. An example of using CRI for VBOI localization is shown. / Doctor of Philosophy / Vibration-based occupant inference (VBOI) algorithms are an emerging area of research in smart buildings instrumented with vibration sensors. These algorithms use vibration measurements of the building's structure to learn something about the occupants inside the building. For example the vibration of a floor in response to a person's footstep could be used to estimate where that person is without the need for any line-of-sight sensors like cameras or motion sensors. The storyline of this dissertation will make three stops: The first is the demonstration of a VBOI algorithm for monitoring occupant health. The second is an investigation of some assumptions commonly made while developing VBOI algorithms, seeking to shed light on when they lead to accurate results and when they should be used with caution. The third, and final, is the development of a data-driven modelling method which uses knowledge about how systems vibrate to build as detailed a model of the system as possible. Current VBOI algorithms have demonstrated the ability to accurately infer a range of information about occupants through vibration measurements. This is shown with a varied literature of localization algorithms, as well as a growing number of algorithms for performing gait analysis. Gait analysis is the study of how people walk, and its correlation to their health. The vibration-based gait analysis procedure in this work demonstrates extracting distributions of temporal gait parameters, like the time between steps. However, many current VBOI algorithms make significant simplifying assumptions about the dynamics of civil structures. Experimental and theoretical investigations of some of these assumptions show that while all assumptions are accurate in certain situations, the dynamics of civil structures are too complex to be completely captured by these simplified models. The proposed path forward for VBOI algorithms is to employ more sophisticated data-drive modelling techniques. Data-driven models use measurements from the system to build a model of how the system would respond to new inputs. The final part of this dissertation is the development of a novel data-driven modelling technique that could be useful for VBOI. The new method, continuous residue interpolation (CRI) uses knowledge of how systems vibrate to build a model of a vibrating system, not only at the locations which were measured, but over the whole system. This allows a relatively small amount of testing to be used to create a model of the entire system, which can in turn be used for VBOI algorithms.
54

A Case Study of Crestwood Primary School: Organizational Routines Implemented For Data-Driven Decison Making

Williams, Kimberly Graybeal 30 October 2014 (has links)
The research study investigated how organizational routines influenced classroom and intervention instruction in a primary school. Educators have used student data for decades but they continue to struggle with the best way to use data to influence instruction. The historical overview of the research highlighted the context of data use from the Effective Schools movement through the No Child Left Behind Act noting the progression of emphasis placed on student data results. While numerous research studies have focused on the use of data, the National Center for Educational Evaluation and Regional Assistance (2009) reported that existing research on the use of data to make instructional decisions does not yet provide conclusive evidence of what practices work to improve student achievement. A descriptive case study methodology was employed to investigate the educational phenomenon of organizational routines implemented for data-driven decision making to influence classroom and intervention instruction. The case study examined a school that faced the macrolevel pressures of school improvement. The study triangulated data from surveys, interviews, and document analysis in an effort to reveal common themes about organizational routines for data-driven decision making. The study participants identified 14 organizational routines as influencing instruction. The interview questions focused on the common themes of (a) curriculum alignment, (b) common assessments, (c) guided reading levels, (d) professional learning communities, and (e) acceleration plans. The survey respondents and interview participants explained how the organizational routines facilitated the use of data by providing (a) focus and direction, (b) student centered instruction, (c) focus on student growth, (d) collaboration and teamwork, (e), flexible grouping of students, and (f) teacher reflection and ownership of all students. Challenges and unexpected outcomes of the organizational routines for data-driven decision making were also discussed. The challenges with the most references included (a) time, (b) too much data (c) data with conflicting information, (d) the pacing guide, and (e) changing teacher attitudes and practices. Ultimately, a data-driven culture was cultivated within the school that facilitated instructional adjustments resulting in increased academic achievement. / Ed. D.
55

Organ Viability Assessment in Transplantation based on Data-driven Modeling

Lan, Qing 03 March 2020 (has links)
Organ transplantation is one of the most important and effective solutions to save end-stage patients, who have one or more critical organ failures. However, the inadequate organs for transplantation to meet the demands has been the major issue. Even worse, the lack of accurate non-invasive assessment methods wastes 20% of donor organs every year. Currently, the most frequently used organ assessment methods are visual inspections and biopsy. Yet both methods are subjective: the assessment accuracy depends on the evaluator's experience. Moreover, repeating biopsies will potentially damage the organs. To reduce the waste of donor organs, online non-invasive and quantitative organ assessment methods are in great needs. Organ viability assessment is a challenging issue due to four reasons: 1) there are no universally accepted guidelines or procedures for surgeons to quantitatively assess the organ viability; 2) there is no easy-deployed and non-invasive biological in situ data to correlate with organ viability; 3) the organs viability is difficult to model because of heterogeneity among organs; 4) both visual inspection and biopsy can be applied only at present time, and how to forecast the viability of similar-but-non-identical organs at a future time is still in shadow. Motivated by the challenges, the overall objective of this dissertation is to develop online non-invasive and quantitative assessment methods to predict and forecast the organ viability. As a result, four data-driven modeling research tasks are investigated to achieve the overall objective: 1) Quantitative and qualitative models are used to jointly predict the number of dead cells and the liver viability based on features extracted from biopsy images. This method can quantitatively assess the organ viability, which could be used to validate the biopsy results from pathologists to increase the evaluation accuracy. 2) A multitask learning logistic regression model is applied to assess liver viability by using principal component analysis to extract infrared image features to quantify the correlation between liver viability and spatial infrared imaging data. This non-invasive online assessment method can evaluate the organ viability without physical contact to reduce the risk of damaging the organs. 3) A spatial-temporal smooth variable selection method is conducted to improve the liver viability prediction accuracy by considering both spatial and temporal effects from the infrared images without feature engineering. In addition, it provides medical interpretation based on variable selection to highlight the most significant regions on the liver resulting in viability loss. 4) A multitask general path model is implemented to forecast the heterogeneous kidney viability based on limited historical data by learning the viability loss paths of each kidney during preservation. The generality of this method is validated by tissue deformation forecasting in needle biopsy process to potentially improve the biopsy accuracy. In summary, the proposed data-driven methods can predict and forecast the organ viability without damaging the organ. As a result, the increased utilization rate of donor organs will benefit more end-stage patients by dramatically extending their life spans. / Doctor of Philosophy / Organ transplantation is the ultimate solution to save end-stage patients with one or more organ failures. However, the inadequate organs for transplantation to meet the demands has been the major issue. Even worse, the lack of accurate and non-invasive viability assessment methods wastes 20% of donor organs every year. Currently, the most frequently used organ assessment methods are visual inspections and biopsy. Yet both methods are subjective: the assessment accuracy depends on the personal experience of evaluator. Moreover, repeating biopsies will potentially damage the organs. As a result, online non-invasive and quantitative organ assessment methods are in great needs. It is extremely important because such methods will increase the organ utilization rate by saving more discarded organs with transplantation potential. The overall objective of this dissertation is to advance the knowledge on modeling organ viability by developing online non-invasive and quantitative methods to predict and forecast the viability of heterogeneous organs in transplantation. After an introduction in Chapter 1, four research tasks are investigated. In Chapter 2, quantitative and qualitative models jointly predicting porcine liver viability are proposed based on features from biopsy images to validate the biopsy results. In Chapter 3, a multi-task learning logistic regression model is proposed to assess the cross-liver viability by correlating liver viability with spatial infrared data validated by porcine livers. In Chapter 4, a spatial-temporal smooth variable selection is proposed to predict liver viability by considering both spatial and temporal correlations in modeling without feature engineering, which is also validated by porcine livers. In addition, the variable selection results provide medical interpretations by capturing the significant regions on the liver in predicting viability. In Chapter 5, a multitask general path model is proposed to forecast kidney viability validated by porcine kidney. This forecasting method is generalized to apply to needle biopsy tissue deformation case study with the objective to improve the needle insertion accuracy. Finally, I summarize the research contribution and discuss future research directions in Chapter 6. The proposed data-driven methods can predict and forecast organ viability without damaging the organ. As a result, the increased utilization rate of donor organs will benefit more patients by dramatically extending their life spans and bringing them back to normal daily activities.
56

Cross-Validation of Data-Driven Correction Reduced Order Modeling

Mou, Changhong 03 October 2018 (has links)
In this thesis, we develop a data-driven correction reduced order model (DDC-ROM) for numerical simulation of fluid flows. The general DDC-ROM involves two stages: (1) we apply ROM filtering (such as ROM projection) to the full order model (FOM) and construct the filtered ROM (F-ROM). (2) We use data-driven modeling to model the nonlinear interactions between resolved and unresolved modes, which solves the F-ROM's closure problem. In the DDC-ROM, a linear or quadratic ansatz is used in the data-driven modeling step. In this thesis, we propose a new cubic ansatz. To get the unknown coefficients in our ansatz, we solve an optimization problem that minimizes the difference between the FOM data and the ansatz. We test the new DDC-ROM in the numerical simulation of the one-dimensional Burgers equation with a small diffusion coefficient. Furthermore, we perform a cross-validation of the DDC-ROM to investigate whether it can be successful in computational settings that are different from the training regime. / M.S. / Practical engineering and scientific problems often require the repeated simulation of unsteady fluid flows. In these applications, the computational cost of high-fidelity full-order models can be prohibitively high. Reduced order models (ROMs) represent efficient alternatives to brute force computational approaches. In this thesis, we propose a data-driven correction ROM (DDC-ROM) in which available data and an optimization problem are used to model the nonlinear interactions between resolved and unresolved modes. In order to test the new DDC-ROM's predictability, we perform its cross-validation for the one-dimensional viscous Burgers equation and different training regimes.
57

Data-driven Methods in Mechanical Model Calibration and Prediction for Mesostructured Materials

Kim, Jee Yun 01 October 2018 (has links)
Mesoscale design involving control of material distribution pattern can create a statistically heterogeneous material system, which has shown increased adaptability to complex mechanical environments involving highly non-uniform stress fields. Advances in multi-material additive manufacturing can aid in this mesoscale design, providing voxel level control of material property. This vast freedom in design space also unlocks possibilities within optimization of the material distribution pattern. The optimization problem can be divided into a forward problem focusing on accurate predication and an inverse problem focusing on efficient search of the optimal design. In the forward problem, the physical behavior of the material can be modeled based on fundamental mechanics laws and simulated through finite element analysis (FEA). A major limitation in modeling is the unknown parameters in constitutive equations that describe the constituent materials; determining these parameters via conventional single material testing has been proven to be insufficient, which necessitates novel and effective approaches of calibration. A calibration framework based in Bayesian inference, which integrates data from simulations and physical experiments, has been applied to a study involving a mesostructured material fabricated by fused deposition modeling. Calibration results provide insights on what values these parameters converge to as well as which material parameters the model output has the largest dependence on while accounting for sources of uncertainty introduced during the modeling process. Additionally, this statistical formulation is able to provide quick predictions of the physical system by implementing a surrogate and discrepancy model. The surrogate model is meant to be a statistical representation of the simulation results, circumventing issues arising from computational load, while the discrepancy is aimed to account for the difference between the simulation output and physical experiments. In this thesis, this Bayesian calibration framework is applied to a material bending problem, where in-situ mechanical characterization data and FEA simulations based on constitutive modeling are combined to produce updated values of the unknown material parameters with uncertainty. / Master of Science / A material system obtained by applying a pattern of multiple materials has proven its adaptability to complex practical conditions. The layer by layer manufacturing process of additive manufacturing can allow for this type of design because of its control over where material can be deposited. This possibility then raises the question of how a multi-material system can be optimized in its design for a given application. In this research, we focus mainly on the problem of accurately predicting the response of the material when subjected to stimuli. Conventionally, simulations aided by finite element analysis (FEA) were relied upon for prediction, however it also presents many issues such as long run times and uncertainty in context-specific inputs of the simulation. We instead have adopted a framework using advanced statistical methodology able to combine both experimental and simulation data to significantly reduce run times as well as quantify the various uncertainties associated with running simulations.
58

Perceptions of Teachers about Using and Analyzing Data to Inform Instruction

Harris, Lateasha Monique 01 January 2018 (has links)
Monitoring academic progress to guide instructional practices is an important role of teachers in a small rural school district in the Southern United States. Teachers in this region were experiencing difficulties using the approved school district model to implement data-driven instruction. The purpose of this qualitative case study was to identify elementary- and middle-level teachers' perceptions about using the Plan, Do, Study, Act (PDSA) model to analyze data in the classroom and use it to inform classroom instruction. Bambrick-Santoyo's principles for effective data-driven instruction was the conceptual framework that guided this study. The research questions were focused on teachers' perceptions of and experiences with the PDSA. A purposeful sampling was used to recruit 8 teachers from Grades 3-9 and their insights were captured through semistructured interviews, reflective journals, and document analyses of data walls. Emergent themes were identified through an open coding process, and trustworthiness was secured through triangulation and member checking. The themes were about using data to assess students, creating lessons, and collaborating with colleagues. The three findings revealed that elementary- and middle-level teachers acknowledge PDSA as an effective tool for guiding student learning, that teachers rely on assessment data, and that teachers need on-going collaborative engagement with their colleagues when using the PDSA. This study has implications for positive social change by providing a structure for improving classroom instructional practices and engaging teachers in more collaborative practices.
59

Data-Driven Decision-Making in Urban Schools That Transitioned From Focus or Priority to Good Standing

Ware, Danielle 01 January 2018 (has links)
Despite the importance an urban school district places on data-driven decision-making (DDDM) to drive instruction, implementation continues to remain a challenge. The purpose of this study was to investigate how support systems affected the implementation of DDDM to drive instructional practices in three urban schools that recently transitioned from priority or focus to good standing on the State Accountability Report. The study aligned with the organizational supports conceptual framework with an emphasis on data accessibility, collection methods, reliability and validity, the use of coaches and data teams, professional development, and data-driven leaders. Through the collection of qualitative data from one-on-one interviews, the research questions asked about the perspectives on data culture and data driven instructional practices of three school leaders and nine teachers. The data were triangulated to generate a thematic illustration of content that was coded and analyzed to identify solid patterns and themes. Findings suggest that leaders create a data-driven school culture by establishing a school-wide vision, developing a DDDM cycle, creating a collaborative DDDM support system, communicating data as a school community, and changing the way technology is used in DDDM initiatives. Based on the findings, a project in the form of a white paper was developed, using research to support that when data is regularly used to hone student skills, a positive shift in overall teacher practices occurs. This shift provides the potential for positive social change when students have opportunities to attain academic goals, resulting in increased student achievement and higher graduation rates.
60

A novel approach for the improvement of error traceability and data-driven quality predictions in spindle units

Rangaraju, Adithya January 2021 (has links)
The lack of research on the impact of component degradation on the surface quality of machine tool spindles is limited and the primary motivation for this research. It is common in the manufacturing industry to replace components even if they still have some Remaining Useful Life (RUL), resulting in an ineffective maintenance strategy. The primary objective of this thesis is to design and construct an Exchangeable Spindle Unit (ESU) test stand that aims at capturing the influence of the failure transition of components during machining and its effects on the quality of the surface. Current machine tools cannot be tested with extreme component degradation, especially the spindle, since the degrading elements can lead to permanent damage, and machine tools are expensive to repair. The ESU substitutes and decouples the machine tool spindle to investigate the influence of deteriorated components on the response so that the machine tool spindle does not take the degrading effects. Data-driven quality control is another essential factor which many industries try to implement in their production line. In a traditional manufacturing scenario, quality inspections are performed to check if the parameters measured are within the nominal standards at the end of a production line or between processes. A significant flaw in the traditional approach is its inability to map the degradation of components to quality. Condition monitoring techniques can resolve this problem and help identify defects early in production. This research focuses on two objectives. The first one aims at capturing the component degradation by artificially inducing imbalance into the ESU shaft and capturing the excitation behavior during machining with an end mill tool. Imbalance effects are quantified by adding mass onto the ESU spindle shaft. The varying effects of the mass are captured and characterized using vibration signals. The second objective is to establish a correlation between the surface quality of the machined part with the characterized vibrations signals by Bagged Ensemble Tree (BET) machine learning models. The results show a good correlation between the surface roughness and the accelerometer signals. A comparison study between a balanced and imbalanced spindle along with its resultant surface quality is presented in this research. / Bristen på forskning om inverkan av komponentnedbrytning på ytkvaliteten hos verktygsmaskiner är begränsad och den primära motivationen för denna forskning. Det är vanligt inom tillverkningsindustrin att byta ut komponenter även om de fortfarande har en viss återstående livslängd, vilket resulterar i en ineffektiv underhållsstrategi. Det primära syftet med denna avhandling är att designa och konstruera en utbytbar spindelenhetstestsats som syftar till att fånga inverkan av komponentbrottsövergång under bearbetning och dess effekter på ytkvaliteten. Nuvarande verktygsmaskiner kan inte testas med extrem komponentnedbrytning, speciellt spindeln, eftersom de nedbrytande elementen kan leda till permanenta skador och verktygsmaskiner är dyra att reparera. Den utbytbara spindelenheten ersätter och kopplar bort verktygsmaskinens spindel för att undersöka effekten av försämrade komponenter på responsen så att verktygsmaskinens spindel inte absorberar de nedbrytande effekterna. Datadriven kvalitetskontroll är en annan viktig faktor som många industrier försöker implementera i sin produktionslinje. I ett traditionellt tillverkningsscenario utförs kvalitetsinspektioner för att kontrollera om de uppmätta parametrarna ligger inom de nominella normerna i slutet av en produktionslinje eller mellan processer. En betydande brist med det traditionella tillvägagångssättet är dess oförmåga att kartlägga komponenternas försämring till kvalitet. Tillståndsövervakningstekniker kan lösa detta problem och hjälpa till att identifiera defekter tidigt i produktionsprocessen. Denna forskning fokuserar på två mål. Den första syftar till att fånga komponentnedbrytning genom att artificiellt inducera obalans i axeln på den utbytbara spindelenheten och fånga excitationsbeteendet under bearbetning med ett fräsverktyg. Obalanseffekter kvantifieras genom att tillföra massa till spindelaxeln på den utbytbara spindelenheten. Massans varierande effekter fångas upp och karakteriseras med hjälp av vibrationssignaler. Det andra målet är att etablera en korrelation mellan ytkvaliteten hos den bearbetade delen med de karakteriserade vibrationssignalerna från Bagged Ensemble Tree maskininlärningsmodeller. Resultaten visar en god korrelation mellan ytjämnheten och accelerometerns signaler. En jämförande studie mellan en balanserad och obalanserad spindel tillsammans med dess resulterande ytkvalitet presenteras i denna forskning.

Page generated in 0.0705 seconds