Spelling suggestions: "subject:"data deriven"" "subject:"data dcdriven""
131 |
Online boiler convective heat exchanger monitoring: a comparison of soft sensing and data-driven approachesPrinsloo, Gerto 07 May 2019 (has links)
Online monitoring supports plant reliability and performance management by providing real time information about the condition of equipment. However, the intricate geometries and harsh operating environment of coal fired power plant boilers inhibit the ability to do online measurements of all process related variables. A low-cost alternative lies in the possibility of using knowledge about boiler operation to extract information about its condition from standard online process measurements. This approach is evaluated with the aim of enhancing online condition monitoring of a boiler’s convective pass heat exchanger network by respectively using a soft sensor and a data-driven method. The soft sensor approach is based on a one-dimensional thermofluid process model which takes measurements as inputs and calculates unmeasured variables as outputs. The model is calibrated based on design information. The data-driven method is one developed specifically in this study to identify unique fault signatures in measurement data to detect and quantify changes in unmeasured variables. The fault signatures are initially constructed using the calibrated one-dimensional thermofluid process model. The benefits and limitations of these methods are compared at the hand of a case study boiler. The case study boiler has five convective heat exchanger stages, each composed of four separate legs. The data-driven method estimates the average conduction thermal resistance of individual heat exchanger legs and the flue gas temperature at the inlet to the convective pass. In addition to this, the soft sensor estimates the average fluid variables for individual legs throughout the convective pass and therefore provides information better suited for condition prognosis. The methods are tested using real plant measurements recorded during a period which contained load changes and on-load heat exchanger cleaning events. The cleaning event provides some basis for validating the results because the qualitative changes of some unmeasured monitored variables expected during this event are known. The relative changes detected by both methods are closely correlated. The data-driven method is computationally less expensive and easily implementable across different software platforms once the fault signatures have been obtained. Fault signatures are easily trainable once the model has been developed. The soft sensors require the continuous use of the modelling software and will therefore be subject to licencing constraints. Both methods offer the possibility to enhance the monitoring resolution of modern boilers without the need to install any additional measurements. Implementation of these monitoring frameworks can provide a simple and low-cost contribution to optimized boiler performance and reliability management.
|
132 |
SunDown: Model-driven Per-Panel Solar Anomaly Detection for Residential ArraysFeng, Menghong 15 July 2020 (has links)
There has been significant growth in both utility-scale and residential-scale solar installa- tions in recent years, driven by rapid technology improvements and falling prices. Unlike utility-scale solar farms that are professionally managed and maintained, smaller residential- scale installations often lack sensing and instrumentation for performance monitoring and fault detection. As a result, faults may go undetected for long periods of time, resulting in generation and revenue losses for the homeowner. In this thesis, we present SunDown, a sensorless approach designed to detect per-panel faults in residential solar arrays. SunDown does not require any new sensors for its fault detection and instead uses a model-driven ap- proach that leverages correlations between the power produced by adjacent panels to de- tect deviations from expected behavior. SunDown can handle concurrent faults in multiple panels and perform anomaly classification to determine probable causes. Using two years of solar generation data from a real home and a manually generated dataset of multiple solar faults, we show that our approach has a MAPE of 2.98% when predicting per-panel output. Our results also show that SunDown is able to detect and classify faults, including from snow cover, leaves and debris, and electrical failures with 99.13% accuracy, and can detect multi- ple concurrent faults with 97.2% accuracy.
|
133 |
Physics-Guided Data-Driven Production Forecasting in ShalesSaputra, Wardana 11 1900 (has links)
In the early 21st century, oil and gas production in the U.S. was conjectured to be in terminal-irreversible decline. But, thanks to the advancement of hydraulic fracturing technologies over the last decade, operators are now able to produce two-thirds of U.S. oil and gas output from almost impermeable shale formations. Despite the enormous success of the ‘shale revolution’, there are still debates about how long shale production will last and if there will be enough to subsidize a meaningful transition to ‘greener’ power sources. Most official pronouncements of shale oil and gas reserves are based on purely empirical curve-fitting approaches or geological volumetric calculations that tend to largely overestimate the actual reserves. As an alternative to these industry-standard forecasting methods, we propose a more reliable, ‘transparent’, physics-guided and data-driven approach to estimating future production rates of oil and gas in shales. Our physics-based scaling method captures all essential physics of hydrocarbon production and hydrofracture geometry, yet it is as simple as the industry-favored Decline Curve Analysis (DCA), so that most engineers can adopt it. We also demonstrate that our method is as accurate as other analytical methods and has the same predictive power as commercial reservoir simulators but with less data required and significantly faster computational time. To capture the uncertainties of play-wide production, we combine physical scaling with the Generalized Extreme Value (GEV) statistics. So far, we have implemented this method to nearly half a million wells from all major U.S. shale plays. Since the results of our analyses are not subject to bias, policy-makers ought not to assume that the shale production boom will last for centuries.
|
134 |
On the Characteristics of a Data-driven Multi-scale Frame Convergence AlgorithmGrunden, Beverly K. 01 June 2021 (has links)
No description available.
|
135 |
Lawful, reasonable and fair decision-making in disciplinary cases in secondary schoolsHerselman, Lodewikus Stephanus January 2014 (has links)
Section 16 A (2) (d), (e) and (f) of the South African Schools Act, Act 84 of 1996 assumes that a school principal has specialised knowledge in interpreting legislation, dealing with disciplinary matters pertaining to learners, educators and support staff, and making disciplinary decisions. The legal framework of the Promotion of Administrative Justice Act, Act 3 of 2000, as well as section 33 of the Constitution of the Republic of South Africa, Act 108 of 1996, affects disciplinary decision making in education. The need to understand how legislation affects disciplinary decision making is important, because s ection 16 A of the South African Schools Act, Act 84 of 1996 assumes that education managers have the requisite knowledge and understanding of the law when dealing with disciplinary decision making. Disciplinary decisions taken by education managers fall in the domain of administrative law. The Promotion of Administrative Justice Act, Act 3 of 2000, forms the foundation for administrative action that is lawful, reasonable and fair. Since this Act is relatively new, and education managers have a lack of education law knowledge in general, it can be argued that principals might struggle to take disciplinary decisions that are lawful, reasonable and fair. Thus, there is a need to answer the following question: What are the legal requirements that should be considered in taking disciplinary decisions that are lawful, reasonable and fair and how can these disciplinary decisions be made more effectively?
The purpose of the study was to understand the context and content of Section 33 of the Constitution of the Republic of South Africa, Act 108 of 1996, the Promotion of Administrative Justice Act, Act 3 of 2000, and Section 16A of the Schools Act , Act 84 of 1996 and how they would positively influence disciplinary decision making in South African education. The main research question was: What are the legal requirements that should be considered in taking disciplinary decisions that are lawful, reasonable and fair and how can these disciplinary decisions be made more effectively?
Chapter 2 answered the research question of which decision-making processes could assist the education manager to take disciplinary decisions that are lawful, reasonable and fair. It was established that principals make frequent use of the rational model for decision making. However, the more comprehensive data-driven decision-making model was proposed. This not only focuses on a single disciplinary decision, but on the cause and trends of all transgressions that exist in a school. This model enables a principal to draw up a plan of action to deal with the cause of the problem.
After analysing the applicable legal framework, the concepts of lawful, reasonable, and fair were defined and interpreted in Chapter 3. An administrative action is lawful when an administrator is duly authorised by law to exercise power. Reasonableness has two elements, namely rationality and proportionality. Rationality means that evidence and information should support a decision an administrator takes, while the purpose of rationality is to avoid an imbalance between the adverse and beneficial effects. The approach to fairness has changed since the pre-democratic era. The main components that are linked to procedural fairness are the common-law principles of audi alteram partem, and nemo iudex in sua causa.
The qualitative approach was followed in this study to shed light on the perceptions of the participants on the meaning of the legal concepts of lawful, reasonable, and fair in disciplinary decision making, and their understanding of the legal framework of this study. Furthermore, this study sought answers to which decision-making processes could assist the education manager, as well as to the advantages of having a disciplinary coordinator to assist education managers in making lawful, reasonable and fair disciplinary decisions.
Convenience and purposeful sampling was used because the schools were conveniently located. Four secondary school principals in Cape Town were chosen, as well as two officials from the Western Cape Department of Education. The reason for purposive sampling was that two of the four schools that were selected had to have a discipline coordinator. Semi-structured interviews were held with the abovementioned principals and officials to answer the main research question.
The following information emerged from the semi-structured interviews which were incorporated in the data-driven, decision-making model of school improvement. Some of the findings were: i. Animosity exists between some school principals and the Western Cape Education Department (WCED). There is a lack of communication between the WCED and principals, as well as a lack of training on disciplinary decision making.
ii. It was also established that principals made common mistakes related to the interpretation of legislation or applicable regulations.
iii. A good practice emanating from the study is a paper trail of all interventions kept by schools.
iv. Principals tend to use only the South African Schools Act as a legal framework for disciplinary decision making.
v. Principals need to focus on strategies to address the link between bad behaviour and poor academic performance.
vi. A discipline coordinator can assist the principal in maintaining discipline, investigating transgressions, organising disciplinary hearings, and in disciplinary decision making.
Decision making, lawfulness, reasonableness, and fairness were combined in this research to establish the legal requirements that should be considered in taking disciplinary decisions that are lawful, reasonable and fair, and how these disciplinary decisions can be more effective for the sole purpose of school improvement. / Thesis (PhD)--University of Pretoria, 2014. / tm2015 / Education Management and Policy Studies / PhD / Unrestricted
|
136 |
Data-driven persona development for a knowledge management systemBaldi, Annika January 2021 (has links)
Generating personas based entirely on data has gained popularity. Personas describe characteristics of a user group in a human-like format. This project presents the persona creation process from raw data to evaluated personas for Zapiens’ knowledge management system. The objective of the personas is to learn about customer behavior and aid in customer communication. For the described methodology, platform log data was clustered to group the users. The quantitative approach is, thereby, fast, updatable, and scalable. The analysis was split into two different features of the Zapiens platform. Persona sets for the training component and the chatbot component of Zapiens were tried to be created. The group characteristics were then enhanced with data from user surveys. This approach proved to be only successful for the training analysis. The collected data is presented in a web-based persona template to make the personas easily accessible and sharable. The finished training persona set was evaluated using the Persona Perception Scale. The results showed three personas of satisfying quality. The project aims to provide a complete overview of the data-driven persona development process.
|
137 |
Data-Driven Decision Support Systems for Product Development - A Data Exploration Study Using Machine LearningAeddula, Omsri January 2021 (has links)
Modern product development is a complex chain of events and decisions. The ongoing digital transformation of society, increasing demands in innovative solutions puts pressure on organizations to maintain, or increase competitiveness. As a consequence, a major challenge in the product development is the search for information, analysis, and the build of knowledge. This is even more challenging when the design element comprises complex structural hierarchy and limited data generation capabilities. This challenge is even more pronounced in the conceptual stage of product development where information is scarce, vague, and potentially conflicting. The ability to conduct exploration of high-level useful information using a machine learning approach in the conceptual design stage would hence enhance be of importance to support the design decision-makers, where the decisions made at this stage impact the success of overall product development process. The thesis aims to investigate the conceptual stage of product development, proposing methods and tools in order to support the decision-making process by the building of data-driven decision support systems. The study highlights how the data can be utilized and visualized to extract useful information in design exploration studies at the conceptual stage of product development. The ability to build data-driven decision support systems in the early phases facilitates more informed decisions. The thesis presents initial descriptive study findings from the empirical studies, showing the capabilities of the machine learning approaches in extracting useful information, and building data-driven decision support systems. The thesis initially describes how the linear regression model and artificial neural networks extract useful information in design exploration, providing support for the decision-makers to understand the consequences of the design choices through cause-and-effect relationships on a detailed level. Furthermore, the presented approach also provides input to a novel visualization construct intended to enhance comprehensibility within cross-functional design teams. The thesis further studies how the data can be augmented and analyzed to extract the necessary information from an existing design element to support the decision-making process in an oral healthcare context.
|
138 |
Efficient simulation tools for real-time monitoring and control using model order reduction and data-driven techniques / Outils de simulation efficaces pour la surveillance et le contrôle en temps réel à l'aide de techniques de réduction de modèles et de techniques basées sur les donnéesQuaranta, Giacomo 02 September 2019 (has links)
La simulation numérique, c'est-à-dire l'utilisation des ordinateurs pour exécuter un programme physique, est une partie importante du monde technologique actuel. Elle est nécessaire dans de nombreux domaines scientifiques et techniques pour étudier le comportement de systèmes dont les modèles mathématiques sont trop complexes pour fournir des solutions analytiques et elle rend possible l'évaluation virtuelle des réponses des systèmes (jumeaux virtuels). Cela réduit considérablement le nombre de tests expérimentaux nécessaires à la conception précise du système réel que le modèle numérique représente. Cependant, ces jumeaux virtuels, basés sur des méthodes classiques qui utilisent une représentation fine du système (ex. méthode des éléments finis), permettent rarement une rétroaction en temps réel, même dans un contexte de calcul haute performance, fonctionnant sur des plateformes puissantes. Dans ces circonstances, les performances en temps réel requises dans certaines applications sont compromises. En effet, les jumeaux virtuels sont statiques, c'est-à-dire qu'ils sont utilisés dans la conception de systèmes complexes et de leurs composants, mais on ne s'attend pas à ce qu'ils prennent en compte ou assimilent des données affin de définir des systèmes d'application dynamiques pilotés par les données. De plus, des écarts significatifs entre la réponse observée et celle prévue par le modèle sont généralement constatés en raison de l'imprécision des modèles employés, de la détermination des paramètres du modèle ou de leur évolution dans le temps. Dans cette thèse, nous proposons différentes méthodes pour résoudre ces handicaps affin d'effectuer une surveillance et un contrôle en temps réel. Dans la première partie, les techniques de Réduction de Modèles sont utilisées pour tenir compte des contraintes en temps réel; elles calculent une bonne approximation de la solution en simplifiant la procédure de résolution plutôt que le modèle. La précision de la solution n'est pas compromise et des simulations e-caces peuvent être réalisées (jumeaux numériquex). Dans la deuxième partie, la modélisation pilotée par les données est utilisée pour combler l'écart entre la solution paramétrique calculée, en utilisant des techniques de réduction de modèles non intrusives, et les champs mesurés, affin de rendre possibles des systèmes d'application dynamiques basés sur les données (jumeaux hybrides). / Numerical simulation, the use of computers to run a program which implements a mathematical model for a physical system, is an important part of today technological world. It is required in many scientific and engineering fields to study the behavior of systems whose mathematical models are too complex to provide analytical solutions and it makes virtual evaluation of systems responses possible (virtual twins). This drastically reduces the number of experimental tests for accurate designs of the real system that the numerical model represents. However these virtual twins, based on classical methods which make use of a rich representations of the system (e.g. finite element method), rarely allows real-time feedback, even when considering high performance computing, operating on powerful platforms. In these circumstances, the real-time performance required in some applications are compromised. Indeed the virtual twins are static, that is, they are used in the design of complex systems and their components, but they are not expected to accommodate or accommodate or assimilate data so as to define dynamic data-driven application systems. Moreover significant deviations between the observed response and the one predicted by the model are usually noticed due to inaccuracy in the employed models, in the determination of the model parameters or in their time evolution. In this thesis we propose different methods to solve these handicaps in order to perform real-time monitoring and control. In the first part Model Order Reduction (MOR) techniques are used to accommodate real-time constraints; they compute a good approximation of the solution by simplifying the solution procedure instead of the model. The accuracy of the predicted solution is not compromised and efficient simulations can be performed (digital twins). In the second part data-driven modeling are employed to fill the gap between the parametric solution, computed by using non-intrusive MOR techniques, and the measured fields, in order to make dynamic data-driven application systems possible (hybrid twins).
|
139 |
Using Available Archival and Secondary Data to Drive Cutting Edge ResearchDuncan, James M., PhD, CFLE, DAV, Ferraro, Anthony J., PhD, Pippert, Hilary, MS, Reed-Fitzke, Kayla, PhD 04 April 2020 (has links)
This presentation will present primary data collection techniques using archival data to identify participants and how to leverage existing datasets to conduct secondary data analyses. It can often be difficult for new professionals and students to access data or may be unaware of the pros and cons of using either research technique. Data from three different studies will be presented including Long Term Care in Arkansas, Co-Parenting Across Households, and Identifying At-Risk Early Career Servicemembers. Results discussed will provide detailed comparisons of collected samples to target populations. The presentation will aim to assist students and new professionals in better understanding data driven research and provide tools for future use of both secondary and primary data.
|
140 |
Omnichannel path to purchase : Viability of Bayesian Network as Market Attribution ModelsDikshit, Anubhav January 2020 (has links)
Market attribution is the problem of interpreting the influence of advertisements onthe user’s decision process. Market attribution is a hard problem, and it happens to be asignificant reason for Google’s revenue. There are broadly two types of attribution models- data-driven and heuristics.This thesis focuses on the data driven attribution modeland explores the viability of using Bayesian Network as market attribution models andbenchmarks the performance against a logistic regression. The data used in this thesiswas prepossessed using undersampling technique. Furthermore, multiple techniques andalgorithms to learn and train Bayesian Network are explored and evaluated.For the given dataset, it was found that Bayesian Network can be used for market at-tribution modeling and that its performance is better than the baseline logistic model. Keywords: Market Attribution Model, Bayesian Network, Logistic Regression.
|
Page generated in 0.0458 seconds