1 |
Artificial neural networks for fault diagnosis, modelling and control of diesel enginesMesbahi, Ehsan January 2000 (has links)
No description available.
|
2 |
Algorithms and architectures for real-time control of water treatment plantBöhme, Thomas Jürgen January 2000 (has links)
No description available.
|
3 |
A SUB-GROUPING METHODOLOGY AND NON-PARAMETRIC SEQUENTIAL RATIO TEST FOR SIGNAL VALIDATIONYU, CHENGGANG 11 June 2002 (has links)
No description available.
|
4 |
Sensor Validation Using Linear Parametric Models, Artificial Neural Networks and CUSUM / Sensorvalidering medelst linjära konfektionsmodeller, artificiella neurala nätverk och CUSUMNorman, Gustaf January 2015 (has links)
Siemens gas turbines are monitored and controlled by a large number of sensors and actuators. Process information is stored in a database and used for offline calculations and analyses. Before storing the sensor readings, a compression algorithm checks the signal and skips the values that explain no significant change. Compression of 90 % is not unusual. Since data from the database is used for analyses and decisions are made upon results from these analyses it is important to have a system for validating the data in the database. Decisions made on false information can result in large economic losses. When this project was initiated no sensor validation system was available. In this thesis the uncertainties in measurement chains are revealed. Methods for fault detection are investigated and finally the most promising methods are put to the test. Linear relationships between redundant sensors are derived and the residuals form an influence structure allowing the faulty sensor to be isolated. Where redundant sensors are not available, a gas turbine model is utilized to state the input-output relationships so that estimates of the sensor outputs can be formed. Linear parametric models and an ANN (Artificial Neural Network) are developed to produce the estimates. Two techniques for the linear parametric models are evaluated; prediction and simulation. The residuals are also evaluated in two ways; direct evaluation against a threshold and evaluation with the CUSUM (CUmulative SUM) algorithm. The results show that sensor validation using compressed data is feasible. Faults as small as 1% of the measuring range can be detected in many cases.
|
5 |
LSTM Networks for Detection and Classification of Anomalies in Raw Sensor DataVerner, Alexander 01 January 2019 (has links)
In order to ensure the validity of sensor data, it must be thoroughly analyzed for various types of anomalies. Traditional machine learning methods of anomaly detections in sensor data are based on domain-specific feature engineering. A typical approach is to use domain knowledge to analyze sensor data and manually create statistics-based features, which are then used to train the machine learning models to detect and classify the anomalies. Although this methodology is used in practice, it has a significant drawback due to the fact that feature extraction is usually labor intensive and requires considerable effort from domain experts.
An alternative approach is to use deep learning algorithms. Research has shown that modern deep neural networks are very effective in automated extraction of abstract features from raw data in classification tasks. Long short-term memory networks, or LSTMs in short, are a special kind of recurrent neural networks that are capable of learning long-term dependencies. These networks have proved to be especially effective in the classification of raw time-series data in various domains. This dissertation systematically investigates the effectiveness of the LSTM model for anomaly detection and classification in raw time-series sensor data.
As a proof of concept, this work used time-series data of sensors that measure blood glucose levels. A large number of time-series sequences was created based on a genuine medical diabetes dataset. Anomalous series were constructed by six methods that interspersed patterns of common anomaly types in the data. An LSTM network model was trained with k-fold cross-validation on both anomalous and valid series to classify raw time-series sequences into one of seven classes: non-anomalous, and classes corresponding to each of the six anomaly types.
As a control, the accuracy of detection and classification of the LSTM was compared to that of four traditional machine learning classifiers: support vector machines, Random Forests, naive Bayes, and shallow neural networks. The performance of all the classifiers was evaluated based on nine metrics: precision, recall, and the F1-score, each measured in micro, macro and weighted perspective.
While the traditional models were trained on vectors of features, derived from the raw data, that were based on knowledge of common sources of anomaly, the LSTM was trained on raw time-series data. Experimental results indicate that the performance of the LSTM was comparable to the best traditional classifiers by achieving 99% accuracy in all 9 metrics. The model requires no labor-intensive feature engineering, and the fine-tuning of its architecture and hyper-parameters can be made in a fully automated way. This study, therefore, finds LSTM networks an effective solution to anomaly detection and classification in sensor data.
|
6 |
DEVELOPMENT AND INTEGRATION OF HARDWARE AND SOFTWARE FOR ACTIVE-SENSORS IN STRUCTURAL HEALTH MONITORINGOVERLY, TIMOTHY G. S. 03 July 2007 (has links)
No description available.
|
7 |
Capturing Three-Dimensional Clavicle Kinematics During Arm Elevation: Describing the Contribution of Clavicle Motion and Associated Scapulothoracic Muscle Activation to Total Shoulder Complex MotionSzucs, Kimberly A. 02 September 2010 (has links)
No description available.
|
8 |
DESENVOLVIMENTO E VALIDAÇÃO DE SISTEMAS DE MONITORAMENTO DE BAIXO CUSTO DE TEMPERATURA E UMIDADE RELATIVA DO AR / DEVELOPMENT AND DOWN MONITORING SYSTEMS VALIDATION COST OF TEMPERATURE AND HUMIDITY ON AIRRêgo, Márlison de Sá 11 January 2016 (has links)
From the perspective of optimizing the planning, management and management of water resources in the national context the hydrometeorological monitoring systems have gone through the process of technological innovation. Increased monitoring of potential for reduced costs and improved quality of monitored data motivate this field of research. Starting from the importance of the variables of temperature and relative humidity among the variables monitored, this paper proposes the development of three hardware systems and software based on an open platform, which measure these values accurately from three sensors low cost, RHT01, RHT02 and RHT03 available in the market from the year 2005. These sensors are part of the advancement of sensor lines, newly created, polymer and semiconductor oxides. The technological base of each system is the microcontroller "arduino rev. 3 ", which he managed an electronic circuit capable of collecting and storing data from hydro-meteorological variables, in a flash memory device (sd card). Then, in order to check the reliability on the operation, each system was subjected to three data collection tests, at three different sites operating simultaneously with conventional systems used by the Micrometeorology Laboratory of UFSM (LMMET/UFSM). Local tests are situated in the municipality of Santa Maria, in the state of Rio Grande do Sul, Brazil. The microclimate conditions were different among trials due to location. The first test was performed on the terrace of the INPE-UFSM facilities, the second test in micrometeorology Laboratory of UFSM and the third in a rural area of central campus of UFSM, characterized by the pampa biome, which is located a network monitoring tower micro-sulfux. The collections occurred during the months from April to June, representing the final step of the fall season and the beginning of winter in 2015. The first trial lasted eleven days, the second and third Thirteen Eleven. The time discretization of the collections was a minute. With the data collected was carried out comparative analysis of inexpensive and conventional systems by graphic systems for analysis, descriptive statistics, and statistics such as Pearson correlation coefficients, ANOVA and Tukey test. With determination coefficient values (R2) higher than 0.90 in all tests and no significant difference between low-cost sensors and analyzed conventional systems, with only one exception, the results indicated the feasibility of using these sensors to generate Data medium temperature and relative humidity for a minimum time interval from generation 1 minute, 15 minutes, 30 minutes and 1 hour. The results also indicated the possibility of further quantitative research on the influence of the sensors under the measured data. / Na perspectiva de otimizar o planejamento, manejo e gestão dos recursos hídricos no contexto nacional os sistemas de monitoramento hidrometeorológico tem passado pelo processo de inovação tecnológica. O aumento do potencial de monitoramento pela diminuição dos custos e a melhora na qualidade dos dados monitorados motivam esse campo de pesquisa. Partindo-se da relevância das variáveis de temperatura e umidade relativa do ar dentre as variáveis monitoradas, o presente trabalho propõe o desenvolvimento de três sistemas de hardware e software baseados em uma plataforma livre, que meçam essas grandezas com precisão a partir de três sensores de baixo custo, RHT01, RHT02 e RHT03, disponíveis no mercado a partir do ano de 2005. Estes sensores fazem parte do avanço das linhas de sensores, recém-criadas, de polímeros e óxidos semicondutores. A base tecnológica de cada sistema foi o microcontrolador arduino rev. 3 , o qual gerenciou um circuito eletrônico capaz de coletar e armazenar dados das variáveis hidrometeorológicas, em um dispositivo de memória flash ( sd card ). Em seguida, com o objetivo de verificar a confiabilidade quanto ao funcionamento, cada sistema foi submetido a três testes de coleta de dados, em três locais diferentes operando em simultâneo com sistemas convencionais utilizados pelo Laboratório de Micrometeorologia da UFSM (LMMET/UFSM). Os locais dos testes estão situados no município de Santa Maria, no Estado do Rio Grande do Sul, Brasil. As condições de microclima entre os testes eram diferentes em virtude do local. O primeiro teste foi realizado no terraço das instalações do INPE-UFSM, o segundo teste no laboratório de micrometeorologia da UFSM e o terceiro em uma área rural do campus central da UFSM, caracterizada pelo bioma pampa, onde fica localizada uma torre de monitoramento da rede micro-sulfux. As coletas ocorreram durante os meses de abril a junho, correspondendo a etapa final da estação de outono e início do inverno no ano de 2015. O primeiro teste durou onze dias, o segundo treze e o terceiro onze. A discretização temporal das coletas foi de um minuto. Com os dados coletados realizou-se análise comparativa dos sistemas de baixo custo e os sistemas convencionais mediante a análise de gráficos, medidas descritivas, e de estatísticas tais como a correlação linear de Pearson, análise de variância e teste Tukey. Com valores de coeficiente de determinação (R2) superiores a 0,90 em todos os testes e não havendo diferença significativa entre os sensores de baixo custo analisados e o sistemas convencional, com apenas uma única exceção, os resultados indicaram a viabilidade do uso destes sensores para geração de dados médios de temperatura e umidade relativa do ar para intervalo de tempo mínimo de geração de 1 minuto, 15 minutos, 30 minutos e 1 hora. Os resultados também indicaram a possibilidade de pesquisas quantitativas posteriores sobre a influência do abrigo dos sensores nos dados mensurados.
|
Page generated in 0.1964 seconds