• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 69
  • 7
  • 5
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 119
  • 119
  • 25
  • 23
  • 19
  • 16
  • 15
  • 15
  • 13
  • 13
  • 10
  • 10
  • 9
  • 9
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Estudo e implementação de métodos de validação de modelos matemáticos aplicados no desenvolvimento de sistemas de controle de processos industriais. / Research and implementation of mathematical model validation methods applied in the development of industrial process control systems.

Alvarado, Christiam Segundo Morales 22 June 2017 (has links)
A validação de modelos lineares é uma etapa importante em um projeto de Identificação de Sistemas, pois a escolha correta do modelo para representar a maior parte da dinâmica do processo, dentro de um número finito de técnicas de identificação e em torno de um ponto de operação, permite o sucesso no desenvolvimento de controladores preditivos e de controladores robustos. Por tal razão, o objetivo principal desta Tese é o desenvolvimento de um método de validação de modelos lineares, tendo como ferramentas de avaliação os métodos estatísticos, avaliações dinâmicas e análise da robustez do modelo. O componente principal do sistema de validação de modelos lineares proposto é o desenvolvimento de um sistema fuzzy para análise dos resultados obtidos pelas ferramentas utilizadas na etapa de validação. O projeto de Identificação de Sistemas é baseado em dados reais de operação de uma Planta-Piloto de Neutralização de pH, localizada no Laboratório de Controle de Processos Industriais da Escola Politécnica da USP. Para verificar o resultado da validação, todos os modelos são testados em um controlador preditivo do tipo QDMC (Quadratic Dynamic Matrix Control) para seguir uma trajetória de referência. Os critérios utilizados para avaliar o desempenho do controlador QDMC, para cada modelo utilizado, foram a velocidade de resposta do controlador e o índice da mínima variabilidade da variável de processo. Os resultados mostram que a confiabilidade do sistema de validação projetado para malhas com baixa e alta não-linearidade em um processo real, foram de 85,71% e 50%, respectivamente, com relação aos índices de desempenho obtidos pelo controlador QDMC. / Linear model validation is the most important stage in System Identification Project because, the model correct selection to represent the most of process dynamic allows the success in the development of predictive and robust controllers, within identification technique finite number and around the operation point. For this reason, the development of linear model validation methods is the main objective in this Thesis, taking as a tools of assessing the statistical, dynamic and robustness methods. Fuzzy system is the main component of model linear validation system proposed to analyze the results obtained by the tools used in validation stage. System Identification project is performed through operation real data of a pH neutralization pilot plant, located at the Industrial Process Control Laboratory, IPCL, of the Escola Politécnica of the University of São Paulo, Brazil. In order to verify the validation results, all modes are used in QDMC type predictive controller, to follow a set point tracking. The criterions used to assess the QDMC controller performance were the speed response and the process variable minimum variance index, for each model used. The results show that the validation system reliability were 85.71% and 50% projected for low and high non-linearity in a real process, respectively, linking to the performance indexes obtained by the QDMC controller.
72

Confidence-based model validation for reliability assessment and its integration with reliability-based design optimization

Moon, Min-Yeong 01 August 2017 (has links)
Conventional reliability analysis methods assume that a simulation model is able to represent the real physics accurately. However, this assumption may not always hold as the simulation model could be biased due to simplifications and idealizations. Simulation models are approximate mathematical representations of real-world systems and thus cannot exactly imitate the real-world systems. The accuracy of a simulation model is especially critical when it is used for the reliability calculation. Therefore, a simulation model should be validated using prototype testing results for reliability analysis. However, in practical engineering situation, experimental output data for the purpose of model validation is limited due to the significant cost of a large number of physical testing. Thus, the model validation needs to be carried out to account for the uncertainty induced by insufficient experimental output data as well as the inherent variability existing in the physical system and hence in the experimental test results. Therefore, in this study, a confidence-based model validation method that captures the variability and the uncertainty, and that corrects model bias at a user-specified target confidence level, has been developed. Reliability assessment using the confidence-based model validation can provide conservative estimation of the reliability of a system with confidence when only insufficient experimental output data are available. Without confidence-based model validation, the designed product obtained using the conventional reliability-based design optimization (RBDO) optimum could either not satisfy the target reliability or be overly conservative. Therefore, simulation model validation is necessary to obtain a reliable optimum product using the RBDO process. In this study, the developed confidence-based model validation is integrated in the RBDO process to provide truly confident RBDO optimum design. The developed confidence-based model validation will provide a conservative RBDO optimum design at the target confidence level. However, it is challenging to obtain steady convergence in the RBDO process with confidence-based model validation because the feasible domain changes as the design moves (i.e., a moving-target problem). To resolve this issue, a practical optimization procedure, which terminates the RBDO process once the target reliability is satisfied, is proposed. In addition, the efficiency is achieved by carrying out deterministic design optimization (DDO) and RBDO without model validation, followed by RBDO with the confidence-based model validation. Numerical examples are presented to demonstrate that the proposed RBDO approach obtains a conservative and practical optimum design that satisfies the target reliability of designed product given a limited number of experimental output data. Thus far, while the simulation model might be biased, it is assumed that we have correct distribution models for input variables and parameters. However, in real practical applications, only limited numbers of test data are available (parameter uncertainty) for modeling input distributions of material properties, manufacturing tolerances, operational loads, etc. Also, as before, only a limited number of output test data is used. Therefore, a reliability needs to be estimated by considering parameter uncertainty as well as biased simulation model. Computational methods and a process are developed to obtain confidence-based reliability assessment. The insufficient input and output test data induce uncertainties in input distribution models and output distributions, respectively. These uncertainties, which arise from lack of knowledge – the insufficient test data, are different from the inherent input distributions and corresponding output variabilities, which are natural randomness of the physical system.
73

Enabling Timing Analysis of Complex Embedded Software Systems

Kraft, Johan January 2010 (has links)
Cars, trains, trucks, telecom networks and industrial robots are examples of products relying on complex embedded software systems, running on embedded computers. Such systems may consist of millions of lines of program code developed by hundreds of engineers over many years, often decades. Over the long life-cycle of such systems, the main part of the product development costs is typically not the initial development, but the software maintenance, i.e., improvements and corrections of defects, over the years. Of the maintenance costs, a major cost is the verification of the system after changes has been applied, which often requires a huge amount of testing. However, today's techniques are not sufficient, as defects often are found post-release, by the customers. This area is therefore of high relevance for industry. Complex embedded systems often control machinery where timing is crucial for accuracy and safety. Such systems therefore have important requirements on timing, such as maximum response times. However, when maintaining complex embedded software systems, it is difficult to predict how changes may impact the system's run-time behavior and timing, e.g., response times.Analytical and formal methods for timing analysis exist, but are often hard to apply in practice on complex embedded systems, for several reasons. As a result, the industrial practice in deciding the suitability of a proposed change, with respect to its run-time impact, is to rely on the subjective judgment of experienced developers and architects. This is a risky and inefficient, trial-and-error approach, which may waste large amounts of person-hours on implementing unsuitable software designs, with potential timing- or performance problems. This can generally not be detected at all until late stages of testing, when the updated software system can be tested on system level, under realistic conditions. Even then, it is easy to miss such problems. If products are released containing software with latent timing errors, it may cause huge costs, such as car recalls, or even accidents. Even when such problems are found using testing, they necessitate design changes late in the development project, which cause delays and increases the costs. This thesis presents an approach for impact analysis with respect to run-time behavior such as timing and performance for complex embedded systems. The impact analysis is performed through optimizing simulation, where the simulation models are automatically generated from the system implementation. This approach allows for predicting the consequences of proposed designs, for new or modified features, by prototyping the change in the simulation model on a high level of abstraction, e.g., by increasing the execution time for a particular task. Thereby, designs leading to timing-, performance-, or resource usage problems can be identified early, before implementation, and a late redesigns are thereby avoided, which improves development efficiency and predictability, as well as software quality. The contributions presented in this thesis is within four areas related to simulation-based analysis of complex embedded systems: (1) simulation and simulation optimization techniques, (2) automated model extraction of simulation models from source code, (3) methods for validation of such simulation models and (4) run-time recording techniques for model extraction, impact analysis and model validation purposes. Several tools has been developed during this work, of which two are in commercialization in the spin-off company Percepio AB. Note that the Katana approach, in area (2), is subject for a recent patent application - patent pending. / PROGRESS
74

Model Validation and Discovery for Complex Stochastic Systems

Jha, Sumit Kumar 02 July 2010 (has links)
In this thesis, we study two fundamental problems that arise in the modeling of stochastic systems: (i) Validation of stochastic models against behavioral specifications such as temporal logics, and (ii) Discovery of kinetic parameters of stochastic biochemical models from behavioral specifications. We present a new Bayesian algorithm for Statistical Model Checking of stochastic systems based on a sequential version of Jeffreys’ Bayes Factor test. We argue that the Bayesian approach is more suited for application do- mains like systems biology modeling, where distributions on nuisance parameters and priors may be known. We prove that our Bayesian Statistical Model Checking algorithm terminates for a large subclass of prior probabilities. We also characterize the Type I/II errors associated with our algorithm. We experimentally demonstrate that this algorithm is suitable for the analysis of complex biochemical models like those written in the BioNetGen language. We then argue that i.i.d. sampling based Statistical Model Checking algorithms are not an effective way to study rare behaviors of stochastic models and present another Bayesian Statistical Model Checking algorithm that can incorporate non-i.i.d. sampling strategies. We also present algorithms for synthesis of chemical kinetic parameters of stochastic biochemical models from high level behavioral specifications. We consider the setting where a modeler knows facts that must hold on the stochastic model but is not confident about some of the kinetic parameters in her model. We suggest algorithms for discovering these kinetic parameters from facts stated in appropriate formal probabilistic specification languages. Our algorithms are based on our theoretical results characterizing the probability of a specification being true on a stochastic biochemical model. We have applied this algorithm to discover kinetic parameters for biochemical models with as many as six unknown parameters.
75

Stochastic Modelling of Random Variables with an Application in Financial Risk Management.

Moldovan, Max January 2003 (has links)
The problem of determining whether or not a theoretical model is an accurate representation of an empirically observed phenomenon is one of the most challenging in the empirical scientific investigation. The following study explores the problem of stochastic model validation. Special attention is devoted to the unusual two-peaked shape of the empirically observed distributions of the conditional on realised volatility financial returns. The application of statistical hypothesis testing and simulation techniques leads to the conclusion that the conditional on realised volatility returns are distributed with a specific previously undocumented distribution. The probability density that represents this distribution is derived, characterised and applied for validation of the financial model.
76

An assessment of recent changes in catchment sediment sources and sinks, central Queensland, Australia

Hughes, Andrew Owen, Physical, Environmental & Mathematical Sciences, Australian Defence Force Academy, UNSW January 2009 (has links)
Spatial and temporal information on catchment sediment sources and sinks can provide an improved understanding of catchment response to human-induced disturbances. This is essential for the implementation of well-targeted catchment-management decisions. This thesis investigates the nature and timing of catchment response to human activities by examining changes in sediment sources and sinks in a dry-tropical subcatchment of the Great Barrier Reef (GBR) catchment area, in northeastern Australia. Changes in catchment sediment sources, both in terms of spatial provenance and erosion type, are determined using sediment tracing techniques. Results indicate that changes in sediment source contributions over the last 250 years can be linked directly to changes in catchment land use. Sheetwash and rill erosion from cultivated land (40???60%) and channel erosion from grazed areas (30-80%) currently contribute most sediment to the river system. Channel erosion, on a basin-wide scale, appears to be more important than previously considered in this region of Australia. Optically stimulated luminescence and 137Cs dating are used to determine pre-and post- European settlement (ca. 1850) alluvial sedimentation rates. The limitations of using 137Cs as a floodplain sediment dating tool in a low fallout environment, dominated by sediment derived from channel and cultivation sources, are identified. Low magnitude increases in post-disturbance floodplain sedimentation rates (3 to 4 times) are attributed to the naturally high sediment loads in the dry-tropics. These low increases suggest that previous predictions which reflect order of magnitude increases in post-disturbance sediment yields are likely to be overestimates. In-channel bench deposits, formed since European settlement, are common features that appear to be important stores of recently eroded material. The spatially distributed erosion/sediment yield model SedNet is applied, both with generic input parameters and locally-derived data. Outputs are evaluated against available empirically-derived data. The results suggest that previous model estimates using generic input parameters overestimate post-disturbance and underestimate pre-disturbance sediment yields, exaggerating the impact of European catchment disturbance. This is likely to have important implications for both local-scale and catchment-wide management scenarios in the GBR region. Suggestions for future study and the collection of important empirical data to enable more accurate model performance are made.
77

Modeling and Simulation of Brake Squeal in Disc Brake Assembly / Modellering och simulering av bromsskrik i skivbromsar

Nilman, Jenny January 2018 (has links)
Brake squeal is an old and well-known problem in the vehicle industry and is a frequent source for customer complain. Although, brake squeal is not usually affecting the performance of the brakes, it is still important to address the problem and to predict the brakes tendency to squeal on an early stage in the design process. Brake squeal is usually defined as a sustained, high-frequency vibration of the brake components, due to the braking action. By using simulation in finite element (FE) method it should be possible to predict at what frequencies the brakes tend to emit sound. The method chosen for the analysis was the complex eigenvalues analysis (CEA) method, since it is a well-known tool to predict unstable modes in FE analysis. The results from the CEA were evaluated against measured data from an earlier study. Even though there are four main mechanism formulated in order to explain the up come of squeal, the main focus in this project was modal coupling, since it is the main mechanism in the CEA. A validation of the key components in model was performed before the analysis, in order to achieve better correlation between the FE model and reality. A parametric study was conducted with the CEA, to investigate how material properties and operating parameters effected the brakes tendency to squeal. The following parameters was included in the analysis; coefficient of friction, brake force, damping, rotational velocity, and Young’s modulus for different components. The result from the CEA did not exactly reproduce the noise frequencies captured in experimental tests. The discrepancy is believed to mainly be due to problems in the calibration process of the components in the model. The result did however show that the most effective way to reduce the brakes tendency for squeal was to lower the coefficient of friction. The effect of varying the Young’s modulus different components showed inconsistent results on the tendency to squeal. By adding damping one of the main disadvantages for the CEA, which the over-prediction of the number of unstable modes, where minimized.
78

Estudo e implementação de métodos de validação de modelos matemáticos aplicados no desenvolvimento de sistemas de controle de processos industriais. / Research and implementation of mathematical model validation methods applied in the development of industrial process control systems.

Christiam Segundo Morales Alvarado 22 June 2017 (has links)
A validação de modelos lineares é uma etapa importante em um projeto de Identificação de Sistemas, pois a escolha correta do modelo para representar a maior parte da dinâmica do processo, dentro de um número finito de técnicas de identificação e em torno de um ponto de operação, permite o sucesso no desenvolvimento de controladores preditivos e de controladores robustos. Por tal razão, o objetivo principal desta Tese é o desenvolvimento de um método de validação de modelos lineares, tendo como ferramentas de avaliação os métodos estatísticos, avaliações dinâmicas e análise da robustez do modelo. O componente principal do sistema de validação de modelos lineares proposto é o desenvolvimento de um sistema fuzzy para análise dos resultados obtidos pelas ferramentas utilizadas na etapa de validação. O projeto de Identificação de Sistemas é baseado em dados reais de operação de uma Planta-Piloto de Neutralização de pH, localizada no Laboratório de Controle de Processos Industriais da Escola Politécnica da USP. Para verificar o resultado da validação, todos os modelos são testados em um controlador preditivo do tipo QDMC (Quadratic Dynamic Matrix Control) para seguir uma trajetória de referência. Os critérios utilizados para avaliar o desempenho do controlador QDMC, para cada modelo utilizado, foram a velocidade de resposta do controlador e o índice da mínima variabilidade da variável de processo. Os resultados mostram que a confiabilidade do sistema de validação projetado para malhas com baixa e alta não-linearidade em um processo real, foram de 85,71% e 50%, respectivamente, com relação aos índices de desempenho obtidos pelo controlador QDMC. / Linear model validation is the most important stage in System Identification Project because, the model correct selection to represent the most of process dynamic allows the success in the development of predictive and robust controllers, within identification technique finite number and around the operation point. For this reason, the development of linear model validation methods is the main objective in this Thesis, taking as a tools of assessing the statistical, dynamic and robustness methods. Fuzzy system is the main component of model linear validation system proposed to analyze the results obtained by the tools used in validation stage. System Identification project is performed through operation real data of a pH neutralization pilot plant, located at the Industrial Process Control Laboratory, IPCL, of the Escola Politécnica of the University of São Paulo, Brazil. In order to verify the validation results, all modes are used in QDMC type predictive controller, to follow a set point tracking. The criterions used to assess the QDMC controller performance were the speed response and the process variable minimum variance index, for each model used. The results show that the validation system reliability were 85.71% and 50% projected for low and high non-linearity in a real process, respectively, linking to the performance indexes obtained by the QDMC controller.
79

HdSC: modelagem de alto nível para simulação nativa de plataformas com suporte ao desenvolvimento de HdS

Prado, Bruno Otávio Piedade 08 1900 (has links)
Com os grandes avanços recentes dos sistemas computacionais, houve a possibilidade de ascensão de dispositivos inovadores, como os modernos telefones celulares e tablets com telas sensíveis ao toque. Para gerenciar adequadamente estas diversas interfaces é necessário utilizar o software dependente do hardware (HdS), que é responsável pelo controle e acesso a estes dispositivos. Além deste complexo arranjo de componentes, para atender a crescente demanda por mais funcionalidades integradas, o paradigma de multiprocessamento vem sendo adotado para aumentar o desempenho das plataformas. Devido à lacuna de produtividade de sistemas, tanto a indústria como a academia têm pesquisado processos mais eficientes para construir e simular sistemas cada vez mais complexos. A premissa dos trabalhos do estado da arte está em trabalhar com modelos com alto nível de abstração e de precisão que permitam ao projetista avaliar rapidamente o sistema, sem ter que depender de lentos e complexos modelos baseados em ISS. Neste trabalho é definido um conjunto de construtores para modelagem de plataformas baseadas em processadores, com suporte para desenvolvimento de HdS e reusabilidade dos componentes, técnicas para estimativa estática de tempo simulado em ambiente nativo de simulação e suporte para plataformas multiprocessadas. Foram realizados experimentos com aplica- ções de entrada e saída intensiva, computação intensiva e multiprocessada, com ganho médio de desempenho da ordem de 1.000 vezes e precisão de estimativas com erro médio inferior a 3%, em comparação com uma plataforma de referência baseada em ISS._________________________________________________________________________________________ ABSTRACT: The amazing advances of computer systems technology enabled the rise of innovative devices, such as modern touch sensitive cell phones and tablets. To properly manage these various interfaces, it is required the use of the Hardwaredependent Software (HdS) that is responsible for these devices control and access. Besides this complex arrangement of components, to meet the growing demand for more integrated features, the multiprocessing paradigm has been adopted to increase the platforms performance. Due to the system design gap, both industry and academia have been researching for more efficient processes to build and simulate systems with this increasingly complexity. The premise of the state of art works is the development of high level of abstraction and precise models to enable the designer to quickly evaluate the system, without having to rely on slow and complex models based on instruction set simulators (ISS). This work defined a set of constructors for processor-based platforms modeling, supporting HdS development and components reusability, techniques for static simulation timing estimation in native environment and support for multiprocessor platforms. Experiments were carried out with input and output intensive, compute intensive and multiprocessed applications leading to an average performance speed up of about 1,000 times and average timing estimation accuracy of less than 3%, when compared with a reference platform based on ISS.
80

Contribution à l’estimation robuste de modèles dynamiques : Application à la commande de systèmes dynamiques complexes. / Contribution in the robust estimate of dynamic models. Application in the order of complex dynamic systems.

Corbier, Christophe 29 November 2012 (has links)
L'identification des systèmes dynamiques complexes reste une préoccupation lorsque les erreurs de prédictions contiennent des outliers d'innovation. Ils ont pour effet de détériorer le modèle estimé, si le critère d'estimation est mal choisi et mal adapté. Cela a pour conséquences de contaminer la distribution de ces erreurs, laquelle présente des queues épaisses et s'écarte de la distribution normale. Pour résoudre ce problème, il existe une classe d'estimateurs, dits robustes, moins sensibles aux outliers, qui traitent d'une manière plus « douce » la transition entre résidus de niveaux très différents. Les M-estimateurs de Huber font partie de cette classe. Ils sont associés à un mélange des normes L2 et L1, liés à un modèle de distribution gaussienne perturbée, dit gross error model. A partir de ce cadre formel, nous proposons dans cette thèse, un ensemble d'outils d'estimation et de validation de modèles paramétriques linéaires et pseudo-linéaires boîte-noires, avec extension de l'intervalle de bruit dans les petites valeurs de la constante d'accord de la norme de Huber. Nous présentons ainsi les propriétés de convergence du critère d'estimation et de l'estimateur robuste. Nous montrons que l'extension de l'intervalle de bruit réduit la sensibilité du biais de l'estimateur et améliore la robustesse aux points de levage. Pour un type de modèle pseudo-linéaire, il est présenté un nouveau contexte dit L-FTE, avec une nouvelle méthode de détermination de L, dans le but d'établir les linéarisations du gradient et du Hessien du critère d'estimation, ainsi que de la matrice de covariance asymptotique de l'estimateur. De ces relations, une version robuste du critère de validation FPE est établie et nous proposons un nouvel outil d'aide au choix de modèle estimé. Des expérimentations sur des processus simulés et réels sont présentées et analysées.L'identification des systèmes dynamiques complexes reste une préoccupation lorsque les erreurs de prédictions contiennent des outliers d'innovation. Ils ont pour effet de détériorer le modèle estimé, si le critère d'estimation est mal choisi et mal adapté. Cela a pour conséquences de contaminer la distribution de ces erreurs, laquelle présente des queues épaisses et s'écarte de la distribution normale. Pour résoudre ce problème, il existe une classe d'estimateurs, dits robustes, moins sensibles aux outliers, qui traitent d'une manière plus « douce » la transition entre résidus de niveaux très différents. Les M-estimateurs de Huber font partie de cette classe. Ils sont associés à un mélange des normes L2 et L1, liés à un modèle de distribution gaussienne perturbée, dit gross error model. A partir de ce cadre formel, nous proposons dans cette thèse, un ensemble d'outils d'estimation et de validation de modèles paramétriques linéaires et pseudo-linéaires boîte-noires, avec extension de l'intervalle de bruit dans les petites valeurs de la constante d'accord de la norme de Huber. Nous présentons ainsi les propriétés de convergence du critère d'estimation et de l'estimateur robuste. Nous montrons que l'extension de l'intervalle de bruit réduit la sensibilité du biais de l'estimateur et améliore la robustesse aux points de levage. Pour un type de modèle pseudo-linéaire, il est présenté un nouveau contexte dit L-FTE, avec une nouvelle méthode de détermination de L, dans le but d'établir les linéarisations du gradient et du Hessien du critère d'estimation, ainsi que de la matrice de covariance asymptotique de l'estimateur. De ces relations, une version robuste du critère de validation FPE est établie et nous proposons un nouvel outil d'aide au choix de modèle estimé. Des expérimentations sur des processus simulés et réels sont présentées et analysées. / Complex dynamic systems identification remains a concern when prediction errors contain innovation outliers. They have the effect to damage the estimated model if the estimation criterion is badly chosen and badly adapted. The consequence is the contamination of the distribution of these errors; this distribution presents heavy tails and deviates of the normal distribution. To solve this problem, there is a robust estimator's class, less sensitive to the outliers, which treat the transition between residuals of very different levels in a softer way. The Huber's M-estimators belong to this class. They are associated to a mixed L2 - L1 norm, related to a disturbed Gaussian distribution model, namely gross error model. From this formal context, in this thesis we propose a set of estimation and validation tools of black-box linear and pseudo-linear models, with extension of the noise interval to low values of the tuning constant in the Huber's norm. We present the convergence properties of the robust estimation criterion and the robust estimator. We show that the extension of the noise interval reduces the sensitivity of the bias of the estimator and improves the robustness to the leverage points. Moreover, for a pseudo-linear model structure, we present a new context, named L-FTE, with a new method to determine L, in order to linearize the gradient and the Hessien of estimation criterion and the asymptotic covariance matrix of the estimator. From these expressions, a robust version of the FPE validation criterion is established and we propose a new decisional tool for the estimated model choice. Experiments on simulated and real systems are presented and analyzed.

Page generated in 0.1132 seconds