11 |
Development of an Integrated Gaussian Process Metamodeling Application for Engineering DesignBaukol, Collin R 01 June 2009 (has links) (PDF)
As engineering technologies continue to grow and improve, the complexities in the engineering models which utilize these technologies also increase. This seemingly endless cycle of increased computational power and demand has sparked the need to create representative models, or metamodels, which accurately reflect these complex design spaces in a computationally efficient manner. As research into metamodeling and using advanced metamodeling techniques continues, it is important to remember design engineers who need to use these advancements. Even experienced engineers may not be well versed in the material and mathematical background that is currently required to generate and fully comprehend advanced complex metamodels. A metamodeling environment which utilizes an advanced metamodeling technique known as Gaussian Process is being developed to help bridge the gap that is currently growing between the research community and design engineers. This tool allows users to easily create, modify, query, and visually/numerically assess the quality of metamodels for a broad spectrum of design challenges.
|
12 |
Augmented conversation and cognitive apprenticeship metamodel based intelligent learning activity builder systemAdenowo, Adetokunbo January 2012 (has links)
This research focused on a formal (theory based) approach to designing Intelligent Tutoring System (ITS) authoring tool involving two specific conventional pedagogical theories—Conversation Theory (CT) and Cognitive Apprenticeship (CA). The research conceptualised an Augmented Conversation and Cognitive Apprenticeship Metamodel (ACCAM) based on apriori theoretical knowledge and assumptions of its underlying theories. ACCAM was implemented in an Intelligent Learning Activity Builder System (ILABS)—an ITS authoring tool. ACCAM’s implementation aims to facilitate formally designed tutoring systems, hence, ILABS―the practical implementation of ACCAM― constructs metamodels for Intelligent Learning Activity Tools (ILATs) in a numerical problem-solving context (focusing on the construction of procedural knowledge in applied numerical disciplines). Also, an Intelligent Learning Activity Management System (ILAMS), although not the focus of this research, was developed as a launchpad for ILATs constructed and to administer learning activities. Hence, ACCAM and ILABS constitute the conceptual and practical contributions that respectively flow from this research. ACCAM’s implementation was tested through the evaluation of ILABS and ILATs within an applied numerical domain―the accounting domain. The evaluation focused on the key constructs of ACCAM―cognitive visibility and conversation, implemented through a tutoring strategy employing Process Monitoring (PM). PM augments conversation within a cognitive apprenticeship framework; it aims to improve the visibility of the cognitive process of a learner and infers intelligence in tutoring systems. PM was implemented via an interface that attempts to bring learner’s thought process to the surface. This approach contrasted with previous studies that adopted standard Artificial Intelligence (AI) based inference techniques. The interface-based PM extends the existing CT and CA work. The strategy (i.e. interface-based PM) makes available a new tutoring approach that aimed fine-grain (or step-wise) feedbacks, unlike the goal-oriented feedbacks of model-tracing. The impact of PM—as a preventive strategy (or intervention) and to aid diagnosis of learners’ cognitive process—was investigated in relation to other constructs from the literature (such as detection of misconception, feedback generation and perceived learning effectiveness). Thus, the conceptualisation and implementation of PM via an interface also contributes to knowledge and practice. The evaluation of the ACCAM-based design approach and investigation of the above mentioned constructs were undertaken through users’ reaction/perception to ILABS and ILAT. This involved, principally, quantitative approach. However, a qualitative approach was also utilised to gain deeper insight. Findings from the evaluation supports the formal (theory based) design approach—the design of ILABS through interaction with ACCAM. Empirical data revealed the presence of conversation and cognitive visibility constructs in ILATs, which were determined through its behaviour during the learning process. This research identified some other theoretical elements (e.g. motivation, reflection, remediation, evaluation, etc.) that possibly play out in a learning process. This clarifies key conceptual variables that should be considered when constructing tutoring systems for applied numerical disciplines (e.g. accounting, engineering). Also, the research revealed that PM enhances the detection of a learner’s misconception and feedback generation. Nevertheless, qualitative data revealed that frequent feedbacks due to the implementation of PM could be obstructive to thought process at advance stage of learning. Thus, PM implementations should also include delayed diagnosis, especially for advance learners who prefer to have it on request. Despite that, current implementation allows users to turn PM off, thereby using alternative learning route. Overall, the research revealed that the implementation of interface-based PM (i.e. conversation and cognitive visibility) improved the visibility of learner’s cognitive process, and this in turn enhanced learning—as perceived.
|
13 |
Modeling and simulation of flows over and through fibrous porous media / Modélisation et simulation d'un écoulement autour d'une paroi poreuse et fibreuseLuminari, Nicola 19 March 2018 (has links)
Toute surface naturelle est par essence non lisse, elle est constituée de rugosités plus ou moins régulières et / ou de structures mobiles d’échelles variées. D’un point de vue mécanique des fluides, ces surfaces naturelles proposent des meilleures performances aérodynamiques en terme de réduction de traînée, d’augmentation de la portance ou de contrôle du décollement lorsqu’elles couvrent des corps en mouvement. Cela a été notament prouvé pour des écoulements de couches limites ou de sillage, autour de corps épais. La simulation numérique d’écoulements aux échelles microscopiques autour des surfaces « naturelles » demeure de nos jours encore hors de portée. En conséquence, la thèse a pour objet d’étudier la modélisation du glissement apparent de l’écoulement sur ce genre de surface, modélisée comme un milieu poreux, appliquant la théorie de la moyenne-volumique de Whitaker. Ce modèle mathématique permet globalement de représenter en moyenne les détails de la micro-structure de ses surfaces, tout en conservant une description satisfaisante des phénomènes physiques induits par l’écoulement. Le premier chapitre de ce manuscrit dresse un panorama des efforts antérieurs portant sur la modélisation de ces surfaces en précisant les résultats les plus importants issus de la littérature. Le deuxième chapitre présente la dérivation mathématique des équations de Navier- Stokes en moyenne volumique (VANS en anglais) dans un milieu poreux. Dans le troisième chapitre est étudiée la stabilité de l’écoulement à l’interface entre un fluide libre et un milieu poreux, constitué d'une série de cylindres rigides. La présence de cette couche poreuse est traitée par un terme de traînée dans les équations du fluide. On montre que l'ajout de ce terme réduit les taux d’amplification de l’instabilité de Kelvin-Helmholtz sur toute la gamme des nombre d’onde et ainsi augmente la longueur d’onde du mode le plus amplifié. Dans ce même contexte a été calculée la différence entre un modèle isotrope et une approche tensorielle pour le terme de traînée, afin de déterminer l’approche la plus consistante pour une étude de stabilité de ce type d’écoulement. Cela a mené à la conclusion que le modèle le plus pertinent est celui utilisant le tenseur de perméabilité apparent. Dans le chapitre suivant le tenseur de perméabilité apparent est identifié sur la base d’une centaine de simulations numériques directes, pour un milieu poreux tridimensionnel constitué de cylindres rigides, où le problème de fermeture est abordé par la méthode VANS. Dans ces configurations ce tenseur varie en fonction de quatre paramètres : le nombre de Reynolds, la porosité et l’orientation du gradient moyen de pression définie par deux angles d’Euler. Cette paramétrisation permet de capturer les effets tridimensionnels locaux. Cette base de données ainsi constituée a permis de créer, une approche de type kriging, un métamodèle comportemental pour estimer toutes les composantes du tenseur de perméabilité apparente. Dans le cinquième chapitre sont menées des simulations des équations VANS à l’échelle macroscopique après implémentation du méta-modèle qui autorise des temps de calcul raisonnables. La validation de l’approche à l’échelle macroscopique est effectuée sur un écoulement dans une cavité fermé couverte d’une couche poreuse et une comparaison avec les résultats d’un DNS très précise, homogénéisés a posteriori montre un très bon accord et démontre la pertinence de la démarche. L’étape suivante a consisté en l’étude du contrôle du décollement pour un écoulement autour d’une bosse poreuse par cette même approche VANS macroscopique. Enfin des conclusions générales et des directions de recherche possibles sont présentées dans le dernier chapitre. / Any natural surface is in essence non-smooth, consisting of more or less regular roughness and/or mobile structures of different scales. From a fluid mechanics point of view, these natural surfaces offer better aerodynamic performances when they cover moving bodies, in terms of drag reduction, lift enhancement or control of boundary layer separation; this has been shown for boundary layer or wake flows around thick bodies. The numerical simulation of microscopic flows around "natural" surfaces is still out of reach today. Therefore, the goal of this thesis is to study the modeling of the apparent flow slip occurring on this kind of surfaces, modeled as a porous medium, applying Whitaker's volume averaging theory. This mathematical model makes it possible to capture details of the microstructure while preserving a satisfactory description of the physical phenomena which occur. The first chapter of this manuscript provides an overview of previous efforts to model these surfaces, detailing the most important results from the literature. The second chapter presents the mathematical derivation of the volume-averaged Navier-Stokes equations (VANS) in a porous medium. In the third chapter the flow stability at the interface between a free fluid and a porous medium, formed by a series of rigid cylinders, is studied. The presence of this porous layer is treated by including a drag term in the fluid equations. It is shown that the presence of this term reduces the rates of amplification of the Kelvin-Helmholtz instability over the whole range of wavenumbers, thus leading to an increase of the wavelength of the most amplified mode. In this same context, the difference between the isotropic model and a tensorial approach for the drag term has been evaluated, to determine the most consistent approach to study these flow instabilities. This has led to the conclusion that the model that uses the apparent permeability tensor is the most relevant one. In the following chapter, based on this last result, the apparent permeability tensor, based on over one hundred direct numerical simulations carried out over microscopic unit cells, has been identified for a three-dimensional porous medium consisting of rigid cylinders. In these configurations the tensor varies according to four parameters: the Reynolds number, the porosity and the direction of the average pressure gradient, defined by two Euler angles. This parameterization makes it possible to capture local three-dimensional effects. This database has been set up to create, based on a kriging-type approach, a behavioral metamodel for estimating all the components of the apparent permeability tensor. In the fifth chapter, simulations of the VANS equations are carried out on a macroscopic scale after the implementation of the metamodel, to get reasonable computing times. The validation of the macroscopic approach is performed on a closed cavity flow covered with a porous layer and a comparison with the results of a very accurate DNS, homogenized a posteriori, has shown a very good agreement and has demonstrated the relevance of the approach. The next step has been the study of the passive control of the separation of the flow past a hump which is placed on a porous wall, by the same macroscopic VANS approach. Finally, general conclusions and possible directions of research in the field are presented in the last chapter.
|
14 |
Simulation-Based Design Under Uncertainty for Compliant Microelectromechanical SystemsWittwer, Jonathan W. 11 March 2005 (has links)
The high cost of experimentation and product development in the field of microelectromechanical systems (MEMS) has led to a greater emphasis on simulation-based design for increasing first-pass design success and reliability. The use of compliant or flexible mechanisms can help eliminate friction, wear, and backlash, but compliant MEMS are sensitive to variations in material properties and geometry. This dissertation proposes approaches for design stage uncertainty analysis, model validation, and robust optimization of nonlinear compliant MEMS to account for critical process uncertainties including residual stress, layer thicknesses, edge bias, and material stiffness. Methods for simulating and mitigating the effects of non-idealities such joint clearances, semi-rigid supports, non-ideal loading, and asymmetry are also presented. Approaches are demonstrated and experimentally validated using bistable micromechanisms and thermal microactuators as examples.
|
15 |
Functional Ontologies and Their Application to Hydrologic Modeling: Development of an Integrated Semantic and Procedural Knowledge Model and Reasoning EngineByrd, Aaron R. 01 August 2013 (has links)
This dissertation represents the research and development of new concepts and techniques for modeling the knowledge about the many concepts we as hydrologists must understand such that we can execute models that operate in terms of conceptual abstractions and have those abstractions translate to the data, tools, and models we use every day. This hydrologic knowledge includes conceptual (i.e. semantic) knowledge, such as the hydrologic cycle concepts and relationships, as well as functional (i.e. procedural) knowledge, such as how to compute the area of a watershed polygon, average basin slope or topographic wetness index. This dissertation is presented as three papers and a reference manual for the software created. Because hydrologic knowledge includes both semantic aspects as well as procedural aspects, we have developed, in the first paper, a new form of reasoning engine and knowledge base that extends the general-purpose analysis and problem-solving capability of reasoning engines by incorporating procedural knowledge, represented as computer source code, into the knowledge base. The reasoning engine is able to compile the code and then, if need be, execute the procedural code as part of a query. The potential advantage to this approach is that it simplifies the description of procedural knowledge in a form that can be readily utilized by the reasoning engine to answer a query. Further, since the form of representation of the procedural knowledge is source code, the procedural knowledge has the full capabilities of the underlying language. We use the term "functional ontology" to refer to the new semantic and procedural knowledge models. The first paper applies the new knowledge model to describing and analyzing polygons. The second and third papers address the application of the new functional ontology reasoning engine and knowledge model to hydrologic applications. The second paper models concepts and procedures, including running external software, related to watershed delineation. The third paper models a project scenario that includes integrating several models. A key advance demonstrated in this paper is the use of functional ontologies to apply metamodeling concepts in a manner that both abstracts and fully utilizes computational models and data sets as part of the project modeling process.
|
16 |
An Efficient Robust Concept Exploration Method and Sequential Exploratory Experimental DesignLin, Yao 31 August 2004 (has links)
Experimentation and approximation are essential for efficiency and effectiveness in concurrent engineering analyses of large-scale complex systems. The approximation-based design strategy is not fully utilized in industrial applications in which designers have to deal with multi-disciplinary, multi-variable, multi-response, and multi-objective analysis using very complicated and expensive-to-run computer analysis codes or physical experiments. With current experimental design and metamodeling techniques, it is difficult for engineers to develop acceptable metamodels for irregular responses and achieve good design solutions in large design spaces at low prices. To circumvent this problem, engineers tend to either adopt low-fidelity simulations or models with which important response properties may be lost, or restrict the study to very small design spaces. Information from expensive physical or computer experiments is often used as a validation in late design stages instead of analysis tools that are used in early-stage design. This increases the possibility of expensive re-design processes and the time-to-market.
In this dissertation, two methods, the Sequential Exploratory Experimental Design (SEED) and the Efficient Robust Concept Exploration Method (E-RCEM) are developed to address these problems. The SEED and E-RCEM methods help develop acceptable metamodels for irregular responses with expensive experiments and achieve satisficing design solutions in large design spaces with limited computational or monetary resources. It is verified that more accurate metamodels are developed and better design solutions are achieved with SEED and E-RCEM than with traditional approximation-based design methods. SEED and E-RCEM facilitate the full utility of the simulation-and-approximation-based design strategy in engineering and scientific applications.
Several preliminary approaches for metamodel validation with additional validation points are proposed in this dissertation, after verifying that the most-widely-used method of leave-one-out cross-validation is theoretically inappropriate in testing the accuracy of metamodels. A comparison of the performance of kriging and MARS metamodels is done in this dissertation. Then a sequential metamodeling approach is proposed to utilize different types of metamodels along the design timeline.
Several single-variable or two-variable examples and two engineering example, the design of pressure vessels and the design of unit cells for linear cellular alloys, are used in this dissertation to facilitate our studies.
|
17 |
Metamodeling Complex Systems Using Linear And Nonlinear Regression MethodsKartal, Elcin 01 September 2007 (has links) (PDF)
Metamodeling is a very popular approach for the approximation of complex systems. Metamodeling techniques can be categorized according to the type of regression method employed as linear and nonlinear models. The Response Surface Methodology (RSM) is an example of linear regression. In classical RSM metamodels, parameters are estimated using the Least Squares (LS) Method. Robust regression techniques, such as Least Absolute Deviation (LAD) and M-regression, are also considered in this study due to the outliers existing in data sets. Artificial Neural Networks (ANN) and Multivariate Adaptive Regression Splines (MARS) are examples for non-linear regression technique. In this thesis these two nonlinear metamodeling techniques are constructed and their performances are compared with the performances of linear models.
|
18 |
Hla Fom Development With Model TransformationsDinc, Ali Cem 01 May 2010 (has links) (PDF)
There has been a recent interest in the model-based development approach in the modeling and simulation community. The Model-Driven Architecture (MDA) of OMG envisions a fully model-based development process where models are created for capturing not only requirements, but also designs and implementations. Domain-specific metamodels and model transformations constitute the cornerstones of this approach. We have developed transformations from the data part of Field Artillery (FA) domain models to High Level Architecture (HLA) Object Model Template (OMT) models, honoring the MDA philosophy. In the MDA terminology, the former corresponds to the CIM (Computation-Independent Model) or, arguably, PIM (Platform-Independent Model), and the latter corresponds to the PSM
(Platform-Specific Model), where the platform is HLA. As a case study for the source metamodel, we have developed a metamodel for the data model part of the (observed) fire
techniques of the FA domain. All of the entities in the metamodel are derived from the NATO&rsquo / s Command and Control Information Exchange Data Model (C2IEDM) elements.
|
19 |
A metamodeling approach for approximation of multivariate, stochastic and dynamic simulationsHernandez Moreno, Andres Felipe 04 April 2012 (has links)
This thesis describes the implementation of metamodeling approaches as a solution to approximate multivariate, stochastic and dynamic simulations. In the area of statistics, metamodeling (or ``model of a model") refers to the scenario where an empirical model is build based on simulated data. In this thesis, this idea is exploited by using pre-recorded dynamic simulations as a source of simulated dynamic data. Based on this simulated dynamic data, an empirical model is trained to map the dynamic evolution of the system from the current discrete time step, to the next discrete time step. Therefore, it is possible to approximate the dynamics of the complex dynamic simulation, by iteratively applying the trained empirical model. The rationale in creating such approximate dynamic representation is that the empirical models / metamodels are much more affordable to compute than the original dynamic simulation, while having an acceptable prediction error.
The successful implementation of metamodeling approaches, as approximations of complex dynamic simulations, requires understanding of the propagation of error during the iterative process. Prediction errors made by the empirical model at earlier times of the iterative process propagate into future predictions of the model. The propagation of error means that the trained empirical model will deviate from the expensive dynamic simulation because of its own errors. Based on this idea, Gaussian process model is chosen as the metamodeling approach for the approximation of expensive dynamic simulations in this thesis. This empirical model was selected not only for its flexibility and error estimation properties, but also because it can illustrate relevant issues to be considered if other metamodeling approaches were used for this purpose.
|
20 |
Objektinių modelių transformacijų realizavimas / Implementing object model transformationsAbdrachimovas, Ruslanas 31 May 2004 (has links)
Presented work covers one of the most important areas of OMG’s model driven architecture (MDA) – problems of object model transformations. Based on research of OMG specifications and other sources, author analyzes transformation process, states importance of modeling and metamodeling for designing of UML like modeling languages. Research work describes designed metamodels of experimental modeling languages: “Entity – process”, Java metamodel and relational metamodel. Author gives a short overview of model editors for these languages, created using EMF framework tools. Based on analysis, author describes very flexible architecture of model transformation implementation, based on filter and pipes architectural pattern. Usage of this architecture gives flexibility to transformation implementation and allows easy and straightforward decomposition of transformation to separate stages. Designed filter and pipes transformation architecture was used for experimental transformation implementation. Research work presents quality and quantity based results of experimental transformations.
|
Page generated in 0.0423 seconds