• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 39
  • 14
  • 3
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 70
  • 70
  • 16
  • 15
  • 14
  • 13
  • 10
  • 10
  • 9
  • 9
  • 8
  • 8
  • 8
  • 8
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Méthodes de simulation adaptative pour l’évaluation des risques de système complexes. / Adaptive simulation methods for risk assessment of complex systems

Turati, Pietro 16 May 2017 (has links)
L’évaluation de risques est conditionnée par les connaissances et les informations disponibles au moment où l’analyse est faite. La modélisation et la simulation sont des moyens d’explorer et de comprendre le comportement du système, d’identifier des scénarios critiques et d’éviter des surprises. Un certain nombre de simulations du modèle sont exécutées avec des conditions initiales et opérationnelles différentes pour identifier les scénarios conduisant à des conséquences critiques et pour estimer leurs probabilités d’occurrence. Pour les systèmes complexes, les modèles de simulations peuvent être : i) de haute dimension ; ii) boite noire ; iii) dynamiques ; iv) coûteux en termes de calcul, ce qu’empêche l’analyste d’exécuter toutes les simulations pour les conditions multiples qu’il faut considérer.La présente thèse introduit des cadres avancés d’évaluation des risques basée sur les simulations. Les méthodes développées au sein de ces cadres sont attentives à limiter les coûts de calcul requis par l’analyse, afin de garder une scalabilité vers des systèmes complexes. En particulier, toutes les méthodes proposées partagent l’idée prometteuse de focaliser automatiquement et de conduire d’une manière adaptive les simulations vers les conditions d’intérêt pour l’analyse, c’est-à-dire, vers des informations utiles pour l'évaluation des risques.Les avantages des méthodes proposées ont été montrés en ce qui concerne différentes applications comprenant, entre autres, un sous-réseau de transmission de gaz, un réseau électrique et l’Advanced Lead Fast Reactor European Demonstrator (ALFRED). / Risk assessment is conditioned on the knowledge and information available at the moment of the analysis. Modeling and simulation are ways to explore and understand system behavior, for identifying critical scenarios and avoiding surprises. A number of simulations of the model are run with different initial and operational conditions to identify scenarios leading to critical consequences and to estimate their probabilities of occurrence. For complex systems, the simulation models can be: i) high-dimensional; ii) black-box; iii) dynamic; and iv) computationally expensive to run, preventing the analyst from running the simulations for the multiple conditions that need to be considered.The present thesis presents advanced frameworks of simulation-based risk assessment. The methods developed within the frameworks are attentive to limit the computational cost required by the analysis, in order to keep them scalable to complex systems. In particular, all methods proposed share the powerful idea of automatically focusing and adaptively driving the simulations towards those conditions that are of interest for the analysis, i.e., for risk-oriented information.The advantages of the proposed methods have been shown with respect to different applications including, among others, a gas transmission subnetwork, a power network and the Advanced Lead Fast Reactor European Demonstrator (ALFRED).
32

Model Transformation in context of Driver Assistance System: Meta-model based transformation for Simulink an Scicos

Kappattanavar, Abhishek Mallikarjuna 02 June 2016 (has links)
In today’s world we see that Embedded Systems forms a major part in the life of a human being. Almost every device today has an electronic chip embedded in it. When it comes to automotive, these electronic devices are multiplying. This has resulted in innovative methods of developing Embedded Systems. Among them, Model Based Development has become very popular and a standard way of developing embedded systems. Now, we can see that most embedded systems, especially the automotive systems, are being developed using Model development tools like Simulink. In the design and development of Driver Assistance System, Model Based Design (MBD) plays an important role from system design and simulation to code generation. Modeling tool Matlab/Simulink is now among the most popular tools. Due to the proprietary nature of Simulink and challenges in requirement elicitation phase the industry is looking towards an open source alternative, such as Scicos. Since, most of the OEMs are still using Simulink, there is a need for interoperability between Simulink and Scicos. The present work proposes metamodels for Simulink and Scicos, and Model transformation using these Metamodels for the inter-operability. In order to develop the model transformation the metamodels for Simulink and Scicos were developed using EMF Ecore. These metamodels conform to OMGs MOF Standards. These metamodels were used in developing the transformation definition using the language QVTo. First a simple model was developed, and transformation rules were applied and verified using it. Then a Simulink subsystem of a cross wind assistance system was subjected to forward transformation. The outputs of the model before transformation and that after transformation were compared. They were found to give the same output as desired. Thus, verifying the transformation definition. An attempt was made to achieve reverse transformation. A subsystem in Scicos was considered for reverse transformation. After subjecting it to transformation, an intermediate model conforming to Simulink metamodel was obtained. This shows that the interoperability between Scicos and Simulink can be achieved.
33

Maturity integrated in a meta model of knowledge to help decision making in preliminary collaborative design of mechanical systems / La maturité intégrée dans un méta modèle de connaissances pour aider à la prise de décision en conception préliminaire collaborative de systèmes mécaniques

Dremont, Nicolas 26 November 2013 (has links)
La conception de systèmes mécaniques, de par son aspect pluridisciplinaire et technologique, fait intervenir et interagir différentes personnes qui travaillent et prennent des décisions ensemble, et, participent ensemble à l’élaboration du produit. Elles travaillent de manière collaborative cependant elles ne se connaissent pas obligatoirement, ne se situent pas forcément géographiquement sur un site commun, n’ont peut-être pas la même culture et n’appartiennent pas systématiquement à la même entreprise. La conception préliminaire représente les premières phases du cycle de conception ou le produit est en cours de définition. Le nombre d’incertitudes sur les paramètres et les informations produit sont très importantes. Il y a un manque de connaissances important à cette étape du processus de conception qui doit être considéré afin d’améliorer et d’aider les prises de décisions dans les phases amonts. C’est ce manque de connaissances que je me propose de qualifier et caractériser en apportant une réponse à la question résultante: comment prendre en compte le manque de connaissances pour prendre des décisions durant la conception préliminaire collaborative ? Pour se faire, nous proposons un méta-modèle de connaissances permettant de structurer les informations du produit et les connaissances en intégrant la maturité du produit. Cette maturité est définie par une métrique et permet d’identifier le niveau de connaissances des concepteurs sur le produit et d’orienter la prise de décision grâce à l’utilisation d’une approche mixte, à la fois qualitative et quantitative. Enfin, nous évaluerons la capacité de ce méta-modèle à générer différent modèles produit, puis sa pertinence avec l’implémentation sur un cas industriel. / The design of mechanical systems, due to their multi-disciplinary and technological aspects, involves different people who, together, work and make decisions and jointly participate in the development of the product. They work in a collaborative manner; however, they may have different strategies, geographical positions, cultures and do not know the other members of the team. Preliminary design represents the early stages of the design cycle or product definition. A number of uncertainties regarding the parameters and product information are very important. There is an important lack of knowledge at this stage of the design process that must be managed or filled in order to improve and support the decision making in the early phases. It is this lack of knowledge that I propose to qualify and characterise, providing an answer to the question: how does one to take into account the lack of knowledge in decision making during the preliminary design collaboration? To do so, we propose a meta-model for structuring product information and knowledge by integrating product maturity. A metric allows this maturity to be defined, to identify the level of knowledge of the product designers and to guide the decision making, thanks to the use of a qualitative and quantitative approach. Finally, we evaluate the ability of the meta-model to generate the different models produced and its relevance to the implementation in an industrial case.
34

Amortissement virtuel pour la conception vibroacoustique des lanceurs futurs / Thin films and heterostructures of LiNbO3 for acoustical / optical integrated devices

Krifa, Mohamed 19 May 2017 (has links)
Dans le dimensionnement des lanceurs spatiaux, la maîtrise de l'amortissement est une problématique majeure. Faute d'essais sur structure réelle très couteux avant la phase finale de qualification, la modélisation de l'amortissement peut conduire à un sur-dimensionnement de la structure alors que le but recherché est de diminuer le coût du lancement d'une fusée tout en garantissant le confort vibratoire de la charge utile.Nos contributions sont les suivantes. Premièrement, une méthode de prédiction par le calcul des niveaux vibratoires dans les structures de lanceurs en utilisant une stratégie d'essais virtuels qui permet de prédire les amortissements en basses fréquences, est proposée. Cette méthode est basée sur l'utilisation de méta-modèles construits à partir de plans d'expériences numériques à l'aide de modèles détaillés des liaisons. Ces méta-modèles peuvent être obtenus grâce à des calculs spécifiques utilisant une résolution 3D par éléments finis avec prise en compte du contact. En utilisant ces méta-modèles, l'amortissement modal dans un cycle de vibration peut être calculé comme étant le ratio entre l'énergie dissipée et l'énergie de déformation. L'approche utilisée donne une approximation précise et peu coûteuse de la solution. Le calcul non-linéaire global qui est inaccessible pour les structures complexes est rendu accessible en utilisant l'approche virtuelle basées sur les abaques.Deuxièmement, une validation des essais virtuels sur la structure du lanceur Ariane 5 a été élaborée en tenant compte des liaisons boulonnées entre les étages afin d'illustrer l'approche proposée. Lorsque la matrice d'amortissement généralisé n'est pas diagonale (car des dissipations localisées), ces méthodes modales ne permettent pas de calculer ou d'estimer les termes d'amortissement généralisé extra-diagonaux. La problématique posée est alors la quantification de l'erreur commise lorsque l'on néglige les termes extra-diagonaux dans le calcul des niveaux vibratoires ; avec un bon ratio précision / coût de calcul.Troisièmement, la validité de l'hypothèse de diagonalité de la matrice d'amortissement généralisée a été examinée et une méthode très peu coûteuse de quantification a posteriori de l'erreur d'estimation de l'amortissement modal par la méthodes des perturbations a été proposée.Finalement, la dernière contribution de cette thèse est la proposition d'un outil d'aide à la décision qui permet de quantifier l'impact des méconnaissances sur l'amortissement dans les liaisons sur le comportement global des lanceurs via l'utilisation de la méthode info-gap. / In the dimensioning of space launchers, controlling depreciation is a major problem. In the absence of very expensive real structural tests before the final qualification phase, damping modeling can lead to over-sizing of the structure while the aim is to reduce the cost of launching a rocket while guaranteeing the vibratory comfort of the payload.[...]
35

Regression models to assess the thermal performance of Brazilian low-cost houses: consideration of solar incidence and shading devices / Regression Models to Assess the Thermal Performance of Brazilian Low-Cost Houses: Consideration of Solar Incidence and Shading Devices

Anchieta, Camila Chagas 01 February 2016 (has links)
Building performance simulation (BPS) tools are significant and helpful during all design stages, especially during the early ones. However, there are obstacles to the full implementation and use of such tools, causing them not to become an effective part of the design process. In order to overcome this barrier, this research is presented, with the creation of regression models (meta-models) that allow to predict the discomfort by heat and/or by cold in a Brazilian low-cost house (LCH) in three distinct bioclimatic zones in Brazil, represented by the cities of Curitiba/PR, São Paulo/SP and Manaus/AM. The focus of this work was to analyze the impact of solar incidence and shading devices on thermal comfort by applying the meta-models. The method consisted in a) collecting data from projects referring to the type of building aforementioned to aid in the creation of the base model; b) definition of the key parameters and their ranges to be varied; c) simulations run on EnergyPlus using the Monte Carlo method to randomly create parameters combinations within their defined ranges; d) regression analysis and metamodels elaboration, followed by their validation with reliability tests; and lastly, e) a case study, consisting in applying the meta-models to a standard LCH to verify the impact of shading devices in a unit in regards to thermal comfort and the their potential as support tool in the design process. In general, all R2 values for the meta-models were above 0.95, except for the ones for São Paulo and Curitiba for discomfort by heat, 0.74 and 0.61, respectively. In regards to the case study, the meta-models predicted a decrease of approximately 50% in discomfort by heat for Manaus when a given combination of orientation, quantity and size of the devices was used. For the remaining locations, the meta-models predicting discomfort by heat and by cold require further investigation to properly assess some unexpected predictions and the meta-models sensitivity to the parameters related to shading devices. / Ferramentas de simulação computacional são importantes e uteis durante todas as etapas de projeto, especialmente durante as iniciais. No entanto. Há obstáculos para a completa implementação e uso de tais ferramentas, fazendo com que não sejam uma parte efetiva do processo de projeto. Para superar esta barreira, esta pesquisa é apresentada, com a criação de modelos de regressão (meta-modelos) que permitem a predição do desconforto por frio e/ou por calor em uma habitação de interesse social (HIS) no Brasil em três zonas bioclimáticas, representadas pelas cidades de Curitiba/PR, São Paulo/SP e Manaus/AM. O foco deste trabalho foi analisar o impacto da incidência solar e das proteções solares no conforto térmico utilizando os meta-modelos. O método consistiu em a) coletar dados referentes ao tipo de edifício mencionado para auxiliar na criação do modelo de base; b) a definição dos parâmetros chave e suas faixas de variação; c) simulações no EnergyPlus usando o método de Monte Carlo para aleatoriamente combinar valores de parâmetros dentro de suas faixas; d) análise de regressão e elaboração dos meta-modelos, seguida da validação dos mesmos por testes de confiabilidade; e por fim, e) um estudo de caso, consistindo na aplicação dos meta-modelos a uma HIS padrão para verificar o impacto das proteções solares em uma unidade em relação ao conforto térmico da mesma, assim como o potencial dos meta-modelos em serem utilizados como uma ferramenta de auxílio nas fases iniciais de projeto. No geral, todos os valores de R2 foram acima de 0.95, exceto para os meta-modelos de São Paulo e Curitiba para desconforto por calor, com 0.74 e 0.61, respectivamente. Em relação ao estudo de caso, os meta-modelos previram uma queda de aproximadamente 50% no desconforto por calor para Manaus, dada uma combinação entre orientação, quantidade e dimensão das proteções. Para as demais localidades, os meta-modelos prevendo desconforto por frio e por calor requerem maiores estudos para avaliar predições inesperadas e a sensibilidade dos meta-modelos em relação aos parâmetros de proteções solares.
36

"Programação neurolinguística: transformação e persuasão no metamodelo" / Neuro-Linguistic Programming: transformation and persuasion in meta-model.

Regina Maria Azevedo 19 April 2006 (has links)
Neste estudo apresentamos as origens da Programação Neurolingüística (PNL), seus principais fundamentos, pressupostos teóricos e objetivos; analisamos o “metamodelo”, sua relação com a linguagem e sua exploração por meio do processo de “modelagem”, a partir do enfoque presente na obra A estrutura da magia I: um livro sobre linguagem e terapia, de Richard Bandler e John Grinder, idealizadores da PNL. Examinamos as transformações obtidas mediante o processo de derivação, com base na Gramática Gerativo-Transformacional de Noam Chomsky, objetivando verificar sua relação com o “metamodelo”. Explorando o discurso do Sujeito submetido ao processo de “modelagem”, verificamos em que medida os novos conteúdos semânticos revelados pelas transformações poderiam influenciá-lo, a ponto de mudar sua visão de mundo. Para esta análise, investigamos ainda as teorias clássicas da Argumentação, em especial os conceitos de convicção e persuasão, constatando que a “modelagem” oferece ao Sujeito recursos para ampliar seu repertório lingüístico, apreender novos significados a partir de seus próprios enunciados e, por meio da deliberação consigo mesmo, convencer-se e persuadir-se. / This study aims at presenting the origins of the Neuro-Linguistic Programming (NLP), its main ideas, theoretical presuppositions and goals. Furthermore, it will be analyzed the meta-model, its relationship with language and its exploitation through the modeling process, all based on the book The structure of magic I: a book about language and therapy, by Richard Bandler and John Grinder, the founders of NLP. Moreover, it will be examined the transformations obtained from the derivation process, based on Noam Chomsky´s Transformational-generative grammar, with the goal of verifying its relationship with the meta-model. When exploiting the subject´s discourse submitted for the process of modeling, it will be verified in which way the new semantic contents revealed by the transformations could influence that subject and made him alter his vision of the world. For this analysis, it will be investigated also the classic theories of Argumentation, especially the conviction and persuasion concepts. It will also be verified that the process of modeling can offer resources to the subject, for him to enhance his linguistic vocabulary, to learn new meanings from his own sentences and to be able to persuade and convince himself through deliberating with his inner self.
37

"Programação neurolinguística: transformação e persuasão no metamodelo" / Neuro-Linguistic Programming: transformation and persuasion in meta-model.

Azevedo, Regina Maria 19 April 2006 (has links)
Neste estudo apresentamos as origens da Programação Neurolingüística (PNL), seus principais fundamentos, pressupostos teóricos e objetivos; analisamos o “metamodelo", sua relação com a linguagem e sua exploração por meio do processo de “modelagem", a partir do enfoque presente na obra A estrutura da magia I: um livro sobre linguagem e terapia, de Richard Bandler e John Grinder, idealizadores da PNL. Examinamos as transformações obtidas mediante o processo de derivação, com base na Gramática Gerativo-Transformacional de Noam Chomsky, objetivando verificar sua relação com o “metamodelo". Explorando o discurso do Sujeito submetido ao processo de “modelagem", verificamos em que medida os novos conteúdos semânticos revelados pelas transformações poderiam influenciá-lo, a ponto de mudar sua visão de mundo. Para esta análise, investigamos ainda as teorias clássicas da Argumentação, em especial os conceitos de convicção e persuasão, constatando que a “modelagem" oferece ao Sujeito recursos para ampliar seu repertório lingüístico, apreender novos significados a partir de seus próprios enunciados e, por meio da deliberação consigo mesmo, convencer-se e persuadir-se. / This study aims at presenting the origins of the Neuro-Linguistic Programming (NLP), its main ideas, theoretical presuppositions and goals. Furthermore, it will be analyzed the meta-model, its relationship with language and its exploitation through the modeling process, all based on the book The structure of magic I: a book about language and therapy, by Richard Bandler and John Grinder, the founders of NLP. Moreover, it will be examined the transformations obtained from the derivation process, based on Noam Chomsky´s Transformational-generative grammar, with the goal of verifying its relationship with the meta-model. When exploiting the subject´s discourse submitted for the process of modeling, it will be verified in which way the new semantic contents revealed by the transformations could influence that subject and made him alter his vision of the world. For this analysis, it will be investigated also the classic theories of Argumentation, especially the conviction and persuasion concepts. It will also be verified that the process of modeling can offer resources to the subject, for him to enhance his linguistic vocabulary, to learn new meanings from his own sentences and to be able to persuade and convince himself through deliberating with his inner self.
38

Modell und Optimierungsansatz für Open Source Softwareentwicklungsprozesse

Dietze, Stefan January 2004 (has links)
Gerade in den letzten Jahren erfuhr Open Source Software (OSS) eine zunehmende Verbreitung und Popularität und hat sich in verschiedenen Anwendungsdomänen etabliert. Die Prozesse, welche sich im Kontext der OSS-Entwicklung (auch: OSSD – Open Source Software-Development) evolutionär herausgebildet haben, weisen in den verschiedenen OSS-Entwicklungsprojekten z.T. ähnliche Eigenschaften und Strukturen auf und auch die involvierten Entitäten, wie z.B. Artefakte, Rollen oder Software-Werkzeuge sind weitgehend miteinander vergleichbar. Dies motiviert den Gedanken, ein verallgemeinerbares Modell zu entwickeln, welches die generalisierbaren Entwicklungsprozesse im Kontext von OSS zu einem übertragbaren Modell abstrahiert. Auch in der Wissenschaftsdisziplin des Software Engineering (SE) wurde bereits erkannt, dass sich der OSSD-Ansatz in verschiedenen Aspekten erheblich von klassischen (proprietären) Modellen des SE unterscheidet und daher diese Methoden einer eigenen wissenschaftlichen Betrachtung bedürfen. In verschiedenen Publikationen wurden zwar bereits einzelne Aspekte der OSS-Entwicklung analysiert und Theorien über die zugrundeliegenden Entwicklungsmethoden formuliert, aber es existiert noch keine umfassende Beschreibung der typischen Prozesse der OSSD-Methodik, die auf einer empirischen Untersuchung existierender OSS-Entwicklungsprojekte basiert. Da dies eine Voraussetzung für die weitere wissenschaftliche Auseinandersetzung mit OSSD-Prozessen darstellt, wird im Rahmen dieser Arbeit auf der Basis vergleichender Fallstudien ein deskriptives Modell der OSSD-Prozesse hergeleitet und mit Modellierungselementen der UML formalisiert beschrieben. Das Modell generalisiert die identifizierten Prozesse, Prozessentitäten und Software-Infrastrukturen der untersuchten OSSD-Projekte. Es basiert auf einem eigens entwickelten Metamodell, welches die zu analysierenden Entitäten identifiziert und die Modellierungssichten und -elemente beschreibt, die zur UML-basierten Beschreibung der Entwicklungsprozesse verwendet werden. In einem weiteren Arbeitsschritt wird eine weiterführende Analyse des identifizierten Modells durchgeführt, um Implikationen, und Optimierungspotentiale aufzuzeigen. Diese umfassen beispielsweise die ungenügende Plan- und Terminierbarkeit von Prozessen oder die beobachtete Tendenz von OSSD-Akteuren, verschiedene Aktivitäten mit unterschiedlicher Intensität entsprechend der subjektiv wahrgenommenen Anreize auszuüben, was zur Vernachlässigung einiger Prozesse führt. Anschließend werden Optimierungszielstellungen dargestellt, die diese Unzulänglichkeiten adressieren, und ein Optimierungsansatz zur Verbesserung des OSSD-Modells wird beschrieben. Dieser Ansatz umfasst die Erweiterung der identifizierten Rollen, die Einführung neuer oder die Erweiterung bereits identifizierter Prozesse und die Modifikation oder Erweiterung der Artefakte des generalisierten OSS-Entwicklungsmodells. Die vorgestellten Modellerweiterungen dienen vor allem einer gesteigerten Qualitätssicherung und der Kompensation von vernachlässigten Prozessen, um sowohl die entwickelte Software- als auch die Prozessqualität im OSSD-Kontext zu verbessern. Desweiteren werden Softwarefunktionalitäten beschrieben, welche die identifizierte bestehende Software-Infrastruktur erweitern und eine gesamtheitlichere, softwaretechnische Unterstützung der OSSD-Prozesse ermöglichen sollen. Abschließend werden verschiedene Anwendungsszenarien der Methoden des OSS-Entwicklungsmodells, u.a. auch im kommerziellen SE, identifiziert und ein Implementierungsansatz basierend auf der OSS GENESIS vorgestellt, der zur Implementierung und Unterstützung des OSSD-Modells verwendet werden kann. / In recent years Open Source Software (OSS) has become more widespread and its popularity has grown so that it is now established in various application domains. The processes which have emerged evolutionarily within the context of OSS development (OSSD – Open Source Software Development) display, to some extent, similar properties and structures in the various OSSD projects. The involved entities, e.g., artifacts, roles or software tools, are also widely comparable. This leads to the idea of developing a generalizable model which abstracts the generalizable development processes within the context of OSS to a transferable model. Even the scientific discipline of Software Engineering (SE) has recognized that the OSSD approach is, in various aspects, considerably different from traditional (proprietary) models of SE, and that these methods therefore require their own scientific consideration. Numerous publications have already analyzed individual aspects of OSSD and formulated theories about the fundamental development methods, but to date there is still no comprehensive description of the typical processes of OSSD methodology based on an empirical study of existing OSSD projects. Since this is a precondition for the further scientific examination of OSSD processes, a descriptive model of OSSD processes is obtained on the basis of comparative case studies and described in a formalized manner with UML modeling elements within the context of this dissertation. The model generalizes the identified processes, process entities and software infrastructures of the analyzed OSSD projects. It is based on a specially developed meta model which identifies the entities to be analyzed and describes the modeling viewpoints and elements which are used for the UML-based description of the development processes. Another procedure step includes the further analysis of the identified model in order to display the implications, and the potential for optimization. For example, these encompass the insufficient planning and scheduling capability of processes or the observed tendency of OSSD actors to carry out various activities at different intensities depending on the subjectively perceived incentives which leads to some processes being neglected. Subsequently, the optimization targets which address these inadequacies are displayed, and an optimization approach for the improvement of the OSSD model is described. The approach incorporates the expansion of the identified roles, the introduction of new or the expansion of already identified processes and the modification or expansion of artifacts of the generalized OSSD model. The presented model enhancements serve, above all, to increase quality assurance and to compensate neglected processes in order to improve developed software quality as well as process quality in the context of OSSD. Furthermore, software functionalities are described which expand the existing identified software infrastructure and should enable an overall, software-technical support of OSSD processes. Finally, the various application scenarios of OSSD model methods - also in commercial SE - are identified and an implementation approach based on the OSS GENESIS is presented which can be used to implement and support the OSSD model.
39

Conceptual design methodology of distributed intelligence large scale systems

Nairouz, Bassem R. 20 September 2013 (has links)
Distributed intelligence systems are starting to gain dominance in the field of large-scale complex systems. These systems are characterized by nonlinear behavior patterns that are only predicted through simulation-based engineering. In addition, the autonomy, intelligence, and reconfiguration capabilities required by certain systems introduce obstacles adding another layer of complexity. However, there exists no standard process for the design of such systems. This research presents a design methodology focusing on distributed control architectures while concurrently considering the systems design process. The methodology has two major components. First, it introduces a hybrid design process, based on the infusion of the control architecture and conceptual system design processes. The second component is the development of control architectures metamodel, placing a distinction between control configuration and control methods. This enables a standard representation of a wide spectrum of control architectures frameworks.
40

Maturity integrated in a meta model of knowledge to help decision making in preliminary collaborative design of mechanical systems

Dremont, Nicolas 26 November 2013 (has links) (PDF)
The design of mechanical systems, due to their multi-disciplinary and technological aspects, involves different people who, together, work and make decisions and jointly participate in the development of the product. They work in a collaborative manner; however, they may have different strategies, geographical positions, cultures and do not know the other members of the team. Preliminary design represents the early stages of the design cycle or product definition. A number of uncertainties regarding the parameters and product information are very important. There is an important lack of knowledge at this stage of the design process that must be managed or filled in order to improve and support the decision making in the early phases. It is this lack of knowledge that I propose to qualify and characterise, providing an answer to the question: how does one to take into account the lack of knowledge in decision making during the preliminary design collaboration? To do so, we propose a meta-model for structuring product information and knowledge by integrating product maturity. A metric allows this maturity to be defined, to identify the level of knowledge of the product designers and to guide the decision making, thanks to the use of a qualitative and quantitative approach. Finally, we evaluate the ability of the meta-model to generate the different models produced and its relevance to the implementation in an industrial case.

Page generated in 0.0602 seconds