• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 172
  • 28
  • 22
  • 19
  • 5
  • 4
  • 3
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 331
  • 331
  • 195
  • 79
  • 58
  • 54
  • 45
  • 40
  • 38
  • 35
  • 33
  • 32
  • 32
  • 30
  • 28
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
181

Development of Physics-based Models and Design Optimization of Power Electronic Conversion Systems

Nejadpak, Arash 21 March 2013 (has links)
The main objective for physics based modeling of the power converter components is to design the whole converter with respect to physical and operational constraints. Therefore, all the elements and components of the energy conversion system are modeled numerically and combined together to achieve the whole system behavioral model. Previously proposed high frequency (HF) models of power converters are based on circuit models that are only related to the parasitic inner parameters of the power devices and the connections between the components. This dissertation aims to obtain appropriate physics-based models for power conversion systems, which not only can represent the steady state behavior of the components, but also can predict their high frequency characteristics. The developed physics-based model would represent the physical device with a high level of accuracy in predicting its operating condition. The proposed physics-based model enables us to accurately develop components such as; effective EMI filters, switching algorithms and circuit topologies [7]. One of the applications of the developed modeling technique is design of new sets of topologies for high-frequency, high efficiency converters for variable speed drives. The main advantage of the modeling method, presented in this dissertation, is the practical design of an inverter for high power applications with the ability to overcome the blocking voltage limitations of available power semiconductor devices. Another advantage is selection of the best matching topology with inherent reduction of switching losses which can be utilized to improve the overall efficiency. The physics-based modeling approach, in this dissertation, makes it possible to design any power electronic conversion system to meet electromagnetic standards and design constraints. This includes physical characteristics such as; decreasing the size and weight of the package, optimized interactions with the neighboring components and higher power density. In addition, the electromagnetic behaviors and signatures can be evaluated including the study of conducted and radiated EMI interactions in addition to the design of attenuation measures and enclosures.
182

Advanced physical modelling of step graded Gunn Diode for high power TeraHertz sources

Amir, Faisal January 2011 (has links)
The mm-wave frequency range is being increasingly researched to close the gap between 100 to 1000 GHz, the least explored region of the electromagnetic spectrum, often termed as the 'THz Gap'. The ever increasing demand for compact, portable and reliable THz (Terahertz) devices and the huge market potential for THz system have led to an enormous amount of research and development in the area for a number of years. The Gunn Diode is expected to play a significant role in the development of low cost solid state oscillators which will form an essential part of these THz systems.Gunn and mixer diodes will 'power' future THz systems. The THz frequencies generation methodology is based on a two-stage module. The initial frequency source is provided by a high frequency Gunn diode and is the main focus of this work. The output from this diode is then coupled into a multiplier module. The multiplier provides higher frequencies by the generation of harmonics of the input signal by means of a non-linear element, such as Schottky diode Varactor. A realistic Schottky diode model developed in SILVACOTM is presented in this work.This thesis describes the work done to develop predictive models for Gunn Diode devices using SILVACOTM. These physically-based simulations provide the opportunity to increase understanding of the effects of changes to the device's physical structure, theoretical concepts and its general operation. Thorough understanding of device physics was achieved to develop a reliable Gunn diode model. The model development included device physical structure building, material properties specification, physical models definition and using appropriate biasing conditions.The initial goal of the work was to develop a 2D model for a Gunn diode commercially manufactured by e2v Technologies Plc. for use in second harmonic mode 77GHz Intelligent Adaptive Cruise Control (ACC) systems for automobiles. This particular device was chosen as its operation is well understood and a wealth of data is available for validation of the developed physical model. The comparisons of modelled device results with measured results of a manufactured device are discussed in detail. Both the modelled and measured devices yielded similar I-V characteristics and so validated the choice of the physical models selected for the simulations. During the course of this research 2D, 3D rectangular, 3D cylindrical and cylindrical modelled device structures were developed and compared to measured results.The injector doping spike concentration was varied to study its influence on the electric field in the transit region, and was compared with published and measured data.Simulated DC characteristics were also compared with measured results for higher frequency devices. The devices mostly correspond to material previously grown for experimental studies in the development of D-band GaAs Gunn devices. Ambient temperature variations were also included in both simulated and measured data.Transient solutions were used to obtain a time dependent response such as determining the device oscillating frequency under biased condition. These solutions provided modelled device time-domain responses. The time-domain simulations of higher frequency devices which were developed used modelling measured approach are discussed. The studied devices include 77GHz (2nd harmonic), 125 GHz (2nd harmonic) and 100 GHz fundamental devices.During the course of this research, twelve research papers were disseminated. The results obtained have proved that the modelling techniques used, have provided predictive models for novel Transferred Electron Devices (TEDs) operating above 100GHz.
183

Multiagentní simulace - státní zásahy do trhu s nájemními byty / Multiagent simulation - State interventions into rental housing market

Janovský, Lukáš January 2011 (has links)
The thesis focuses on the use of multiagent systems to model the rental housing market. At first the aim was to create the simulation, which would bring a new perspective on the development of the entire market. For this purpose I selected a relatively young methodology titled Agentology, which was subjected to the criticism of a model after finishing the model. That was a secondary principal objective of this thesis. The work is divided into two parts. In the first theoretical part the rental housing market is described and there are discussed the most important factors affecting its state. Simultaneously the chapter describes the most significant State interventions into the market, as we know them from the official housing policies. In the next stage the reader is made familiar with the basic principles of multi-agent modeling. In this chapter there is also an overview of selected methodologies of multiagent systems and one of them is selected and applied in further phases of this work. The second part refers to the multi-agent model. Using the Agentology methodology market model is assembled. The methodology accompanies all stages of model development from the task formulation, through conceptual and technological level to the final evaluation. This work strictly adheres to the methodology and all its recommendations. In the end, the result represents a model whose functionality has been verified by analyzing the output data. Finally the thesis deals with criticism of the Agentology methodology. This criticism is a result of experience gained from previous development. It concerns evaluation of concrete steps and also of methodology as a whole in terms of admittance, integrity and practicality.
184

Modélisation du système complexe de la publication scientifique / Modeling the complex system of scientific publication

Kovanis, Michail 02 October 2017 (has links)
Le système d’évaluation par les pairs est le gold standard de la publication scientifique. Ce système a deux objectifs: d’une part filtrer les articles scientifiques erronés ou non pertinents et d’autre part améliorer la qualité de ceux jugés dignes de publication. Le rôle des revues scientifiques et des rédacteurs en chef est de veiller à ce que des connaissances scientifiques valides soient diffusées auprès des scientifiques concernés et du public. Cependant, le système d’évaluation par les pairs a récemment été critiqué comme étant intenable sur le long terme, inefficace et cause de délais de publication des résultats scientifiques. Dans ce projet de doctorat, j’ai utilisé une modélisation par systèmes complexes pour étudier le comportement macroscopique des systèmes de publication et d’évaluation par les pairs. Dans un premier projet, j’ai modélisé des données empiriques provenant de diverses sources comme Pubmed et Publons pour évaluer la viabilité du système. Je montre que l’offre dépasse de 15% à 249% la demande d’évaluation par les pairs et, par conséquent, le système est durable en termes de volume. Cependant, 20% des chercheurs effectuent 69% à 94% des revues d’articles, ce qui souligne un déséquilibre significatif en termes d’efforts de la communauté scientifique. Les résultats ont permis de réfuter la croyance largement répandue selon laquelle la demande d’évaluation par les pairs dépasse largement l’offre mais ont montré que la majorité des chercheurs ne contribue pas réellement au processus. Dans mon deuxième projet, j’ai développé un modèle par agents à grande échelle qui imite le comportement du système classique d’évaluation par les pairs, et que j’ai calibré avec des données empiriques du domaine biomédical. En utilisant ce modèle comme base pour mon troisième projet, j’ai modélisé cinq systèmes alternatifs d’évaluation par les pairs et évalué leurs performances par rapport au système conventionnel en termes d’efficacité de la revue, de temps passé à évaluer des manuscrits et de diffusion de l’information scientifique. Dans mes simulations, les deux systèmes alternatifs dans lesquels les scientifiques partagent les commentaires sur leurs manuscrits rejetés avec les éditeurs du prochain journal auquel ils les soumettent ont des performances similaires au système classique en termes d’efficacité de la revue. Le temps total consacré par la communauté scientifique à l’évaluation des articles est cependant réduit d’environ 63%. En ce qui concerne la dissémination scientifique, le temps total de la première soumission jusqu’à la publication est diminué d’environ 47% et ces systèmes permettent de diffuser entre 10% et 36% plus d’informations scientifiques que le système conventionnel. Enfin, le modèle par agents développé peut être utilisé pour simuler d’autres systèmes d’évaluation par les pairs ou des interventions, pour ainsi déterminer les interventions ou modifications les plus prometteuses qui pourraient être ensuite testées par des études expérimentales en vie réelle. / The peer-review system is undoubtedly the gold standard of scientific publication. Peer review serves a two-fold purpose; to screen out of publication articles containing incorrect or irrelevant science and to improve the quality of the ones deemed suitable for publication. Moreover, the role of the scientific journals and editors is to ensure that valid scientific knowledge is disseminated to the appropriate target group of scientists and to the public. However, the peer review system has recently been criticized, in that it is unsustainable, inefficient and slows down publication. In this PhD thesis, I used complex-systems modeling to study the macroscopic behavior of the scientific publication and peer-review systems. In my first project, I modeled empirical data from various sources, such as Pubmed and Publons, to assess the sustainability of the system. I showed that the potential supply has been exceeding the demand for peer review by 15% to 249% and thus, the system is sustainable in terms of volume. However, 20% of researchers have been performing 69% to 94% of the annual reviews, which emphasizes a significant imbalance in terms of effort by the scientific community. The results provided evidence contrary to the widely-adopted, but untested belief, that the demand for peer review over-exceeds the supply, and they indicated that the majority of researchers do not contribute to the process. In my second project, I developed a large-scale agent-based model, which mimicked the behavior of the conventional peer-review system. This model was calibrated with empirical data from the biomedical domain. Using this model as a base for my third project, I developed and assessed the performance of five alternative peer-review systems by measuring peer-review efficiency, reviewer effort and scientific dissemination as compared to the conventional system. In my simulations, two alternative systems, in which scientists shared past reviews of their rejected manuscripts with the editors of the next journal to which they submitted, performed equally or sometimes better in terms of peer-review efficiency. They also each reduced the overall reviewer effort by ~63%. In terms of scientific dissemination, they decreased the median time from first submission until publication by ~47% and diffused on average 10% to 36% more scientific information (i.e., manuscript intrinsic quality x journal impact factor) than the conventional system. Finally, my agent-based model may be an approach to simulate alternative peer-review systems (or interventions), find those that are the most promising and aid decisions about which systems may be introduced into real-world trials.
185

Espaces virtuels pour l’éducation et l’illustration scientifiques : contribution à l’appréhension de la Théorie de la Relativité Restreinte par la réalité virtuelle / Virtual spaces for scientific exploration and education : contribution to the apprehension of the Theory of Special Relativity through virtual reality

Doat, Tony 20 September 2012 (has links)
La Théorie de la Relativité (TRR), est une théorie particulièrement contre-intuitive dont les implications sont inaccessibles à l'expérience sensible humaine ; ce qui pose un certain nombre de difficultés de compréhension aux étudiants. Cependant, la Réalité Virtuelle (RV) offre une approche intéressante en permettant à un utilisateur d'être immergé et d'interagir dans un monde virtuel où la vitesse de la lumière est ramenée à 1 m/s. Les phénomènes relativistes deviennent ainsi directement accessibles par ses sens. Cette caractéristique, point départ de nos travaux, permet alors d’appréhender les phénomènes relativistes par une expérience « par la pratique ». L'enjeu de notre travail porte plus précisément sur la définition de moyens et de méthodes intégrés dans une plate-forme immersive permettant d'appréhender les phénomènes relativistes. Dans ce contexte, nous proposons, tout d’abord, des méthodes novatrices pour simuler les phénomènes relativistes sur un nombre quelconque d'objets en mouvement arbitraire et tenant compte de la dynamique relativiste des objets dans la scène, notamment durant leurs interactions. Nous nous focalisons sur les effets qui déforment les objets vus par l'observateur, à savoir le délai de propagation des photons, la relativité des longueurs et l'effet d'aberration. Nous définissons ensuite des méthodes pour intégrer une simulation relativiste dans un environnement immersif basé intrinsèquement sur un monde newtonien. Nous proposons également une plate-forme expérimentale dans laquelle sont intégrées des méthodes d'interaction utilisées pour mettre en scène un « jeu sérieux », ici un billard relativiste. Enfin, nous démontrons la portée de notre outil expérimental par deux voies : l'une concerne l'utilisation de l'application dans des évaluations de didactique et l'autre concerne un exemple d'extension de l'outil pour mettre en lumière un autre aspect de la Physique relativiste : la relation entre vitesse et énergie. / The Theory of Special Relativity (TSR) is a particularly counterintuitive theory. Its implications are, by nature, out of reach by human experience. Therefore we cannot perceive its effects directly, thus raising problems of comprehension for the students confronted to it. However, Virtual Reality (VR) enables us to overcome this limitation by immersing a user into a world where the velocity of light is reduced to 1 m/s. As a result, the relativistic phenomena become directly perceivable through our senses. This possibility, which is the cornerstone of our work, brings a unique way to apprehend the relativistic phenomenon trough a "hands-on" experiment.In this context, we propose, first, innovative methods to include relativistic effects in simulation containing any number of objects moving in an arbitrary direction and velocity and taking into account the relativistic dynamics of the objects, including object-to-object interaction. We focused on the relativistic phenomenon involved in the deformation of objects: the delay of propagation of the photons from the light source to the observer, as well as the relativity of length and the aberration of light. We describe, second, methods to integrate the simulation techniques, previously introduced, into an immersive environment intrinsically based on Newtonian physics. We also provide interaction methods and a concrete application in a serious game framework: a relativistic carom billiard. Finally, we demonstrate the possibilities of our platform are demonstrated in two ways: one tackles usage in the context of learning evaluation and the other is an extension of the tool to access new pieces of information relevant to TSR, such as the force profile used to launch an object with a relativistic velocity.
186

Simulation of Dengue Outbreak in Thailand

Meesumrarn, Thiraphat 08 1900 (has links)
The dengue virus has become widespread worldwide in recent decades. It has no specific treatment and affects more than 40% of the entire population in the world. In Thailand, dengue has been a health concern for more than half a century. The highest number of cases in one year was 174,285 in 1987, leading to 1,007 deaths. In the present day, dengue is distributed throughout the entire country. Therefore, dengue has become a major challenge for public health in terms of both prevention and control of outbreaks. Different methodologies and ways of dealing with dengue outbreaks have been put forward by researchers. Computational models and simulations play an important role, as they have the ability to help researchers and officers in public health gain a greater understanding of the virus's epidemic activities. In this context, this dissertation presents a new framework, Modified Agent-Based Modeling (mABM), a hybrid platform between a mathematical model and a computational model, to simulate a dengue outbreak in human and mosquito populations. This framework improves on the realism of former models by utilizing the reported data from several Thai government organizations, such as the Thai Ministry of Public Health (MoPH), the National Statistical Office, and others. Additionally, its implementation takes into account the geography of Thailand, as well as synthetic mosquito and synthetic human populations. mABM can be used to represent human behavior in a large population across variant distances by specifying demographic factors and assigning mobility patterns for weekdays, weekends, and holidays for the synthetic human population. The mosquito dynamic population model (MDP), which is a component of the mABM framework, is used for representing the synthetic mosquito population dynamic and their ecology by integrating the regional model to capture the effect of dengue outbreak. The two synthetic populations can be linked to each other for the purpose of presenting their interactions, and the Local Stochastic Contact Model for Dengue (LSCM-DEN) is utilized. For validation, the number of cases from the experiment is compared to reported cases from the Thailand Vector Borne Disease Bureau for the selected years. This framework facilitates model configuration for sensitivity analysis by changing parameters, such as travel routes and seasonal temperatures. The effects of these parameters were studied and analyzed for an improved understanding of dengue outbreak dynamics.
187

Augmented Reality Assistenzsystem mit graphenbasierter Zustandsanalyse für Produkte im Internet der Dinge

Neges, Matthias, Wolf, Mario, Abramovici, Michael January 2016 (has links)
Aus der Einführung "Durch die Vernetzung von Produkten im Internet der Dinge / Internet of Things (IoT) und die damit einhergehende Verfügbarkeit von Daten, können nicht nur Produkte selbstständig agieren, reagieren und Aktionen auslösen, sondern auch externe Empfänger die von ihnen gelieferten Daten auswerten und für zusätzliche Services nutzen (Eisenhauer 2007, Abramovici et al. 2014). Dies birgt unter Anderem enorme Potentiale bei der Instandhaltung von technischen Anlagen (Wohlgemut 2007). Diese Anlagen oder Produktionsstätten sind in aller Regel komplexe Systeme, die aus einer heterogenen Landschaft von Subsystemen bestehen. Ohne vorhergehende Kenntnisse einer Maschine ist die Analyse oder Überprüfung solcher Systeme schwierig bis unmöglich. Weiterhin stehen die technischen Dokumentationen und Wartungshistorien bei solchen Tätigkeiten häufig nicht vollständig oder nur in Papierform vor Ort zur Verfügung, während der aktuelle Status der Anlage nicht mit den vorhandenen Informationen überlagert werden kann. ..."
188

Die Rolle der Ökonomik in der Wissenschaftsphilosophie: Eine kritische Würdigung aus Sicht der Economics of Scientific Knowledge und eine Agentenbasierte Modellierung zur Konsensbildung mit eingeschränkt rationalen, adaptiv handelnden heterogenen Akteuren

Baier, Melanie 19 December 2016 (has links)
Die Dissertation wendet sich insbesondere der Rolle der Ökonomik auf der Metaebene der wissenschaftsphilosophischen Argumentation zu. Ziel ist zu klären, welchen Erklärungsgehalt ökonomische Instrumente in der Wissenschaftsphilosophie haben können. Mit der Economics of Scientific Knowledge (ESK) hat sich seit Mitte der 1990er Jahre ein Literaturzweig herausgebildet, in dem genau diese Zielsetzung verfolgt wird, nämlich das Erkenntnisobjekt der wissenschaftlichen Koordination mit unterschiedlichen Methoden und Instrumenten der Ökonomik zu untersuchen. Es wird gezeigt, dass den analytischen Modellen der ESK einige Probleme inhärent sind, die prinzipiell durch neue Methoden und Instrumente gelöst werden können. Als ein geeigneter Kandidat wird die Agentenbasierte Modellierung (ABM) identifiziert, die eine realitätsnähere Abbildung der Akteure, eine ergebnisoffene Modellierung ihrer Entscheidungen und des Koordinationsprozesses erlaubt. Der Analyse von der ESK zuzuordnenden analytischen und agentenbasierten Modellen folgt im zweiten Teil der Dissertation die Programmierung einer eigenen ABM Continuous Opinions of Satisficing Agents and Discrete Actions (COSDA) mit Hilfe der Multi-Agenten-Programmiersprache NetLogo. In der heuristischen ABM COSDA werden zentrale wissenschaftsphilosophische und ökonomische Prämissen, die im ersten Teil der Arbeit als Problemfelder identifiziert wurden, aufgegeben. Mit Modellierung heterogener Agententypen, die - mit unterschiedlichen Präferenzen und Verhaltensheuristiken ausgestattet - miteinander interagieren, wird eine mögliche Mikrospezifikation für die Emergenz eines Makrophänomens erzeugt. Das Makrophänomen, d.h. die unterschiedlichen Resultate im wissenschaftlichen Koordinationsprozess, sind aus den selbstverstärkenden Effekten der Interaktion erklärbar, aber nicht vorhersehbar. Die Mikrospezifikation kann als relevante, durch eine kohärente Fiktion formulierte Möglichkeit interpretiert werden, die anders als analytische Modelle der ESK kein rationales Entscheidungskalkül der Agenten voraussetzt.
189

A Family of Role-Based Languages

Kühn, Thomas 24 March 2017 (has links)
Role-based modeling has been proposed in 1977 by Charles W. Bachman, as a means to model complex and dynamic domains, because roles are able to capture both context-dependent and collaborative behavior of objects. Consequently, they were introduced in various fields of research ranging from data modeling via conceptual modeling through to programming languages. More importantly, because current software systems are characterized by increased complexity and context-dependence, there is a strong demand for new concepts beyond object-oriented design. Although mainstream modeling languages, i.e., Entity-Relationship Model, Unified Modeling Language, are good at capturing a system's structure, they lack ways to model the system's behavior, as it dynamically emerges through collaborating objects. In turn, roles are a natural concept capturing the behavior of participants in a collaboration. Moreover, roles permit the specification of interactions independent from the interacting objects. Similarly, more recent approaches use roles to capture context-dependent properties of objects. The notion of roles can help to tame the increased complexity and context-dependence. Despite all that, these years of research had almost no influence on current software development practice. To make things worse, until now there is no common understanding of roles in the research community and no approach fully incorporates both the context-dependent and the relational nature of roles. In this thesis, I will devise a formal model for a family of role-based modeling languages to capture the various notions of roles. Together with a software product line of Role Modeling Editors, this, in turn, enables the generation of a role-based language family for Role-based Software Infrastructures (RoSI).:I Review of Contemporary Role-based Languages 1 Introduction 1.1 Background 1.2 Motivation 1.3 Problem Definition 1.4 Outline 2 Nature of Roles 2.1 Running Example 2.2 Behavioral Nature 2.3 Relational Nature 2.4 Context-Dependent Nature 2.5 Constraints in Role-Based Languages 2.6 Classification of Roles 3 Systematic Literature Review 3.1 Method 3.2 Results 3.3 Discussion 4 Contemporary Role-Based Modeling Languages 4.1 Behavioral and Relational Modeling Languages 4.1.1 Lodwick 4.1.2 The Generic Role Model 4.1.3 Role-Based Metamodeling Language (RBML) 4.1.4 Role-Based Pattern Specification 4.1.5 Object-Role Modeling (ORM) 2 4.1.6 OntoUML 4.2 Context-Dependent Modeling Languages 4.2.1 Metamodel for Roles 4.2.2 E-CARGO Model 4.2.3 Data Context Interaction (DCI) 4.3 Combined Modeling Languages 4.3.1 Taming Agents and Objects (TAO) 4.3.2 Information Networking Model (INM) 4.3.3 Helena Approach 5 Contemporary Role-based Programming Languages 5.1 Behavioral Programming Languages 5.1.1 Chameleon 5.1.2 Java with Roles (JAWIRO) 5.1.3 Rava 5.1.4 JavaStage 5.2 Relational Programming Languages 5.2.1 Rumer 5.2.2 First Class Relationships 5.2.3 Relations 5.3 Context-Dependent Programming Languages 5.3.1 EpsilonJ and NextEJ 5.3.2 Role/Interaction/Communicative Action (RICA) 5.3.3 ObjectTeams/Java 5.3.4 PowerJava 5.3.5 Scala Roles 6 Comparison of Role-based Languages 6.1 Comparison of Role-Based Modeling Languages 6.2 Comparison of Role-Based Programming Languages 6.3 Results and Findings II Family of Role-Based Modeling Languages 7 Foundations of Role-Based Modeling Languages 7.1 Ontological Foundation 7.1.1 Metaproperties 7.1.2 Classifying Modeling Concepts 7.2 Graphical Notation 7.2.1 Model Level Notation 7.2.2 Graphical Modeling Constraints 7.2.3 Instance Level Notation 7.3 Formalization of Roles 7.3.1 Model Level 7.3.2 Instance Level 7.3.3 Constraint Level 7.4 Reintroducing Inheritance 7.4.1 Extending the Banking Application 7.4.2 Model Level Extensions 7.4.3 Instance Level Extensions 7.4.4 Constraint Level Extensions 7.5 Reference Implementation 7.5.1 Translation of Logical Formulae 7.5.2 Structure of the Reference Implementation 7.5.3 Specifying and Verifying Role Models 7.6 Full-Fledged Role Modeling Editor 7.6.1 Software Architecture 7.6.2 Illustrative Example 7.6.3 Additional Tool Support 8 Family of Role-Based Modeling Languages 8.1 Family of Metamodels for Role-Based Modeling Languages 8.1.1 Feature Model for Role-Based Languages 8.1.2 Feature Minimal Metamodel 8.1.3 Feature Complete Metamodel 8.1.4 Mapping Features to Variation Points 8.1.5 Implementation of the Metamodel Generator 8.2 First Family of Role Modeling Editors 8.2.1 Dynamic Feature Configuration 8.2.2 Architecture of the Dynamic Software Product Line 8.2.3 Applicability of the Language Family Within RoSI 9 Conclusion 9.1 Summary 9.2 Contributions 9.3 Comparison with Contemporary Role-Based Modeling Languages 9.4 Future Research
190

A Metamodel Family for Role-Based Modeling and Programming Languages

Kühn, Thomas, Leuthäuser, Max, Götz, Sebastian, Seidl, Christoph, Aßmann, Uwe 05 July 2021 (has links)
Role-based modeling has been proposed almost 40 years ago as a means to model complex and dynamic domains, because roles are able to capture both context-dependent and collaborative behavior of objects. Unfortunately, while several researchers have introduced the notion of roles to modeling and programming languages, only few have captured both the relational and the context-dependent nature of roles. In this work, we classify various proposals since 2000 and show the discontinuity and fragmentation of the whole research field. To overcome discontinuity, we propose a family of metamodels for role-based modeling languages. Each family member corresponds to a design decision captured in a feature model. In this way, it becomes feasible to generate a metamodel for each role-based approach. This allows for the combination and improvement of the different role-based modeling and programming languages and paves the way to reconcile the research field.

Page generated in 0.1025 seconds