• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • 2
  • 2
  • Tagged with
  • 10
  • 10
  • 5
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Behavioural Model Fusion

Nejati, Shiva 19 January 2009 (has links)
In large-scale model-based development, developers periodically need to combine collections of interrelated models. These models may capture different features of a system, describe alternative perspectives on a single feature, or express ways in which different features alter one another's structure or behaviour. We refer to the process of combining a set of interrelated models as "model fusion". A number of factors make model fusion complicated. Models may overlap, in that they refer to the same concepts, but these concepts may be presented differently in each model, and the models may contradict one another. Models may describe independent system components, but the components may interact, potentially causing undesirable side effects. Finally, models may cross-cut, modifying one another in ways that violate their syntactic or semantic properties. In this thesis, we study three instances of the fusion problem for "behavioural models", motivated by real-world applications. The first problem is combining "partial" models of a single feature with the goal of creating a more complete description of that feature. The second problem is maintenance of "variant" specifications of individual features. The goal here is to combine the variants while preserving their points of difference (i.e., variabilities). The third problem is analysis of interactions between models describing "different" features. Specifically, given a set of features, the goal is to construct a composition such that undesirable interactions are absent. We provide an automated tool-supported solution to each of these problems and evaluate our solutions. The main novelties of the techniques presented in this thesis are (1) preservation of semantics during the fusion process, and (2) applicability to large and evolving collections of models. These are made possible by explicit modelling of partiality, variability and regularity in behavioural models, and providing semantic-preserving notions for relating these models.
2

Behavioural Model Fusion

Nejati, Shiva 19 January 2009 (has links)
In large-scale model-based development, developers periodically need to combine collections of interrelated models. These models may capture different features of a system, describe alternative perspectives on a single feature, or express ways in which different features alter one another's structure or behaviour. We refer to the process of combining a set of interrelated models as "model fusion". A number of factors make model fusion complicated. Models may overlap, in that they refer to the same concepts, but these concepts may be presented differently in each model, and the models may contradict one another. Models may describe independent system components, but the components may interact, potentially causing undesirable side effects. Finally, models may cross-cut, modifying one another in ways that violate their syntactic or semantic properties. In this thesis, we study three instances of the fusion problem for "behavioural models", motivated by real-world applications. The first problem is combining "partial" models of a single feature with the goal of creating a more complete description of that feature. The second problem is maintenance of "variant" specifications of individual features. The goal here is to combine the variants while preserving their points of difference (i.e., variabilities). The third problem is analysis of interactions between models describing "different" features. Specifically, given a set of features, the goal is to construct a composition such that undesirable interactions are absent. We provide an automated tool-supported solution to each of these problems and evaluate our solutions. The main novelties of the techniques presented in this thesis are (1) preservation of semantics during the fusion process, and (2) applicability to large and evolving collections of models. These are made possible by explicit modelling of partiality, variability and regularity in behavioural models, and providing semantic-preserving notions for relating these models.
3

JigCell Model Connector: Building Large Molecular Network Models from Components

Jones, Thomas Carroll Jr. 28 June 2017 (has links)
The ever-growing size and complexity of molecular network models makes them difficult to construct and understand. Modifying a model that consists of tens of reactions is no easy task. Attempting the same on a model containing hundreds of reactions can seem nearly impossible. We present the JigCell Model Connector, a software tool that supports large-scale molecular network modeling. Our approach to developing large models is to combine together smaller models, making the result easier to comprehend. At the base, the smaller models (called modules) are defined by small collections of reactions. Modules connect together to form larger modules through clearly defined interfaces, called ports. In this work, we enhance the port concept by defining different types of ports. Not all modules connect together the same way, therefore multiple connection options need to exist. / Master of Science
4

Mathematical Framework for Early System Design Validation Using Multidisciplinary System Models

Larson, Bradley Jared 09 March 2012 (has links) (PDF)
A significant challenge in the design of multidisciplinary systems (e.g., airplanes, robots, cell phones) is to predict the effects of design decisions at the time these decisions are being made early in the design process. These predictions are used to choose among design options and to validate design decisions. System behavioral models, which predict a system's response to stimulus, provide an analytical method for evaluating a system's behavior. Because multidisciplinary systems contain many different types of components that have diverse interactions, system behavioral models are difficult to develop early in system design and are challenging to maintain as designs are refined. This research develops methods to create, verify, and maintain multidisciplinary system models developed from models that are already part of system design. First, this research introduces a system model formulation that enables virtually any existing engineering model to become part of a large, trusted population of component models from which system behavioral models can be developed. Second, it creates a new algorithm to efficiently quantify the feasible domain over which the system model can be used. Finally, it quantifies system model accuracy early in system design before system measurements are available so that system models can be used to validate system design decisions. The results of this research are enabling system designers to evaluate the effects of design decisions early in system design, improving the predictability of the system design process, and enabling exploration of system designs that differ greatly from existing solutions.
5

Model Composition and Aggregation in Macromolecular Regulatory Networks

Randhawa, Ranjit 14 May 2008 (has links)
Mathematical models of regulatory networks become more difficult to construct and understand as they grow in size and complexity. Large regulatory network models can be built up from smaller models, representing subsets of reactions within the larger network. This dissertation focuses on novel model construction techniques that extend the ability of biological modelers to construct larger models by supplying them with tools for decomposing models and using the resulting components to construct larger models. Over the last 20 years, molecular biologists have amassed a great deal of information about the genes and proteins that carry out fundamental biological processes within living cells --- processes such as growth and reproduction, movement, signal reception and response, and programmed cell death. The full complexity of these macromolecular regulatory networks is too great to tackle mathematically at the present time. Nonetheless, modelers have had success building dynamical models of restricted parts of the network. Systems biologists need tools now to support composing "submodels" into more comprehensive models of integrated regulatory networks. We have identified and developed four novel processes (fusion, composition, flattening, and aggregation) whose purpose is to support the construction of larger models. Model Fusion combines two or more models in an irreversible manner. In fusion, the identities of the original (sub)models are lost. Beyond some size, fused models will become too complex to grasp and manage as single entities. In this case, it may be more useful to represent large models as compositions of distinct components. In Model Composition one thinks of models not as monolithic entities but rather as collections of smaller components (submodels) joined together. A composed model is built from two or more submodels by describing their redundancies and interactions. While it is appealing in the short term to build larger models from pre-existing models, each developed independently for their own purposes, we believe that ultimately it will become necessary to build large models from components that have been designed for the purpose of combining them. We define Model Aggregation as a restricted form of composition that represents a collection of model elements as a single entity (a "module"). A module contains a definition of pre-determined input and output ports. The process of aggregation (connecting modules via their interface ports) allows modelers to create larger models in a controlled manner. Model Flattening converts a composed or aggregated model with some hierarchy or connections to one without such connections. The relationships used to describe the interactions among the submodels are lost, as the composed or aggregated model is converted into a single large (flat) model. Flattening allows us to use existing simulation tools, which have no support for composition or aggregation. / Ph. D.
6

Spatial Growth Regressions: Model Specification, Estimation and Interpretation

LeSage, James P., Fischer, Manfred M. 04 1900 (has links) (PDF)
This paper uses Bayesian model comparison methods to simultaneously specify both the spatial weight structure and explanatory variables for a spatial growth regression involving 255 NUTS 2 regions across 25 European countries. In addition, a correct interpretation of the spatial regression parameter estimates that takes into account the simultaneous feed- back nature of the spatial autoregressive model is provided. Our findings indicate that incorporating model uncertainty in conjunction with appropriate parameter interpretation decreased the importance of explanatory variables traditionally thought to exert an important influence on regional income growth rates. (authors' abstract)
7

Modelização e pensée sauvage na prática composicional / Model and pensée sauvage in the compocitional practice

Kimizuka, Yuri Sizuo 13 June 2019 (has links)
A presente tese está dividida em três partes, cada uma contemplando um aspecto distinto que foi desenvolvido ao longo do trabalho de pesquisa; e, que se completa no anexo I, através do conjunto de partituras. A primeira trata inicialmente de modelização, seus princípios e aplicações na CAC1 - no caso, OpenMusic - onde foram desenvolvidos os patches e implementado o conceito de composição de modelos. Como ponto de partida para a criação de padrões foi escolhida a modulação de anel, a qual é abordada em vários sentidos, dentre os quais: harmônico, timbrístico e, algorítmico. O conceito de Diamorfose, que diz respeito a transição entre materiais, encerra este conteúdo dedicado às questões mais ligadas a computação. Téchné refere-se efetivamente à escrita composicional, mais especificamente àquela que foi desenvolvida pelos compositores que antecederam o estudo e escrita desse trabalho. O subtítulo \"L\'artisan merveilleux\", é uma alusão ao \"L\'artisan Furieux\" (Le Marteau sans Maître). O \"Artesão maravilhoso\" é mais que um jogo de palavras, propõe o compositor como artífice da criação, o artista que sabe esculpir o tempo para revelar a música. Aqui, trata-se de observar como os compositores resolveram determinados problemas, ou ainda, como fizeram uso de recursos - como a modulação de anel - dentre outros processos composicionais; destacando principalmente as contribuições de Olivier Messiaen, Claude Vivier, e Tristan Murail. A segunda parte propõe a ideia da pensée sauvage como elemento estruturante na composição. Este conceito funciona como um pendant em relação a primeira parte, uma maneira de se pensar na criação a partir daquilo que o antropólogo Claude Lévi-Strauss identificou em diversas culturas, e aqui será denominado por pensée sauvage, em referência ao título do livro de Lévi-Strauss. Através deste conceito se faz possível olhar para o lado - outras culturas - e além, para um passado muito remoto, pré-histórico. Essa conexão surgiu, inicialmente, com a proposição de Olivier Messiaen de que os pássaros seriam remanescentes de animais pré-históricos, e mais recentemente de pesquisas realizadas que apontam para a relação entre a bioacústica e arte. Nesse sentido, verifica-se que o pensamento mítico na composição é também uma maneira criar um sistema de relações que permite manipular os sons dos pássaros desde a sua observação, modelização e, reescrita. A terceira parte traz os memoriais de composição de quatro peças compostas ao longo da pesquisa. Nesses memoriais são analisados os processos de criação, suas implicações teóricas e práticas em relação ao que foi desenvolvido nas duas partes anteriores. Como este trabalho traz para o universo do pensamento musical diversos elementos provenientes da matemática, física, do estudo dos mitos, optei por uma apresentação didática desses, apesar de tais elementos serem constantemente presentes nos escritos sobre a música, como por exemplo nos textos dos compositores espectralistas, como Dufourt, Murail e Grisey; na escrita sobre o mito, na música em François-Bernard Mâche, nas referências aos pássaros em Messiaen. Acredito que a retomada de conceitos e noções gerais deve ser feita de maneira concomitante às novas contribuições no escopo de uma tese que problematiza confluências teóricas sob o ponto de vista da composição. / This dissertation is divided into three parts, each one considering a distinct aspect developed throughout the research, complemented with the Annex I which contains a set of scores. The first part is about modeling, its principles and applications in CAC - in this case, OpenMusic - where the patches were developed and the model composition concept implemented. The ring modulation was chosen as base for the generation of patterns. This kind of modulation is approached in several senses, among which: harmonic, timbre and, algorithmic. The second part in this chapter is called Diamorphosis, and refers to the transition between materials transformation. Téchné refers to compositional craftsmanship inherited from composers who preceded the study and writing of that work. The subtitle \"L\'artisan merveilleux\" is an allusion to \"L\'artisan Furieux\" (Le Marteau sans Maître). More than a pun, the \"Wonderful Craftsman\" proposes the composer as an artifice of creation, the artist who knows how to sculpt the time to reveal the music. Here, it is a question of observing how the composers solved certain problems, or how they used the resources - like ring modulation - among other compositional processes; specially the contributions of Olivier Messiaen, Claude Vivier, and Tristan Murail. The second part proposes the idea of pensée sauvage as structuring element in the composition. This concept works as a pendant to the first part, it is a kind of thinking about creation from what the anthropologist Claude Lévi-Strauss identified in various cultures, and here it will be called by pensée sauvage, in reference to the title of the book of Lévi-Strauss. Through this concept it is possible to observe - other cultures - and beyond, toward a very remote age - prehistory. This connection arose, initially, with Olivier Messiaen\'s proposition that birds would be remnants of prehistoric animals, and most recently of researches that point to the relationship between bioacoustics and art. In this sense the mythical reasoning in the composition works as way to implement a structure allowing to manipulate the sounds of the birds from its observation, modeling and, rewriting. The third part brings the compositional memorials of four pieces composed throughout the research. In these memorials are analyzed their creation processes, as well the theoretical and practical implications related to what was developed in the two previous parts. As this dissertation brings to the musical universe concepts from mathematics, physics, the study of myths, I chose a didactic presentation of these, although such elements have been constantly present in the writings on music, for instance on the texts of spectral composers such as Dufourt, Murail and Grisey; as well concerning about myth, in François-Bernard Mâche, and those about birds in Messiaen. Therefore I believe that the resumption of concepts and general notions should be made concomitantly with the new contributions in the scope of a dissertation as it problematizes all these theoretical confluences from the point of view of composition.
8

Knowledge composition methodology for effective analysis problem formulation in simulation-based design

Bajaj, Manas 17 November 2008 (has links)
In simulation-based design, a key challenge is to formulate and solve analysis problems efficiently to evaluate a large variety of design alternatives. The solution of analysis problems has benefited from advancements in commercial off-the-shelf math solvers and computational capabilities. However, the formulation of analysis problems is often a costly and laborious process. Traditional simulation templates used for representing analysis problems are typically brittle with respect to variations in artifact topology and the idealization decisions taken by analysts. These templates often require manual updates and "re-wiring" of the analysis knowledge embodied in them. This makes the use of traditional simulation templates ineffective for multi-disciplinary design and optimization problems. Based on these issues, this dissertation defines a special class of problems known as variable topology multi-body (VTMB) problems that characterizes the types of variations seen in design-analysis interoperability. This research thus primarily answers the following question: How can we improve the effectiveness of the analysis problem formulation process for VTMB problems? The knowledge composition methodology (KCM) presented in this dissertation answers this question by addressing the following research gaps: (1) the lack of formalization of the knowledge used by analysts in formulating simulation templates, and (2) the inability to leverage this knowledge to define model composition methods for formulating simulation templates. KCM overcomes these gaps by providing: (1) formal representation of analysis knowledge as modular, reusable, analyst-intelligible building blocks, (2) graph transformation-based methods to automatically compose simulation templates from these building blocks based on analyst idealization decisions, and (3) meta-models for representing advanced simulation templates VTMB design models, analysis models, and the idealization relationships between them. Applications of the KCM to thermo-mechanical analysis of multi-stratum printed wiring boards and multi-component chip packages demonstrate its effectiveness handling VTMB and idealization variations with significantly enhanced formulation efficiency (from several hours in existing methods to few minutes). In addition to enhancing the effectiveness of analysis problem formulation, KCM is envisioned to provide a foundational approach to model formulation for generalized variable topology problems.
9

Reúso de frameworks transversais com apoio de modelos

Gottardi, Thiago 04 July 2012 (has links)
Made available in DSpace on 2016-06-02T19:05:57Z (GMT). No. of bitstreams: 1 4450.pdf: 5302294 bytes, checksum: a12e396d985ac3fe2e63b38cc999decf (MD5) Previous issue date: 2012-07-04 / Universidade Federal de Minas Gerais / Aspect-Oriented programming was created to modularize the so-called crosscutting concerns . Crosscutting concerns have some properties that cannot be fully modularized with the object-oriented paradigm. After that, aspect-oriented frameworks were created in order to make reuse of different concern codes easier. Among these frameworks, Crosscutting Frameworks are aspect-oriented frameworks specifically created for crosscutting concern code modularization, for instance, persistence, distribution, concurrency and business rules. Currently, these frameworks are usually distributed as source code and must be reused by extending classes, aspects and methods. Reusing these frameworks in code-level require application developers to worry about implementation issues, that affects understandability, productivity and quality of final software. In this thesis, the objective is to raise abstraction level by applying a new model-driven approach for crosscutting framework reuse, which also allows reusability during earlier development phases. Experiments were conducted in order to compare the productivity of the proposed process with the conventional reuse technique, which is based on source-code edition. It was identified that the proposed process has advantages during the reuse activity, however, no advantages were detected while maintaining an application coupled to a crosscutting framework. / A programação orientada a aspectos foi criada para permitir a modularização de um tipo de interesse de software denominado de interesse transversal , que não pode ser completamente modularizado com paradigmas como o orientado a objetos. Com o uso do paradigma orientado a aspectos, vários pesquisadores começaram a pesquisar como determinados interesses transversais poderiam ser modularizados de formas genéricas para aumentar suas possibilidades de reúso, fazendo surgir Frameworks Orientados a Aspectos e também o termo Frameworks Transversais. Framework Transversal é um tipo de framework orientado a aspectos que tem o objetivo de permitir o reúso de código de interesses transversais, como persistência, distribuição, concorrência ou regras de negócio. Em geral, esses frameworks são disponibilizados na forma de código-fonte e devem ser reusados por meio da edição de código. Realizar o reúso neste nível faz com que engenheiros de aplicação tenham que se preocupar com detalhes da implementação do framework, afetando o entendimento, a produtividade e a qualidade do software final. Neste trabalho, o objetivo é elevar o nível de abstração do processo de reúso de frameworks transversais, propondo um processo dirigido por modelos que permite iniciar o processo de reúso nas fases antecedentes à implementação. Experimentos foram realizados para comparar o tempo de aplicar no novo processo com o processo de edição de código-fonte. Foi identificado que o novo processo possui vantagens em diminuir o tempo para reusar os frameworks, porém, não foram encontradas vantagens durante a manutenção de aplicações acopladas a frameworks transversais.
10

End-User Development of Web-based Decision Support Systems

Tschudnowsky, Alexey 29 June 2017 (has links)
Recent innovations in the information technology and computing devices magnified the volume of available information. Today’s decision makers face the challenge of analyzing ever more data in shorter timeframes. Demand for technology that can efficiently assist systematic data analysis is constantly growing. Development of dedicated information systems is, however, difficult both from organizational and technological point of view. First, traditional software production is a complex and time-consuming process that can not be performed under time-pressure. Second, changing business conditions and evolving stakeholder needs require solutions that can be efficiently tailored over time. Finally, costs of custom software development are high, so that not all use cases and scenarios can be covered sufficiently. This thesis proposes a holistic approach to address the challenges above and to enable efficient development of decision support software. The main idea is to empower end users, i.e., decision makers, in constructing their own case-specific solutions. The proposed approach called Web-Composition for End-User Development consists of a systematic process for development and evolution of decision support systems, assistance mechanisms to address lack of programming skills by decision makers and evolution facilities to enable cost- and time-efficient extensibility of user-produced solutions. The thesis describes implementation of the devised principles and ideas in the context of several open-source projects and application scenarios. Applicability and usability of the concepts are demonstrated in user studies with respective target groups. Based on the outcome analysis the thesis concludes that end users can and should actively participate in construction of decision support software.

Page generated in 0.1362 seconds