Spelling suggestions: "subject:"model based."" "subject:"godel based.""
321 |
A metamodel of operational control for discrete event logistics systemsSprock, Timothy A. 27 May 2016 (has links)
Discrete Event Logistics Systems (DELS) are a class of dynamic systems that are defined by the transformation of discrete flows through a network of interconnected subsystems. The DELS domain includes systems such as supply chains, manufacturing systems, transportation networks, warehouses, and health care delivery systems. Advancements in computer integrated manufacturing and intelligent devices have spurred a revolution in manufacturing. These smart manufacturing systems utilize technical interoperability and plant-wide integration at the device-level to drive production agility and efficiency. Extending these successes to enterprise-wide integration and decision-making will require the definitions of control and device to be extended and supported at the operations management and the business planning levels as well. In the future, smart operational control mechanisms must not only integrate real-time data from system operations, but also formulate and solve a wide variety of optimization analyses quickly and efficiently and then translate the results into executable commands. However in contemporary DELS practice, these optimization analyses, and analyses in general, are often purpose-built to answer specific questions, with an implicit system model and many possible analysis implementations depending on the question, the instance data, and the solver. Also because of the semantic gap between operations research analysis models such as job-shop scheduling algorithms and IT-based models such as MES, there is little integration between control analysis methods and control execution tools. Automated and cost-effective access to multiple analyses from a single conceptual model of the target system would broaden the usage and implementation of analysis-based decision support and system optimization. The fundamental contribution of this dissertation is concerned with interoperability and bridging the gap between operations research analysis models and practical applications of the results. This dissertation closes this gap by constructing a standard domain-specific language, standard problem definitions, and a standard analysis methodology to answer the control questions and execute the prescribed control actions. The domain specific language meets a broader requirement for facilitating interoperability for DELS, including system integration, plug-and-play analysis methods and tools, and system design methodologies. The domain-specific language formalizes a recurring product, process, resource, and facility description of the DELS domain. It provides a common language to discuss our systems, including the questions that we want to ask about our systems, the problems that we need to solve in order to answer those questions, and the mechanisms to deploy the solution.
A canonical set of control questions defines the comprehensive functional specification of all the decision-making mechanisms that a controller needs to be able to provide; i.e. a model of analysis models or a metamodel of operational control. These questions refine the interoperability mechanism between system and analysis models by mapping classes of control analysis models to implementation and execution mechanisms in the system model. A standard representation of each class of control problems is only a partial solution to fully addressing operational control. The final contribution of this dissertation constructs a round-trip analysis methodology that completes the bridge between operations research analysis models and deployable control mechanisms. This contribution formalizes an analysis pathway, from formulating an analysis model to executing a control action, that is grounded in a more fundamental insight into how analysis methods are executed to support operational control decision-making.
|
322 |
Étude et décontamination du transcriptome de novo du nématode doré Globodera rostochiensisLafond Lapalme, Joël January 2016 (has links)
Le nématode doré, Globodera rostochiensis, est un nématode phytoparasite qui peut infecter des plantes agricoles telles la pomme de terre, la tomate et l’aubergine. En raison des pertes de rendement considérables associées à cet organisme, il est justifiable de quarantaine dans plusieurs pays, dont le Canada. Les kystes du nématode doré protègent les œufs qu’ils contiennent, leur permettant de survivre (en état de dormance) jusqu’à 20 ans dans le sol. L’éclosion des œufs n’aura lieu qu’en présence d’exsudats racinaires d’une plante hôte compatible à proximité. Malheureusement, très peu de connaissances sont disponibles sur les mécanismes moléculaires liés à cette étape-clé du cycle vital du nématode doré.
Dans cet ouvrage, nous avons utilisé la technique RNA-seq pour séquencer tous les ARNm d’un échantillon de kystes du nématode doré afin d’assembler un transcriptome de novo (sans référence) et d’identifier des gènes jouant un rôle dans les mécanismes de survie et d’éclosion. Cette méthode nous a permis de constater que les processus d’éclosion et de parasitisme sont étroitement reliés. Plusieurs effecteurs impliqués dans le mouvement vers la plante hôte et la pénétration de la racine sont induits dès que le kyste est hydraté (avant même le déclenchement de l’éclosion).
Avec l’aide du génome de référence du nématode doré, nous avons pu constater que la majorité des transcrits du transcriptome ne provenaient pas du nématode doré. En effet, les kystes échantillonnés au champ peuvent contenir des contaminants (bactéries, champignons, etc.) sur leur paroi et même à l’intérieur du kyste. Ces contaminants seront donc séquencés et assemblés avec le transcriptome de novo. Ces transcrits augmentent la taille du transcriptome et induisent des erreurs lors des analyses post-assemblages. Les méthodes de décontamination actuelles utilisent des alignements sur des bases de données d’organismes connus pour identifier ces séquences provenant de contaminants. Ces méthodes sont efficaces lorsque le ou les contaminants sont connus (possède un génome de référence) comme la contamination humaine. Par contre, lorsque le ou les contaminants sont inconnus, ces méthodes deviennent insuffisantes pour produire un transcriptome décontaminé de qualité.
Nous avons donc conçu une méthode qui utilise un algorithme de regroupement hiérarchique des séquences. Cette méthode produit, de façon récursive, des sous-groupes de séquences homogènes en fonction des patrons fréquents présents dans les séquences. Une fois les groupes créés, ils sont étiquetés comme contaminants ou non en fonction des résultats d’alignements du sous-groupe. Les séquences ambiguës ayant aucun ou plusieurs alignements différents sont donc facilement classées en fonction de l’étiquette de leur groupe. Notre méthode a été efficace pour décontaminer le transcriptome du nématode doré ainsi que d’autres cas de contamination. Cette méthode fonctionne pour décontaminer un transcriptome, mais nous avons aussi démontré qu’elle a le potentiel de décontaminer de courtes séquences brutes. Décontaminer directement les séquences brutes serait la méthode de décontamination optimale, car elle minimiserait les erreurs d’assemblage.
|
323 |
Perceptions of Model-Based Systems Engineering As the Foundation for Cost Estimation and Its Implications to Earned Value ManagementBalram, Sara January 2012 (has links)
Model-based systems engineering (MBSE) is an enterprising systems engineering methodology, which in replacing traditional, document-centric systems engineering methods, has the potential to reduce project costs, time, effort and risk. The potential benefits of applying MBSE on a project are widely discussed but are largely anecdotal. Throughout the System Engineering and Project Management industries, there is a strong desire to quantify these benefits, particularly within organizations that are looking to apply it to their complex, system of systems projects. The objective of this thesis was to quantify the benefits that model-based systems engineering presents, particularly in terms of project cost estimates. In order to quantify this qualitative data, statistical analysis was conducted on collected perceptions from industry experts and professionals. The results of this work led to identifying future research that should be completed in order to make MBSE an industry-wide standard for the development and estimation of projects.
|
324 |
用範文演示法教授高中英文作文之成效 / A Case Study of Model-based Writing Instruction in Senior High School English Class林淑惠, Lin,Shu-huei Unknown Date (has links)
在英語寫作教學上,長久以來對於範文的使用,有很大的爭議。有些學者認為學習者可以透過分析與模仿範文的結構學會英文的組織結構;另一方面,有些學者則認為寫作是複雜的,非直線的過程,應包含了構思、寫作、修改、重寫等活動,範文的使用會阻礙學生自主思考。因此,在此提出一套寫作教學法,融合上述兩派的優點,透過範文使學生熟悉英文的修辭結構,同時以同儕修改、重寫、教師回饋等活動來輔助寫作的過程。本研究的目的是探討以範文演示法教學對高中生寫作的成效。
研究對象是台北縣某所高中39位三年級學生。主要研究工具包括:用以評估學生寫作能力的前後測寫作、兩份問卷用以調查學生對於英文寫作之結構的認知,及最後進行訪談以了解教學成效。研究過程分為三個步驟:(1)前測---前測寫作及問卷。根據大考中心的英文寫作標準,以前測成績將學生分成中等能力組(MPG)和低能力組(LPG),以調查寫作教學對不同程度學生的影響。問卷則是調查學生之寫作習慣,及對英文寫作之認知。(2)教學實驗---為期十週的寫作教學
,教學主題為記敘文及因果關係的寫作。(3)後測---後測寫作問卷與訪談,透過後測寫作以檢視學生之寫作成效;問卷與訪談則是為了進一步了解學生對英文寫作教學課程之觀感。
研究結過發現,此教學實驗對學生的英文寫作有正面影響。尤其在寫作結構及整體寫作品質方面,有明顯進步。此外,低能力組的進步大於中等能力組,而學生也反映,在上完英文寫作課後,對英文寫作較有信心。 / In the history of English writing pedagogy, there has long been a dispute over the use of models in English writing. Some researchers propose that learners can learn English organizational structures through the analysis and imitation of organization and logical arrangement in model paragraphs or essays. On the other hand, some process-oriented researchers argue that the use of models will prevent students from thinking on their own. An integrated approach was thus proposed to incorporate the strengths of the two approaches. Models were used to familiarize students with the rhetorical organization in English written discourse. In the meantime, peer-editing, revision and teacher feedback were used to facilitate the writing process.
The present study examines the effects of model-based instruction in organization on the writing of senior high school students. Thirty-nine subjects, who were senior high students in the third year, participated in this study. A pretest writing was used before the instruction to group the subjects into different proficiency levels according to the criterion set by the College Entrance Examination Center (CEEC). Two proficiency groups, namely, the middle proficiency group (M.P.G) and low proficiency group (L.P.G), were found among the subjects. A Writing Habit questionnaire was applied to investigate the subjects’ conception and habits about writing before the instruction. Then, a ten-week model-based writing instruction was conducted on all the participants in the study. Description and cause-effect writing were the two genres for instruction. After the instruction, a post-test writing was conducted to compare with the pre-test to examine whether students had made progress in their writing after the instruction. The Response Questionnaire and three in-depth interviews were conducted to further explore students’ perception and attitude toward the model-based writing.
The results of the study reveal that the instruction helps to improve students’ abilities to organize ideas more effectively and logically. The significant increase in scores on overall writing quality in the posttest also show that the model-based writing instruction helps to yield higher quality in students’ overall writing. In addition, subjects in the L.P.G. seem to make more progress than those in the M.P.G.
Moreover, the subjects seem to have more confidence in writing after the treatment.
|
325 |
Applied Adaptive Optimal Design and Novel Optimization Algorithms for Practical UseStrömberg, Eric January 2016 (has links)
The costs of developing new pharmaceuticals have increased dramatically during the past decades. Contributing to these increased expenses are the increasingly extensive and more complex clinical trials required to generate sufficient evidence regarding the safety and efficacy of the drugs. It is therefore of great importance to improve the effectiveness of the clinical phases by increasing the information gained throughout the process so the correct decision may be made as early as possible. Optimal Design (OD) methodology using the Fisher Information Matrix (FIM) based on Nonlinear Mixed Effect Models (NLMEM) has been proven to serve as a useful tool for making more informed decisions throughout the clinical investigation. The calculation of the FIM for NLMEM does however lack an analytic solution and is commonly approximated by linearization of the NLMEM. Furthermore, two structural assumptions of the FIM is available; a full FIM and a block-diagonal FIM which assumes that the fixed effects are independent of the random effects in the NLMEM. Once the FIM has been derived, it can be transformed into a scalar optimality criterion for comparing designs. The optimality criterion may be considered local, if the criterion is based on singe point values of the parameters or global (robust), where the criterion is formed for a prior distribution of the parameters. Regardless of design criterion, FIM approximation or structural assumption, the design will be based on the prior information regarding the model and parameters, and is thus sensitive to misspecification in the design stage. Model based adaptive optimal design (MBAOD) has however been shown to be less sensitive to misspecification in the design stage. The aim of this thesis is to further the understanding and practicality when performing standard and MBAOD. This is to be achieved by: (i) investigating how two common FIM approximations and the structural assumptions may affect the optimized design, (ii) reducing runtimes complex design optimization by implementing a low level parallelization of the FIM calculation, (iii) further develop and demonstrate a framework for performing MBAOD, (vi) and investigate the potential advantages of using a global optimality criterion in the already robust MBAOD.
|
326 |
Dynamic Temperature Mapping - Real-time Strategies and Model-based ReconstructionsZhang, Zhongshuai 14 December 2016 (has links)
No description available.
|
327 |
Development of Advanced Acquisition and Reconstruction Techniques for Real-Time Perfusion MRIRoeloffs, Volkert Brar 16 June 2016 (has links)
Diese Doktorarbeit befasst sich mit der methodischen Entwicklung von Akquisition- und Rekonstruktionstechniken zur Anwendung von Echtzeit-Bildgebungstechniken auf das Gebiet der dynamischen kontrastmittelgestützten Magentresonanztomographie. Zur Unterdrückung unerwünschter Bildartefakte wird eine neue Spoiling-Technik vorgeschlagen, die auf randomisierten Phasen der Hochfrequenzanregung basiert. Diese Technik erlaubt eine schnelle, artefaktfreie Aufnahme von T1-gewichteten Rohdaten bei radialer Abtastung. Die Rekonstruktion quantitativer Parameterkarten aus solchen Rohdaten kann als nichtlineares, inverses Problem aufgefasst werden. In dieser Arbeit wird eine modellbasierte Rekonstruktionstechnik zur quantitativen T1-Kartierung entwickelt, die dieses inverse Problem mittels der iterativ regularisierten Gauß-Newton-Methode mit parameterspezifischer Regularisierung löst. In Simulationen sowie in-vitro- und in-vivo-Studien wird Genauigkeit und Präzision dieser neuen Methode geprüft, die ihre direkte Anwendung in in-vitro-Experimenten zur "first-pass"-Perfusion findet. In diesen Experimenten wird ein kommerziell verfügbares Phantom verwendet, dass in-vivo-Perfusion simuliert und gleichzeitig vollständige Kontrolle über die vorherrschenden Austauschraten erlaubt.
|
328 |
Modeling Mortality Rates In The WikiLeaks Afghanistan War LogsRusch, Thomas, Hofmarcher, Paul, Hatzinger, Reinhold, Hornik, Kurt 09 1900 (has links) (PDF)
The WikiLeaks Afghanistan war logs contain more than 76 000 reports about fatalities and their circumstances in the US led Afghanistan war, covering the period from January 2004 to December 2009. In this paper we use those reports to build statistical models to help us understand the mortality rates associated with specific circumstances. We choose an approach that combines Latent Dirichlet Allocation (LDA) with negative binomial based recursive partitioning. LDA is used to process the natural language information contained in each report summary. We estimate latent topics and assign each report to one of them. These topics - in addition to other variables in the data set - subsequently serve as explanatory variables for modeling the number of fatalities of the civilian population, ISAF Forces, Anti-Coalition Forces and the Afghan National Police or military as well as the combined number of fatalities. Modeling is carried out with manifest mixtures of negative binomial distributions estimated with model-based recursive partitioning. For each group of fatalities, we identify segments with different mortality rates that correspond to a small number of topics and other explanatory variables as well as their interactions. Furthermore, we carve out the similarities between segments and connect them to stories that have been covered in the media. This provides an unprecedented description of the war in Afghanistan covered by the war logs. Additionally, our approach can serve as an example as to how modern statistical methods may lead to extra insight if applied to problems of data journalism. (author's abstract) / Series: Research Report Series / Department of Statistics and Mathematics
|
329 |
Gaining Insight with Recursive Partitioning of Generalized Linear ModelsRusch, Thomas, Zeileis, Achim January 2013 (has links) (PDF)
Recursive partitioning algorithms separate a feature space into a set of disjoint rectangles.
Then, usually, a constant in every partition is fitted. While this is a simple and intuitive approach, it may still lack interpretability as to how a specific relationship between dependent and independent variables may look. Or it may be that a certain model is assumed or of interest and there is a number of candidate variables that may non-linearly give rise to different model parameter values. We present an approach that combines generalized linear models with recursive partitioning that offers enhanced interpretability of classical trees as well as providing an explorative way to assess a candidate variable's in uence on a parametric model. This method conducts recursive partitioning of a generalized linear model by (1) fitting the model to the data set, (2) testing for parameter
instability over a set of partitioning variables, (3) splitting the data set with respect to the variable associated with the highest instability. The outcome is a tree where each terminal node is associated with a generalized linear model. We will show the method's
versatility and suitability to gain additional insight into the relationship of dependent and independent variables by two examples, modelling voting behaviour and a failure model
for debt amortization, and compare it to alternative approaches.
|
330 |
Análise de cobertura de critérios de teste estruturais a partir de conjuntos derivados de especificações formais: um estudo comparativo no contexto de aplicações espaciais / Structural coverage analysis of test sets derived from formal specifications: a comparative study in the space applications contextHerculano, Paula Fernanda Ramos 24 April 2007 (has links)
As técnicas de teste podem ser divididas, num primeiro nível, naquelas baseadas no código (caixa branca) e naquelas baseadas na especificação (caixa preta ou funcionais). Nenhuma delas é completa pois visam a identificar tipos diferentes de defeitos e a sua utilização em conjunto pode elevar o nível de confiabilidade das aplicações. Assim, tornam-se importantes estudos que contribuam para um melhor entendimento da relação existente entre técnicas funcionais e estruturais, como elas se complementam e como podem ser utilizadas em conjunto. Este trabalho está inserido no contexto do projeto PLAVIS (Plataforma para Validação e Integração de Software em Aplicações Espaciais), e tem como objetivo realizar um estudo comparativo entre as técnicas de geração de casos de teste funcionais (baseadas nas especificações formais) e os critérios estruturais baseados em fluxo de controle e fluxo de dados, aplicados nas implementações. Num contexto específico, esse estudo deve fornecer dados de como se relacionam essas duas técnicas (funcional e estrutural) gerando subsídios para sua utilização em conjunto. Num contexto mais amplo - o do projeto PLAVIS - visa a estabelecer uma estratégia de teste baseada em critérios funcionais e estruturais e que possam, juntamente com as ferramentas que dão suporte a eles, compor um ambiente de teste disponível à utilização em aplicações espaciais dentro do INPE / Testing techniques can be divided, in high level, in code-based ones (white box) and specification based ones (black box). None of them are complete as they intend to identify different kinds of faults. The use of them together can increase the application confidence level. Thus, it is important to investigate the relationship between structural testing techniques and functional testing techniques, how they complete themselves and how they can be used together. This paper was developed in the context of the Plavis (PLAtform of software Validation & Integration on Space systems) project. This project provides comparative studies between functional generation testing techniques (based on formal specifications) and structural generation testing techniques, such as control-flow and data-flow criteria, applied in the implementation. In a specific context, this study provides data about the relationship between these techniques and how they can be used together. In the context of the Plavis project, the goal is to provide a testing strategy, based on functional and structural criteria, and a set of tools, composing a testing environment to be used in Space Applications projects, at INPE
|
Page generated in 0.0481 seconds