• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 53
  • 10
  • 7
  • 4
  • 4
  • 1
  • Tagged with
  • 88
  • 88
  • 35
  • 31
  • 26
  • 25
  • 24
  • 17
  • 17
  • 13
  • 11
  • 11
  • 11
  • 10
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Analyse pire cas exact du réseau AFDX / Exact worst-case communication delay analysis of AFDX network

Adnan, Muhammad 21 November 2013 (has links)
L'objectif principal de cette thèse est de proposer les méthodes permettant d'obtenir le délai de transmission de bout en bout pire cas exact d'un réseau AFDX. Actuellement, seules des bornes supérieures pessimistes peuvent être calculées en utilisant les approches de type Calcul Réseau ou par Trajectoires. Pour cet objectif, différentes approches et outils existent et ont été analysées dans le contexte de cette thèse. Cette analyse a mis en évidence le besoin de nouvelles approches. Dans un premier temps, la vérification de modèle a été explorée. Les automates temporisés et les outils de verification ayant fait leur preuve dans le domaine temps réel ont été utilisés. Ensuite, une technique de simulation exhaustive a été utilisée pour obtenir les délais de communication pire cas exacts. Pour ce faire, des méthodes de réduction de séquences ont été définies et un outil a été développé. Ces méthodes ont été appliquées à une configuration réelle du réseau AFDX, nous permettant ainsi de valider notre travail sur une configuration de taille industrielle du réseau AFDX telle que celle embarquée à bord des avions Airbus A380. The main objective of this thesis is to provide methodologies for finding exact worst case end to end communication delays of AFDX network. Presently, only pessimistic upper bounds of these delays can be calculated by using Network Calculus and Trajectory approach. To achieve this goal, different existing tools and approaches have been analyzed in the context of this thesis. Based on this analysis, it is deemed necessary to develop new approaches and algorithms. First, Model checking with existing well established real time model checking tools are explored, using timed automata. Then, exhaustive simulation technique is used with newly developed algorithms and their software implementation in order to find exact worst case communication delays of AFDX network. All this research work has been applied on real life implementation of AFDX network, allowing us to validate our research work on industrial scale configuration of AFDX network such as used on Airbus A380 aircraft. / The main objective of this thesis is to provide methodologies for finding exact worst case end to end communication delays of AFDX network. Presently, only pessimistic upper bounds of these delays can be calculated by using Network Calculus and Trajectory approach. To achieve this goal, different existing tools and approaches have been analyzed in the context of this thesis. Based on this analysis, it is deemed necessary to develop new approaches and algorithms. First, Model checking with existing well established real time model checking tools are explored, using timed automata. Then, exhaustive simulation technique is used with newly developed algorithms and their software implementation in order to find exact worst case communication delays of AFDX network. All this research work has been applied on real life implementation of AFDX network, allowing us to validate our research work on industrial scale configuration of AFDX network such as used on Airbus A380 aircraft.
42

Comparação entre diferentes estratégias de melhoria visando à redução do lead time

Utiyama, Marcel Heimar Ribeiro 18 February 2016 (has links)
Submitted by Livia Mello (liviacmello@yahoo.com.br) on 2016-09-21T14:29:18Z No. of bitstreams: 1 TeseMHRU.pdf: 2754873 bytes, checksum: 04976803627828cf7bb2eebe936f3f24 (MD5) / Approved for entry into archive by Marina Freitas (marinapf@ufscar.br) on 2016-09-21T18:37:37Z (GMT) No. of bitstreams: 1 TeseMHRU.pdf: 2754873 bytes, checksum: 04976803627828cf7bb2eebe936f3f24 (MD5) / Approved for entry into archive by Marina Freitas (marinapf@ufscar.br) on 2016-09-21T18:37:43Z (GMT) No. of bitstreams: 1 TeseMHRU.pdf: 2754873 bytes, checksum: 04976803627828cf7bb2eebe936f3f24 (MD5) / Made available in DSpace on 2016-09-21T18:37:53Z (GMT). No. of bitstreams: 1 TeseMHRU.pdf: 2754873 bytes, checksum: 04976803627828cf7bb2eebe936f3f24 (MD5) Previous issue date: 2016-02-18 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES) / The focus in time is the cornerstone of modern manufacturing management approaches; among them the Time Based Competition (TBC) and the Quick Response Manufacturing (QRM). Lead time reduction brings significant gains which are obtained by means of improvements in the shop floor variables. To make improvements in these variables, managers need to choose the best way to invest his limited financial resources. This work look for the best strategy regarding the improvement in time to repair, time between failures and setup time: To perform improvements focused on mean, variability or eliminate the worst cases. The main focus of this work is the worst case strategy, it means, to find the situations in which this strategy is the best option. To perform this comparison a modeling/simulation was conducted and the three variables addressed in this work were modeled using the normal and lognormal probability distributions. Results show that for situations with moderate and high variability the worst case strategy is the best improvement option. We believe this strategy can bring significant benefits and it is probably less costly and easier to implement. For low variability situations we find that the mean strategy is the best option. In addition for this case, for two variables (time to repair and setup time) the worst case strategy is a good alternative for situations which it is not possible to reduce the mean of these variables. For the time to failure, the only good option remains the mean strategy. / O foco no tempo constitui o aspecto fundamental de modernas estratégias de gestão da manufatura. Dentre elas merece destaque a competição baseada no tempo (time-based competition-TBC) e a manufatura de resposta rápida (Quick Response Manufacturing-QRM). A redução do lead time implica em ganhos significativos e é obtida por meio de programas de melhorias em variáveis do chão de fábrica. Para a realização de melhorias nessas variáveis, um gerente de produção, precisa escolher a melhor maneira de investir os seus recursos financeiros limitados. O presente trabalho investiga qual a melhor estratégia no que se refere à melhoria no tempo de reparo, tempo entre falhas e tempo de setup: Realizar melhorias na média, na variabilidade ou eliminar os piores casos. O foco principal deste trabalho é a estratégia de melhoria no pior caso, ou seja, identificar as situações nas quais a mesma produz um efeito no lead time superior ou semelhante às estratégias de melhoria na média e na variabilidade. Para isso foi realizada uma modelagem/simulação e as três variáveis abordadas nesse trabalho foram modeladas utilizando as distribuições de probabilidade normal e lognormal. Os resultados encontrados na presente tese mostram que para situações com variabilidades moderadas e altas a estratégia focada no pior caso é a melhor opção de melhoria considerando as três variáveis investigadas neste estudo. Ou seja, a estratégia de melhoria focada nos piores casos pode trazer benefícios mais facilmente e a um menor custo. Adicionalmente, para baixa variabilidade a estratégia de melhoria na média é a melhor opção. No entanto, foi observado que para o tempo de reparo e tempo de setup a estratégia de melhoria no pior caso é uma boa alternativa para as situações nas quais não seja possível efetuar uma melhoria na média dessas variáveis. Já para o tempo entre falhas, essa constatação não se sustenta, pois nesse caso a única alternativa que apresenta benefícios significativos é a estratégia de melhoria focada na média.
43

Estudo do pior caso na validação de limpeza de equipamentos de produção de radiofármacos de reagentes liofilizados. Validação de metodologia de carbono orgânico total / Worst-case study for cleaning validation of equipments in the radiopharmaceutical production of lyophilized reagents. Metodology validation of total organic carbon.

Luciana Valeria Ferrari Machado Porto 18 December 2015 (has links)
Os radiofármacos são definidos como preparações farmacêuticas contendo um radionuclídeo em sua composição, são administrados intravenosamente em sua maioria, e, portanto, o cumprimento dos princípios de Boas Práticas de Fabricação (BPF) é essencial e indispensável à tais produtos. A validação de limpeza é um requisito das BPF e consiste na evidência documentada que demonstra que os procedimentos de limpeza removem os resíduos a níveis pré-determinados de aceitação, garantindo que não haja contaminação cruzada. Uma simplificação da validação dos processos de limpeza é admitida, e consiste na escolha de um produto, denominado de \"pior caso\" ou worst case, para representar a limpeza de todos os equipamentos da mesma linha de produção. Uma das etapas da validação de limpeza é o estabelecimento e validação do método analítico para quantificação do resíduo. O objetivo deste estudo foi estabelecer o pior caso para a validação de limpeza dos equipamentos de produção de reagentes liofilizados-RL para marcação com 99mTc, avaliar a utilização do teor de carbono orgânico total (COT) como indicador de limpeza dos equipamentos utilizados na fabricação dos RL, validar o método para determinação de CONP (carbono orgânico não purgável/volátil) e realizar testes de recuperação com o produto escolhido como pior caso. A escolha do produto pior caso baseou-se no cálculo de um índice denominado \"índice para pior caso - Worst Case Index (WCI)\", utilizando informações de solubilidade dos fármacos, dificuldade de limpeza dos equipamentos e taxa de ocupação dos produtos na linha de produção. O produto indicado como pior caso entre os RL foi o MIBI-TEC. Os ensaios de validação do método foram realizados utilizando-se um analisador de carbono modelo TOC-Vwp acoplado a um amostrador automático modelo ASI-V, ambos da marca Shimadzu&reg e controlados por software TOC Control-V Shimadzu&reg. Foi utilizado o método direto de quantificação do CONP. Os parâmetros avaliados na validação do método foram: conformidade do sistema, robustez, linearidade, limites de detecção (LD) e de quantificação (LQ), precisão (repetibilidade e precisão intermediária), e exatidão (recuperação) e foram definidos como: 4% acidificante, 2,5 mL de oxidante, tempo de integração da curva de 4,5 minutos, tempo de sparge de 3,0 minutos e linearidade na faixa de 40-1000 μgL-1, com coeficiente de correlação (r) e soma residual dos mínimos quadrados (r2) > 0,99 respectivamente. LD e LQ para CONP foram 14,25 ppb e 47,52 ppb, respectivamente, repetibilidade entre 0,11 4,47%; a precisão intermediária entre 0,59 a 3,80% e exatidão entre 97,05 - 102,90%. A curva analítica para Mibi mostrou-se linear na faixa de 100-800 μgL-1, com r e r2 > 0,99, apresentando parâmetros similares aos das curvas analíticas de CONP. Os resultados obtidos neste estudo demonstraram que a abordagem do pior caso para validação de limpeza é um meio simples e eficaz para diminuir a complexidade e morosidade do processo de validação, além de proporcionar uma redução nos custos envolvidos nestas atividades. Todos os resultados obtidos nos ensaios de validação de método CONP atenderam as exigências e especificações preconizadas pela norma RE 899/2003 da ANVISA para considerar a metodologia validada. / Radiopharmaceuticals are defined as pharmaceutical preparations containing a radionuclide in their composition, mostly intravenously administered, and therefore compliance with the principles of Good Manufacturing Practices (GMP) is essential and indispensable. Cleaning validation is a requirement of the current GMP, and consists of documented evidence, which demonstrates that the cleaning procedures are able to remove residues to pre-determined acceptance levels, ensuring that no cross contamination occurs. A simplification of cleaning processes validation is accepted, and consists in choosing a product, called \"worst case\", to represent the cleaning processes of all equipment of the same production area. One of the steps of cleaning validation is the establishment and validation of the analytical method to quantify the residue. The aim of this study was to establish the worst case for cleaning validation of equipment in the radiopharmaceutical production of lyophilized reagent (LR) for labeling with 99mTc, evaluate the use of Total Organic Carbon (TOC) content as indicator of equipment cleaning used in the LR manufacture, validate the method of Non-Purgeable Organic Carbon (NPOC), and perform recovery tests with the product chosen as worst case.Worst case products choice was based on the calculation of an index called \"Worst Case Index\" (WCI), using information about drug solubility, difficulty of cleaning the equipment and occupancy rate of the products in line production. The products indicated as worst case was the LR MIBI-TEC. The method validation assays were performed using carbon analyser model TOC-Vwp coupled to an autosampler model ASI-V, both from Shimadzu&reg, controlled by TOC Control-V software. It was used the direct method for NPOC quantification. The parameters evaluated in the validation method were: system suitability, robustness, linearity, detection limit (DL) and quantification limit (QL), precision (repeatability and intermediate precision), and accuracy (recovery) and they were defined as follows: 4% acidifying reagent, 2.5 ml oxidizing reagent, 4.5 minutes integration curve time, 3 minutes sparge time and linearity in 40-1000 μgL-1 range, with correlation coefficient (r) and residual sum of minimum squares (r2) greater than 0.99 respectively. DL and QL for NPOC were 14.25 ppb e 47.52 ppb respectively, repeatability between 0.11 and 4.47%; the intermediate precision between 0.59 and 3.80% and accuracy between 97.05 and 102.90%. The analytical curve for Mibi was linear in 100-800 μgL-1 range with r and r2 greater than 0.99, presenting similar parameters to NPOC analytical curves. The results obtained in this study demonstrated that the worst-case approach to cleaning validation is a simple and effective way to reduce the complexity and slowness of the validation process, and provide a costs reduction involved in these activities. All results obtained in NPOC method validation assays met the requirements and specifications recommended by the RE 899/2003 Resolution from ANVISA to consider the method validated.
44

Online algoritmy pro varianty bin packingu / Online algorithms for variants of bin packing

Veselý, Pavel January 2014 (has links)
An online algorithm must make decisions immediately and irrevocably based only on a part of the input without any knowledge of the future part of the input. We introduce the competitive analysis of online algorithms, a standard worst-case analysis, and present main results of this analysis on the problem of online Bin Packing and on some of its variants. In Bin Packing, a sequence of items of size up to 1 arrives to be packed into the minimal number of unit capacity bins. Mainly, we focus on Colored Bin Packing in which items have also a color and we cannot pack two items of the same color adjacently in a bin. For Colored Bin Packing, we improve some previous results on the problem with two colors and present the first results for arbitrarily many colors. Most notably, in the important case when all items have size zero, we give an optimal 1.5-competitive algorithm. For items of arbitrary size we present a lower bound of 2.5 and a 3.5-competitive algorithm. Powered by TCPDF (www.tcpdf.org)
45

Étude de l'application de la théorie des valeurs extrêmes pour l'estimation fiable et robuste du pire temps d'exécution probabiliste / Study of the extreme value theory applicability for reliable and robust probabilistic worst-case execution time estimates

Guet, Fabrice 13 December 2017 (has links)
Dans les systèmes informatiques temps réel, les tâches logicielles sont contraintes par le temps. Pour garantir la sûreté du système critique contrôlé par le système temps réel, il est primordial d'estimer de manière sûre le pire temps d'exécution de chaque tâche. Les performances des processeurs actuels du commerce permettent de réduire en moyenne le temps d'exécution des tâches, mais la complexité des composants d'optimisation de la plateforme rendent difficile l'estimation du pire temps d'exécution. Il existe différentes approches d'estimation du pire temps d'exécution, souvent ségréguées et difficilement généralisables ou au prix de modèles coûteux. Les approches probabilistes basées mesures existantes sont vues comme étant rapides et simples à mettre en œuvre, mais souffrent d'un manque de systématisme et de confiance dans les estimations qu'elles fournissent. Les travaux de cette thèse étudient les conditions d'application de la théorie des valeurs extrêmes à une suite de mesures de temps d'exécution pour l'estimation du pire temps d'exécution probabiliste, et ont été implémentées dans l'outil diagxtrm. Les capacités et les limites de l'outil ont été étudiées grâce à diverses suites de mesures issues de systèmes temps réel différents. Enfin, des méthodes sont proposées pour déterminer les conditions de mesure propices à l'application de la théorie des valeurs extrêmes et donner davantage de confiance dans les estimations. / Software tasks are time constrained in real time computing systems. To ensure the safety of the critical systems that embeds the real time system, it is of paramount importance to safely estimate the worst-case execution time of each task. Modern commercial processors optimisation components enable to reduce in average the task execution time at the cost of a hard to determine task worst-case execution time. Many approaches for executing a task worst-case execution time exist but are usually segregated and hardly scalable, or by building very complex models. Measurement-based probabilistic timing analysis approaches are said to be easy and fast, but they suffer from a lack of systematism and confidence in their estimates. This thesis studies the applicability of the extreme value theory to a sequence of execution time measurements for the estimation of the probabilistic worst-case execution time, leading to the development of the diagxtrm tool. Thanks to a large panel of sequences of measurements from different real time systems, capabilities and limits of the tool are enlightened. Finally, a couple of methods are provided for determining measurements conditions that foster the application of the theory and raise more confidence in the estimates.
46

Análise de sistemas operacionais de tempo real para aplicações de robótica e automação / Analysis of real time operating systems for robotics and automation applications

Aroca, Rafael Vidal 31 October 2008 (has links)
Este trabalho apresenta um estudo sobre sistemas operacionais de tempo real (RTOS) utilizados na implementação da infraestrutura de controle digital para sistemas mecatrônicos, mas serve também como referência para outros sistemas que possuam restrições de tempo. Além de ter um caráter experimental, onde foram medidos e analisados dados como o pior tempo de resposta dos sistemas e a latência para tratamento de interrupções, este trabalho de pesquisa ainda contempla a implementação e uso de RTOS em situações práticas, bem como contempla a construção de uma plataforma geral de pesquisa que servirá de base para futuros trabalhos no laboratório de mecatrônica. Os sistemas analisados neste trabalho foram o VxWorks, QNX, Linux, RTAI, Windows XP, Windows CE e \'mü\'C/OS-II. Outro produto gerado durante este trabalho foi um Live CD para auxiliar na implementação e ensino de conceitos e sistemas de tempo real. / This work presents a study about real time operating systems (RTOS) that are utilized as infrastructure to create digital control systems for mechatronics systems, and also for systems that have critical time constraints. Parameters like worst case response time and interrupt latency were measured for each operating system. This research project also covers the implementation and use of RTOS in practical situations. A general research platform for robotics and real time research was also developed and will be used for future works in the Mechatronics Laboratory. The tested systems were VxWorks, QNX, Linux, RTAI, Windows XP, Windows CE and \'mü\'C/OS-II. Another product released during this work was a Live CD to aid the implementation and teaching of real time systems and concepts.
47

Analyse de Flux de Trames AFDX en Réception et Méthode d’Optimisation Mémoire / AFDX Frame Flow Analysis in Reception and Memory Optimization Method

Baga, Yohan 03 May 2018 (has links)
L’essor des réseaux AFDX comme infrastructure de communication entre les équipements de bord des aéronefs civils motive de nombreux travaux de recherche pour réduire les délais de communication tout en garantissant un haut niveau de déterminisme et de qualité de service. Cette thèse traite de l’effet des accolements de trames sur l’End System de réception, notamment sur le buffer interne afin de garantir une non perte de trames et un dimensionnement mémoire optimal. Une modélisation pire cas du flux de trames est réalisée selon une première méthode pessimiste, basée sur un flux de trames périodiques ; puis une seconde, plus optimiste, basée sur des intervalles de réception et un placement de trames itératif. Une étude probabiliste met en œuvre des distributions gaussiennes pour évaluer les probabilités d’occurrences des pires cas d’accolements et apporte un éclairage qui ouvre une discussion sur la pertinence de ne considérer que la modélisation pire cas pour dimensionner le buffer de réception. Un gain mémoire supplémentaire peut être obtenu par la mise en œuvre de la compression sans perte LZW. / The rise of AFDX networks as a communication infrastructure between on-board equipment of civil aircraft motivates many research projects to reduce communication delays while guaranteeing a high level of determination and quality of service. This thesis deals with the effect of the back-ot-back frame reception on the reception End System, in particular, on the internal buffer, in order to guarantee a non-loss of frames and optimal memory dimensioning. A worst-case modeling of the frame flow is carried out according to a first pessimistic method, based on a periodic frame flow. Then a more optimistic method is presented based on the reception intervals and an iterative frame placement. A probabilistic study implements Gaussian distributions to evaluate the occurrence probabilities of the worst back-to-back frames and provides an illumination that opens a discussion on the relevance of not considering the worst-case modeling to size the reception buffer. Additional memory gain can be achieved by implementing LZW lossless compression.
48

Aspect Analyzer: Ett verktyg för automatiserad exekveringstidsanalys av komponenter och aspekter / Aspect Analyzer: A Tool for Automated WCET Analysis of Aspects and Components

Uhlin, Pernilla January 2002 (has links)
<p>The increasing complexity in the development of a configurable real-time system has emerged new principles of software techniques, such as aspect-oriented software development and component-based software development. These techniques allow encapsulation of the system's crosscutting concerns and increase the modularity of the software. The properties of a component that influences the systems performance or semantics are specified separately in entities called aspects, while basic functionality of the property still remains in the component. </p><p>When building a real-time system, different sets of configurations of aspects and components can be combined, resulting in different configurations of the system. The temporal behavior of the system changes and a way to ensure the predictability of the system is needed. </p><p>This thesis presents a tool for aspect-level worst-case execution time analysis, which gives a priori information about the temporal behavior of the system, before the process of composing aspects with components.</p>
49

Verifikation av verktyget aspect analyzer / Aspect analyzer tool verification

Bodin, Joakim January 2003 (has links)
<p>Rising complexity in the development of real-time systems has made it crucial to have reusable components and a more flexible way of configuring these components into a coherent system. Aspect-oriented system development (AOSD) is a technique that allows one to put a system’s crosscutting concerns into"modules"that are called aspects. Applying AOSD in real-time and embedded system development one can expect reductions in the complexity of the system design and development. </p><p>A problem with AOSD in its current form is that it does not support predictability in the time domain. Hence, in order to use AOSD in real-time system development, we need to provide ways of analyzing temporal behavior of aspects, components and resulting system (made from weaving aspects and components). Aspect analyzer is a tool that computes the worst-case execution time (WCET) for a set of components and aspects, thus, enabling support for predictability in the time domain of aspect-oriented real-time software. </p><p>A limitation of the aspect analyzer, until now, were that no verification had been made whether the aspect analyzer would produce WCET values that were close to the measured or computed (with another WCET analysis technique) WCET of an aspect-oriented real-time system. Therefore, in this thesis we perform a verification of the correctness of the aspect analyzer using a number of different methods for WCET analysis. These investigations of the correctness of the output from the aspect analyzer gave confidence to the automated WCET analysis. In addition, performing this verification led to the identification of the steps necessary to compute the WCETs of a piece of program, when using a third party tool, which gives the ability to write accurate input files for the aspect analyzer.</p>
50

Towards Aspectual Component-Based Real-Time System Development

Tešanović, Aleksandra January 2003 (has links)
<p>Increasing complexity of real-time systems and demands for enabling their configurability and tailorability are strong motivations for applying new software engineering principles such as aspect-oriented and component-based software development. The integration of these two techniques into real-time systems development would enable: (i) efficient system configuration from the components in the component library based on the system requirements, (ii) easy tailoring of components and/or a system for a specific application by changing the behavior (code) of the component by aspect weaving, and (iii) enhanced flexibility of the real-time and embedded software through the notion of system configurability and component tailorability.</p><p>In this thesis we focus on applying aspect-oriented and component-based software development to real-time system development. We propose a novel concept of aspectual component-based real-time system development (ACCORD). ACCORD introduces the following into real-time system development: (i) a design method that assumes the decomposition of the real-time system into a set of components and a set of aspects, (ii) a real-time component model denoted RTCOM that supports aspect weaving while enforcing information hiding, (iii) a method and a tool for performing worst-case execution time analysis of different configurations of aspects and components, and (iv) a new approach to modelling of real-time policies as aspects.</p><p>We present a case study of the development of a configurable real-time database system, called COMET, using ACCORD principles. In the COMET example we show that applying ACCORD does have an impact on the real-time system development in providing efficient configuration of the real-time system. Thus, it could be a way for improved reusability and flexibility of real-time software, and modularization of crosscutting concerns.</p><p>In connection with development of ACCORD, we identify criteria that a design method for component-based real-time systems needs to address. The criteria include a well-defined component model for real-time systems, aspect separation, support for system configuration, and analysis of the composed real-time system. Using the identified set of criteria we provide an evaluation of ACCORD. In comparison with other approaches, ACCORD provides a distinct classification of crosscutting concerns in the real-time domain into different types of aspects, and provides a real-time component model that supports weaving of aspects into the code of a component, as well as a tool for temporal analysis of the weaved system.</p> / Report code: LiU-TEK-LIC-2003:23.

Page generated in 0.0653 seconds