• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 8
  • 6
  • 5
  • 4
  • 3
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 36
  • 8
  • 7
  • 6
  • 5
  • 5
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Detecção de adulteração de combustíveis com sensores poliméricos eletrodepositados e redes neurais artificiais. / Fuel adulteration detection using electrodepositated polymer sensors and artificial neural networks.

Sérgio Tonzar Ristori Ozaki 11 June 2010 (has links)
A adulteração de combustíveis é uma grande preocupação no Brasil. A agência reguladora nacional (ANP) detecta anualmente de 1 a 3% de adulterações nas amostras coletadas, o que é um índice alto considerando o tamanho do mercado brasileiro. As alternativas de adulteração são vastas e muito dinâmicas, por isso os arranjos de sensores baseados no conceito de seletividade global parecem os mais adequados para detectar falsificação de combustíveis. O conceito de seletividade global leva em conta a sensibilidade cruzada de sensores químicos não específicos e o uso de métodos de análise multivariada de dados para encontrar padrões para amostras de diferentes composições químicas. Os sensores químicos podem ser obtidos de uma variedade de materiais sensoativos, cujas respostas elétricas variam de acordo com as propriedades físico-químicas do meio em que se encontra. Os polímeros condutores são excelentes materiais sensoativos, pois sua condutividade elétrica é grandemente influenciada pelas condições ambientais e podem ser processados na forma de filmes finos através várias técnicas. No presente trabalho, filmes de poli(3-metiltiofeno) (PMTh) e poli(3-hexiltiofeno) são depositados por cronopotenciometria e cronoamperometria sobre microeletrodos interdigitados e são caracterizados por espectroscopia de impedância. Os dados são analisados por redes neurais artificiais do tipo multilayer perceptron e bons resultados são obtidos na detecção de adulteração de gasolina. O mesmo estudo também pode ser aplicado na detecção de adulteração de álcool etílico combustível com um desempenho um pouco pior. / Fuel adulteration is a major concern in Brazil. The local governmental agency detects from 1 to 3% of problematic samples yearly, which is a lot considering Brazils market size. The myriad of adulteration possibilities is vast and it is very dynamic, thus array of sensors based on global selectivity concept seems to be more suitable methodology to detect problems in fuel. The global selectivity concept encompasses the cross-sensitivity of non-specific chemical sensors and the use of multivariated data analysis methods as a way to provide fingerprints for samples of different chemical composition. The chemical sensors can employ different types of sensoactive materials, whose electrical responses are dependent on the physicochemical characteristics of the media they get in contact with. Conducting polymers (CP) are per excellence suitable sensoactive materials, since their electrical conductivity is highly influenced by the environmental conditions and they can be easily processed in the thin film form by different techniques. In the present work films of poly(3-methylthiophene) (PMTh) and poly(3-hexylthiophene) (PHTh) are deposited by chronopotenciometry and chronoamperometry onto interdigitated microelectrodes and characterized through Impedance Spectroscopy. This data was analyzed with Multilayer perceptron neural networks and a very good performance is found in gasoline adulteration detection. A less great performance was also achieved in the investigation vehicular ethanol adulteration.
Read more
12

Kontrola identifikátorů vozidel / Checking of Vehicle Identifiers

Motúz, Ondřej January 2015 (has links)
Diploma thesis is focused on identifiers of vehicles and their parts. The thesis gives an overview about available identifiers situated on the vecihles. Further more the thesis deal with potentional forgery and altering of identifiers and as a next shows potentional ways how to detect a manipulation with them. Practipal part of thesis includes a proposal of methodology for checking vehicles based on identifiers. There was also created a cathalogue of the Skoda vehicles. It gives an overview of identifiers for selected vehicles of the brand Skoda.
13

Vetenskap eller pseudovetenskap? : En utvärdering av giltigheten i Poppers kritik gentemot Freuds psykoanalytiska teori på basis av demarkationskriteriet

Isfåle, Linda January 2008 (has links)
In this essay I evaluate the legality of Karl Popper’s criticism against psychoanalysis, regarding this theory of Freud’s being pseudoscientific. Popper’s criticism is based on his theory of demarcation in which he states that an empirical theory must be possible to test by observations in order to be, as most important is, hypothetically possible to falsify based on other empirical statements – often in the form of new found facts that contradict the original statement/theory. In purpose of assessing Popper’s criticism I perform a modified idea analysis, based on a book by Evert Vedung (1977). By referring both to Popper and to spokespersons of psychoanalysis I structure the arguments pro and contra Popper’s criticism, in order to then weigh these arguments against each other. My main conclusion is that psychoanalysis, regardless of Popper’s criticism, is in fact an empirical theory since it can be internally validated based on the observations made by a psychoanalyst. But according to the theory of demarcation psychoanalysis can not be tested based on observations, probably because Popper by ”observations” meant only those that can be made and validated by independent scientists.
14

Falsification of the LOOK

Veas, Rodrigo Andres 01 July 2015 (has links) (PDF)
The LOOK is a viewing time measure that seeks to assess sexual interest patterns and is currently in development at Brigham Young University. This instrument is intended to aid current efforts to prevent child sexual abuse by identifying deviant sexual interests. A recently presented study on a similar viewing time measure has raised concerns regarding individuals' ability to falsify sexual interest patterns on average. This study seeks to extend this falsification research to the LOOK in order to assess if falsification of this measure is possible by means of speed or pretense. Participants were exclusively heterosexual non-pedophilic males and females. Sexual interest patterns for 151 females and 150 males were used. These individuals were distributed into either a control group or one of four possible falsification conditions for each gender. The study used Fischer's Chi-square scoring procedure to examine the significance of differences between the averaged patterns of sexual interest obtained from falsification groups and average expected interest patterns of control groups. Results of this study found that 4 of 8 falsification groups were able to significantly falsify sexual interest patterns on average. It appears that, on average, everyone in the pretense groups were capable of falsifying results. Men and women were able to emulate response patterns of the opposite gender regardless of whether given information about the basic mechanism of visual response time instruments. It is concluded that while the LOOK seems to possess a degree of sensitivity toward falsification efforts, improvements are still needed in order to increase its ability to detect test-taker's efforts to falsify results on average.
Read more
15

Enjeux politiques du rationalisme critique chez Karl Popper. / Political stakes of critical rationalism in Karl Popper's works

Abessolo Metogo, Christel-Donald 27 June 2013 (has links)
L'intérêt de l'humanité pour la connaissance se joue sur deux fronts : celui de la réduction de l'ignorance, et celui de l'action tant individuelle que collective. Aussi la manière dont nous acquérons le savoir est-elle essentielle, parce qu'elle préjuge aussi bien de notre perception du monde que de notre conscience de nous-mêmes et de la société. Car si, avec la raison comme alliée, l'homme se découvre des potentialités illimitées, nous aurions pourtant tort de passe outre une stricte réalité, celle de notre ignorance infinie, celle, au fond, de notre incapacité à cerner, de façon sûre et certaine, quoi que ce soit de ce monde complexe et en évolution constante qui nous accueille. C'est pourquoi, pour Karl Popper, toute rationalité véritable doit être critique, c'est-à-dire pluraliste et débattante, seule façon de considérer objectivement l'écart qui nous sépare de la vérité et, par suite, d'agir avec prudence et discernement, dans l'intérêt de la science comme dans celui de la collectivité. / Our interest for knowledge is based on two essential principles : the first one aims at reducing ignorance while the second one emphasizes individual and collective actions. So, the way we acquire knowledge is essential as it foresees at the same time our perception of the world as well as our being aware of our existence and that of the society as a whole. Indeed, if the Man uses reason his ally, he will discover unlimited potentialities, then we will be mistaken in not taking in consideration this strict reality of our unlimited ignorance, the one that, in reality, anf from the bottom of our not being able to certainly and surely encircle anything in this fast-growing world that dewells us. That is why Karl Popper, any real rationality has to be critical, it means pluralist and debating. That is the only way to separate the gap between us and the truth, and then, lead us to act with caution and discernment ib the interest of science and in the interest of the community.
Read more
16

Formalisation d'un environnement d'analyse des données basé sur la détection d'anomalies pour l'évaluation de risques : Application à la connaissance de la situation maritime / Formalisation of a data analysis environment based on anomaly detection for risk assessment : Application to Maritime Domain Awareness

Iphar, Clément 22 November 2017 (has links)
Il existe différents systèmes de localisation de navires en mer qui favorisent une aide à la navigation et une sécurisation du trafic maritime. Ces systèmes sont également utilisés en tant qu’outils de surveillance et d’aide à la décision par les centres de surveillance basés à terre. Le Système d’Identification Automatique (AIS) déployé par l’Organisation Maritime Internationale, bien qu’étant le système de localisation de navires le plus utilisé de nos jours, est faiblement sécurisé. Cette vulnérabilité est illustrée par des cas réels et détectés tels que des usurpations d’identité ou des disparitions volontaires de navires qui sont sources de risques pour les navires, les infrastructures offshores et côtières et l’environnement.Nous proposons dans cette thèse une démarche méthodologique d’analyse et d’évaluation des messages AIS fondée sur les dimensions de la qualité de la donnée, dont l’intégrité considérée comme la plus importante de ces dimensions. Du fait de la structure complexe de l’AIS, une liste d'indicateurs a été établie, afin d’évaluer l'intégrité de la donnée, sa conformité avec les spécifications techniques du système et la cohérence des champs des messages entre eux et au sein d’un seul ou plusieurs messages. Notre démarche repose également sur l’usage d’informations additionnelles telles que des données géographiques ou des registres de navires afin d’évaluer la véracité et l’authenticité d’un message AIS et de son expéditeur.Enfin, une évaluation des risques associés est proposée, permettant une meilleurecompréhension de la situation maritime ainsi que l’établissement de liens de causalité entre les vulnérabilités du système et les risques relevant de la sécurité et sûreté de la navigation maritime. / At sea, various systems enable vessels to be aware of their environment and on the coast, those systems, such as radar, provide a picture of the maritime traffic to the coastal states. One of those systems, the Automatic Identification System (AIS) is used for security purposes (anti-collision) and as a tool for on-shore bodies as a control and surveillance and decision-support tool.An assessment of AIS based on data quality dimensions is proposed, in which integrity is highlighted as the most important of data quality dimensions. As the structure of AIS data is complex, a list of integrity items have been established, their purpose being to assess the consistency of the data within the data fields with the technical specifications of the system and the consistency of the data fields within themselves in a message and between the different messages. In addition, the use of additional data (such as fleet registers) provides additional information to assess the truthfulness and the genuineness of an AIS message and its sender.The system is weekly secured and bad quality data have been demonstrated, such as errors in the messages, data falsification or data spoofing, exemplified in concrete cases such as identity theft or vessel voluntary disappearances. In addition to message assessment, a set of threats have been identified, and an assessment of the associated risks is proposed, allowing a better comprehension of the maritime situation and the establishment of links between the vulnerabilities caused by the weaknesses of the system and the maritime risks related to the safety and security of maritime navigation.
Read more
17

Vérification de la pléiotropie en randomisation mendélienne : évaluation méthodologique et application à l'estimation de l'effet causal de l'adiposité sur la pression artérielle

Mbutiwi, Fiston Ikwa Ndol 07 1900 (has links)
Introduction La randomisation mendélienne (RM) est une approche de plus en plus populaire dans les études observationnelles qui utilise des variants génétiques (habituellement des polymorphismes mononucléotidiques ou single-nucleotide polymorphisms, SNPs) associés à une exposition (hypothèse 1 ou pertinence) comme instruments pour estimer l’effet causal de cette exposition sur une issue, en assumant l’absence de confusion entre l’instrument et l’issue (hypothèse 2 ou indépendance) et l’absence d’un effet de l’instrument sur l’issue en dehors de l’exposition (hypothèse 3 ou restriction d’exclusion). Cependant, la validité des résultats de la RM est menacée par la pléiotropie, phénomène biologique par lequel un SNP affecte distinctement l’exposition et l’issue, qui est l’une des principales causes de violation de la restriction d’exclusion. Cette thèse examine certains défis méthodologiques pratiques de la RM relatifs à la vérification de la restriction d’exclusion et à la validité des résultats à travers trois principaux objectifs : 1) cartographier comment les chercheurs en RM préviennent, détectent et/ou contrôlent, et discutent des violations potentielles de la restriction d'exclusion dues notamment à la pléiotropie ; 2) évaluer la performance de la méthode basée sur la confusion positive, qui compare les estimés ponctuels de l’effet de l’exposition sur l’issue obtenus par la RM et par la régression conventionnelle, dans la détection des instruments invalides dans plusieurs contextes pratiques d’études de RM ; et 3) examiner l’impact des méthodes courantes de gestion de la médication antihypertensive dans les études de RM modélisant la pression artérielle (PA) sur l'estimation de l’effet causal et la détection des violations potentielles de la restriction d'exclusion. Méthodes Pour l’objectif 1, une revue de littérature de 128 études de RM ayant utilisé au moins un SNP sur le gène FTO (fat mass and obesity-associated) comme instrument de l’indice de masse corporelle (IMC) a été réalisée. La façon dont les auteurs préviennent, évaluent ou contrôlent, et discutent des violations potentielles de la restriction d’exclusion dues notamment à la pléiotropie a été examinée. Pour l’objectif 2, une étude de simulation statistique considérant des contextes d’études de RM utilisant comme instrument un SNP ou un score de risque génétique (genetic risk score, GRS), une issue continue ou binaire, dans des scénarios évaluant l’impact de la taille de l’échantillon et du type de pléiotropie (indirect ou direct), a été réalisée. La performance de la méthode basée sur la confusion positive a été définie comme le pourcentage de jeux de données simulés dans lesquels la méthode détectait des instruments invalides. Pour l’objectif 3, une étude de RM de l’association entre l’IMC et la PA systolique (PAS) a été réalisée. Les méthodes de gestion de la médication antihypertensive examinées étaient : (i) pas de correction, (ii) inclure la médication dans les modèles comme une covariable d’ajustement, (iii) exclure de l’analyse les sujets traités aux antihypertenseurs, (iv) ajouter une valeur constante de 15 mm Hg aux valeurs mesurées de la PAS chez les sujets traités aux antihypertenseurs, et (v) utiliser comme issue un indicateur binaire de l'hypertension. Résultats Il existe une pléthore de méthodes utilisées dans les études de RM dont certaines peuvent être sous-optimales à prévenir, détecter ou contrôler le biais dû à l’inclusion des SNPs pléiotropiques. Les simulations statistiques montrent qu’en RM utilisant un SNP comme instrument, la méthode basée sur la confusion positive est performante à détecter l’invalidité de l’instrument lorsque la pléiotropie est directe plutôt qu’indirecte, indépendamment de l’issue, mais la performance de la méthode s’améliore avec l’augmentation de taille de l’échantillon. En revanche, la méthode est moins performante à détecter l’invalidité lorsque l’instrument est un GRS, mais sa performance augmente avec la proportion des SNPs invalides inclus dans le GRS. Enfin, les estimations de la RM varient énormément selon la stratégie de gestion de la médication antihypertensive choisie, contrairement à la détection des violations de la restriction d’exclusion qui n’en est pas affectée. Conclusion Cette thèse met de l’avant certaines difficultés méthodologiques dans les applications de la RM et l’importance de la triangulation de plusieurs méthodes dans la vérification des hypothèses de RM. Le champ de la RM est en plein essor, et des nouvelles méthodes sont souvent proposées. Il devient important non seulement de les évaluer, mais aussi d’en détailler l’utilisation et les hypothèses sous-jacentes pour une utilisation optimale en complément aux méthodes existantes. / Introduction Mendelian randomization (MR) is an increasingly popular technique in observational studies that uses genetic variants (usually single-nucleotide polymorphisms, SNPs) associated with an exposure (Assumption 1 or relevance) as instruments to estimate the causal effect of that exposure on an outcome, assuming no confounding between the instrument and the outcome (Assumption 2 or independence) and no effect of the instrument on the outcome outside of its association with the exposure (Assumption 3 or exclusion restriction). However, the validity of the MR results is challenged by pleiotropy, the biological phenomenon whereby a SNP distinctly affects the exposure and the outcome, which is one of the leading causes of violation of the exclusion restriction assumption. This thesis examines some practical MR methodological challenges related to the assessment of the exclusion restriction and the validity of MR results through three main objectives: 1) to examine how MR researchers prevent, detect, and/or control for, and discuss potential violations of the exclusion restriction due especially to pleiotropy; 2) to evaluate the performance of the leveraging positive confounding (LPC) method that compares the MR and the conventional point estimates in detecting invalid instruments in several practical MR settings; and 3) to examine the impact of commonly used methods of accounting for antihypertensive medication in MR studies modeling blood pressure (BP) on the estimation of the causal effect and the detection of potential violations of the exclusion restriction. Methods For Objective 1, a literature review of 128 MR studies that used at least one SNP in the fat mass and obesity-associated (FTO) gene as an instrument for body mass index (BMI) was conducted to examined how the authors prevent, detect, or control, and discuss potential violations of the exclusion restriction, especially due to pleiotropy. For Objective 2, a simulation study considering MR analyse settings using single SNP or genetic risk score (GRS) as an instrument, continuous or binary outcome, in scenarios evaluating the impact of sample size and type of pleiotropy (indirect vs. direct) was performed. The performance of the LPC method was assessed as the percentage of simulated datasets in which the LPC method detected invalid instruments. For Objective 3, an MR study of the association between BMI and systolic BP (SBP) was performed. The methods for accounting for antihypertensive medication examined were: (i) no adjustment, (ii) include medication in the models as an adjustment covariate, (iii) exclude from the analysis subjects treated with antihypertensive medication, (iv) add a constant value of 15 mm Hg to the measured values of SBP in subjects using antihypertensive medication, and (v) use as outcome a binary indicator of hypertension. Results There exists a plethora of methods used in MR studies, some of which may be suboptimal for preventing, detecting, or controlling for bias due to the inclusion of pleiotropic SNPs. Statistical simulations show that in MR using single SNP as an instrument, the LPC method performs better at detecting invalidity of the instrument when the pleiotropy is direct than indirect, regardless of the outcome, although the performance of the method improves with increasing sample size. In contrast, the method performs less well in detecting invalidity when the instrument is a GRS, but its performance increases with the proportion of invalid SNPs included in the GRS. Finally, MR estimates change greatly depending on the chosen strategy of accounting for antihypertensive medication in contrast to the detection of exclusion restriction violations which is not impacted. Conclusion This present thesis highlights some of the methodological challenges in MR applications and the importance of triangulating multiple methods when assessing the MR assumptions. The MR field is booming, and new methods are often proposed. Therefore, it is important to evaluate these methods as well as to detail their application and underlying assumptions for optimal use as a complement to existing methods.
Read more
18

Il ruolo e la funzione del falso nella storia della shoah : storici, affaires e opinione pubblica / Le rôle et la fonction du faux dans l’histoire de la shoah : historiens, affaires et opinion publique / The role and function of false in the Holocaust history : historians, affaires and public opinion

Bertolini, Frida 14 January 2013 (has links)
Celui du faux est un problème auquel les spécialistes de chaque période historique ont dû se confronter, mais qui a subi une accélération et une exaspération avec l’histoire du temps présent, aussi à cause de la présence simultanée des protagonistes qui ont rendu plus complexe une scène historique et commémorative profondément marquée par le rapport entre historiens et témoins, et par la particulière articulation de la mémoire publique et de la mémoire privée. L’événement qui a souffert avec le plus d’acuité du problème du faux à l’époque contemporaine est certainement le génocide des Juifs commis par les nazis pendant la Seconde Guerre Mondiale, car c’est justement au cœur de l’entreprise génocidaire qui a eu lieu la plus grande falsification qui a alimenté tout discours révisionniste ultérieur. La négation de l’extermination, avec le tentative des nazis de dissimuler et détruire les preuves de leur culpabilité, est en effet consubstantiel au déroulement des faits, œuvrant ainsi sur deux niveaux: à l’origine, sur la suppression systématique des traces et des témoins éventuels; plus tard, sur les différentes étapes de l’opération historiographique. Le sophisme négationniste par lequel la réalité meurtrière des chambres à gaz ne peut être prouvée que par ceux qui les ont vus en fonction de leurs propres yeux, c’est à dire par ceux qui y ont perdu la vie, remet en question non seulement la réalité historique de l’événement mais aussi, par conséquent, la mémoire des survivants qui, avec la falsification de leur expérience, sont obligés de faire face depuis l’époque de la persécution nazie. L’historien est devenu donc le protagoniste d’une contemporanéité dans laquelle histoire et mémoire ont fini par se retrouver souvent inextricablement liées. / The problem of false is a problem that specialists of different historical period had to confront, but it has been accelerated and exasperated with the history of the present time, also because of the simultaneous presence of the protagonists who made more complex the historic and commemorative scene deeply influenced by the relationship between historians and witnesses, and the specific articulation of public memory and private memory. The event, which has most deeply suffered the problem of false in the modern era is certainly the Jewish genocide perpetrated by the Nazis during World War II, because it is precisely at the heart of genocide that the greatest falsification, that has fueled all subsequent revisionist discourse, began. The denial of the extermination, the Nazis attempt to conceal and destroy evidence of their guilt is indeed consubstantial with the sequence of events and works on two levels: during the Holocaust, by the systematic removal of traces and potential witnesses, and later on the different stages of the historiographical operation. Revisionist sophistry by which the murderous reality of the gas chambers can be proven only by those who saw it with their own eyes, for example by those who have lost their lives, questions not only the historical reality of the event but also, therefore, the memory of the survivors, who with the falsification of their experience, are forced to face since the days of Nazi persecution. The historian thus became the protagonist of a contemporaneity in which history and memory have ended up often inextricably linked.
Read more
19

The design of a defence mechanism to mitigate the spectrum sensing data falsification attack in cognitive radio ad hoc networks

Ngomane, Issah January 2018 (has links)
Thesis ( M.Sc. ( Computer Science)) -- University of Limpopo, 2018 / Dynamic spectrum access enabled by cognitive radio networks is envisioned to address the problems of the ever-increasing wireless technology. This innovative technology increases spectrum utility by allowing unlicensed devices to utilise the unused spectrum band of licenced devices opportunistically. The unlicensed devices referred to as secondary users (SUs) constantly sense the spectrum band to avoid interfering with the transmission of the licenced devices known as primary users (PUs).Due to some environmental challenges that can interfere with effective spectrum sensing, the SUs have to cooperate in sensing the spectrum band. However, cooperative spectrum sensing is susceptible to the spectrum sensing data falsification (SSDF) attack where selfish radios falsify the spectrum reports. Hence, there is a need to design a defence scheme that will defend the SSDF attack and guaranty correct final transmission decision. In this study, we proposed the integration of the reputation based system and the qout-of-m rule scheme to defend against the SSDF attack. The reputation-based system was used to determine the trustworthiness of the SUs. The q-out-of-m rule scheme where m sensing reports were selected from the ones with good reputation and q was the final decision, which was used to isolate the entire malicious nodes and make the correct final transmission decision. The proposed scheme was implemented in a Cognitive Radio Ad Hoc Network (CRAHN) where the services of a data fusion centre (FC) were not required. The SUs conducted their own data fusion and made their own final transmission decision based on their sensing reports and the sensing reports of their neighbouring nodes. Matlab was used to implement and simulate the proposed scheme. We compared our proposed scheme with the multifusion based distributed spectrum sensing and density based system schemes. Metrics used were the success probability, missed detection probability and false alarm probability. The proposed scheme performed better compared to the other schemes in all the metrics. / CSIR, NRF and, University of Limpopo research office
Read more
20

Instrumentation and Coverage Analysis of Cyber Physical System Models

January 2016 (has links)
abstract: A Cyber Physical System consists of a computer monitoring and controlling physical processes usually in a feedback loop. These systems are increasingly becoming part of our daily life ranging from smart buildings to medical devices to automobiles. The controller comprises discrete software which may be operating in one of the many possible operating modes and interacting with a changing physical environment in a feedback loop. The systems with such a mix of discrete and continuous dynamics are usually termed as hybrid systems. In general, these systems are safety critical, hence their correct operation must be verified. Model Based Design (MBD) languages like Simulink are being used extensively for the design and analysis of hybrid systems due to the ease in system design and automatic code generation. It also allows testing and verification of these systems before deployment. One of the main challenges in the verification of these systems is to test all the operating modes of the control software and reduce the amount of user intervention. This research aims to provide an automated framework for the structural analysis and instrumentation of hybrid system models developed in Simulink. The behavior of the components introducing discontinuities in the model are automatically extracted in the form of state transition graphs. The framework is integrated in the S-TaLiRo toolbox to demonstrate the improvement in mode coverage. / Dissertation/Thesis / Masters Thesis Computer Science 2016
Read more

Page generated in 0.1016 seconds