201 |
Un langage de description et de programmation de systèmes de conduite de procédés industrielsPleyber, Joël 10 March 1978 (has links) (PDF)
.
|
202 |
Compilation optimisée des modèles UMLCharfi Smaoui, Asma 12 December 2011 (has links) (PDF)
Cette thèse s'inscrit dans le cadre de la mise en œuvre de l'ingénierie dirigée par les modèles (IDM) pour le développement des systèmes embarquées. Ces systèmes ayant généralement des ressources limitées (mémoire et/ou calculs), exigent que le code généré soit le plus optimisé possible. L'objectif de cette thèse est de produire à partir d'un modèle spécifié dans le langage UML, un code assembleur plus compact que le code assembleur produit par les compilateurs de code. Malgré l'évolution croissante des compilateurs optimisés, les compilateurs les plus répandus comme le GCC (Gnu Compiler Collection) sont incapables d'effectuer certains types d'optimisations qu'il est possible d'effectuer à un plus haut niveau d'abstraction dans une phase de pré-génération de code. En effet, certaines informations (liées à la sémantique d'exécution du langage UML) sont perdues lors de la génération de code. Ces informations, utiles pour les optimisations de haut niveau, sont invisibles par le compilateur de code vue qu'il prend toutes les informations liées au système modélisé à partir du code généré. Nous proposons ainsi une nouvelle approche dirigée par les modèles pour le développement des systèmes à ressources limitées, qui élimine l'étape de la génération de code en remplaçant cette étape par une compilation directe des modèles. Nous avons développé le premier compilateur de modèles UML (GUML : le front-end UML pour le compilateur GCC) qui génère directement de l'assembleur (sans passer par un langage de programmation) à partir des modèles UML. GUML permet de compiler les classes, les activités et les machines à états UML. Il permet de générer, en compilant certaines machines à états, un code assembleur plus compact que le code assembleur produit par GCC. Deux optimisations de GCC sont améliorées : l'élimination de code mort et l'élimination des expressions redondantes.
|
203 |
Continuation-Passing C : Transformations de programmes pour compiler la concurrence dans un langage impératifKerneis, Gabriel 09 November 2012 (has links) (PDF)
La plupart des programmes informatiques sont concurrents : ils doivent effectuer plusieurs tâches en même temps. Les threads et les événements sont deux techniques usuelles d'implémentation de la concurrence. Les événements sont généralement plus légers et efficaces que les threads, mais aussi plus difficiles à utiliser. De plus, ils sont souvent trop limités ; il est alors nécessaire d'écrire du code hybride, encore plus complexe, utilisant à la fois des threads ordonnancés préemptivement et des événements ordonnancés coopérativement. Nous montrons dans cette thèse que des programmes concurrents écrits dans un style à threads sont traduisibles automatiquement en programmes à événements équivalents et efficaces par une suite de transformations source-source prouvées. Nous proposons d'abord Continuation-Passing C, une extension du langage C pour l'écriture de systèmes concurrents qui offre des threads très légers et unifiés (coopératifs et préemptifs). Les programmes CPC sont transformés par le traducteur CPC pour produire du code à événements séquentialisé efficace, utilisant des threads natifs pour les parties préemptives. Nous définissons et prouvons ensuite la correction de ces transformations, en particulier le lambda lifting et la conversion CPS, pour un langage impératif. Enfin, nous validons la conception et l'implémentation de CPC en le comparant à d'autres bibliothèques de threads et en exhibant notre seeder BitTorrent Hekate. Nous justifions aussi notre choix du lambda lifting en implémentant eCPC, une variante de CPC utilisant les environnements, et en comparant ses performances à celles de CPC.
|
204 |
Completeness of Fact Extractors and a New Approach to Extraction with Emphasis on the Refers-to RelationLin, Yuan 07 August 2008 (has links)
This thesis deals with fact extraction, which analyzes source code (and sometimes related artifacts) to produce extracted facts about the code. These facts may, for example, record where in the code variables are declared and where they are used, as well as related information. These extracted facts are typically used in software reverse engineering to reconstruct the design of the program.
This thesis has two main parts, each of which deals with a formal approach to fact extraction. Part 1 of the thesis deals with the question: How can we demonstrate that a fact extractor actually does its job? That is, does the extractor produce the facts that it is supposed to produce? This thesis builds on the concept of semantic completeness of a fact extractor, as defined by Tom Dean et al, and further defines source, syntax and compiler completeness. One of the contributions of this thesis is to show that in particular important cases (when the extractor is deterministic and its front end is idempotent), there is an efficient algorithm to determine if the extractor is compiler complete. This result is surprising, considering that in general it is undecidable if two programs are semantically equivalent, and it would seem that source code and its corresponding extracted facts are each essentially programs that are to be proved to be equivalent or at least sufficiently similar.
The larger part of the thesis, Part 2, presents Algebraic Refers-to Analysis (ARA), a new approach to fact extraction with emphasis on the Refers-to relation. ARA provides a framework for specifying fact extraction, based on a three-step pipeline: (1) basic (lexical and syntactic) extraction, (2) a normalization step and (3) a binding step.
For practical programming languages, these three steps are repeated, in stages and phases, until the Refers-to relation is computed. During the writing of this thesis, ARA pipelines for C, Java, C++, Fortran, Pascal and Ada have been designed. A prototype fact extractor for the C language has been created.
Validating ARA means to demonstrate that ARA pipelines satisfy the programming language standards such as ISO C++ standard. In other words, we show that ARA phases (stages and formulas) are correctly transcribed from the rules in the language standard.
Comparing with the existing approaches such as Attribute Grammar, ARA has the following advantages. First, ARA formulas are concise, elegant and more importantly, insightful. As a result, we have some interesting discovery about the programming languages. Second, ARA is validated based on set theory and relational algebra, which is more reliable than exhaustive testing. Finally, ARA formulas are supported by existing software tools such as database management systems and relational calculators.
Overall, the contributions of this thesis include 1) the invention of the concept of hierarchy of completeness and the automatic testing of completeness, 2) the use of the relational data model in fact extraction, 3) the invention of Algebraic Refers-to Relation Analysis (ARA) and 4) the discovery of some interesting facts of programming languages.
|
205 |
Compilation certifiée de SCADE/LUSTREAuger, Cédric 07 February 2013 (has links) (PDF)
Les langages synchrones sont apparus autour des années quatre-vingt, en réponse à un besoin d'avoir un modèle mathématique simple pour implémenter des systèmes temps réel critiques. Dans ce modèle, le temps est découpé en instants discrets durant lesquels tous les composants du système reçoivent et produisent une donnée. Cette modélisation permet des raisonnements beaucoup plus simples en évitant de devoir prendre en compte le temps de calcul de chaque opération. Dans le monde du logiciel critique, la fiabilité du matériel et de son fonctionnement sont primordiaux, et on accepte d'être plus lent si on devient plus sûr. Afin d'augmenter cette fiabilité, plutôt que de concevoir manuellement tout le système, on utilise des machines qui synthétisent automatiquement le système souhaité à partir d'une description la plus concise possible. Dans le cas du logiciel, ce mécanisme s'appelle la compilation, et évite des erreurs introduites par l'homme par inadvertance. Elle ne garantit cependant pas la bonne correspondance entre le système produit et la description donnée. Des travaux récents menés par une équipe INRIA dirigée par Xavier Leroy ont abouti en 2008 au compilateur CompCert d'un sous-ensemble large de C vers l'assembleur PowerPC pour lequel il a été prouvé dans l'assistant de preuve Coq que le code assembleur produit correspond bien à la description en C du programme source. Un tel compilateur offre des garanties fortes de bonne correspondance entre le système synthétisé et la description donnée. De plus, avec les compilateurs utilisés pour le temps réel critique, la plupart des optimisations sont désactivées afin d'éviter les erreurs qui y sont liées. Dans CompCert, des optimisations elles aussi prouvées sont proposées, ce qui pourrait permettre ces passes dans la production de systèmes temps réel critiques sans en compromettre la fiabilité. Le but de cette thèse est d'avoir une approche similaire mais spécifique à un langage synchrone, donc plus approprié à la description de systèmes temps réel critiques que ne l'est le C. Un langage synchrone flots de données semblable à Lustre, nommé Ls, et un langage impératif semblable au langage C, nommé Obc y sont proposés ainsi que leur sémantique formelle et une chaîne de compilation avec des preuves de préservation de sémantique le long de cette chaîne.
|
206 |
Completeness of Fact Extractors and a New Approach to Extraction with Emphasis on the Refers-to RelationLin, Yuan 07 August 2008 (has links)
This thesis deals with fact extraction, which analyzes source code (and sometimes related artifacts) to produce extracted facts about the code. These facts may, for example, record where in the code variables are declared and where they are used, as well as related information. These extracted facts are typically used in software reverse engineering to reconstruct the design of the program.
This thesis has two main parts, each of which deals with a formal approach to fact extraction. Part 1 of the thesis deals with the question: How can we demonstrate that a fact extractor actually does its job? That is, does the extractor produce the facts that it is supposed to produce? This thesis builds on the concept of semantic completeness of a fact extractor, as defined by Tom Dean et al, and further defines source, syntax and compiler completeness. One of the contributions of this thesis is to show that in particular important cases (when the extractor is deterministic and its front end is idempotent), there is an efficient algorithm to determine if the extractor is compiler complete. This result is surprising, considering that in general it is undecidable if two programs are semantically equivalent, and it would seem that source code and its corresponding extracted facts are each essentially programs that are to be proved to be equivalent or at least sufficiently similar.
The larger part of the thesis, Part 2, presents Algebraic Refers-to Analysis (ARA), a new approach to fact extraction with emphasis on the Refers-to relation. ARA provides a framework for specifying fact extraction, based on a three-step pipeline: (1) basic (lexical and syntactic) extraction, (2) a normalization step and (3) a binding step.
For practical programming languages, these three steps are repeated, in stages and phases, until the Refers-to relation is computed. During the writing of this thesis, ARA pipelines for C, Java, C++, Fortran, Pascal and Ada have been designed. A prototype fact extractor for the C language has been created.
Validating ARA means to demonstrate that ARA pipelines satisfy the programming language standards such as ISO C++ standard. In other words, we show that ARA phases (stages and formulas) are correctly transcribed from the rules in the language standard.
Comparing with the existing approaches such as Attribute Grammar, ARA has the following advantages. First, ARA formulas are concise, elegant and more importantly, insightful. As a result, we have some interesting discovery about the programming languages. Second, ARA is validated based on set theory and relational algebra, which is more reliable than exhaustive testing. Finally, ARA formulas are supported by existing software tools such as database management systems and relational calculators.
Overall, the contributions of this thesis include 1) the invention of the concept of hierarchy of completeness and the automatic testing of completeness, 2) the use of the relational data model in fact extraction, 3) the invention of Algebraic Refers-to Relation Analysis (ARA) and 4) the discovery of some interesting facts of programming languages.
|
207 |
Génération automatique de parties opératives de circuits VLSI de type microprocesseurJamier, Robert Courtois, Bernard January 2008 (has links)
Reproduction de : Thèse de docteur-ingénieur : informatique : Grenoble, INPG : 1986. / Titre provenant de l'écran-titre. Bibliogr. p. 233-241.
|
208 |
Dynamics, Processes and Characterization in Classical and Quantum OpticsGamel, Omar 09 January 2014 (has links)
We pursue topics in optics that follow three major themes; time averaged dynamics with the associated Effective Hamiltonian theory, quantification and transformation of polarization, and periodicity within quantum circuits.
Within the first theme, we develop a technique for finding the dynamical evolution in time of a time averaged density matrix. The result is an equation of evolution that includes an Effective Hamiltonian, as well as decoherence terms that sometimes manifest in a Lindblad-like form. We also apply the theory to examples of the AC Stark Shift and Three-Level Raman Transitions.
In the theme of polarization, the most general physical transformation on the polarization state has been represented as an ensemble of Jones matrix transformations, equivalent to a completely positive map on the polarization matrix. This has been directly assumed without proof by most authors. We follow a novel approach to derive this expression from simple physical principles, basic coherence optics and the matrix theory of positive maps.
Addressing polarization measurement, we first establish the equivalence of classical polarization and quantum purity, which leads to the identical structure of the Poincar\' and Bloch spheres. We analyze and compare various measures of polarization / purity for general dimensionality proposed in the literature, with a focus on the three dimensional case. % entanglement?
In pursuit of the final theme of periodic quantum circuits, we introduce a procedure that synthesizes the circuit for the simplest periodic function that is one-to-one within a single period, of a given period p. Applying this procedure, we synthesize these circuits for p up to five bits. We conjecture that such a circuit will need at most n Toffoli gates, where p is an n-bit number.
Moreover, we apply our circuit synthesis to compiled versions of Shor's algorithm, showing that it can create more efficient circuits than ones previously proposed. We provide some new compiled circuits for experimentalists to use in the near future. A layer of "classical compilation" is pointed out as a method to further simplify circuits. Periodic and compiled circuits should be helpful for creating experimental milestones, and for the purposes of validation.
|
209 |
Dynamics, Processes and Characterization in Classical and Quantum OpticsGamel, Omar 09 January 2014 (has links)
We pursue topics in optics that follow three major themes; time averaged dynamics with the associated Effective Hamiltonian theory, quantification and transformation of polarization, and periodicity within quantum circuits.
Within the first theme, we develop a technique for finding the dynamical evolution in time of a time averaged density matrix. The result is an equation of evolution that includes an Effective Hamiltonian, as well as decoherence terms that sometimes manifest in a Lindblad-like form. We also apply the theory to examples of the AC Stark Shift and Three-Level Raman Transitions.
In the theme of polarization, the most general physical transformation on the polarization state has been represented as an ensemble of Jones matrix transformations, equivalent to a completely positive map on the polarization matrix. This has been directly assumed without proof by most authors. We follow a novel approach to derive this expression from simple physical principles, basic coherence optics and the matrix theory of positive maps.
Addressing polarization measurement, we first establish the equivalence of classical polarization and quantum purity, which leads to the identical structure of the Poincar\' and Bloch spheres. We analyze and compare various measures of polarization / purity for general dimensionality proposed in the literature, with a focus on the three dimensional case. % entanglement?
In pursuit of the final theme of periodic quantum circuits, we introduce a procedure that synthesizes the circuit for the simplest periodic function that is one-to-one within a single period, of a given period p. Applying this procedure, we synthesize these circuits for p up to five bits. We conjecture that such a circuit will need at most n Toffoli gates, where p is an n-bit number.
Moreover, we apply our circuit synthesis to compiled versions of Shor's algorithm, showing that it can create more efficient circuits than ones previously proposed. We provide some new compiled circuits for experimentalists to use in the near future. A layer of "classical compilation" is pointed out as a method to further simplify circuits. Periodic and compiled circuits should be helpful for creating experimental milestones, and for the purposes of validation.
|
210 |
Certified Compilation and Worst-Case Execution Time EstimationOliveira Maroneze, André 17 June 2014 (has links) (PDF)
Safety-critical systems - such as electronic flight control systems and nuclear reactor controls - must satisfy strict safety requirements. We are interested here in the application of formal methods - built upon solid mathematical bases - to verify the behavior of safety-critical systems. More specifically, we formally specify our algorithms and then prove them correct using the Coq proof assistant - a program capable of mechanically checking the correctness of our proofs, providing a very high degree of confidence. In this thesis, we apply formal methods to obtain safe Worst-Case Execution Time (WCET) estimations for C programs. The WCET is an important property related to the safety of critical systems, but its estimation requires sophisticated techniques. To guarantee the absence of errors during WCET estimation, we have formally verified a WCET estimation technique based on the combination of two main methods: a loop bound estimation and the WCET estimation via the Implicit Path Enumeration Technique (IPET). The loop bound estimation itself is decomposed in three steps: a program slicing, a value analysis based on abstract interpretation, and a loop bound calculation stage. Each stage has a chapter dedicated to its formal verification. The entire development has been integrated into the formally verified C compiler CompCert. We prove that the final estimation is correct and we evaluate its performances on a set of reference benchmarks. The contributions of this thesis include (a) the formalization of the techniques used to estimate the WCET, (b) the estimation tool itself (obtained from the formalization), and (c) the experimental evaluation. We conclude that our formally verified development obtains interesting results in terms of precision, but it requires special precautions to ensure the proof effort remains manageable. The parallel development of specifications and proofs is essential to this end. Future works include the formalization of hardware cost models, as well as the development of more sophisticated analyses to improve the precision of the estimated WCET.
|
Page generated in 0.0931 seconds