71 |
PUBLICIDADE E O POLITICAMENTE CORRETO: INTERDISCURSIVIDADES NA CONSTRUÇÃO SOCIAL DO SENTIDO / Advertising and the political correctness: interdiscursivities on the social construction of meaningZanini, Gustavo Moreira 27 February 2015 (has links)
Made available in DSpace on 2016-08-03T12:30:10Z (GMT). No. of bitstreams: 1
Gustavo Moreira Zanini.pdf: 1290506 bytes, checksum: 41c7c0924876851a82287822dd335ffe (MD5)
Previous issue date: 2015-02-27 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / We realized that the traditional paradigms are now considered singular and dictatorial. On the other hand, sets out a new meaning whose north is the political correctness. Understanding the advertising as a socio-cultural product, this research is initially a bibliographic study, aimed the conceptualization and analysis of issues related to your theme. With this properly seized context, a pragmatic discourse analysis was performed on a corpus of advertisements aired between 2009 and 2014, on brazilian television through open format. Our investigation focused on the ways that elements postulated by politically correct thinking are being added to advertiser gender, in constructions of meaning. We could observe a rebuke trend of certain groups to advertising content that touch on very specific themes, with interpretations marked by a high degree of subjectivity; and what is being sought, oftentimes, through an empowerment permitted by our present context, is in fact the suppression of certain themes within the advertising communication. / Ao que se percebe, paradigmas até então vigentes passam a ser considerados singulares e ditatoriais. Em contrapartida, estabelece-se uma nova acepção, cujo norte é o pensar politicamente correto. Entendendo a publicidade como um produto sociocultural, essa pesquisa inicialmente é bibliográfica visando à conceituação e análise de questões inerentes ao seu tema. Com este contexto devidamente apreendido, uma análise pragmática do discurso foi realizada em um corpus de anúncios publicitários veiculados entre 2009 e 2014, no meio televisivo brasileiro de formato aberto. Nossa investigação se concentrou nos modos com que os elementos postulados pelo pensamento politicamente correto vêm sendo incorporados ao gênero publicitário em construções de sentido. Pudemos observar uma tendência de repreensão de determinados grupos a conteúdos publicitários que tocam em temáticas muito específicas, com interpretações marcadas por um alto grau de subjetividade; e o que se busca muitas vezes, através de um empoderamento permitido por nosso contexto atual, é mesmo a supressão de determinadas temáticas dentro da comunicação publicitária.
.
|
72 |
Variants of acceptance specifications for modular system design / Variantes de spécifications à ensemble d'acceptation pour la conception modulaire de systèmesVerdier, Guillaume 29 March 2016 (has links)
Les programmes informatiques prennent une place de plus en plus importante dans nos vies. Certains de ces programmes, comme par exemple les systèmes de contrôle de centrales électriques, d'avions ou de systèmes médicaux sont critiques : une panne ou un dysfonctionnement pourraient causer la perte de vies humaines ou des dommages matériels ou environnementaux importants. Les méthodes formelles visent à offrir des moyens de concevoir et vérifier de tels systèmes afin de garantir qu'ils fonctionneront comme prévu. Au fil du temps, ces systèmes deviennent de plus en plus évolués et complexes, ce qui est source de nouveaux défis pour leur vérification. Il devient nécessaire de développer ces systèmes de manière modulaire afin de pouvoir distribuer la tâche d'implémentation à différentes équipes d'ingénieurs. De plus, il est important de pouvoir réutiliser des éléments certifiés et les adapter pour répondre à de nouveaux besoins. Aussi les méthodes formelles doivent évoluer afin de s'adapter à la conception et à la vérification de ces systèmes modulaires de taille toujours croissante. Nous travaillons sur une approche algébrique pour la conception de systèmes corrects par construction. Elle définit un formalisme pour exprimer des spécifications de haut niveau et permet de les raffiner de manière incrémentale en des spécifications plus concrètes tout en préservant leurs propriétés, jusqu'à ce qu'une implémentation soit atteinte. Elle définit également plusieurs opérations permettant de construire des systèmes complexes à partir de composants plus simples en fusionnant différents points de vue d'un même système ou en composant plusieurs sous-systèmes ensemble, ainsi que de décomposer une spécification complexe afin de réutiliser des composants existants et de simplifier la tâche d'implémentation. Le formalisme de spécification que nous utilisons est basé sur des spécifications modales. Intuitivement, une spécification modale est un automate doté de deux types de transitions permettant d'exprimer des comportements optionnels ou obligatoires. Raffiner une spécification modale revient à décider si les parties optionnelles devraient être supprimées ou rendues obligatoires. Cette thèse contient deux principales contributions théoriques basées sur une extension des spécifications modales appelée " spécifications à ensembles d'acceptation ". La première contribution est l'identification d'une sous-classe des spécifications à ensembles d'acceptation, appelée " spécifications à ensembles d'acceptation convexes ", qui permet de définir des opérations bien plus efficaces tout en gardant un haut niveau d'expressivité. La seconde contribution est la définition d'un nouveau formalisme, appelé " spécifications à ensembles d'acceptation marquées ", qui permet d'exprimer des propriétés d'atteignabilité. Ceci peut, par exemple, être utilisé pour s'assurer qu'un système termine ou exprimer une propriété de vivacité dans un système réactif. Les opérations usuelles sont définies sur ce nouveau formalisme et elles garantissent la préservation des propriétés d'atteignabilité. Cette thèse présente également des résultats d'ordre plus pratique. Tous les résultats théoriques sur les spécifications à ensembles d'acceptation convexes ont été prouvés en utilisant l'assistant de preuves Coq. L'outil MAccS a été développé pour implémenter les formalismes et opérations présentés dans cette thèse. Il permet de les tester aisément sur des exemples, ainsi que d'étudier leur efficacité sur des cas concrets. / Software programs are taking a more and more important place in our lives. Some of these programs, like the control systems of power plants, aircraft, or medical devices for instance, are critical: a failure or malfunction could cause loss of human lives, damages to equipments, or environmental harm. Formal methods aim at offering means to design and verify such systems in order to guarantee that they will work as expected. As time passes, these systems grow in scope and size, yielding new challenges. It becomes necessary to develop these systems in a modular fashion to be able to distribute the implementation task to engineering teams. Moreover, being able to reuse some trustworthy parts of the systems and extend them to answer new needs in functionalities is increasingly required. As a consequence, formal methods also have to evolve in order to accommodate both the design and the verification of these larger, modular systems and thus address their scalability challenge. We promote an algebraic approach for the design of correct-by-construction systems. It defines a formalism to express high-level specifications of systems and allows to incrementally refine these specifications into more concrete ones while preserving their properties, until an implementation is reached. It also defines several operations allowing to assemble complex systems from simpler components, by merging several viewpoints of a specific system or composing several subsystems together, as well as decomposing a complex specification in order to reuse existing components and ease the implementation task. The specification formalism we use is based on modal specifications. In essence, a modal specification is an automaton with two kinds of transitions allowing to express mandatory and optional behaviors. Refining a modal specification amounts to deciding whether some optional parts should be removed or made mandatory. This thesis contains two main theoretical contributions, based on an extension of modal specifications called acceptance specifications. The first contribution is the identification of a subclass of acceptance specifications, called convex acceptance specifications, which allows to define much more efficient operations while maintaining a high level of expressiveness. The second contribution is the definition of a new formalism, called marked acceptance specifications, that allows to express some reachability properties. This could be used for example to ensure that a system is terminating or to express a liveness property for a reactive system. Usual operations are defined on this new formalism and guarantee the preservation of the reachability properties as well as independent implementability. This thesis also describes some more practical results. All the theoretical results on convex acceptance specifications have been proved using the Coq proof assistant. The tool MAccS has been developed to implement the formalisms and operations presented in this thesis. It allows to test them easily on some examples, as well as run some experimentations and benchmarks.
|
73 |
Meze svobody: Cenzura, regulace a politická korektnost v literatuře po roce 1989 / Censorship in ČR after 1989SEGI, Stefan January 2016 (has links)
The dissertation thesis examines czech literary censorship after 1989. It presents a polemical addition to a monograph published one year earlier entitled V obecném zájmu [In the General Interest], which covered the same period. The main methodological resource is represented by the books of a British theatrologist Helen Freshwater, who based her inclusive model of censorship on the border crossing of hard and soft censorship. Moreover, she moved her emphasis on discourse as the main indicator of what is considered a censor´s intervention in a particular historical moment. The core of the thesis consists of four chapters, which on the basis of the original research examine the typical cases of censorship and related discourse. These chapters are included in a broader frame of a changing notion of censorship and political correctness in the discussed period. The chapter devoted to the banned skinhead music group Braník is based on the examination of the respective court´s files and the analysis of the changing notion of freedom of speech in the beginning of the 1990s. The chapter about the censorship of literature for children and youth is based on the comparison of various editions of books written by Bohumil Říha. Furthermore, the conditions are observed, under which the interventions to these new editions were identified as censorship. Censorship on internet is treated in the chapter devoted to the regulation of the virtual (literary) child pornography, while the chapter devoted to political correctness focuses on texts and paratexts of splatterpunk literature. This doctoral work should offer a complex picture of changing ways of censorship and thinking after 1989.
|
74 |
Analýza učebnic českého jazyka pro střední školy z genderového hlediska / Analysis of Czech Language Secondary School Text Books from the Gender Point of ViewDvořáková, Jana January 2018 (has links)
The thesis examines the issue of gender in textbooks. The aim of the thesis is to analyse three sets of Czech language secondary school textbooks and ascertain, whether these textbooks contribute to the propagation of gender stereotypes in the society or whether they do not. The thesis is divided into a theoretical section and an empirical section. The theoretical section firstly explicates the fundamental concepts of gender, secondly focuses on the discrimination of women via language and on other topics from the field of gender linguistics. The last chapter of the theoretical section focuses on how gender stereotypes influence schooling. The empirical section encompasses the aforementioned analysis of the selected Czech language textbook sets. The analysis further investigates the manner in which men and women are represented in the curriculum, how men and women are portrayed in the illustrations, if the linguistic means are gender-balanced, and also whether the textbooks include any gender linguistics topics. The results of the analysis show that in all textbook sets, men are the dominant gender in illustrations, as well as among significant figures and authors of literary excerpts. In terms of roles, employment and character traits that are attached to individual characters, the sets of...
|
75 |
Etická vhodnost a správnost projektu Adopce na dálku / The Ethical Suitability and Rightness of the Project of Remole Adoption.DUBA, Vladimír January 2010 (has links)
T h e project deals with adopting children from a distance in India and the assessment of the righteousness of the process from an ethical perspective. The first part introduces current India from a political, cultural, and religious perspective. The second part focuses on the introduction of the adoption from a distance process and evaluates the suitability and purposefulness of the individual parts of the project. The third part deals with the terms of righteousness and suitability of such adoption from a catholic perspective, where we also find the justification of the ethical part of the process. Such justification is later described in the description of Caritas in the threefold service to the church. following, you will find the incorporation of the human being into the social teachings of the church and the situational ethics. The project later focuses on the terms of solidarity, subsidiarity, and the universal relationship between men and charity. The last part is a practical excursion into the life of a community center in the Indian town of Moobidri, where the author visited himself.
|
76 |
Politická korektnost - předpoklady, formy, důsledky / Political correctness - prerequisits, forms, consequencesRada, Tomáš January 2011 (has links)
The diploma thesis is focused on the phenomena of political correctness. The topic is studied at three levels which correspond to the three chapters. The first chapter analyses the development of the term political correctness and the discussion which took place in the USA. The second chapter presents the main philosophical and ethical basis of political correctness, the third chapter describes its particular diplays.
|
77 |
Zuverlässige numerische Berechnungen mit dem Spigot-AnsatzDo, Dang-Khoa 20 September 2005 (has links)
Der Spigot-Ansatz ist eine elegante Vorgehensweise, spezielle numerische Werte zuverlässig, effizient und mit beliebiger Genauigkeit zu berechnen. Die Stärke des Ansatzes ist seine Effizienz, seine totale Korrektheit und seine mathematisch exakt begründete Sicherstellung einer gewünschten absoluten Genauigkeit. Seine Schwäche ist möglicherweise die eingeschränkte Anwendbarkeit. Es gibt in der Literatur Spigot-Berechnung für e und pi. Wurzelberechnung und Logarithmenberechnung gehören zu den Hauptergebnissen der Dissertation. In Kombination mit den Methoden der Reihentransformation von Zeilberger und Wilf bzw. von Gosper ist der Einsatz zur Berechnung von hypergeometrischen Reihen sehr Erfolg versprechend. Eine interessante offene Frage ist die Berechnung der Feigenbaumkonstanten mit dem Ansatz. 'Spigot' bedeutet 'sukzessive Extraktion von Wertanteilen': die Wertanteile werden extrahiert, als ob sie durch einen Hahn (englisch: spigot) gepumpt werden. Es ist dabei besonders interessant, dass in bestimmten Fällen ein Wert-Anteil mit einer Ziffer der Kodierung des Ergebnisses versehen werden kann. Der Spigot-Ansatz steht damit im Gegensatz zu den konventionellen Iterationsverfahren: in einem Schritt des Spigot-Ansatzes wird jeweils ein Wert-Anteil 'extrahiert' und das gesamte Ergebnis ist die Summe der Wert-Anteile; während ein Schritt in einem Iterationsverfahren die Berechnung eines besseren gesamten Ergebnisses aus dem des vorigen Schritt beinhaltet. Das Grundschema der Berechnung mit dem Spigot-Ansatz sieht folgendermaßen aus: zuerst wird für den zu berechnenden numerischen Wert eine gut konvergierende Reihe mit rationalen Gliedern durch symbolisch-algebraische Methoden hergeleitet; dann wird für eine gewünschte Genauigkeit eine Teilsumme ausgewählt; anschließend werden aus der Teilsumme Wert-Anteile iterativ extrahiert. Die Extraktion von Wert-Anteilen aus der Teilsumme geschieht mit dem Spigot-Algorithmus, der auf Sale zurück geht, nur Integer-Arithmetik benötigt und sich als eine verallgemeinerte Form der Basis-Konvertierung dadurch auffassen lässt, dass die Teilsumme als die Kodierung des Wertes in einer inhomogenen Basis interpretiert wird. Die Spigot-Idee findet auch in der Überführung einer Reihe in eine besser konvergierende Reihe auf der Art und Weise Anwendung, dass Wert-Anteile aus der Reihe extrahiert werden, um die originale Reihe werttreu zur Reihe der Wert-Anteile zu transformieren. Dies geschieht mit den Methoden der Reihentransformation von Gosper bzw. Wilf. Die Dissertation umfasst im Wesentlichen die Formalisierung des Spigot-Algorithmus und der Gosperschen Reihentransformation, eine systematische Darstellung der Ansätze, Methoden und Techniken der Reihenentwiclung und Reihentransformation (die Herleitung von Reihen mit Hilfe charakteristischer Funktionalgleichungen; Methoden der Reihentransformation von Euler, Kummer, Markoff, Gosper, Zeilberger und Wilf) sowie die Methoden zur Berechnung von Wurzeln und Logarithmen mit dem Spigot-Ansatz. Es ist interessant zu sehen, wie sich die Grundideen des Spigot-Algorithmus, der Wurzelberechnung und der Logarithmenberechnung jeweils im Wesentlichen durch eine Gleichung ausdrücken lassen. Es ist auch interessant zu sehen, wie sich verschiedene Methoden der Reihentransformation auf einige einfache Grundideen zurückführen lassen. Beispiele für den Beweis von totalen Korrektheit (bei iterativer Berechnung von Wurzeln) könnte auch von starkem Interesse sein. Um die Zuverlässigkeit anderer Methoden zur Berechnung von natürlichen Logarithmen zu überprüfen, scheint der Vergleich der Ergebnisse mit den des Spigot-Ansatzes die beste Methode zu sein. Anders als bei Wurzelberechnung kann hierbei zur Überprüfung die inverse Berechnung nicht angewandt werden. / spigot, total correctness, acceleration of series, computation of roots, computation of logarithms Reliable numerical computations with spigot approach Spigot approach is an elegant way to compute special numerical values reliably, efficiently and with arbitrary accuracy. The advantage of this way are its efficiency and its total correctness including the bounding of the absolute error. The disadvantage is perhaps its restricted applicableness. There are spigot computation for e an pi. The computation of roots and logarithms belongs to the main results of this thesis. In combination with the methods for acceleration of series by Gosper as well as by Zeilberger and Wilf is the use for numerical summation of hypergeometric series very promising. An interesting open question is the computation of the Feigenbaum constant by this way. ‘Spigot’ means ‘successive extraction of portions of value’: the portions of value are ‘extracted’ as if they were pumped through a spigot. It is very interesting in certain case, where these portions can be interpreted as the digits of the result. With respect to that the spigot approach is the opposition to the iterative approach, where in each step the new better result is computed from the result of the previous step. The schema of spigot approach is characterised as follows: first a series for the value to be computed is derived, then a partial sum of the series is chosen with respect to an desired accuracy, afterwards the portions of value are extracted from the sum. The extraction of potions of value is carried by the spigot algorithm which is due to Sale an requires only integer arithmetic. The spigot algorithm can be understood as a generalisation of radix-conversion if the sum is interpreted as an encoding of the result in a mixed-radix (inhomogeneous) system. The spigot idea is also applied in transferring a series into a better convergent series: portions of value are extracted successively from the original series in order to form the series of extracted potions which should have the same value as the original series. This transfer is carried with the methods for acceleration of series by Gosper and Wilf. The thesis incorporates essentially the formalisation of the spigot algorithm and the method of Gosper for acceleration of series, a systematisation of methods and techniques for derivation and acceleration of series (derivation of series for functions by using their characteristic functional equations; methods for acceleration of series by Euler, Kummer, Markov, Gosper Zeilberger and Wilf) as well as the methods for computation of roots and logarithms by spigot approach. It is interesting to see how to express the basic ideas for spigot algorithm, computation of roots and computation of logarithm respectively in some equations. It is also interesting to see how to build various methods for acceleration of series from some simple basic ideas. Examples for proof of total correctness (for iterative computation of roots) can be of value to read. Comparing with the results produced by spigot approach is possibly the best way for verifying the reliability of other methods for computation of natural logarithms, because (as opposed to root computing) the verification by inverse computation is inapplicable.
|
78 |
“Budskapet går ofta fram” : En enkätundersökning om gymnasieelevers attityder till språkriktighet. / “The message mostly reaches out” : A study of high school students' attitudes to language correctness.Öberg, Felicia January 2021 (has links)
Syftet med undersökningen är att undersöka hur viktigt gymnasieelever anser att språkriktighet är i olika kategorier samt deras attityder till enskilda språkriktighetsfrågor. De enskilda språkriktighetsfrågorna är avgränsade till formord och specifikt valet mellan de och dem, han och honom i objektsposition, sina och deras, jämförelseuttryck, före och innan och var och vart samt konstruktioner med dels och eftersom att. Resultatet analyseras för att se hur enhetliga elevernas svar är samt i vilken uträckning kön, gymnasieprogram samt årskurshar betydelse för attityderna till språkriktighet. För att undersökningens syfte ska kunna besvaras har en enkätundersökning genomförts. I undersökningen deltog 217 gymnasieelever från tre olika gymnasieprogram samt från alla tre årskurser. Resultatet visar att eleverna anser att språkriktighet är viktigt i formella texter som information från myndigheter samt att det är oviktigt i informella texter som SMS. Resultatet visar också att eleverna accepterar både de och dem samt sina och deras oavsett satsledsfunktion. De accepterar också var och vart oavsett om det syftar på befintlighet eller riktning. De accepterar inte konstruktioner med han i objektposition utan föredrar honom. För jämförelseuttryck accepteras formuleringen än mig i större utsträckning än formuleringen än jag. För konstruktioner med dels accepteras det till viss del samt att eleverna accepterar konstruktioner med både eftersom att och eftersom. Elevernas svar är relativt enhetliga där det visar sig att attityderna i liten utsträckning påverkas av kön, gymnasieprogram samt årskurs. / The purpose of the survey is to investigate how important high school students consider language correctness to be in different categories and their attitudes to individual language correctness issues. The individual language correctness issues are limited to pronouns,prepositions, conjunctions and subjunctions, specifically the choice between de and dem, han and honom in object position, sina and deras, jag and mig in comparative expressions, föreand innan and var and vart as well as constructions with dels and eftersom att. The results areanalyzed to see how uniform the students' answers are and to what extent gender, high schoolprogram and year groups are important for attitudes to language correctness. In order to beable to answer the purpose a survey has been conducted. The survey involved 217 highschool students from three different high school programs as well as from all three year groups. The results show that the students believe that language correctness is important in formaltexts such as information from the authority and that it is unimportant in informal texts suchas SMS. The results also show that the students accept both de and dem as well as sina and deras regardless of the clause function. They also accept var and vart, regardless of whether it refers to existence or direction. They do not accept constructions with han in objectposition but prefer honom. For comparative terms, the formulation än mig is accepted to agreater extent than the formulation än jag. For constructions with dels it is accepted to someextent and that the students accept constructions with both eftersom att and eftersom. The students' answers are relatively uniform, where it turns out that attitudes are to a small extentaffected by gender, high school programs and year group.
|
79 |
Analyse de dépendances ML pour les évaluateurs de logiciels critiques. / ML Dependency Analysis for Critical-Software AssessorsBenayoun, Vincent 16 May 2014 (has links)
Les logiciels critiques nécessitent l’obtention d’une évaluation de conformité aux normesen vigueur avant leur mise en service. Cette évaluation est obtenue après un long travaild’analyse effectué par les évaluateurs de logiciels critiques. Ces derniers peuvent être aidéspar des outils utilisés de manière interactive pour construire des modèles, en faisant appel àdes analyses de flots d’information. Des outils comme SPARK-Ada existent pour des sous-ensembles du langage Ada utilisés pour le développement de logiciels critiques. Cependant,des langages émergents comme ceux de la famille ML ne disposent pas de tels outils adaptés.La construction d’outils similaires pour les langages ML demande une attention particulièresur certaines spécificités comme les fonctions d’ordre supérieur ou le filtrage par motifs. Cetravail présente une analyse de flot d’information pour de tels langages, spécialement conçuepour répondre aux besoins des évaluateurs. Cette analyse statique prend la forme d’uneinterprétation abstraite de la sémantique opérationnelle préalablement enrichie par desinformations de dépendances. Elle est prouvée correcte vis-à-vis d’une définition formellede la notion de dépendance, à l’aide de l’assistant à la preuve Coq. Ce travail constitue unebase théorique solide utilisable pour construire un outil efficace pour l’analyse de toléranceaux pannes. / Critical software needs to obtain an assessment before commissioning in order to ensure compliance tostandards. This assessment is given after a long task of software analysis performed by assessors. Theymay be helped by tools, used interactively, to build models using information-flow analysis. Tools likeSPARK-Ada exist for Ada subsets used for critical software. But some emergent languages such as thoseof the ML family lack such adapted tools. Providing similar tools for ML languages requires specialattention on specific features such as higher-order functions and pattern-matching. This work presentsan information-flow analysis for such a language specifically designed according to the needs of assessors.This analysis is built as an abstract interpretation of the operational semantics enriched with dependencyinformation. It is proved correct according to a formal definition of the notion of dependency using theCoq proof assistant. This work gives a strong theoretical basis for building an efficient tool for faulttolerance analysis.
|
80 |
Secure, fast and verified cryptographic applications : a scalable approach / Implémentations cryptographiques sures, performantes et vérifiées : une approche passant à l'échelleZinzindohoué-Marsaudon, Jean-Karim 03 July 2018 (has links)
La sécurité des applications sur le web est totalement dépendante de leur design et de la robustesse de l'implémentation des algorithmes et protocoles cryptographiques sur lesquels elles s'appuient. Cette thèse présente une nouvelle approche, applicable à de larges projets, pour vérifier l'état de l'art des algorithmes de calculs sur les grands nombres, tel que rencontrés dans les implémentations de référence. Le code et les preuves sont réalisés en F*, un langage orienté preuve et qui offre un système de types riche et expressif. L'implémentation et la vérification dans un langage d'ordre supérieur permet de maximiser le partage de code mais nuit aux performances. Nous proposons donc un nouveau langage, Low*, qui encapsule un sous ensemble de C en F* et qui compile vers C de façon sûre. Low* conserve toute l'expressivité de F* pour les spécifications et les preuves et nous l'utilisons pour implémenter de la cryptographie, en y intégrant les optimisations des implémentations de référence. Nous vérifions ce code en termes de sûreté mémoire, de correction fonctionnelle et d'indépendance des traces d'exécution vis à vis des données sensibles. Ainsi, nous présentons HACL*, une bibliothèque cryptographique autonome et entièrement vérifiée, dont les performances sont comparables sinon meilleures que celles du code C de référence. Plusieurs algorithmes de HACL* font maintenant partie de la bibliothèque NSS de Mozilla, utilisée notamment dans Firefox et dans RedHat. Nous appliquons les mêmes concepts sur miTLS, une implémentation de TLS vérifiée et montrons comment étendre cette méthodologie à des preuves cryptographiques, du parsing de message et une machine à état. / The security of Internet applications relies crucially on the secure design and robust implementations of cryptographic algorithms and protocols. This thesis presents a new, scalable and extensible approach for verifying state-of-the-art bignum algorithms, found in popular cryptographic implementations. Our code and proofs are written in F∗, a proof-oriented language which offers a very rich and expressive type system. The natural way of writing and verifying higher-order functional code in F∗ prioritizes code sharing and proof composition, but this results in low performance for cryptographic code. We propose a new language, Low∗, a fragment of F∗ which can be seen as a shallow embedding of C in F∗ and safely compiled to C code. Nonetheless, Low∗ retains the full expressiveness and verification power of the F∗ system, at the specification and proof level. We use Low∗ to implement cryptographic code, incorporating state-of-the-art optimizations from existing C libraries. We use F∗ to verify this code for functional correctness, memory safety and secret in- dependence. We present HACL∗, a full-fledged and fully verified cryptographic library which boasts performance on par, if not better, with the reference C code. Several algorithms from HACL∗ are now part of NSS, Mozilla’s cryptographic library, notably used in the Firefox web browser and the Red Hat operating system. Eventually, we apply our techniques to miTLS, a verified implementation of the Transport Layer Security protocol. We show how they extend to cryptographic proofs, state-machine implementations and message parsing verification.
|
Page generated in 0.046 seconds