131 |
Applying Dynamic Software Updates to Computationally-Intensive ApplicationsKim, Dong Kwan 22 July 2009 (has links)
Dynamic software updates change the code of a computer program while it runs, thus saving the programmer's time and using computing resources more productively. This dissertation establishes the value of and recommends practices for applying dynamic software updates to computationally-intensive applications—a computing domain characterized by long-running computations, expensive computing resources, and a tedious deployment process. This dissertation argues that updating computationally-intensive applications dynamically can reduce their time-to-discovery metrics—the total time it takes from posing a problem to arriving at a solution—and, as such, should become an intrinsic part of their software lifecycle. To support this claim, this dissertation presents the following technical contributions: (1) a distributed consistency algorithm for synchronizing dynamic software updates in a parallel HPC application, (2) an implementation of the Proxy design pattern that is more efficient than the existing implementations, and (3) a dynamic update approach for Java Virtual Machine (JVM)-based applications using the Proxy pattern to offer flexibility and efficiency advantages, making it suitable for computationally-intensive applications. The contributions of this dissertation are validated through performance benchmarks and case studies involving computationally-intensive applications from the bioinformatics and molecular dynamics simulation domains. / Ph. D.
|
132 |
Towards a Framework for Proving Termination of Maude ProgramsAlarcón Jiménez, Beatriz 10 June 2011 (has links)
Maude es un lenguaje de programación declarativo basado en la lógica de reescritura
que incorpora muchas características que lo hacen muy potente. Sin
embargo, a la hora de probar ciertas propiedades computacionales esto conlleva
dificultades. La tarea de probar la terminación de sistemas de reesctritura
es de hecho bastante dura, pero aplicada a lenguajes de programación reales
se concierte en más complicada debido a estas características inherentes. Esto
provoca que métodos para probar la terminación de este tipo de programas
requieran técnicas específicas y un análisis cuidadoso. Varios trabajos han intentado
probar terminación de (un subconjunto de) programas Maude. Sin
embargo, todos ellos siguen una aproximación transformacional, donde el programa
original es trasformado hasta alcanzar un sistema de reescritura capaz
de ser manejado con las técnicas y herramientas de terminación existentes. En
la práctica, el hecho de transformar los sistemas originales suele complicar la
demostración de la terminación ya que esto introduce nuevos símbolos y reglas
en el sistema. En esta tesis, llevamos a cabo el problema de probar terminación
de (un subconjunto de) programas Maude mediante métodos directos.
Por un lado, nos centramos en la estrategia de Maude. Maude es un lenguaje
impaciente donde los argumentos de una función son evaluados siempre
antes de la aplicación de la función que los usa. Esta estrategia (conocida como
llamada por valor) puede provocar la no terminación si los programas no
están escritos cuidadosamente. Por esta razón, Maude (en concreto) incorpora
mecanismos para controlar la ejecución de programas como las anotaciones
sintácticas que están asociadas a los argumentos de los símbolos. En reescritura,
esta estrategia sería conocida como reescritura sensible al contexto
innermost (RSCI).
Por otro lado, Maude también incorpora la posibilidad de declarar atributos. / Alarcón Jiménez, B. (2011). Towards a Framework for Proving Termination of Maude Programs [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/11003
|
133 |
Rewriting-based Verification and Debugging of Web SystemsRomero ., Daniel Omar 02 November 2011 (has links)
The increasing complexity of Web system has led to the development of sophisticated formal methodologies for verifying and correcting Web data and Web programs.
In general, establishing whether a Web system behaves correctly with respect to the original intention of the programmer or checking its internal consistency
are non-trivial tasks as witnessed by many studies in the literature.
In this dissertation, we face two challenging problems related to the verification of Web systems.
Firstly, we extend a previous Web verification framework based on partial rewriting by providing a semi-automatic technique for repairing Web systems.
We propose a basic repairing methodology that is endowed with several strategies for optimizing the number of repair actions that must be executed in order to fix a given Web site.
Also, we develop an improvement of the Web verification framework that is based on abstract interpretation and greatly enhances both efficiency and scalability of the original technique.
Secondly, we formalize a framework for the specification and model-checking of dynamic Web applications that is based on Rewriting Logic.
Our framework allows one to simulate
the user navigation and the evaluation of Web scripts within a Web application, and also check important related properties such us reachability and consistency.
When a property is refuted, a counter-example with the erroneous trace is delivered.
Such information can be analyzed in order to debug the Web application under examination by means of a novel backward trace slicing technique that we formulated for this purpose.
This technique consists in tracing back, along an execution trace, all the relevant symbols of the term (or state) that we are interested to observe. / Romero ., DO. (2011). Rewriting-based Verification and Debugging of Web Systems [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/12496
|
134 |
Terminaison en temps moyen fini de systèmes de règles probabilistes / Termination within a finite mean time of probabilistic rules based systemsGarnier, Florent 17 September 2007 (has links)
Nous avons dans cette thèse cherché à définir un formalisme simple pour pouvoir modéliser des systèmes où se combinent des phénomènes non-déterministes et des comportements aléatoires. Nous avons choisi d'étendre le formalisme de la réécriture pour lui permettre d'exprimer des phénomènes probabilistes, puis nous avons étudié la terminaison en temps moyen fini de ce modèle. Nous avons également présenté une notion de stratégie pour contrôler l'application des règles de réécriture probabilistes et nous présentons des critères généraux permettant d'identifier des classes de stratégies sous lesquelles les systèmes de réécriture probabilistes terminent en temps moyen fini. Afin de mettre en valeur notre formalisme et les méthodes de preuve de terminaison en temps moyen fini, nous avons modélisé un réseau de stations \WIFI~ et nous montrons que toutes les stations parviennent à émettre leurs messages dans un temps moyen fini. / In this thesis we define a new formalism that allows to model transition systems where transitions can be either probabilistic or non deterministic. We choose to extend the rewriting formalism because it allows to simply express non-deterministic behavior. Latter, we study the termination of such systems and we give some criteria that imply the termination within a finite mean number of rewrite steps. We also study the termination of such systems when the firing of probabilistic rules are controlled by strategies. In this document, we use our techniques to model the \WIFI~ protocol and show that a pool of stations successfully emits all its messages within a finite mean time.
|
135 |
Définitions par réécriture dans le lambda-calcul : confluence, réductibilité et typage / Definitions by rewriting in the lambda-calculus : confluence, reducibility and typingRiba, Colin 14 December 2007 (has links)
Cette thèse concerne la combinaison du lambda-calcul et de la réécriture, dont nous étudions principalement deux propriétés : la confluence et la normalisation forte. Nous commençons par étudier sous quelles conditions la combinaison d'une relation de réécriture conditionnelle confluente au lambda-calcul donne une relation de réécriture confluente. Ensuite nous nous intéressons aux preuves de normalisation forte de lambda-calculs typés utilisant la technique de réductibilité. Notre contribution la plus importante est une comparaison de diverses variantes de cette technique, utilisant comme outil de comparaison la manière dont ces variantes s'étendent à la réécriture et dont elles prennent en compte les types unions et les types existentiels implicites. Enfin, nous présentons un critère, basé sur un système de types contraints, pour la normalisation forte de la réécriture conditionnelle combinée au lambda-calcul. Notre approche étend des critères de terminaison existants qui utilisent des annotations de taille. C'est à notre connaissance le premier critère de terminaison pour la réécriture conditionnelle avec membres droits d'ordre supérieur qui prenne en compte, dans l'argument de terminaison, de l'information issue de la satisfaction des conditions des règles de réécriture / This thesis is about the combination of lambda-calculus with rewriting. We mainly study two properties: confluence and strong normalization. We begin by studying under which conditions the combination of a confluent conditional rewrite relation to the lambda-calculus leads to a confluent relation. Next, we study strong normalization proofs of typed lambda-calculi that use the reducibility technique. Our main contribution is a comparison of variants of this technique, with respect to how they extend to rewriting and how they handle union and implicit existential types. Finally, we present a termination criterion for the combination of conditional rewriting and lambda-calculus based on a constrained type system. Our approach, which extends known criteria that use sized types, is to our knowledge the first termination criterion for conditional rewriting with higher-order right-hand sides that takes into account in the termination argument some information generated by the satisfaction of the conditions of the rewrite rules
|
136 |
RACR: A Scheme Library for Reference Attribute Grammar Controlled RewritingBürger, Christoff 07 February 2013 (has links) (PDF)
This report presents RACR, a reference attribute grammar library for the programming language Scheme.
RACR supports incremental attribute evaluation in the presence of abstract syntax tree rewrites. It provides a set of functions that can be used to specify abstract syntax tree schemes and their attribution and construct respective trees, query their attributes and node information and annotate and rewrite them. Thereby, both, reference attribute grammars and rewriting, are seamlessly integrated, such that rewrites can reuse attributes and attribute values change depending on performed rewrites – a technique we call Reference Attribute Grammar Controlled Rewriting. To reevaluate attributes influenced by abstract syntax tree rewrites, a demand-driven, incremental evaluation strategy, which incorporates the actual execution paths selected at runtime for control-flows within attribute equations, is used. To realize this strategy, a dynamic attribute dependency graph is constructed throughout attribute evaluation – a technique we call Dynamic Attribute Dependency Analyses.
The report illustrates RACR's motivation, features, instantiation and usage. In particular its application programming interface is documented and exemplified. The report is a reference manual for RACR developers. Further, it presents RACR’s complete implementation and therefore provides a good foundation for readers interested into the details of reference attribute grammar controlled rewriting and dynamic attribute dependency analyses.
|
137 |
Die C# Schnittstelle der Referenzattributgrammatik-gesteuerten Graphersetzungsbibliothek RACR: Übersicht, Anwendung und Implementierung: EntwicklerhandbuchLangner, Daniel, Bürger, Christoff 04 July 2018 (has links)
Dieser Bericht präsentiert RACR-NET, eine Schnittstelle der Referenzattributgrammatik-gesteuerten Graphersetzungsbibliothek RACR für C#.
RACR-NET ermöglicht die Nutzung der deklarativen, dynamischen Sprachspezifikations-, Instanziierungs- und Auswertungsmeachanismen der RACR Scheme-Bibliothek in der objektorientierten Programmierung. Dies umfasst insbesondere die automatische inkrementelle Auswertung attributbasierter semantischer Analysen und somit das automatische Cachen parametrisierter Funktionsmethoden. Graphersetzungen entsprechen hierbei Zustandsänderungen von Objektinstanzen und der Invalidierung abgeleiteter Berechnungen.
Schwerpunkt dieses Berichts ist die objektorientierte Programmierschnittstelle von RACR-NET, dessen praktische Anwendung und Implementierung. Der Bericht ist ein Referenzhandbuch für RACR-NET Anwender und Entwickler.:1. Einleitung
1.1. Aufgabenstellung
1.2. Struktur der Arbeit
2. Konzeptionelle und technische Voraussetzungen
2.1. Überblick der RAG-gesteuerten Graphersetzung
2.2. Scheme
2.3. Die RACR Scheme-Bibliothek
2.4. Das .NET-Framework und die Common Language Infrastructure
2.5. IronScheme
3. RACR-NET Implementierung: Prozedurale Schnittstelle
3.1. Scheme in C#
3.2. RACR in C#
3.3. Anforderungsanalyse
3.4. Implementierung der prozeduralen Schnittstelle
4. RACR-NET Implementierung: Objektorientierte Schnittstelle
4.1. Überblick über die objektorientierte Schnittstelle
4.2. Anwendungsbeispiel
4.3. Herausforderungen bei der Implementierung
4.4. Implementierung
5. Evaluation
5.1. Testen der Schnittstelle
5.2. Performance-Messungen und -Vergleiche
6. Zusammenfassung und Ausblick
6.1. Eine objektorientierte Bibliothek für RAG-gesteuerte Graphersetzung
6.2. Zukünftige Arbeiten
A. Literaturverzeichnis
B. MIT Lizenz
|
138 |
The critical figure : negativity in selected works by Proust, Joyce and Beckett / William David WatsonWatson, William David January 2000 (has links)
This dissertation represents an interpretation of the different forms of negativity in the modernist work that can be understood in terms of that which is unsaid, unsayable, or any other means of refusing to give an affirmative proposition regarding the world the work describes. It explores this negativity as both a representation of that which cannot be represented, and as an operational negativity, or negation, that takes part in the unmaking of the work's figures. The function of this negativity, as interpreted in Marcel Proust's Remembrance of Things Past (1913-1927), James Joyce's Ulysses (1922) and Krapp's Last Tape (1959) by Samuel Beckett, is to rewrite the representations of the work. Negativity is then also understood as a transformation and conditioning of elements already present in the literary work, that lead to ambivalent and problematic representations in the work. In this sense, negativity can be understood as a form of rewriting of the work's representations. The interpretations of the works of Proust, Joyce and Beckett are guided by this understanding, as given in the introduction, of negativity. In the analysis of Proust's novel, in "The Unmaking of Proust: Negation and Errors in Remembrance of Things Past", this form of negativity is situated in relation to Proust's handling of epistemological questions and mimetic references to reality in his work. The analysis of Joyce's work in "The Wandering of Language in James Joyce's Ulysses" discusses his treatment of language and the origins of language as being characterized by a negation that increases the difficulty of the language, and attempts to negate its origins. Finally, in the analysis of Beckett's "Krapp's Last Tape", in "Beckett, Proust, and the End of Literature", it is shown that negativity conditions both the reception of the influence of Proust by Beckett, and the play's attempt to suggest the end of writing. In conclusion the dissertation returns to the idea of negativity as a form of rewriting, and briefly indicates that the function of negativity in these novels can be understood as a form of invention. / Thesis (M.A.)--Potchefstroom University for Christian Higher Education, 2000.
|
139 |
The critical figure : negativity in selected works by Proust, Joyce and Beckett / William David WatsonWatson, William David January 2000 (has links)
This dissertation represents an interpretation of the different forms of negativity in the modernist work that can be understood in terms of that which is unsaid, unsayable, or any other means of refusing to give an affirmative proposition regarding the world the work describes. It explores this negativity as both a representation of that which cannot be represented, and as an operational negativity, or negation, that takes part in the unmaking of the work's figures. The function of this negativity, as interpreted in Marcel Proust's Remembrance of Things Past (1913-1927), James Joyce's Ulysses (1922) and Krapp's Last Tape (1959) by Samuel Beckett, is to rewrite the representations of the work. Negativity is then also understood as a transformation and conditioning of elements already present in the literary work, that lead to ambivalent and problematic representations in the work. In this sense, negativity can be understood as a form of rewriting of the work's representations. The interpretations of the works of Proust, Joyce and Beckett are guided by this understanding, as given in the introduction, of negativity. In the analysis of Proust's novel, in "The Unmaking of Proust: Negation and Errors in Remembrance of Things Past", this form of negativity is situated in relation to Proust's handling of epistemological questions and mimetic references to reality in his work. The analysis of Joyce's work in "The Wandering of Language in James Joyce's Ulysses" discusses his treatment of language and the origins of language as being characterized by a negation that increases the difficulty of the language, and attempts to negate its origins. Finally, in the analysis of Beckett's "Krapp's Last Tape", in "Beckett, Proust, and the End of Literature", it is shown that negativity conditions both the reception of the influence of Proust by Beckett, and the play's attempt to suggest the end of writing. In conclusion the dissertation returns to the idea of negativity as a form of rewriting, and briefly indicates that the function of negativity in these novels can be understood as a form of invention. / Thesis (M.A.)--Potchefstroom University for Christian Higher Education, 2000.
|
140 |
Normalisation & equivalence in proof theory & type theoryLengrand, Stéphane J. E. January 2006 (has links)
At the heart of the connections between Proof Theory and Type Theory, the Curry-Howard correspondence provides proof-terms with computational features and equational theories, i.e. notions of normalisation and equivalence. This dissertation contributes to extend its framework in the directions of proof-theoretic formalisms (such as sequent calculus) that are appealing for logical purposes like proof-search, powerful systems beyond propositional logic such as type theories, and classical (rather than intuitionistic) reasoning. Part I is entitled Proof-terms for Intuitionistic Implicational Logic. Its contributions use rewriting techniques on proof-terms for natural deduction (Lambda-calculus) and sequent calculus, and investigate normalisation and cut-elimination, with call-by-name and call-by-value semantics. In particular, it introduces proof-term calculi for multiplicative natural deduction and for the depth-bounded sequent calculus G4. The former gives rise to the calculus Lambdalxr with explicit substitutions, weakenings and contractions that refines the Lambda-calculus and Beta-reduction, and preserves strong normalisation with a full notion of composition of substitutions. The latter gives a new insight to cut-elimination in G4. Part II, entitled Type Theory in Sequent Calculus develops a theory of Pure Type Sequent Calculi (PTSC), which are sequent calculi that are equivalent (with respect to provability and normalisation) to Pure Type Systems but better suited for proof-search, in connection with proof-assistant tactics and proof-term enumeration algorithms. Part III, entitled Towards Classical Logic, presents some approaches to classical type theory. In particular it develops a sequent calculus for a classical version of System F_omega. Beyond such a type theory, the notion of equivalence of classical proofs becomes crucial and, with such a notion based on parallel rewriting in the Calculus of Structures, we compute canonical representatives of equivalent proofs.
|
Page generated in 0.0543 seconds