Spelling suggestions: "subject:"parser"" "subject:"parsed""
1 |
Uma Abordagem para Construção das Etapas de Análise de um Compilador.MÉLO, Daniel Gondim Ernesto de. 07 February 2018 (has links)
Submitted by Gustavo Nascimento (gustavo.diniz@ufcg.edu.br) on 2018-02-07T11:57:47Z
No. of bitstreams: 1
DANIEL GONDIM ERNESTO DE MÉLO - DISSERTAÇÃO PPGCC 2014.pdf: 1076740 bytes, checksum: 6b8c2f71701a3e6c6fa28f87faaeba62 (MD5) / Made available in DSpace on 2018-02-07T11:57:47Z (GMT). No. of bitstreams: 1
DANIEL GONDIM ERNESTO DE MÉLO - DISSERTAÇÃO PPGCC 2014.pdf: 1076740 bytes, checksum: 6b8c2f71701a3e6c6fa28f87faaeba62 (MD5)
Previous issue date: 2014-06-20 / Capes / Compiladores são programas que traduzem um código escrito em alguma linguagem, conhecida
como linguagem fonte, para um outro programa semanticamente equivalente em
outra linguagem, conhecida como linguagem destino. Existem compiladores que traduzem
códigos entre linguagens de alto nível. Porém, em geral, a linguagem destino mais utilizada
é a linguagem de máquina ou código de máquina. Várias linguagens e ferramentas
têm sido propostas dentro desse escopo a exemplo de Xtext, Stratego, CUP, ANTLR, etc.
Apesar da grande quantidade, atualmente, os frameworks existentes para a construção de
compiladores são de difícil compreensão e não evidenciam ao programador várias estruturas
importantes, como tabela de símbolos e árvores de derivação. Adicionalmente, são muitos
os detalhes específicos de cada plataforma concebida com esse propósito. Outrossim, em sua
maioria, cada framework concentra-se e provê serviços para apenas uma etapa de um compilador,
muitas vezes para prover serviços para mais de uma etapa se faz necessário o uso de
linguagens de propósito geral, o que eleva o grau de complexidade para o projetista de Compiladores.
Nesse sentido, propomos UCL (Unified Compiler Language), uma linguagem de
domínio específico para o desenvolvimento das etapas de análise de Compiladores, de forma
independente de plataforma e unificada. Com UCL é possível ao projetista do Compilador,
especificar questões de design, tais como escolha de algoritmos a serem utilizados, tipo de
scanner, entre outras características. A avaliação deste trabalho foi realizada por meio da
condução de dois surveys com alunos da disciplina de Compiladores da Universidade Federal
de Campina Grande, durante a execução dos projetos, que consiste no desenvolvimento
de Compiladores. / Compilers are softwares that translate program codes written in some language, known as source language, to another semantically equivalent program in another programming language,
known as target language. There are compilers that translate codes between high level
languages. However, in general, the most widely used target language is the machine language
or machine code. Several languages and tools have been proposed within this escope,
e.g. Xtext, Stratego, CUP, ANTLR, etc. Despite the great quantity, currently, the existing
frameworks for building compilers are difficult to understand and does not show the programmer
several important structures, such as symbol table and syntax tree. Additionally,
there are many specific details of each platform designed for that purpose. Moreover, in
most cases, each framework focuses and provides services for only one module of a compiler.
Often to provide services for more than one step it is necessary to use general purpose
languages, which increases the degree of complexity. In this context, we propose UCL (Unified
Compiler Language), a domain specific language for the development of the analysis
modules, in a unified and platform independent way. With UCL it is possible for the compiler
designer, specify design issues, such as, choice of algorithms to be used, type of scanner,
among other features. The evaluation of this work was conducted through the application of
two surveys with students of the compilers course from the Federal University of Campina
Grande, during project execution, consisting in the development of compilers.
|
2 |
Algoritmos para Interoperabilidade entre Ontologias / ALGORITHMS FOR ONTOLOGIES INTEROPERABILITHinz, Verlani Timm 06 August 2008 (has links)
Made available in DSpace on 2016-03-22T17:26:09Z (GMT). No. of bitstreams: 1
Verlani Hinz.pdf: 1544192 bytes, checksum: cf6c659e0f4abe594675df0afce948c6 (MD5)
Previous issue date: 2008-08-06 / Nowadays the interest for the theme ontologies has steadily increased, as it aims at to
capture consensual knowledge between people and applications, allowing reusing and sharing
information. With this purpose, mechanisms are demanded to guarantee semantic
interoperability, that is, the identification and compatibility of information. The present study
proposes to development the algorithms to interoperability among ontologies. Hence, it
intends to obtain more semantic flexibility and to reach better levels of description in the
information that characterize the context of a computational environment. Initially, a basic but
deep study, was done about the ways of interoperability among ontologies and its applications
in different environments. Formality of specification as grammatical of graphs and matrix of
adjacencies to represent ontologies and the use of parsers for conversion of code OWL in
representation of matrices had also been studied. The patterns Dublin Core and FOAF, were
studied and used in order to address the issue of reconciliation of vocabularies. For
verification of the applicability of the measure, it was shown to an example of dynamic
interoperability of ontologies and an algorithm that can be applied in several scenes / Recentemente, tem crescido o interesse sobre o tema Ontologias, pois estas objetivam
capturar conhecimento consensual entre pessoas e aplicações, permitindo o reuso e
compartilhamento de informações. Para estes propósitos, são necessários mecanismos que
garantam a interoperabilidade semântica e compatibilidade de informações. O presente
trabalho propõe desenvolver algoritmos para interoperabilidade entre ontologias, utilizando
operadores relacionais. Para isso, inicialmente foi feito um estudo básico, mas aprofundado,
sobre as formas de interoperabilidade entre ontologias e sua aplicação em diferentes
ambientes. Foram estudados também formalismos de especificação como gramática de grafos
e matriz de adjacências para representar ontologias, assim como, a utilização de parsers para
converter ontologias no formato OWL em representação de matrizes. Os padrões Dublin Core
e FOAF, foram estudados e utilizados a fim de tratar a questão da conciliação de
vocabulários. Como verificações da aplicabilidade da medida foram demonstrados dois
exemplos de interoperabilidade dinâmica de ontologias aplicadas a domínios específicos
|
3 |
Une étude de l’écosystème TLS / A study of the TLS ecosystemLevillain, Olivier 23 September 2016 (has links)
SSL/TLS, un protocole de sécurité datant de 1995, est devenu aujourd'hui une brique essentielle pour la sécurité des communications, depuis les sites de commerce en ligne ou les réseaux sociaux jusqu'aux réseaux privés virtuels (VPN), en passant par la protection des protocoles de messagerie électronique, et de nombreux autres protocoles. Ces dernières années, SSL/TLS a été l'objet de toutes les attentions, menant à la découverte de nombreuses failles de sécurité et à des améliorations du protocole. Dans cette thèse, nous commençons par explorer l'écosystème SSL/TLS sur Internet en énumérant les serveurs HTTPS sur l'espace IPv4; nous proposons pour cela des méthodologies de collecte et d'analyse permettant d'obtenir des résultats reproductibles et comparables entre différentes campagnes de mesure. Au-delà de ces observations, nous nous sommes intéressés en détail à deux aspects essentiels de la sécurité TLS: comment parer les attaques sur le Record Protocol, et comment implémenter des parsers sûrs et efficaces. Finalement, en se basant sur les nombreuses failles d'implémentation qui ont affecté presque toutes les piles TLS ces dernières années, nous tirons quelques enseignements concernant les difficultés liées à l'écriture d'une bibliothèque TLS de confiance / SSL/TLS, a 20-year old security protocol, has become a major component securing network communications, from HTTPS e-commerce and social network sites to Virtual Private Networks, from e-mail protocols to virtually every possible protocol. In the recent years, SSL/TLS has received a lot of attentions, leading to the discovery of many security vulnerabilities, and to protocol improvements. In this thesis, we first explore the SSL/TLS ecosystem at large using IPv4 HTTPS scans, while proposing collection and analysis methodologies to obtain reproducible and comparable results across different measurement campaigns. Beyond these observations, we focused on two key aspects of TLS security: how to mitigate Record Protocol attacks, and how to write safe and efficient parsers. Finally, building on the numerous implementation flaws in almost all TLS stacks in the last years, we propose some thoughts about the challenges in writing a secure TLS library
|
4 |
A Left-to-Right Parsing Algorithm for THIS Programming LanguageHooker, David P. 05 1900 (has links)
The subject of this investigation is a specific set of parsers known as LR parsers. Of primary interest is a LR parsing method developed by DeRemer which specifies a translation method which can be defined by a Deterministic Push-Down Automation (DPDA). The method of investigation was to apply DeRemer's parsing technique to a specific language known as THIS Programming Language (TPL). The syntax of TPL was redefined as state diagrams and these state diagrams were, in turn, encoded into two tables--a State-Action table and a Transition table. The tables were then incorporated into a PL/l adaptation of DeRemer's algorithm and tested against various TPL statements.
|
5 |
A Performance Based Comparative Study of Different APIs Used for Reading and Writing XML FilesGujarathi, Neha 08 October 2012 (has links)
No description available.
|
6 |
Syntaktická analýza založená na automatech s hlubokými zásobníky / Parsing Based on Automata with Deep PushdownsRusek, David January 2016 (has links)
This paper addresses the issue of design and implementation of syntactic analysis based on the context sensitive languages, respectively, grammars that contains constructs, which isn't possible to analyze with the help of the standard parsers based on the context free grammars. More specifically, this paper deals with the possibility of adding context sensitive support to the classic LL-analysis by replacing the standard pushdown automata (PDA) with deep pushdown automata (DP), which were introduced and published by prof. Alexander Meduna.
|
Page generated in 0.0586 seconds