Spelling suggestions: "subject:"THIS erogramming anguage"" "subject:"THIS erogramming 1anguage""
101 |
Making scope explorable in Software Development Environments to reduce defects and support program understandingvon Oldenburg, Tim January 2014 (has links)
Programming language tools help software developers to understand a program and to recognize possible pitfalls. Used with the right knowledge, they can be instrumented to achieve better software quality. However, creating language tools that integrate well into the development environment and workflow is challenging.This thesis utilizes a user-centered design process to identify the needs of professional developers through in-depth interviews, address those needs through a concept, and finally implement and evaluate the concept. Taking 'scope' as an exemplary source of misconceptions in programming, a “Scope Inspector” plug-in for the Atom IDE—targeting experienced JavaScript developers in the open source community—is implemented.
|
102 |
Program analysis for quantitative-reachability propertiesLiu, Jiawen 06 September 2024 (has links)
Program analysis studies the execution behaviors of computer programs including programs’ safety behavior, privacy behavior, resource usage, etc. The kind of program analysis on the safety behavior of a program involves analyzing if a particular line of code leaks a secret and how much secret is leaked by this line of code. When studying the resource usage of a program, certain program analysis mainly focuses on analyzing whether a piece of code consumes a certain resource and how much resource is used by this piece of code. Yet another kind of program analysis is studying the program privacy behavior by analyzing whether a specific private data is dependent on other data and how many times they are dependent during multiple executions. We notice that when studying the aforementioned behaviors, there are two dominant program properties that we are analyzing – “How Much” and “Whether”, namely quantitative properties and reachability properties. In other words, we are analyzing the kind of program property that contains two sub-properties – quantitative and reachability. A property is a hyper-property if it has two or more sub-properties. For the class of properties that has quantitative and reachability sub-properties, I refer to them as quantitative-reachability hyperproperties. Most existing program analysis methods can analyze only one subproperty of a program’s quantitative-reachability hyper-property. For example, the reachability analysis methods only tell us whether some code pieces are executed, whether the confidential data is leaked, whether certain data relies on another data, etc., which are only the reachability sub-properties. These analysis methods do not address how many times or how long these properties hold with respect to some particular code or data. Quantitative analysis methods, such as program complexity analysis, resource cost analysis, execution time estimation, etc., only tell us the upper bound on the overall quantity, i.e., the quantitative sub-property. However, these quantities are not associated with a specific piece of code, program location, private data, etc., which are related to the reachability sub-properties. This thesis presents new program analysis methodology for analyzing two representative quantitative-reachability properties. The new methodology mitigates the limitations in both reachability analysis methods and quantitative analysis methods and help to control the program’s execution behaviors in higher granularity. The effectiveness of the new analysis method is validated through prototype implementations and experimental evaluations.
The first noteworthy quantitative-reachability property I look into is the adaptivity in the programs that implement certain adaptive data analyses. Data analyses are usually designed to identify some properties of the population from which the data are drawn, generalizing beyond the specific data sample. For this reason, data analyses are often designed in a way that guarantees that they produce a low generalization error. An adaptive data analysis can be seen as a process composed by multiple queries interrogating some data, where the choice of which query to run next may rely on the results of previous queries. The generalization error of each individual query/analysis can be controlled by using an array of well-established statistical techniques. However, when queries are arbitrarily composed, the different errors can propagate through the chain of different queries and result in high generalization error. To address this issue, data analysts are designing several techniques that not only guarantee bounds on the generalization errors of single queries, but that also guarantee bounds on the generalization error of the composed analyses. The choice of which of these techniques to use, often depends on the chain of queries that an adaptive data analysis can generate, intuitively the adaptivity level in an adaptive data analysis. To help analysts with identifying which technique to use to control their generalization error, we consider adaptive data analyses implemented as while-like programs, and we design a program analysis framework. In this framework, we first formalize the intuitive notion of adaptivity as a quantitative-reachability property, which is a key measure of an adaptive data analysis to choose the appropriate technique. Then we design a program analysis algorithm that estimates a sound upper bound on the adaptivity of the program that implements an adaptive data analysis. We also implement my program analysis and show that it can help to analyze the adaptivity of several concrete data analyses with different adaptivity structures.
As a continuation of the previous work, to get a more precise bound on a program’s adaptivity level, I look at another quantitative-reachability hyper-property – the number of times a given location inside a procedure is visited during the program execution. The upper bound on this hyper-property is referred to as the reachability-bound. It can help to improve the program analysis results when studying other different program features. For example, the reachability-bound on each program location can be used by some resource cost analysis techniques to compute a precise bound on a program’s worst-case resource consumption. When we analyze the adaptivity in an adaptive data analysis program as discussed above, the accuracy of my program analysis result can also be improved through a tight reachability-bound on every program location. Some existing program complexity analysis methods can be repurposed to analyze and estimate the reachability-bound. However, these methods focus only on the overall quantity and ignore the path sensitivity in the program. For this reason, the reachability-bounds of the locations in different sub-procedures are usually over-approximated. As far as we know, there is no general analysis algorithm that computes the reachability-bound for every program location directly and path-sensitively. To this end, I present a pathsensitive reachability-bound algorithm, which exploit the path sensitivity to compute a precise reachability-bound for every program location. We implement this path-sensitive reachability-bound algorithm in a prototype, and report on an experimental comparison with state-of-art tools over four different sets of benchmarks.
|
103 |
LF : a language for reliable embedded systemsVan Riet, F. A. 11 1900 (has links)
Thesis (MSc)--University of Stellenbosch, 2001. / ENGLISH ABSTRACT: Computer-aided verification techniques, such as model checking, are often considered essential
to produce highly reliable software systems. Modern model checkers generally require models to
be written in eSP-like notations. Unfortunately, such systems are usually implemented using
conventional imperative programming languages. Translating the one paradigm into the other is
a difficult and error prone process.
If one were to program in a process-oriented language from the outset, the chasm between implementation
and model could be bridged more readily. This would lead to more accurate models
and ultimately more reliable software.
This thesis covers the definition of a process-oriented language targeted specifically towards embedded
systems and the implementation of a suitable compiler and run-time system.
The language, LF, is for the most part an extension of the language Joyce, which was defined by
Brinch Hansen. Both LF and Joyce have features which I believe make them easier to use than
other esp based languages such as occam. An example of this is a selective communication
primitive which allows for both input and output guards which is not supported in occam.
The efficiency of the implementation is important. The language was therefore designed to be
expressive, but constructs which are expensive to implement were avoided. Security, however, was
the overriding consideration in the design of the language and runtime system.
The compiler produces native code. Most other esp derived languages are either interpreted or
execute as tasks on host operating systems. Arguably this is because most implementations of
esp and derivations thereof are for academic purposes only. LF is intended to be an implementation
language.
The performance of the implementation is evaluated in terms of practical metries such as the
time needed to complete communication operations and the average time needed to service an
interrupt. / AFRIKAANSE OPSOMMING: Rekenaar ondersteunde verifikasietegnieke soos programmodellering, is onontbeerlik in die ontwikkeling
van hoogs betroubare programmatuur. In die algemeen, aanvaar programme wat modelle
toets eSP-agtige notasie as toevoer. Die meeste programme word egter in meer konvensionele
imperatiewe programmeertale ontwikkel. Die vertaling vanuit die een paradigma na die ander is
'n moelike proses, wat baie ruimte laat vir foute.
Indien daar uit die staanspoor in 'n proses gebaseerde taal geprogrammeer word, sou die verwydering
tussen model en program makliker oorbrug kon word. Dit lei tot akkurater modelle en
uiteindelik tot betroubaarder programmatuur.
Die tesis ondersoek die definisie van 'n proses gebaseerde taal, wat gemik is op ingebedde programmatuur.
Verder word die implementasie van 'n toepaslike vertaler en looptyd omgewing ook
bespreek.
Die taal, LF, is grotendeels gebaseer op Joyce, wat deur Brinch Hansen ontwikkel is. Joyce en op
sy beurt LF, is verbeterings op ander esp verwante tale soos occam. 'n Voorbeeld hiervan is 'n
selektiewe kommunikasieprimitief wat die gebruik van beide toevoer- en afvoerwagte ondersteun.
Omdat 'n effektiewe implementasie nagestreef word, is die taalontwerp om so nadruklik moontlik
te wees, sonder om strukture in te sluit wat oneffektief is om te implementeer. Sekuriteit was egter
die oorheersende oorweging in die ontwerp van die taal en looptyd omgewing.
Die vertaler lewer masjienkode, terwyl die meeste ander implementasies van eSP-agtige tale
geinterpreteer word of ondersteun word as prosesse op 'n geskikte bedryfstelsel- die meeste
eSP-agtige tale word slegs vir akademiese doeleindes aangewend. LF is by uitstek ontwerp
as implementasie taal.
Die evaluasie van die stelsel se werkverrigting is gedoen aan die hand van praktiese maatstawwe
soos die tyd wat benodig word vir kommunikasie, sowel as die gemiddelde tyd benodig vir die
hantering van onderbrekings.
|
104 |
A semantics for aspects by compositional translationSanjabi, Sam Bakhtiar January 2008 (has links)
We analyse the semantics of aspect-oriented extensions to functional languages by presenting compositional translations of these primitives into languages with traditional notions of state and control. As a first step, we examine an existing semantic description of aspects which allows the labelling of program points. We show that a restriction of these semantics to aspects which do not preempt the execution of code can be fully abstractly translated into a functional calculus with higher order references, but that removing this restriction requires a notion of exception handling to be added to the target language in order to yield a sound semantics. Next, we proceed to show that abandoning the labelling technique, and consequently relaxing the so-called ``obliviousness'' property of aspectual languages, allows preemptive aspects to be included in the general references model without the need for exceptions. This means that the game model of general references is inherited by the aspect calculus. The net result is a clean semantic description of aspect-orientation, which mirrors recently published techniques for their implementation, and thereby provides theoretical justification for these systems. The practical validity of our semantics is demonstrated by implementing extensions to the basic calculus in Standard ML, and showing how a number of useful aspect-oriented features can be expressed using general references alone. Our theoretical methodology closely follows the proof structure that often appears in the game semantics literature, and therefore provides an operational perspective on notions such as ``bad variables'' and factorisation theorems.
|
105 |
The Development and Validation of a Computer-Aided Instructional Program in Mathematics for Business and Economics MajorsMcCool, Kenneth Bland, 1942- 08 1900 (has links)
The problem with which this study is concerned is that of comparing the results of teaching community college students enrolled in a transferable mathematics sequence for business and economics majors by a computer-aided instructional program and by the traditional lecture method. In order to effectively resolve this problem, an A Programming Language System 360 (APL/360)-aided instructional program was developed and an experimental study was conducted. The APL/360-aided instructional program consisted of three sets of materials.: a manuscript on APL/360, a list of APL programs defining operators relevant to a computer-aided study of calculus, and a collection of problems based on these programs and calculus concepts. The subjects for the experiment were forty-four students enrolled in three sections of Mathematics 112 at Mountain View College of the Dallas County Community College District. The control group, students taught by the traditional lecture method, consisted of twenty-one students. The experimental group, students taught by the APL/360-aided instructional program, consisted of twenty-three students. The same instructor taught all students. The essential difference in the two teaching methods was the use of the computer as a teaching-learning aid in the computer-aided instructional program. The computer was a course supplement to classroom instruction and aided students in obtaining insight into the nature of mathematical concepts as well as serving as a computational aid.
|
106 |
Design and Implementation of a TRAC Processor for Fairchild F24 ComputerChi, Ping Ray 08 1900 (has links)
TRAC is a text-processing language for use with a reactive typewriter. The thesis describes the design and implementation of a TRAC processor for the Fairchild F24 computer. Chapter I introduces some text processing concepts, the TRAC operations, and the implementation procedures. Chapter II examines the history and -characteristics of the TRAC language. The next chapter specifies the TRAC syntax and primitive functions. Chapter IV covers the algorithms used by the processor. The last chapter discusses the design experience from programming the processor, examines the reactive action caused by the processor, and suggests adding external storage primitive functions for a future version of the processor.
|
107 |
Desenvolvimento de hardware e software para viabilizar a operação de um microdensitômetro / Devolopment of hardware and software to operate a microdensitometerMarques, Márcio Alexandre 22 September 1992 (has links)
O presente trabalho foi desenvolvido para viabilizar a operação do microdensitômetro Optronics P-1000 através de um microcomputador tipo IBM-PC. Assim, desenvolveu-se uma interface (hardware), bem como todo o software necessário para operar o equipamento e fazer a aquisição dos dados digitalizados. Este software, permite, também, a visualização interativa das imagens, usada para definir regiões de interesse no filme. / The present work was developed to enable the operation of the Optronics P-1000 densitometer using a IBM-PC compatible microcomputer. Therefore a hardware interface as well as all the needed software to operate the equipment and execute the data acquisition was developed. This software provides also interactive vicualization and operation used to define regions of interest on the film.
|
108 |
EXTRACT: Extensible Transformation and Compiler TechnologyCalnan, III, Paul W. 29 April 2003 (has links)
Code transformation is widely used in programming. Most developers are familiar with using a preprocessor to perform syntactic transformations (symbol substitution and macro expansion). However, it is often necessary to perform more complex transformations using semantic information contained in the source code. In this thesis, we developed EXTRACT; a general-purpose code transformation language. Using EXTRACT, it is possible to specify, in a modular and extensible manner, a variety of transformations on Java code such as insertion, removal, and restructuring. In support of this, we also developed JPath, a path language for identifying portions of Java source code. Combined, these two technologies make it possible to identify source code that is to be transformed and then specify how that code is to be transformed. We evaluate our technology using three case studies: a type name qualifier which transforms Java class names into fully-qualified class names; a contract checker which enforces pre- and post-conditions across behavioral subtypes; and a code obfuscator which mangles the names of a class's methods and fields such that they cannot be understood by a human, without breaking the semantic content of the class.
|
109 |
G#, a graphical approach to functional programmingWall, Karl January 2019 (has links)
The purpose of this bachelor thesis was to present a solution for how a graphical programming language with a functional mindset can be constructed with the help of inspiration from functional programming. An artefact was created by using design science as a research methodology and has in several iterations gone through development and testing to reassure the outcome works as intended. The graphical programming language which has been developed is a language that allows the programmer to code in depth, functions are written as blocks inside blocks and the language is made compact. A code structure in a JSON-format was developed at the same time as G#, allowing it to be presented in a texted data format. The purpose of the JSON-format was to enable execution of G# and the transformation to a text-based programming language. Implications can be made from this study by learning about how a graphical programming language can be built from scratch and what difficulties needs to be tackled to get a result that works as intended. G# was developed to only support a few selected and commonly used functions in programming due to time limitations. The final version of G# in this study can do calculations, handle lists and perform less complex algorithms. Under the development of G#’s graphical functions, the free tool Draw.io was used. Unknowingly from the start, Draw’s limitations made it not perform as expected. Therefore, it is recommended to investigate which available tools are most suited for the construction of the graphical functions before development begins. / Syftet med examensarbetet var att med hjälp av inspiration från funktionell programmering, presentera en lösning på hur ett nytt grafiskt programmeringsspråk med ett funktionellt djup och tänk kan konstrueras. Med hjälp av designbaserad forskning har en artefakt tagits fram som genomgått iterationer av utveckling och testning för att uppnå ett resultat som fungerar som tänkt. Det grafiska språket som tagits fram kallas för G# och är ett språk som tillåter programmeraren att programmera på djupet, funktioner skrivs som block inuti block och språket blir kompakt. Tillhörande G# så konstruerades en struktur i ett JSON-format för hur funktioner i G# kan representeras i ett textat dataformat. Syftet med JSON-formatet var att möjliggöra exekvering av G# och transformering till ett textbaserat programmeringsspråk. Av denna studie kan det dras lärdomar kring hur ett programmeringsspråk kan byggas upp från grunden och vilka svårigheter som måste tacklas för att få ett resultat som fungerar. På grund av begränsning av tid så utvecklades endast vanligt förekommande funktioner inom programmering till G#, språkets slutversion kan göra beräkningar, sköta hantering av listor och utföra enklare algoritmer. Vid utvecklingen av G# användes verktyget Draw.io för att konstruera funktioner. Verktyget var begränsat i funktionalitet och presterade inte enligt förväntan. Därför borde en undersökning som svarar på vilket typ av verktyg som är mest lämpligt för utvecklingen av de grafiska funktionerna göras innan utvecklingen börjar.
|
110 |
LinDCQ : uma linguagem para descrição de circuitos quânticos que possibilita o cálculo das operações na GPU utilizando JOCLGOMES, Mouglas Eugênio Nasário 27 July 2015 (has links)
Submitted by Mario BC (mario@bc.ufrpe.br) on 2017-02-08T13:00:48Z
No. of bitstreams: 1
Mouglas Eugenio Nasario Gomes.pdf: 2441879 bytes, checksum: 71064821936a79cf37326006ed006c46 (MD5) / Made available in DSpace on 2017-02-08T13:00:48Z (GMT). No. of bitstreams: 1
Mouglas Eugenio Nasario Gomes.pdf: 2441879 bytes, checksum: 71064821936a79cf37326006ed006c46 (MD5)
Previous issue date: 2015-07-27 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES / This paper presents the LinDCQ tool — a description language and programming quantum
circuits — which enables the creation of quantum circuits with calculus of operations
performed in parallel on the GPU, using JOCL. The tool also allows the generation of
graphically circuit. Used as a mechanism to generate grammars of languages and automata
as language recognizer and the regular expression engine. In this context a discussion of the
phases of compilers and on quantum computation is presented as well as an explanation
of the main technologies used for the development of quantum circuits. LinDCQ The
tool consists of: grammar in BNF form (Backus-Naur-Form), the compiler verifies that
the incidence of errors in the code to be executed, a graphical interface to facilitate the
programming features that allow the construction of the circuit graphically and parallel
algorithms JOCL to perform operations that require greater computational cost in the
GPU. At the end of an experiment is performed in order to assess the usability of the tool,
to thereby ensure a higher level of user acceptance, facilitating interaction thereof with the
tool developed in this work. / Este trabalho apresenta a ferramenta LinDCQ - uma linguagem de descrição e programação de circuitos quânticos — a qual possibilita a criação de circuitos quânticos com cálculo das operações realizados de forma paralela na GPU, utilizando JOCL. A ferramenta também permite a geração do circuito de forma gráfica. Utiliza gramáticas como mecanismo na geração de linguagens e autômatos como mecanismo reconhecedor de linguagens e de expressões regulares. Nesse contexto é apresentada uma discussão sobre as fases dos compiladores e sobre a computação quântica, assim como uma explanação sobre as principais tecnologias utilizadas para o desenvolvimento de circuitos quânticos. A ferramenta LinDCQ é composta de: gramática no formato BNF (Backus-Naur-Form), compilador que verifica a incidência de erros no código a ser executado, de uma interface gráfica com características facilitadoras à programação que permite a construção do circuito de forma gráfica e de algoritmos paralelos em JOCL para executar as operações que requerem maior custo computacional na GPU. Ao final é realizado um experimento com o
intuito de aferir a usabilidade da ferramenta, para, deste modo, garantir um maior um nível de aceitação do usuário, facilitando a interação do mesmo com a ferramenta desenvolvida nesta dissertação.
|
Page generated in 0.0991 seconds