Spelling suggestions: "subject:"[een] PROGRAMMING LANGUAGES"" "subject:"[enn] PROGRAMMING LANGUAGES""
311 |
Efficient specification-based testing using incremental techniquesUzuncaova, Engin 10 October 2012 (has links)
As software systems grow in complexity, the need for efficient automated techniques for design, testing and verification becomes more and more critical. Specification-based testing provides an effective approach for checking the correctness of software in general. Constraint-based analysis using specifications enables checking various rich properties by automating generation of test inputs. However, as specifications get more complex, existing analyses face a scalability problem due to state explosion. This dissertation introduces a novel approach to analyze declarative specifications incrementally; presents a constraint prioritization and partitioning methodology to enable efficient incremental analyses; defines a suite of optimizations to improve the analyses further; introduces a novel approach for testing software product lines; and provides an experimental evaluation that shows the feasibility and scalability of the approach. The key insight behind the incremental technique is declarative slicing, which is a new class of optimizations. The optimizations are inspired by traditional program slicing for imperative languages but are applicable to analyzable declarative languages, in general, and Alloy, in particular. We introduce a novel algorithm for slicing declarative models. Given an Alloy model, our fully automatic tool, Kato, partitions the model into a base slice and a derived slice using constraint prioritization. As opposed to the conventional use of the Alloy Analyzer, where models are analyzed as a whole, we perform analysis incrementally, i.e., using several steps. A satisfying solution to the base slice is systematically extended to generate a solution for the entire model, while unsatisfiability of the base implies unsatisfiability of the entire model. We show how our incremental technique enables different analysis tools and solvers to be used in synergy to further optimize our approach. Compared to the conventional use of the Alloy Analyzer, this means even more overall performance enhancements for solving declarative models. Incremental analyses have a natural application in the software product line domain. A product line is a family of programs built from features that are increments in program functionality. Given properties of features as firstorder logic formulas, we automatically generate test inputs for each product in a product line. We show how to map a formula that specifies a feature into a transformation that defines incremental refinement of test suites. Our experiments using different data structure product lines show that our approach can provide an order of magnitude speed-up over conventional techniques. / text
|
312 |
Formal specification and verification of a JVM and its bytecode verifierLiu, Hanbing 28 August 2008 (has links)
Not available / text
|
313 |
Logo programavimas mokykloje / Logaprogramming in schoolNarkevičius, Darjušas 02 August 2013 (has links)
Šis darbas yra apie Logo programavimą mokykloje. Darbe išanalizuota S. Paperto filosofija, kurią gerai apibūdina frazė „mokymasis kuriant“. Apžvelgtos Logo savybės bei skirtingi mokimo stiliai. Išanalizuotos Logo konstravimo–programavimo galimybės.
Palygintos mokykloje naudojamos dvi programavimo kalbos Logo ir Pascal, apžvelgtas jų tinkamumas mokimui pagal moksleivio amžių. Darbe apžvelgtos devynios programavimo kalbos, nagrinėtas jų tinkamumas mokymui.
Darbe apžvelgti respublikinės Logo olimpiados darbai, išanalizuotas užduočių vyriausiems moksleiviams programavimo užduočių sudėtingumas, moksleivių kūrybiškumas atliekant jas.
Atliktas tyrimas su skirtingomis moksleivių grupėmis, kurio metu buvo nustatyta kad pradedant mokinius mokinti nuo procedūrinio programavimo, skatinamas jų savarankiškumas bei ugdomi konstruktyvaus programavimo įgūdžiai.
Konstravimo mokymąsi gerai apibūdina frazė „mokymasis kuriant“ (angl. Learning by making). Tokį mokymąsi realizuoti padeda informacinės ir komunikacinės technologijos.
Pradininkas – Seymouras Papertas – žymus matematikas, informatikas ir edukologas, vienas iš dirbtinio intelekto teorijos ir Logo programavimo pradininkų. Jau prieš išleidžiant „Minčių audras“ Papretas savo laboratorijoje Masačiusetso technologijos institute tyrinėjo, kokia programinė įranga būtų tinkamiausia vaikams dirbant kompiuteriu. Tada, 7-ajame dešimtmetyje (1967–1968) ir buvo sukurtos pirmosios programos, išreiškiančios S. Paperto Logo idėjas.
Pagal įvairių... [toliau žr. visą tekstą] / This work is about Logo programming at school. The S. Papert‘s philosophy is analyzed, that could be described as „learning by making“. Logo features and different teaching styles were reviewed. Design-programming features of Logo were analyzed.
The comparative analysis of two programming languages – Logo and Pascal – was done. Their suitability according the pupil’s age was relieved. Nine programming languages were reviewed and their suitability for teaching was investigated.
Olympiad works of the republic of Lithuania were reviewed and the complexity of the tasks for older age pupils was analyzed. The pupils creativity was taken into account also.
The research with different groups of pupils was done and it was determined that while starting to teach pupils from procedural programming the independence and constructive programming skills are encouraged.
The phrase “Learning by making“ is the best description of the construction learning. The information and communication technologies help to realize such type of learning.
Seymour Papert (born February 29, 1928) is well known mathematician, computer scientist, and educator. He is one of the pioneers of artificial intelligence, as well as an inventor of the Logo programming language. Papert in his laboratory at Massachusetts Institute of Technology was searching for the best software for children to work with computer at his researches. Then in 1970-th the first programs that expressed Papert’s ideas were developed... [to full text]
|
314 |
Coercions effaçables : une approche unifiée des systèmes de typesCretin, Julien 30 January 2014 (has links) (PDF)
Les langages de programmation fonctionnels, comme OCaml ou Haskell, reposent sur le lambda calcul en tant que langage noyau. Bien qu'ils aient des stratégies de réduction et des caractéristiques de système de types différentes, leur preuve de correction et de normalisation (en l'absence de réduction) devrait être factorisable. Cette thèse établit une telle factorisation pour des systèmes de types théoriques présentant des types récursifs, du sous-typage, du polymorphisme borné et du polymorphisme contraint. Il est intéressant de remarquer que la correction et la normalisation en réduction forte, implique la correction et la normalisation pour toutes les stratégies usuelles. Notre travail montre qu'une généralisation des coercions actuelles permet de décrire toutes les caractéristiques de systèmes de types citées plus haut de manière effaçable et composable. Nous illustrons ceci en présentant deux systèmes de types concrets : tout d'abord, un système de types explicite avec une forme restreinte d'abstraction sur les coercions pour exprimer le sous-typage et le polymorphisme borné ; puis un système de types implicite sans restriction sur l'abstraction de coercions et qui étend le système explicite avec des types récursifs and du polymorphisme contraint --- mais sans le propriété de préservation du typage. Un résultat annexe est l'adaptation des techniques de step-indexing pour la correction à des calculs en réduction forte.
|
315 |
ICC and Probabilistic ClassesParisen Toldin, Paolo 08 April 2013 (has links) (PDF)
The thesis applies the ICC tecniques to the probabilistic polinomial complexity classes in order to get an implicit characterization of them. The main contribution lays on the implicit characterization of PP (which stands for Probabilistic Polynomial Time) class, showing a syntactical characterisation of PP and a static complexity analyser able to recognise if an imperative program computes in Probabilistic Polynomial Time. The thesis is divided in two parts. The first part focuses on solving the problem by creating a prototype of functional language (a probabilistic variation of lambda calculus with bounded recursion) that is sound and complete respect to Probabilistic Prolynomial Time. The second part, instead, reverses the problem and develops a feasible way to verify if a program, written with a prototype of imperative programming language, is running in Probabilistic polynomial time or not. This thesis would characterise itself as one of the first step for Implicit Computational Complexity over probabilistic classes. There are still open hard problem to investigate and try to solve. There are a lot of theoretical aspects strongly connected with these topics and I expect that in the future there will be wide attention to ICC and probabilistic classes.
|
316 |
Effects of levels of abstractness of icons used to represent programming language constructsGarcia, Mariano January 1993 (has links)
No description available.
|
317 |
Developing practical program analyses for programs with pointersLiang, Donglin January 2002 (has links)
No description available.
|
318 |
The design, implementation, and use of a concurrent lisp programming system for distributed computing environmentsPearson, Mark Philip 05 1900 (has links)
No description available.
|
319 |
A compiler for the LMT music transcription language/Adler, Stuart Philip January 1974 (has links)
No description available.
|
320 |
On the efficiency of meta-level inferenceHarmelen, Frank van January 1989 (has links)
In this thesis we will be concerned with a particular type of architecture for reasoning systems, known as meta-level architectures. After presenting the arguments for such architectures (chapter 1), we discuss a number of systems in the literature that provide an explicit meta-level architecture (chapter 2), and these systems are compared on the basis of a number of distinguishing characteristics. This leads to a classification of meta-level architectures (chapter 3). Within this classification we compare the different types of architectures, and argue that one of these types, called bilingual meta-level inference systems, has a number of advantages over the other types. We study the general structure of bilingual meta-level inference architectures (chapter 4), and we discuss the details of a system that we implemented which has this architecture (chapter 5). One of the problems that this type of system suffers from is the overhead that is incurred by the meta-level effort. We give a theoretical model of this problem, and we perform measurements which show that this problem is indeed a significant one (chapter 6). Chapter 7 discusses partial evaluation, the main technique available in the literature to reduce the meta-level overhead. This technique, although useful, suffers from a number of serious problems. We propose two further techniques, partial reflection and many-sorted logic (chapters 8 and 9), which can be used to reduce the problem of meta-level overhead without suffering from these problems.
|
Page generated in 0.0599 seconds