• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 267
  • 111
  • 90
  • 36
  • 26
  • 24
  • 21
  • 14
  • 7
  • 6
  • 6
  • 3
  • 3
  • 3
  • 3
  • Tagged with
  • 733
  • 140
  • 138
  • 131
  • 101
  • 90
  • 87
  • 82
  • 81
  • 68
  • 66
  • 64
  • 63
  • 63
  • 62
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
231

Verifying Higher-Order Imperative Programs with Higher-Order Separation Logic

Krishnaswami, Neelakantan R. 01 June 2012 (has links)
In this thesis I show is that it is possible to give modular correctness proofs of interesting higher-order imperative programs using higher-order separation logic. To do this, I develop a model higher-order imperative programming language, and develop a program logic for it. I demonstrate the power of my program logic by verifying a series of examples. This includes both realistic patterns of higher-order imperative programming such as the subject-observer pattern, as well as examples demonstrating the use of higher-order logic to reason modularly about highly aliased data structures such as the union-find disjoint set algorithm.
232

Semantics, implementation and pragmatics of Clear, a program specification language

Sannella, Donald Theodore January 1982 (has links)
Specifications are necessary for communicating decisions and intentions and for documenting results at many stages of the program development process. Informal specifications are typically used today, but they are imprecise and often ambiguous. Formal specifications are precise and exact but are more difficult to write and understand. We present work aimed toward enabling the practical use of formal specifications in program development, concentrating on the Clear language for structured algebraic specification. Two different but equivalent denotational semantics for Clear are given. One is a version of a semantics due to Burstall and Goguen with a few corrections, in which the category-theoretic notion of a colimit is used to define Clear's structuring operations independently of the underlying 'institution' (logical formalism). The other semantics defines the same operations by means of straightforward set-theoretic constructions; it is not institutionindependent but it can be modified to handle all institutions of apparent interest. Both versions of the semantics have been implemented. The settheoretic implementation is by far the more useful of the two, and includes a parser and typechecker. An implementation is useful for detecting syntax and type errors in specifications, and can be used as a front end for systems which manipulate specifications. Several large specifications which have been processed by the set-theoretic implementation are presented. A semi-automatic theorem prover for Clear built on top of the Edinburgh LCF system is described. It takes advantage of the structure of Clear specifications to restrict the available information to that which seems relevant to proving the theorem at hand. If the system is unable to prove a theorem automatically the user can attempt the proof interactively using the high-level primitives and inference rules provided. We lay a theoretical foundation for the use of Clear in systematic program development by investigating a new notion of the implementation of a specification by a lower-level specification. This notion extends to handle parameterised specifications. We show that this implementation relation is transitive and commutes with Clear's structuring operations under certain conditions. This means that a large specification can be refined to a program in a gradual and modular fashion, where the correctness of the individual refinements guarantees the correctness of the resulting program.
233

Process Control Methods for Operation of Superconducting Cavities at the LEP Accelerator at CERN

Magnuson, Martin January 1992 (has links)
The aim of this thesis is to analyse the cryogenic process for cooling superconducting radio frequency accelerator test cavities in the LEP accelerator at CERN. A liquefaction cryoplant is analysed, including the production of liquid helium at 4.5 K, the systems for distribution and regulation of liquid helium, and the radio frequency field used for accelerating particles. After discussing regulation problems and modifications planned for a new cavity installation in 1992, different techniques for specifying the control programs for the new installation are evaluated. Various diagramming techniques, standards and methodologies, and Computer Aided Software Engineering-tools, are compared as to their practical usefulness in this kind of process control. Finally, in accordance with anticipated requirements, possible ways of making high and low level control program specifications are suggested.
234

A formal framework for the specification of interactive systems

Butterworth, Richard J. January 1997 (has links)
We are primarily concerned with interactive systems whose behaviour is highly reliant on end user activity. A framework for describing and synthesising such systems is developed. This consists of a functional description of the capabilities of a system together with a means of expressing its desired 'usability'. Previous work in this area has concentrated on capturing 'usability properties' in discrete mathematical models. We propose notations for describing systems in a 'requirements' style and a 'specification' style. The requirements style is based on a simple temporal logic and the specification style is based on Lamport's Temporal Logic of Actions (TLA) [74]. System functionality is specified as a collection of 'reactions', the temporal composition of which define the behaviour of the system. By observing and analysing interactions it is possible to determine how 'well' a user performs a given task. We argue that a 'usable' system is one that encourages users to perform their tasks efficiently (i.e. to consistently perform their tasks well) hence a system in which users perform their tasks well in a consistent manner is likely to be a usable system. The use of a given functionality linked with different user interfaces then gives a means by which interfaces (and other aspects) can be compared and suggests how they might be harnessed to bias system use so as to encourage the desired user behaviour. Normalising across different users anq different tasks moves us away from the discrete nature of reactions and hence to comfortably describe the use of a system we employ probabilistic rather than discrete mathematics. We illustrate that framework with worked examples and propose an agenda for further work.
235

Molecular and cellular differentiation during the early shell field development in Lymnaea stagnalis

Hohagen, Jennifer 16 December 2013 (has links)
No description available.
236

An Investigation of the Optimal Sample Size, Relationship between Existing Tests and Performance, and New Recommended Specifications for Flexible Base Courses in Texas

Hewes, Bailey 03 October 2013 (has links)
The purpose of this study was to improve flexible base course performance within the state of Texas while reducing TxDOT’s testing burden. The focus of this study was to revise the current specification with the intent of providing a “performance related” specification while optimizing sample sizes and testing frequencies based on material variability. A literature review yielded information on base course variability within and outside the state of Texas, and on what tests other states, and Canada, are currently using to characterize flexible base performance. A sampling and testing program was conducted at Texas A&M University to define current variability information, and to conduct performance related tests including resilient modulus and permanent deformation. In addition to these data being more current, they are more representative of short-term variability than data obtained from the literature. This “short-term” variability is considered more realistic for what typically occurs during construction operations. A statistical sensitivity analysis (based on the 80th percentile standard deviation) of these data was conducted to determine minimum sample sizes for contractors to qualify for the proposed quality monitoring program (QMP). The required sample sizes for contractors to qualify for the QMP are 20 for gradation, compressive strength, and moisture-density tests, 15 for Atterberg Limits, and 10 for Web Ball Mill. These sample sizes are based on a minimum 25,000 ton stockpile, or “lot”. After qualifying for the program, if contractors can prove their variability is better than the 80th percentile, they can reduce their testing frequencies. The sample size for TxDOT’s verification testing is 5 samples per lot and will remain at that number regardless of reduced variability. Once qualified for the QMP, a contractor may continue to send material to TxDOT projects until a failing sample disqualifies the contractor from the program. TxDOT does not currently require washed gradations for flexible base. Dry and washed sieve analyses were performed during this study to investigate the need for washed gradations. Statistical comparisons of these data yielded strong evidence that TxDOT should always use a washed method. Significant differences between the washed and dry method were determined for the percentage of material passing the No. 40 and No. 200 sieves. Since TxDOT already specifies limits on the fraction of material passing the No. 40 sieve, and since this study yielded evidence of that size fraction having a relationship with resilient modulus (performance), it would be beneficial to use a washed sieve analysis and therefore obtain a more accurate reading for that specification. Furthermore, it is suggested the TxDOT requires contractors to have “target” test values, and to place 90 percent within limits (90PWL) bands around those target values to control material variability.
237

Smooth transitions in macroeconomic relationships

Eliasson, Ann-Charlotte January 1999 (has links)
The purpose of this thesis is to explore the possibilities and advantages of describing macroeconomic relationships with a certain well-defined class of parametric nonlinear econometric models, called smooth transition regressions (STR). An STR model is a flexible nonlinear specification with a continuum of regimes. It is locally linear transitions from one extreme regime to another are determined by a function of a continuous variable, the transition variable.The thesis consists of four essays and the macroeconomic relationships that are considered are: Consumption, Money Demand and the Phillips Curve. The essays of this dissertation emphasise the importance of allowing for a flexible functional form when dealing with macroeconomic relationships. / Diss. Stockholm : Handelshögsk.
238

Capacity evaluation and retrofitting of timber bridge girders

Wilkinson, Kym January 2008 (has links)
Bridges form a vital link in the physical infrastructure and must be maintained in a "safe working order" at all times. It is estimated that there are currently 20,000 timber road bridges in service throughout Australia. Increasing demands on these bridges due to heavier and faster moving loads, together with deterioration are placing these aging structures at a higher risk of collapse. Unfortunately, many local governments and government departments have neglected the benefits of preventative maintenance and have opted for "just in time" repairs. This is especially true for timber bridges. This past neglect has placed bridge stock in a poor state that is only now being recognised as a significant problem. A key component of this research is to develop improvements to this current situation. This research thesis generates detailed knowledge on the load carrying capacities of timber bridges and new non destructive testing techniques that can be substituted for conventional testing procedures. For the first time guidelines have been developed for undertaking capacity assessment on timber bridges by specifying intervention levels for notched timber and limiting maximum allowable strains in timber members. This newly acquired knowledge will enable Asset Managers to more accurately determine the capacity of sniped timber bridge girders to enable appropriate retrofitting and maintenance while also allowing the safe movement of heavy vehicles. The knowledge generated through destructive testing of timber girders and the analysis of the vast amount of experimental data has enabled the first instance of developing specifications for replacement girders. These specifications detail both functional and performance related targets for three different types of replacement girders. Testing of these replacement girders also demonstrates that through some minor modifications that the specification targets can be met. The outcomes of this thesis provide an innovative approach to accessing the condition and capacity of timber girders and to increasing the safety and life of timber bridges in Queensland. By using new techniques such as Non-destructive testing, species identification and limiting maximum allowable strains, as described in this thesis, the road transport network can be safety used by heavy and permit vehicles. It is only through the effective management of timber bridge maintenance and rehabilitation that Australia can have an efficiently running road transportation network.
239

Definição e especificação formal do jogo diferencial Lobos e Cordeiro / Definition and formal specification of the differential game wolfs and lamb

Sulzbach, Sirlei Ines January 2005 (has links)
No presente trabalho serão apresentadas questões usuais em jogos diferenciais, nos quais os jogadores envolvidos têm objetivos diferentes; ou seja, enquanto um dos jogadores tenta fugir, o outro tenta pegar. Além disso, será definido um modelo de especificação para o jogo diferencial lobos e cordeiro. As Redes de Petri foram escolhidas como forma de especificação para o jogo proposto. Assim, o objetivo será estabelecer estratégias eficientes para o jogo lobos e cordeiro para que se possa realizar um estudo da complexidade das questões apresentadas para este jogo, levando-se em consideração a especificação formal apresentada para tal jogo. / In this work usual questions in differential games will be presented, in which the involved players have different objectives; that is, while one of the players tries "to run away", the other tries "to catch". Moreover, a specification for the differential game "wolves and lamb" will be defined. The Petri Nets had been chosen as specification formalism for the considered game. Thus, the objective is to establish efficient strategies for the game wolves and lamb so that we can carry out a study of the complexity of the presented questions, taking into consideration the presented formal specification for the game.
240

Especificação formal de uma ferramenta de reutilização de especificações de requisitos / Formal specification of a requirements reuse tool

Pimenta, Alexandre January 1998 (has links)
A Engenharia de Software vem, ao longo das últimas décadas, merecendo cada vez mais atenção da comunidade cientifica. A capacidade de construir software com qualidade, dentro dos prazos e orçamentos, atendendo a demanda do mercado, um dos objetivos almejados pelas pesquisas nesta área. Dentro deste contexto, os Métodos Formais e a Reutilização de Software tem sido utilizados para aumentar a qualidade e a produtividade de Software. Os métodos formais possibilitam o desenvolvimento e a verificação de software através de uma notação matemática rigorosa. Este rigor matemático permite que aspectos de consistência, completeza e exatidão possam ser avaliados sistematicamente. Para serem consistentes, fatos declarados em uma especificação não devem ser contraditos em outro lugar. A consistência é garantida matematicamente, provando que fatos iniciais podem ser formalmente traçados (usando-se regras de inferência). A reutilização de software a uma técnica de aproveitamento de informações produzidas durante desenvolvimentos de software anteriores, com o objetivo de reduzir o esforço necessário para o desenvolvimento de um novo sistema. 0 pressuposto básico da reutilização é produzir sistemas de maior qualidade e confiabilidade de forma mais produtiva. Assim, a produtividade é aumentada a medida que soluções de problemas anteriores podem ser aproveitadas para resolver novos problemas. Existe uma tendência de explorar a reutilização nas fases iniciais do desenvolvimento de software. Esta tendência se justifica pela grande importância dada a estas fases. Entretanto, os produtos das fases iniciais são fortemente relacionados com o domínio da aplicação, fazendo com que a reutilização de especificações, de modelos de requisitos ou estratégias de projeto só possa ser realizada com sucesso entre aplicações da mesma família, ou seja, aplicações que compartilhem requisitos e restrições. A noção de domínio é, pois, fundamental a reutilização nas fases iniciais. Um trabalho importante nesta área é o de Maiden, que pesquisou a analogia como um paradigma para reutilização de especificações de requisitos, e identificou 22 domínios abstratos representados por um conjunto de predicados. Este trabalho esta inserido no projeto PROSOFT, que a um ambiente de desenvolvimento de software construído no Grupo de Sistemas de Informação do CPGCC/UFRGS sob a coordenação do Prof. Dr. Daltro Jose Nunes e tem como objetivo apoiar o engenheiro de software desde a analise de requisitos ate a implementação do programa utilizando métodos formais. Ambientes de desenvolvimento de software que se propõem a ajudar o engenheiro de software não podem desprezar o potencial da reutilização, visto que esta técnica oferece condição de se produzir software com mais qualidade de forma mais produtiva Na atual fase de desenvolvimento do PROSOFT, não existe um modelo de reutilização. Se um usuário desejar construir um novo sistema, o ambiente não apresenta suporte para auxiliá-lo na localização e recuperação de especificações de requisitos já implementadas que poderiam ser adaptadas a uma nova solução. Portanto este trabalho tem como objetivo apresentar e especificar formalmente uma ferramenta de reutilização de requisitos para o ambiente PROSOFT. O modelo de reutilização de requisitos desenvolvido por Maiden, "reutilização de especificá-lo de requisitos por analogia" , foi utilizado como referência. / During the last decades Software Engeneering has caught the attention of the scientific community. The ability to develop Software with quality, in the time and cost estimated according to the marketing, is one of the desired goals in this research area. In this context, Formal Methods and Software Reuse have been used to improve Software's quality. Formal Methods enables the software development and verification through a rigorous mathematical notation. This mathematical rigidity allows the systematic assessment of aspects like consistency, completion and correction. The consistency is mathematicaly guaranteed, proving that initial facts are formaly traced (using inference rules). Software Reuse is a technique for making good use of the information produced in previous developments, with the purpose of reducing the efforts to develop new sytems . The basic goal in reusing is to produce systems with higher quality and robustness in a more efficient fashion. There is a tendency to explore the reuse in the initial phases of software development. This is justified by the great importance given to these phases. Nevertheless, the products of the initial phases are strongly related to the application domain, causing the specifications, requirements model or projects strategies reuse succeed only with applications of the same class, that is, applications sharing requirements and restrictions. The notion of domain is fundamental for reusability of software in the initial phases. An important work in this area is Maiden's doctoral thesis, which presents the research of the analogy as a paradigm of requirements specifications reusability , and the identification of 22 abstract domains represented by a predicate set. This work is part of the PROSOFT project, a software development environment built by the CPGCC/UFRGS Information Systems Group, with the purpose of supporting the software engineer, from the requirements analysis to the program implementation using formal methods. Software development environments that propose to help the software engineer cannot ignore the potential for reuse, as this technique offers conditions to produce higher quality software in a more productive way. In the PROSOFT's current development, there is no reuse model. If the user wants to build a new system, the environment does not have any support to help him/her in the identification and recovery of requirements specifications that could be adjusted to the new solution. Hence this work has the purpose of presenting and formaly specifying a requirement reuse tool for the PROSOFT environment. Maiden's requirement reuse model "Analogical Requirement Specification Reuse" was used as reference.

Page generated in 0.1017 seconds