241 |
USIMPL: An Extension of Isabelle/UTP with Simpl-like Control FlowBockenek, Joshua A. 21 December 2017 (has links)
Writing bug-free code is fraught with difficulty, and existing tools for the formal verification of programs do not scale well to large, complicated codebases such as that of systems software. This thesis presents USIMPL, a component of the Orca project for formal verification that builds on Foster’s Isabelle/UTP with features of Schirmer’s Simpl in order to achieve a modular, scalable framework for deductive proofs of program correctness utilizing Hoare logic and Hoare-style algebraic laws of programming. / Master of Science / Writing bug-free code is fraught with difficulty, and existing tools for the formal verification of programs do not scale well to large, complicated codebases such as that of systems software (OSes, compilers, and similar programs that have a high level of complexity but work on a lower level than typical user applications such as text editors, image viewers, and the like). This thesis presents USIMPL, a component of the Orca project for formal verification that builds on an existing framework for computer-aided, deductive mathematical proofs (Foster’s Isabelle/UTP) with features inspired by a simple but featureful language used for verification (Schirmer’s Simpl) in order to achieve a modular, scalable framework for proofs of program correctness utilizing the rule-based mathematical representation of program behavior known as Hoare logic and Hoare-style algebraic laws of programming, which provide a formal methodology for transforming programs to equivalent formulations.
|
242 |
Abstraction Guided Semi-formal VerificationParikh, Ankur 28 June 2007 (has links)
Abstraction-guided simulation is a promising semi-formal framework for design validation in which an abstract model of the design is used to guide a logic simulator towards a target property. However, key issues still need to be addressed before this framework can truly deliver on it's promise. Concretizing, or finding a real trace from an abstract trace, remains a hard problem. Abstract traces are often spurious, for which no corresponding real trace exits. This is a direct consequence of the abstraction being an over-approximation of the real design. Further, the way in which the abstract model is constructed is an open-ended problem which has a great impact on the performance of the simulator.
In this work, we propose a novel approaches to address these issues. First, we present a genetic algorithm to select sets of state variables directly from the gate-level net-list of the design, which are highly correlated to the target property. The sets of selected variables are used to build the Partition Navigation Tracks (PNTs). PNTs capture the behavior of expanded portions of the state space as they related to the target property. Moreover, the computation and storage costs of the PNTs is small, making them scale well to large designs.
Our experiments show that we are able to reach many more hard-to-reach states using our proposed techniques, compared to state-of-the-art methods.
Next, we propose a novel abstraction strengthening technique, where the abstract design is constrained to make it more closely resemble the concrete design. Abstraction strengthening greatly reduces the need to refine the abstract model for hard to reach properties. To achieve this, we efficiently identify sequentially unreachable partial sates in the concrete design via intelligent partitioning, resolution and cube enlargement. Then, these partial states are added as constraints in the abstract model. Our experiments show that the cost to compute these constraints is low and the abstract traces obtained from the strengthened abstract model are far easier to concretize. / Master of Science
|
243 |
Supporting Open Source Investigative Journalism with Crowdsourced Image GeolocationKohler, Rachel 10 August 2017 (has links)
Journalists rely on image and video verification to support their investigations and often utilize open source tools to verify user generated content, but current practice requires experts be involved in every step of the process. Additionally, lacking custom tools to support verification efforts, experts are often limited to the utility of existing, openly available tools, which may or may not support the interactions and information gathering they require. We aim to support the process of geolocating images and videos through crowdsourcing. By enabling crowd workers to participate in the geolocation process, we can provide investigative journalists with efficient and complete verification of image locations. Parallelizing searching speeds up the verification process as well as provides a more extensive search, all while allowing the expert to follow up on other leads or investigative work. We produced a software prototype called GroundTruth which enables crowd workers to support investigative journalists in the geolocation of visual media quickly and accurately. Additionally, this work contributes experimental results demonstrating how the crowd can be utilized to support complex sensemaking tasks. / Master of Science / Journalists rely on image and video verification to support their investigations and often utilize freely available tools to verify online and digital content. Currently, experts are involved in every step of this verification process, researching and confirming or refuting the claims of images or videos. Since experts often do not have access to custom tools, they rely on already existing tools, which do not always meet their needs. One type of image verification is geolocation, in which investigators work to identify the location where a photo or video was made. We aim to support this process through crowdsourcing. By enabling a large number of people, most with no experience or prior training, to help find the location, we can provide investigative journalists with efficient and complete verification of image locations. Multiple people searching at the same time speeds up the verification process as well as provides a more extensive search, all while allowing the expert to follow up on other leads or investigative work. We produced a software prototype called GroundTruth which enables novice individuals to support investigative journalists in determining the location of images and videos quickly and accurately. Additionally, this work contributes experimental results demonstrating how these individuals can collectively support complex sensemaking tasks.
|
244 |
Using data mining to increase controllability and observability in functional verificationFarkash, Monica C. 10 February 2015 (has links)
Hardware verification currently takes more than 50% of the whole verification time. There is a sustained effort to improve the efficiency of the verification process, which in the past helped deliver a large variety of supporting tools. The past years though did not see any major technology change that would bring the improvements that the process really needs (H. Foster 2013) (Wilson Research Group 2012). The existing approach to verification does not provide that type of qualitative jump anymore. This work is introducing a new tactic, providing a modern alternative to the existing approach to the verification problem. The novel approach I use in this research has the potential of significantly improve the process, way beyond incremental changes. It starts with acknowledging the huge amounts of data that follows the hardware development process from inception to the final product and in considering the data not as a quantitative by-product but as a qualitative supply of information on which we can develop a smarter verification. The approach is based on data already generated throughout the process currently used by verification engineers to zoom into the details of different verification aspects. By using existing machine learning approaches we can zoom out and use the same data to extract information, to gain knowledge that we can use to guide the verification process. This approach allows an apparent lack of accuracy introduced by data discovery, to achieve the overall goal. The latest advancements in machine learning and data mining offer a base of a new understanding and usage of the data that is being passed through the process. This work takes several practical problems for which the classical verification process reached a roadblock, and shows how the new approach can provide a jump in productivity and efficiency of the verification process. It focuses on four different aspects of verification to prove the power of this new approach:
reducing effort redundancy, guiding verification to areas that need it first, decreasing time to diagnose, and designing tests for coverage efficiency. / text
|
245 |
BVM: Reformulação da metodologia de verificação funcional VeriSC. / BVM: Reconstruction of VeriSC functional verification methodology.OLIVEIRA, Herder Fernando de Araújo. 27 August 2018 (has links)
Submitted by Johnny Rodrigues (johnnyrodrigues@ufcg.edu.br) on 2018-08-27T17:42:49Z
No. of bitstreams: 1
HELDER FERNANDO DE ARAUJO OLIVEIRA - DISSERTAÇÃO PPGCC 2010..pdf: 2110687 bytes, checksum: 5d2a2c0f6c5039c3f21dd8219d20f122 (MD5) / Made available in DSpace on 2018-08-27T17:42:49Z (GMT). No. of bitstreams: 1
HELDER FERNANDO DE ARAUJO OLIVEIRA - DISSERTAÇÃO PPGCC 2010..pdf: 2110687 bytes, checksum: 5d2a2c0f6c5039c3f21dd8219d20f122 (MD5)
Previous issue date: 2010-06-16 / O processo de desenvolvimento de um circuito digital complexo pode ser composto por diversas etapas. Uma delas é a verificação funcional. Esta etapa pode ser considerada uma das mais importantes, pois tem como objetivo demonstrar que as funcionalidades do circuito a ser produzido estão em conformidade com a sua especificação. Porém, além de ser uma fase com grande consumo de recursos, a complexidade da verificação funcional cresce diante da complexidade do hardware a ser verificado. Desta forma, o uso de uma metodologia de verificação funcional eficiente e de ferramentas que auxiliem o engenheiro de verificação funcional são de grande valia. Neste contexto, este trabalho realiza uma reformulação da metodologia de verificação funcional VeriSC, originando uma nova metodologia, denominada BVM (Brazil-IP Verification Methodology). VeriSC é implementada em SystemC e utiliza as bibliotecas SCV (SystemC Verification Library) e BVE (Brazil-IP Verification Extensions), enquanto BVM é implementada em SystemVerilog e baseada em conceitos e biblioteca de OVM (Open Verification Methodology). Além disto, este trabalho visa a adequação da ferramenta de apoio à verificação funcional eTBc (Easy Testbench Creator) para suportar BVM. A partir do trabalho realizado, é possível constatar, mediante estudos de caso no âmbito do projeto Brazil-IP, que BVM traz um
aumento da produtividade do engenheiro de verificação na realização da verificação funcional, em comparação à VeriSC / The development process of a complex digital circuit can consist of several stages. One of them is the functional verification. This stage can be considered one of the most important because it aims to demonstrate that a circuit functionality to be produced is in accordance with its specification. However, besides being a stage with large consumption of resources, the complexity of functional verification grows according to the complexity of the hardware to be verified. Thus, the use of an effective functional verification methodology and tools to help engineer the functional verification are of great value. Within this context, this work proposes a reformulation of the functional verification methodology VeriSC, resulting in a new methodology called BVM (Brazil-IP Verification Methodology). VeriSC is implemented in SystemC and uses the SCV (SystemC Verification Library) and BVE (Brazil-IP Verification Extensions) libraries, while BVM is implemented and based on SystemVerilog and OVM (Open Verification Methodology) concepts and library. Furthermore,
this study aims the adequacy of the functional verification tool eTBc (testbench Easy Creator), to support BVM. From this work it can be seen, based on case studies under the Brazil-IP project, that BVM increase the productivity of the engineer in the functional verification stage when compared to VeriSC.
|
246 |
Authenticating turbocharger performance utilizing ASME performance test code correction methodsShultz, Jacque January 1900 (has links)
Master of Science / Department of Mechanical and Nuclear Engineering / Kirby S. Chapman / Continued regulatory pressure necessitates the use of precisely designed turbochargers to create the design trapped equivalence ratio within large-bore stationary engines used in the natural gas transmission industry. The upgraded turbochargers scavenge the exhaust gases from the cylinder, and create the air manifold pressure and back pressure on the engine necessary to achieve a specific trapped mass. This combination serves to achieve the emissions reduction required by regulatory agencies.
Many engine owner/operators request that an upgraded turbocharger be tested and verified prior to re-installation on engine. Verification of the mechanical integrity and airflow performance prior to engine installation is necessary to prevent field hardware iterations. Confirming the as-built turbocharger design specification prior to transporting to the field can decrease downtime and installation costs. There are however, technical challenges to overcome for comparing test-cell data to field conditions.
This thesis discusses the required corrections and testing methodology to verify turbocharger onsite performance from data collected in a precisely designed testing apparatus. As the litmus test of the testing system, test performance data is corrected to site conditions per the design air specification. Prior to field installation, the turbocharger is fitted with instrumentation to collect field operating data to authenticate the turbocharger testing system and correction methods. The correction method utilized herein is the ASME Performance Test Code 10 (PTC10) for Compressors and Exhausters version 1997.
|
247 |
Measurement and verification of industrial DSM projects / Walter BooysenBooysen, Walter January 2014 (has links)
Energy cost-reduction projects implemented on complex industrial systems present several challenges. The involvement of multiple project stakeholders associated with programmes such as demand side management (DSM) further increases potential risks. The process of determining project impacts is especially important due to the direct financial impact on stakeholders. A good understanding of the independent measurement and verification (M&V) process is therefore vital to ensure an unbiased process.
A review of existing M&V frameworks and guidelines found that M&V protocols and templates are well developed and widely implemented. Unfortunately, the official literature provides little guidance on the practical M&V of industrial DSM projects. This prompted a detailed literature analysis of numerous publications to ascertain the industry norm. The diverse results obtained are categorised, normalised and graphically presented to highlight shortcomings in present M&V processes.
This thesis develops several practical methodologies and guidelines to address the needs highlighted by the literature analysis. Three chapters are dedicated to the development and verification of these solutions. The first entails the evaluation of data quality with the aim of producing an accurate and error-free dataset. The second develops, evaluates and ultimately selects a baseline model representative of normal system operations. The final chapter presents project performance and uses existing methods to monitor system changes and project performance over the long term.
The new methodologies are designed to simplify the practical implementation of different processes. Results are graphically presented thereby enabling quick and intuitive evaluation whilst adhering to present M&V requirements. This makes the M&V process accessible to all stakeholders and enables the transparent development and improvement of all processes.
The practical application of the new methodologies is verified by using 25 industrial case studies. The results obtained are validated using data obtained from independent third parties. This proves the functionality of the methodologies and highlights trends that can be evaluated in future studies.
The new methodologies improve the accuracy and efficiency of the evaluation process. The potential annual impacts amount to R27 million for DSM stakeholders and R19 million for M&V teams. The extrapolation of these results indicates a massive potential impact on international projects. These results, albeit estimates, confirm the significant contribution of the new methodologies. I would like to officially thank Prof. Eddie Mathews and Prof. Marius Kleingeld for granting me the opportunity to work under their guidance. Thanks to all the staff at the Centre for Research and Continued Engineering Development Pretoria, who created the ideal environment for working and learning. I would also like to thank TEMM International (Pty) Ltd for the bursary without which my studies would not be possible. Finally, I would like to thank my fellow students as well as all the industry professionals whom I had the privilege of working with.
On a personal note, I would like to thank God for making all things possible. Thank you my dearest family, friends and all who had a profound impact on my life. For you I quote Paulo Coelho’s Alchemist: “And, when you want something, all the universe conspires in helping you to achieve it.”. Thank you for your love, sacrifice, support and being part of my universe. I dedicate this work to you. / PhD (Electrical Engineering), North-West University, Potchefstroom Campus, 2014
|
248 |
Measurement and verification of industrial DSM projects / Walter BooysenBooysen, Walter January 2014 (has links)
Energy cost-reduction projects implemented on complex industrial systems present several challenges. The involvement of multiple project stakeholders associated with programmes such as demand side management (DSM) further increases potential risks. The process of determining project impacts is especially important due to the direct financial impact on stakeholders. A good understanding of the independent measurement and verification (M&V) process is therefore vital to ensure an unbiased process.
A review of existing M&V frameworks and guidelines found that M&V protocols and templates are well developed and widely implemented. Unfortunately, the official literature provides little guidance on the practical M&V of industrial DSM projects. This prompted a detailed literature analysis of numerous publications to ascertain the industry norm. The diverse results obtained are categorised, normalised and graphically presented to highlight shortcomings in present M&V processes.
This thesis develops several practical methodologies and guidelines to address the needs highlighted by the literature analysis. Three chapters are dedicated to the development and verification of these solutions. The first entails the evaluation of data quality with the aim of producing an accurate and error-free dataset. The second develops, evaluates and ultimately selects a baseline model representative of normal system operations. The final chapter presents project performance and uses existing methods to monitor system changes and project performance over the long term.
The new methodologies are designed to simplify the practical implementation of different processes. Results are graphically presented thereby enabling quick and intuitive evaluation whilst adhering to present M&V requirements. This makes the M&V process accessible to all stakeholders and enables the transparent development and improvement of all processes.
The practical application of the new methodologies is verified by using 25 industrial case studies. The results obtained are validated using data obtained from independent third parties. This proves the functionality of the methodologies and highlights trends that can be evaluated in future studies.
The new methodologies improve the accuracy and efficiency of the evaluation process. The potential annual impacts amount to R27 million for DSM stakeholders and R19 million for M&V teams. The extrapolation of these results indicates a massive potential impact on international projects. These results, albeit estimates, confirm the significant contribution of the new methodologies. I would like to officially thank Prof. Eddie Mathews and Prof. Marius Kleingeld for granting me the opportunity to work under their guidance. Thanks to all the staff at the Centre for Research and Continued Engineering Development Pretoria, who created the ideal environment for working and learning. I would also like to thank TEMM International (Pty) Ltd for the bursary without which my studies would not be possible. Finally, I would like to thank my fellow students as well as all the industry professionals whom I had the privilege of working with.
On a personal note, I would like to thank God for making all things possible. Thank you my dearest family, friends and all who had a profound impact on my life. For you I quote Paulo Coelho’s Alchemist: “And, when you want something, all the universe conspires in helping you to achieve it.”. Thank you for your love, sacrifice, support and being part of my universe. I dedicate this work to you. / PhD (Electrical Engineering), North-West University, Potchefstroom Campus, 2014
|
249 |
Continuous Authentication using StylometryBrocardo, Marcelo Luiz 30 April 2015 (has links)
Static authentication, where user identity is checked once at login time, can be circumvented no matter how strong the authentication mechanism is. Through attacks such as man-in-the-middle and its variants, an authenticated session can be hijacked later after the initial login process has been completed. In the last decade, continuous authentication (CA) using biometrics has emerged as a possible remedy against session hijacking. CA consists of testing the authenticity of the user repeatedly throughout the authenticated session as data becomes available. CA is expected to be carried out unobtrusively, due to its repetitive nature, which means that the authentication information must be collectible without any active involvement of the user and without using any special purpose hardware devices (e.g. biometric readers). Stylometry analysis, which consists of checking whether a target document was written or not by a specific individual, could potentially be used for CA. Although stylometric techniques can achieve high accuracy rates for long documents, it is still challenging to identify an author for short documents, in particular when dealing with large author populations.
In this dissertation, we propose a new framework for continuous authentication using authorship verification based on the writing style. Authorship verification can be checked using stylometric techniques through the analysis of linguistic styles and writing characteristics of the authors. Different from traditional authorship verification that focuses on long texts, we tackle the use of short messages. Shorter authentication delay (i.e. smaller data sample) is essential to reduce the window size of the re-authentication period in CA. We validate our method using different block sizes, including 140, 280, and 500 characters, and investigate shallow and deep learning architectures for machine learning classification. Experimental evaluation of the proposed authorship verification approach based on the Enron emails dataset with 76 authors yields an Equal Error Rate (EER) of 8.21% and Twitter dataset with 100 authors yields an EER of 10.08%. The evaluation of the approach using relatively smaller forgery samples with 10 authors yields an EER of 5.48%. / Graduate
|
250 |
IRIG 106 CHAPTER 10 RECORDER VALIDATIONFerrill, Paul, Golackson, Michael 10 1900 (has links)
ITC/USA 2007 Conference Proceedings / The Forty-Third Annual International Telemetering Conference and Technical Exhibition / October 22-25, 2007 / Riviera Hotel & Convention Center, Las Vegas, Nevada / The most recent version of IRIG 118, Test Methods for Telemetry Systems and Subsystems, was
released in 1999 and does not include any guidance for testing IRIG 106 Chapter 10 recorder /
reproducers. This paper will describe the methodology and tools used to perform a thorough
testing process to ensure compliance with the IRIG 106-07 standard.
|
Page generated in 0.0892 seconds