Spelling suggestions: "subject:"cerification**"" "subject:"erification**""
251 |
Runtime detection and prevention for Structure Query Language injection attacksShafie, Emad January 2013 (has links)
The use of Internet services and web applications has grown rapidly because of user demand. At the same time, the number of web application vulnerabilities has increased as a result of mistakes in the development where some developers gave the security aspect a lower priority than aspects like application usability. An SQL (structure query language) injection is a common vulnerability in web applications as it allows the hacker or illegal user to have access to the web application's database and therefore damage the data, or change the information held in the database. This thesis proposes a new framework for the detection and prevention of new and common types of SQL injection attacks. The programme of research is divided in several work packages that start from addressing the problem of the web application in general and SQL injection in particular and discuss existing approaches. The other work packages follow a constructive research approach. The framework considers existing and new SQL injection attacks. The framework consists of three checking components; the first component will check the user input for existing attacks, the second component will check for new types of attacks, and the last component will block unexpected responses from the database engine. Additionally, our framework will keep track of an ongoing attack by recording and investigating user behaviour. The framework is based on the Anatempura tool, a runtime verification tool for Interval Temporal Logic properties. Existing attacks and good/bad user behaviours are specified using Interval Temporal Logic, and the detection of new SQL injection attacks is done using the database observer component. Moreover, this thesis discusses a case study where various types of user behaviour are specified in Interval Temporal Logic and show how these can be detected. The implementation of each component has been provided and explained in detail showing the input, the output and the process of each component. Finally, the functionality of each checking component is evaluated using a case study. The user behaviour component is evaluated using sample attacks and normal user inputs. This thesis is summarized at the conclusion chapter, the future work and the limitations will be discussed. This research has made the following contributions: • New framework for detection and prevention of SQL injection attacks. • Runtime detection: use runtime verification technique based on Interval Temporal logic to detect various types of SQL injection attacks. • Database observer: to detect possible new injection attacks by monitoring database transactions. • User's behaviour: investigates related SQL injection attacks using user input, and providing early warning against SQL injection attacks.
|
252 |
Study on Telemetry Data Authentication Protocol in Arms Control VerificationQiang, Huang, Fan, Yang 10 1900 (has links)
International Telemetering Conference Proceedings / October 25-28, 1999 / Riviera Hotel and Convention Center, Las Vegas, Nevada / The arms control verification activity is executed between countries, so various telemetry data will be remote-transmitted in the public signal channel and can be easily tampered. In order to secure this data’s authenticity and integrality, the paper has established a Multi-layer Data Authentication Protocol (MDAP) in which the key cryptographic technologies are digital signature and authentication. Meanwhile, overall evaluations of MDAP have been presented. We proved the MDAP is secure.
|
253 |
Exploring challenges in a verification process - when adapting production processes to new environmental requirementsAhvenlampi Svensson, Amanda January 2016 (has links)
The requirements on the products and production processes within the manufacturing industry are continuously increasing according to environmental standards. The new requirements are coming from a growing awareness of what our planet can provide for example by the global challenge of climate change. The industry needs to reduce energy consumption and waste to meet the upcoming requirements. One of the processes with high environmental impact in a discrete manufacturing industry is the paint shop. Surface treatment is also of great importance to maintain a high quality product. In scientific literature, technological risk is one of the barriers in implementing environmental conscious manufacturing. Therefore the area of sustainable operations management needs building bridges with other functions and disciplines such as economics, strategies and behavioral sciences in order to manage the transitions. The supply of competence around paint shops today is usually provided by suppliers and other sources within the industry and to make the collaboration to work is essential. In this process of collaboration with external sources, substantial measurements are required to maintain the desirable quality. In order to ensure the competence of testing the quality eventuate when switching technology at a pre-treatment line, this report sets out to explore what the challenges to be taken into consideration are when to assure the product- and- process quality. To respond to this question, a multiple case study is conducted during spring 2016 where the phenomenon to study is the change process and the unit of analysis is the challenges that can be faced during the verification process. The case studied is automotive companies located in Sweden which are producing components for heavy duty vehicles. Data collection is performed by studying documents, participatory observations and semi-structured interviews. The results will give insights to academia on what challenges that are occurring during the verification process of implementing new and cleaner technologies. The conclusions are drawn upon the literature and the empirical results. The managerial implications are to increase the awareness of any potential barriers in the verification process in order to be prepared for managing the technological change process.
|
254 |
Automated program generation : bridging the gap between model and implementationBezuidenhout, Johannes Abraham 02 1900 (has links)
Thesis (MSc)--University of Stellenbosch, 2007. / ENGLISH ABSTRACT: The general goal of this thesis is the investigation of a technique that allows model checking to
be directly integrated into the software development process, preserving the benefits of model
checking while addressing some of its limitations. A technique was developed that allows a
complete executable implementation to be generated from an enhanced model specification.
This included the development of a program, the Generator, that completely automates
the generation process. In addition, it is illustrated how structuring the specification as a
transitions system formally separates the control flow from the details of manipulating data.
This simplifies the verification process which is focused on checking control flow in detail. By
combining this structuring approach with automated implementation generation we ensure
that the verified system behaviour is preserved in the actual implementation. An additional
benefit is that data manipulation, which is generally not suited to model checking, is restricted
to separate, independent code fragments that can be verified using verification techniques for
sequential programs. These data manipulation code segments can also be optimised for the
implementation without affecting the verification of the control structure. This technique
was used to develop a reactive system, an FTP server, and this experiment illustrated that
efficient code can be automatically generated while preserving the benefits of model checking. / AFRIKAANSE OPSOMMING: Hierdie tesis ondersoek ’n tegniek wat modeltoetsing laat deel uitmaak van die sagtewareontwikkelingsproses,
en sodoende betroubaarheid verbeter terwyl sekere tekorkominge van
die tradisionele modeltoetsing proses aangespreek word. Die tegniek wat ontwikkel is maak
dit moontlik om ’n volledige uitvoerbare implementasie vanaf ’n gespesialiseerde model spesifikasie
te genereer. Om die implementasie-generasie stap ten volle te outomatiseer is ’n
program, die Generator, ontwikkel. Daarby word dit ook gewys hoe die kontrolevloei
op ’n formele manier geskei kan word van data-manipulasie deur gebruik te maak van ’n
staatoorgangsstelsel struktureringsbenadering. Dit vereenvoudig die verifikasie proses, wat
fokus op kontrolevloei. Deur di´e struktureringsbenadering te kombineer met outomatiese
implementasie-generasie, word verseker dat die geverifieerde stelsel se gedrag behou word in
die finale implementasie. ’n Bykomende voordeel is dat data-manipulasie, wat gewoonlik nie
geskik is vir modeltoetsing nie, beperk word tot aparte, onafhanklike kode segmente wat geverifieer
kan word deur gebruik te maak van verifikasie tegnieke vir sekwensi¨eele programme.
Hierdie data-manipulasie kode segmente kan ook geoptimeer word vir die implementasie sonder
om die verifikasie van die kontrole struktuur te be¨ınvloed. Hierdie tegniek word gebruik
om ’n reaktiewe stelsel, ’n FTP bediener, te ontwikkel, en di´e eksperiment wys dat doeltreffende
kode outomaties gegenereer kan word terwyl die voordele van modeltoetsing behou
word.
|
255 |
Scaling scope bounded checking using incremental approachesGopinath, Divya 28 October 2010 (has links)
Bounded Verification is an effective technique for finding subtle bugs in object-oriented programs. Given a program, its correctness specification and bounds on the input domain size, scope bounded checking translates bounded code segments into formulas in boolean logic and uses off the shelf satisfiability solvers to search for correctness violations. However, scalability is a key issue of the technique, since for non-trivial programs, the formulas are often complex and can choke the solvers. This thesis describes approaches which aim to scale scope bounded checking by utilizing syntactic and semantic information from the code to split a program into sub-programs which can be checked incrementally. It presents a thorough evaluation of the approaches and compares their performance with existing bounded verification techniques. Novel ideas for future work, specifically a specification slicing driven splitting approach, are proposed to further improve the scalability of bounded verification. / text
|
256 |
Algorithmic Analysis of Complex Semantics for Timed and Hybrid Automata.Doyen, Laurent 13 June 2006 (has links)
In the field of formal verification of real-time systems, major developments have been recorded in the last fifteen years. It is about logics, automata, process algebra, programming languages, etc. From the beginning, a formalism has played an important role: timed automata and their natural extension,hybrid automata. Those models allow the definition of real-time constraints using real-valued clocks, or more generally analog variables whose evolution is governed by differential equations. They generalize finite automata in that their semantics defines timed words where each symbol is associated with an occurrence timestamp.
The decidability and algorithmic analysis of timed and hybrid automata have been intensively studied in the literature. The central result for timed automata is that they are positively decidable. This is not the case for hybrid automata, but semi-algorithmic methods are known when the dynamics is relatively simple, namely a linear relation between the derivatives of the variables.
With the increasing complexity of nowadays systems, those models are however limited in their classical semantics, for modelling realistic implementations or dynamical systems.
In this thesis, we study the algorithmics of complex semantics for timed and hybrid automata.
On the one hand, we propose implementable semantics for timed automata and we study their computational properties: by contrast with other works, we identify a semantics that is implementable and that has decidable properties.
On the other hand, we give new algorithmic approaches to the analysis of hybrid automata whose dynamics is given by an affine function of its variables.
|
257 |
Small model theorems for data independent systems in AlloyMomtahan, Lee January 2007 (has links)
A system is data independent in a type T if the only operations allowed on variables of type T are input, output, assignment and equality testing. This property can be exploited to give procedures for the automatic verification of such systems independently of the instance of the type T. Alloy is an extension of first-order logic for modelling software systems. Alloy has a fully automatic analyzer which attempts to refute Alloy formulas by searching for counterexamples within a finite scope. However, failure to find a counterexample does not prove the formula correct. A small model theorem is a theorem which shows that if a formula has a model then it has a model within some finite scope. The contribution of this thesis is to give a small model theorem which applies when modelling data-independent systems in Alloy. The theorem allows one to detect automatically whether an Alloy formula is data independent in some type T and then calculate a threshold scope for T, thereby completing the analysis of the automatic analyzer with respect to the type T. We derive the small model theorem using a model-theoretic approach. We build on the standard semantics of the Alloy language and introduce a more abstract interpretation of formulas, by way of a Galois insertion. This more abstract interpretation gives the same truth value as the original interpretation for many formulas. Indeed we show that this property holds for any formula built with a limited set of language constructors which we call data-independent constructors. The more abstract interpretation is designed so that it often lies within a finite scope and we can calculate whether this is the case and exactly how big the finite scope need be from the types of the free variables in the formula. In this way we can show that if a formula has any instance or counterexample at all then it has one within a threshold scope, the size of which we can calculate.
|
258 |
Delivery, installation, on-sky verification of the Hobby Eberly Telescope wide field correctorLee, Hanshin, Hill, Gary J., Good, John M., Vattiat, Brian L., Shetrone, Matthew, Kriel, Herman, Martin, Jerry, Schroeder, Emily, Oh, Chang Jin, Frater, Eric, Smith, Bryan, Burge, James H. 08 August 2016 (has links)
The Hobby-Eberly Telescope (HET)(+), located in West Texas at the McDonald Observatory, operates with a fixed segmented primary (M1) and has a tracker, which moves the prime-focus corrector and instrument package to track the sidereal and non-sidereal motions of objects. We have completed a major multi-year upgrade of the HET that has substantially increased the pupil size to 10 meters and the field of view to 22 arcminutes by deploying the new Wide Field Corrector (WFC), new tracker system, and new Prime Focus Instrument Package (PFIP). The focus of this paper is on the delivery, installation, and on-sky verification of the WFC. We summarize the technical challenges encountered and resolutions to overcome such challenges during the construction of the system. We then detail the transportation from Tucson to the HET, on-site ground verification test results, post-installation static alignment among the WFC, PFIP, and M1, and on-sky verification of alignment and image quality via deploying multiple wavefront sensors across 22 arcminutes field of view. The new wide field HET will feed the revolutionary new integral field spectrograph called VIRUS, in support of the Hobby-Eberly Telescope Dark Energy Experiment (HETDEX), a new low resolution spectrograph (LRS2), an upgraded high resolution spectrograph (HRS2), and later the Habitable Zone Planet Finder (HPF).
|
259 |
JWST science instrument pupil alignment measurementsKubalak, Dave, Sullivan, Joe, Ohl, Ray, Antonille, Scott, Beaton, Alexander, Coulter, Phillip, Hartig, George, Kelly, Doug, Lee, David, Maszkiewicz, Michael, Schweiger, Paul, Telfer, Randal, Te Plate, Maurice, Wells, Martyn 27 September 2016 (has links)
NASA's James Webb Space Telescope (JWST) is a 6.5m diameter, segmented, deployable telescope for cryogenic IR space astronomy (similar to 40K). The JWST Observatory architecture includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM) element that contains four science instruments (SI), including a guider. OSIM is a full field, cryogenic, optical simulator of the JWST OTE. It is the "Master Tool" for verifying the cryogenic alignment and optical performance of ISIM by providing simulated point source/star images to each of the four Science Instruments in ISIM. Included in OSIM is a Pupil Imaging Module (PIM) - a large format CCD used for measuring pupil alignment. Located at a virtual stop location within OSIM, the PIM records superimposed shadow images of pupil alignment reference (PAR) targets located in the OSIM and SI pupils. The OSIM Pupil Imaging Module was described by Brent Bos, et al, at SPIE in 2011 prior to ISIM testing. We have recently completed the third and final ISIM cryogenic performance verification test before ISIM was integrated with the OTE. In this paper, we describe PIM implementation, performance, and measurement results.
|
260 |
The development of automatic and solar imaging techniques for the accurate detection, merging, verification and tracking of solar filamentsAtoum, Ibrahim Ali Ahmad January 2012 (has links)
Based on a study of existing solar filament and tracking methods, a fully automated solar filament detection and tracking method is presented. An adaptive thresholding technique is used in a segmentation phase to identify candidate filament pixels. This phase is followed by retrieving the actual filament area from a region grown filament by using statistical parameters and morphological operations. This detection technique gives the opportunity to develop an accurate spine extraction algorithm. Features including separation distance, orientation and average intensities are extracted and fed to a Neural Network (NN) classifier to merge broken filament components. Finally, the results for two consecutive images are compared to detect filament disappearance events, taking advantage of the maps resulting from converting solar images to Heliographic Carrington co-ordinates. The study has demonstrated the novelty of the algorithms developed in terms of them now all being fully automated; significantly the algorithms do not require any empirical values to be used whatsoever unlike previous techniques. This combination of features gives the opportunity for these methods to work in real-time. Comparisons with other researchers shows that the present algorithms represent the filaments more accurately and evaluate computationally faster - which could lead to a more precise tracking practice in real-time. An additional development phase developed in this dissertation in the process of detecting solar filaments is the detection of filament disappearances. Some filaments and prominences end their life with eruptions. When this occurs, they disappear from the surface of the Sun within a few hours. Such events are known as disappearing filaments and it is thought that they are associated with coronal mass ejections (CMEs). Filament disappearances are generally monitored by observing and analysing successive solar H-alpha images. After filament regions are obtained from individual H-alpha images, a NN classifier is used to categorize the detected filaments as Disappeared Filaments (DFs) or Miss-Detected Filaments (MDFs). Features such as Area, Length, Mean, Standard Deviation, Skewness and Kurtosis are extracted and fed to this neural network which achieves a confidence level of at least 80%. Comparing the results with other researchers shows high divergence between the results. The NN method shows better convergence with the results of the National Geophysical Data Centre (NGDC) than the results of the others researchers.
|
Page generated in 0.1127 seconds