421 |
Towards Light-Weight Probabilistic Model CheckingKonur, Savas 03 June 2014 (has links)
Yes / Model checking has been extensively used to verify various systems. However, this usually has been done by experts who have a good understanding of model checking and who are familiar with the syntax of both modelling and property specification languages. Unfortunately, this is not an easy task for nonexperts to learn description languages for modelling and formal logics/languages for property specification. In particular, property specification is very daunting and error-prone for nonexperts. In this paper, we present a methodology to facilitate probabilistic model checking for nonexperts. The methodology helps nonexpert users model their systems and express their requirements without any knowledge of the modelling and property specification languages.
|
422 |
kPWorkbench: A software suit for membrane systemsKonur, Savas, Mierla, L.M., Ipate, F., Gheorghe, Marian 29 January 2020 (has links)
Yes / Membrane computing is a new natural computing paradigm inspired by the functioning and structure of biological cells, and has been successfully applied to many different areas, from biology to engineering. In this paper, we present kPWorkbench, a software framework developed to support membrane computing and its applications. kPWorkbench offers unique features, including modelling, simulation, agent-based high performance simulation and verification, which allow modelling and computational analysis of membrane systems. The kPWorkbench formal verification component provides the opportunity to analyse the behaviour of a model and validate that important system requirements are met and certain behaviours are observed. The platform also features a property language based on natural language statements to facilitate property specification. / EPSRC
|
423 |
Improving Processes Using Static Analysis TechniquesChen, Bin 01 February 2011 (has links)
Real-world processes often undergo improvements to meet certain goals, such as coping with changed requirements, eliminating defects, improving the quality of the products, and reducing costs. Identifying and evaluating the defects or errors in the process, identifying the causes of such defects, and validating proposed improvements all require careful analysis of the process.Human-intensive processes, where human contributions require considerable domain expertise and have a significant impact on the success or failure of the overall mission, are of particular concern because they can be extremely complex and may be used in critical, including life-critical, situations. To date, the analysis support for such processes is very limited. If done at all, it is usually performed manually and can be extremely time-consuming, costly and error-prone.There has been considerable success lately in using static analysis techniques to analyze hardware systems, software systems, and manufacturing processes. This thesis explores how such analysis techniques can be automated and employed to effectively analyze life-critical, human-intensive processes. In this thesis, we investigated two static analysis techniques: Finite-State Verification (FSV) and Fault Tree Analysis (FTA). We proposed a process analysis framework that is capable of performing both FSV and FTA on rigorously defined processes. Although evaluated for processes specified in the Little-JIL process definition language, this is a general framework independent of the process definition language. For FSV, we developed a translation-based approach that is able to take advantage of existing FSV tools. The process definition and property to be evaluated are translated into the input model and property representation accepted by the selected FSV tool. Then the FSV tool is executed to verify the model against the property representation. For FTA, we developed a template-based approach to automatically derive fault trees from the process definition. In addition to showing the feasibility of applying these two techniques to processes, much effort has been put on improving the scalability and the usability of the framework so that it can be easily used to analyze complex real-world processes. To scale the analysis, we investigated several optimizations that are able to dramatically reduce the translated models for FSV tools and speed up the verification. We also developed several optimizations for the fault tree derivation to make the generated fault tree much more compact and easier to understand and analyze. To improve the usability, we provided several approaches that make analysis results easier to understand. We evaluated this framework based on the Little-JIL process definition language and employed it to analyze two real-world, human-intensive processes: an in-patient blood transfusion process and a chemotherapy process. The results show that the framework can be used effectively to detect defects in such real-world, human-intensive processes.
|
424 |
Symboleo: Specification and Verification of Legal ContractsParvizimosaed, Alireza 21 October 2022 (has links)
Contracts are legally binding and enforceable agreements among two or more parties that govern social interactions. They have been used for millennia, including in commercial transactions, employment relationships and intellectual property generation. Each contract determines obligations and powers of contracting parties. The execution of a contract needs to be continuously monitored to ensure compliance with its terms and conditions. Smart contracts are software systems that monitor and control the execution of contracts to ensure compliance. But for such software systems to become possible, contracts need to be specified precisely to eliminate ambiguities, contradictions, and missing clauses. This thesis proposes a formal specification language for contracts named Symboleo. The ontology of Symboleo is founded on the legal concepts of obligation (a kind of duty) and power (a kind of right) complemented with the concepts of event and situation that are suitable for conceptualizing monitoring tasks. The formal semantics of legal concepts is defined in terms of state machines that describe the lifetimes of contracts, obligations, and powers, as well as axioms that describe precisely state transitions. The language supports execution-time operations that enable subcontracting assignment of rights and substitution of performance to a third party during the execution of a contract. Symboleo has been applied to the formalization of contracts from three different domains as a preliminary evaluation of its expressiveness. Formal specifications can be algorithmically analyzed to ensure that they satisfy desired properties. Towards this end, the thesis presents two implemented analysis tools. One is a conformance checking tool (SymboleoPC) that ensures that a specification is consistent with the expectations of contracting parties. Expectations are defined for this tool in terms of scenarios (sequences of events) and the expected final outcome (i.e., successful/unsuccessful execution). The other tool (SymboleoPC), which builds on top of an existing model checker (nuXmv), can prove/disprove desired properties of a contract, expressed in temporal logic. These tools have been used for assessing different business contracts. SymboleoPC is also assessed in terms of performance and scalability, with positive results. Symboleo, together with its associated tools, is envisioned as an enabler for the formal verification of contracts to address requirements-level issues, at design time.
|
425 |
The development of automatic and solar imaging techniques for the accurate detection, merging, verification and tracking of solar filaments.Atoum, Ibrahim A.A. January 2012 (has links)
Based on a study of existing solar filament and tracking methods, a fully automated solar filament detection and tracking method is presented. An adaptive thresholding technique is used in a segmentation phase to identify candidate filament pixels. This phase is followed by retrieving the actual filament area from a region grown filament by using statistical parameters and morphological operations. This detection technique gives the opportunity to develop an accurate spine extraction algorithm. Features including separation distance, orientation and average intensities are extracted and fed to a Neural Network (NN) classifier to merge broken filament components. Finally, the results for two consecutive images are compared to detect filament disappearance events, taking advantage of the maps resulting from converting solar images to Heliographic Carrington co-ordinates.
The study has demonstrated the novelty of the algorithms developed in terms of them now all being fully automated; significantly the algorithms do not require any empirical values to be used whatsoever unlike previous techniques. This combination of features gives the opportunity for these methods to work in real-time. Comparisons with other researchers shows that the present algorithms represent the filaments more accurately and evaluate computationally faster - which could lead to a more precise tracking practice in real-time.
An additional development phase developed in this dissertation in the process of detecting solar filaments is the detection of filament disappearances. Some filaments and prominences end their life with eruptions. When this occurs, they disappear from the surface of the Sun within a few hours. Such events are known as disappearing filaments and it is thought that they are associated with coronal mass ejections (CMEs). Filament disappearances are generally monitored by observing and analysing successive solar H-alpha images. After filament regions are obtained from individual H-alpha images, a NN classifier is used to categorize the detected filaments as Disappeared Filaments (DFs) or Miss-Detected Filaments (MDFs). Features such as Area, Length, Mean, Standard Deviation, Skewness and Kurtosis are extracted and fed to this neural network which achieves a confidence level of at least 80%. Comparing the results with other researchers shows high divergence between the results. The NN method shows better convergence with the results of the National Geophysical Data Centre (NGDC) than the results of the others researchers.
|
426 |
Testing and Integration of Machine Learning Components for Image Classification : Testning och integration av machine learning komponenter förbildklassificeringHanash, Ahmad January 2023 (has links)
As ML (Machine Learning) and deep neural networks get more used in many systems,the need to understand and test such systems becomes more actual. When designing a newsystem that contains ML models, the safety of this system becomes inevitably important.This rises the need to discuss a strategy to deal with the potential problems and weak-nesses in such a system. This thesis provides findings from literature and illustrates thepotential strategies in the area of image recognition in a comprehensive way. Lastly, theresult presented in this thesis shows that using an ML component in a complex softwaresystem with high safety requirements requires adopting software methodologies, such asMLOps (Machine learning operations) to monitor such a system and give suggestions tohow to test and verify an ML model integrated into a larger software system.
|
427 |
Discriminating Triggers for Mandatory DeclarationsPearson, Graham S. 09 1900 (has links)
Yes
|
428 |
The Strengthened BTWC Protocol: An Integrated RegimePearson, Graham S. 07 1900 (has links)
Yes
|
429 |
Non-Compliance Concern Investigations: Initiation ProceduresKenyon, I.R. 10 1900 (has links)
Yes
|
430 |
Article VII Measures: Optimizing the BenefitsPearson, Graham S. January 1999 (has links)
Yes
|
Page generated in 0.1152 seconds