• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 268
  • 111
  • 90
  • 36
  • 26
  • 24
  • 21
  • 14
  • 7
  • 6
  • 6
  • 3
  • 3
  • 3
  • 3
  • Tagged with
  • 734
  • 140
  • 138
  • 131
  • 101
  • 90
  • 87
  • 82
  • 81
  • 68
  • 66
  • 64
  • 63
  • 63
  • 62
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
431

Using Live Sequence Chart Specifications for Formal Verification

Kumar, Rahul 11 July 2008 (has links) (PDF)
Formal methods play an important part in the development as well as testing stages of software and hardware systems. A significant and often overlooked part of the process is the development of specifications and correctness requirements for the system under test. Traditionally, English has been used as the specification language, which has resulted in verbose and difficult to use specification documents that are usually abandoned during product development. This research focuses on investigating the use of Live Sequence Charts (LSCs), a graphical and intuitive language directly suited for expressing communication behaviors of a system as the specification language for a system under test. The research presents two methods for using LSCs as a specification language: first, by translating LSCs to temporal logic, and second, by translating LSCs to an automaton structure that is directly suited for formal verification of systems. The research first presents the translation for each method and further, identifies the pros and cons for each verification method.
432

Developing Guidelines for Including Mobility-Based Performance Specifications in Highway Construction Contracts

Larson, Shawn J. 17 December 2013 (has links) (PDF)
Construction zones can greatly affect the traffic flow on roadways, especially when lane closures are required. Traditionally, the Utah Department of Transportation (UDOT) has used traffic management specifications that only allow lane closures and road work to be done during predetermined hours or specifications that require a certain number of lanes to be open at all times. Recently, mobility-based work-zone traffic flow maintenance has been considered. This method requires continuous monitoring of mobility-based performance data and a mechanism to send alerts to the contractors when the mobility data does not meet the standards set by the specifications. UDOT recently tested mobility-based performance specifications at an urban arterial work zone and studied issues related to implementation of mobility-based performance specifications. Parallel to this experiment, UDOT funded a study to develop guidelines for implementing mobility-based performance specifications to manage traffic flow in work zones. Dynamically collecting mobility-based data such as travel time and speed is now feasible using technologies such as Bluetooth and microwave sensors. The core benefit of using mobility-based performance specifications is that they can give the contractor more flexibility in construction work scheduling while maintaining an acceptable level of traffic flow. If the level of traffic flow is not maintained, then the contractor is assessed a financial penalty. The penalty is determined by the amount of time where the flow is not maintained at a predetermined condition. To discuss issues and develop guidelines, a task force consisting of UDOT representatives, several representatives from the construction industry, and researchers from Brigham Young University was formed. Through three task force meetings, a set of 12 guidelines were developed, including guidelines about when mobility-based performance specifications should be used and which mobility data should be used. Some of the issues were difficult for the task force members to agree on, and a decision-making theory called the Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) was used to find best approaches to deal with some of the difficult issues associated with the implementation of mobility-based performance specifications in highway construction contracts. These guidelines should be reviewed as appropriate in the future as UDOT accumulates experience in using these types of specifications.
433

A verified and optimized Stream X-Machine testing method, with application to cloud service certification

Simons, A.J.H., Lefticaru, Raluca 15 January 2020 (has links)
Yes / The Stream X-Machine (SXM) testing method provides strong and repeatable guarantees of functional correctness, up to a specification. These qualities make the method attractive for software certification, especially in the domain of brokered cloud services, where arbitrage seeks to substitute functionally equivalent services from alternative providers. However, practical obstacles include: the difficulty in providing a correct specification, the translation of abstract paths into feasible concrete tests, and the large size of generated test suites. We describe a novel SXM verification and testing method, which automatically checks specifications for completeness and determinism, prior to generating complete test suites with full grounding information. Three optimisation steps achieve up to a ten-fold reduction in the size of the test suite, removing infeasible and redundant tests. The method is backed by a set of tools to validate and verify the SXM specification, generate technology-agnostic test suites and ground these in SOAP, REST or rich-client service implementations. The method was initially validated using seven specifications, three cloud platforms and five grounding strategies. / European Union Seventh Framework Programme (FP7/2007-2013) under grant agreement no. 328392, the Broker@Cloud project [11].
434

Multi agent system for web database processing, on data extraction from online social networks.

Abdulrahman, Ruqayya January 2012 (has links)
In recent years, there has been a ood of continuously changing information from a variety of web resources such as web databases, web sites, web services and programs. Online Social Networks (OSNs) represent such a eld where huge amounts of information are being posted online over time. Due to the nature of OSNs, which o er a productive source for qualitative and quantitative personal information, researchers from various disciplines contribute to developing methods for extracting data from OSNs. However, there is limited research which addresses extracting data automatically. To the best of the author's knowledge, there is no research which focuses on tracking the real time changes of information retrieved from OSN pro les over time and this motivated the present work. This thesis presents di erent approaches for automated Data Extraction (DE) from OSN: crawler, parser, Multi Agent System (MAS) and Application Programming Interface (API). Initially, a parser was implemented as a centralized system to traverse the OSN graph and extract the pro- le's attributes and list of friends from Myspace, the top OSN at that time, by parsing the Myspace pro les and extracting the relevant tokens from the parsed HTML source les. A Breadth First Search (BFS) algorithm was used to travel across the generated OSN friendship graph in order to select the next pro le for parsing. The approach was implemented and tested on two types of friends: top friends and all friends. In case of top friends, 500 seed pro les have been visited; 298 public pro les were parsed to get 2197 top friends pro les and 2747 friendship edges, while in case of all friends, 250 public pro les have been parsed to extract 10,196 friends' pro les and 17,223 friendship edges. This approach has two main limitations. The system is designed as a centralized system that controlled and retrieved information of each user's pro le just once. This means that the extraction process will stop if the system fails to process one of the pro les; either the seed pro le ( rst pro le to be crawled) or its friends. To overcome this problem, an Online Social Network Retrieval System (OSNRS) is proposed to decentralize the DE process from OSN through using MAS. The novelty of OSNRS is its ability to monitor pro les continuously over time. The second challenge is that the parser had to be modi ed to cope with changes in the pro les' structure. To overcome this problem, the proposed OSNRS is improved through use of an API tool to enable OSNRS agents to obtain the required elds of an OSN pro le despite modi cations in the representation of the pro le's source web pages. The experimental work shows that using API and MAS simpli es and speeds up the process of tracking a pro le's history. It also helps security personnel, parents, guardians, social workers and marketers in understanding the dynamic behaviour of OSN users. This thesis proposes solutions for web database processing on data extraction from OSNs by the use of parser and MAS and discusses the limitations and improvements. / Taibah University
435

Advancements in Dependability Analysis of Safety-Critical Systems : Addressing Specification Formulation and Verification Challenges / Framsteg inom tillförlitlighetsanalys av säkerhetskritiska system : Utmaningar inom specifikationsformulering och verifiering

Yu, Zelin January 2023 (has links)
Safety-critical systems have garnered increasing attention, particularly regarding their dependability analysis. In modern times, these systems comprise numerous components, making it crucial to verify that lower-level components adhere to their specifications will ensure the overall system’s compliance with its top-level specification. However, two issues arise in this verification process. Firstly, many industrial applications lack lower-level natural-language specifications for their components, relying solely on toplevel specifications. Secondly, many current verification algorithms need to explore the continuous time evolution of the behavioral combinations of these components, and the combination of components to be explored will rise exponentially with the number of components. To address these challenges, this paper presents significant contributions. Firstly, it introduces a novel method that leverages the structures of redundancy systems to create naturallanguage specifications for components derived from a top-level specification. This approach facilitates a more efficient decomposition of the top-level specification, allowing for greater ease in handling component behaviors. Secondly, the proposed method is successfully applied to Scania’s brake system, leading to the decomposition of its top-level specification. To verify this decomposition, an existing verification algorithm is selected, and the results are impressive. The proposed method effectively addresses the issue of exponential growth in component behavior combinations, which was previously mentioned. Specifically, in the case of the Scania brake system, the number of combinations is dramatically reduced from 27 to a mere 13, showcasing the significant improvement achieved with the new method. / Säkerhetskritiska system har fått ökad uppmärksamhet, särskilt när det gäller deras pålitlighetsanalys. I moderna tider består dessa system av talrika komponenter, vilket gör det avgörande att verifiera att komponenter på lägre nivå följer sina specifikationer för att säkerställa att hela systemet följer sin övergripande specifikation. Två utmaningar uppstår dock i denna verifieringsprocess. För det första saknar många industriella tillämpningar naturligspråksspecifikationer för komponenter på lägre nivå och förlitar sig enbart på övergripande specifikationer. För det andra behöver många nuvarande verifieringsalgoritmer utforska de kontinuerliga tidsutvecklingarna av beteendekombinationer hos dessa komponenter, och antalet kombinationer som ska utforskas ökar exponentiellt med antalet komponenter. För att tackla dessa utmaningar presenterar den här artikeln betydande bidrag. För det första introducerar den en ny metod som utnyttjar strukturer i redundanta system för att skapa naturligspråksspecifikationer för komponenter som härleds från en övergripande specifikation. Denna metod underlättar en mer effektiv uppdelning av övergripande specifikation, vilket gör det enklare att hantera komponentbeteenden. För det andra tillämpas den föreslagna metoden framgångsrikt på Scanias bromssystem, vilket leder till en uppdelning av dess övergripande specifikation. För att verifiera denna uppdelning väljs en befintlig verifieringsalgoritm, och resultaten är imponerande. Den föreslagna metoden hanterar effektivt problemet med exponentiell tillväxt i komponentbeteendekombinationer, vilket tidigare nämnts. Specifikt, för Scanias bromssystem minskar antalet kombinationer dramatiskt från 27 till endast 13, vilket tydligt visar den betydande förbättring som uppnåtts med den nya metoden.
436

Model Based System Consistency Checking Using Event-B

Xu, Hao 04 1900 (has links)
<p>Formal methods such as Event-B are a widely used approach for developing critical systems. This thesis demonstrates that creating models and proving the consistency of the models at the requirements level during software (system) development is an effective way to reduce the occurrence of faults and errors in a practical application. An insulin infusion pump (IIP) is a complicated and time critical system. This thesis uses Event-B to specify models for an IIP, based on a draft requirements document developed by the US Food and Drug Administration (FDA). Consequently it demonstrates Event-B can be used effectively to detect the missing properties, the missing quantities, the faults and the errors at the requirements level of a system development. The IIP is an active and reactive time control system. To achieve the goal of handling timing issues in the IIP system, we made extensions of an existing time pattern specified using Event-B to enrich the semantics of the Event-B language. We created several sets to model the activation times of different events and the union of these time sets defines a global time activation set. The tick of global time is specified as a progress tick event. All the actions in an event are triggered only when the global time in the time tick event matches the time specified in the event. Time is deleted from the corresponding time set, but not the corresponding global time set while the event is triggered. A time point is deleted from the global time set only when there are no pending actions for that time point. Through discharging proof obligations using Event-B, we achieved our goal of improving the requirements document.</p> / Master of Computer Science (MCS)
437

Formalising non-functional requirements embedded in user requirements notation (URN) models

Dongmo, Cyrille 11 1900 (has links)
The growing need for computer software in different sectors of activity, (health, agriculture, industries, education, aeronautic, science and telecommunication) together with the increasing reliance of the society as a whole on information technology, is placing a heavy and fast growing demand on complex and high quality software systems. In this regard, the anticipation has been on non-functional requirements (NFRs) engineering and formal methods. Despite their common objective, these techniques have in most cases evolved separately. NFRs engineering proceeds firstly, by deriving measures to evaluate the quality of the constructed software (product-oriented approach), and secondarily by improving the engineering process (process-oriented approach). With the ability to combine the analysis of both functional and non-functional requirements, Goal-Oriented Requirements Engineering (GORE) approaches have become de facto leading requirements engineering methods. They propose through refinement/operationalisation, means to satisfy NFRs encoded in softgoals at an early phase of software development. On the other side, formal methods have kept, so far, their promise to eliminate errors in software artefacts to produce high quality software products and are therefore particularly solicited for safety and mission critical systems for which a single error may cause great loss including human life. This thesis introduces the concept of Complementary Non-functional action (CNF-action) to extend the analysis and development of NFRs beyond the traditional goals/softgoals analysis, based on refinement/operationalisation, and to propagate the influence of NFRs to other software construction phases. Mechanisms are also developed to integrate the formal technique Z/Object-Z into the standardised User Requirements Notation (URN) to formalise GRL models describing functional and non-functional requirements, to propagate CNF-actions of the formalised NFRs to UCMs maps, to facilitate URN construction process and the quality of URN models. / School of Computing / D. Phil (Computer Science)
438

Evaluating the expressiveness of specification languages : for stochastic safety-critical systems

Jamil, Fahad Rami January 2024 (has links)
This thesis investigates the expressiveness of specification languages for stochastic safety-critical systems, addressing the need for expressiveness in describing system behaviour formally. Through a case study and specification language enhancements, the research explores the impact of different frameworks on a set of specifications. The results highlight the importance of continuous development in the specification languages to meet the complex behaviours of systems with probabilistic properties. The findings emphasise the need for extending the chosen specification languages more formally, to ensure that the languages can capture the complexity of the systems they describe.  The research contributes valuable insights into improving the expressiveness of specification languages for ensuring system safety and operational reliability.
439

A Risk Based Approach to Module Tolerance Specification

Shahtaheri, Yasaman 22 April 2014 (has links)
This research investigates tolerance strategies for modular systems on a project specific basis. The objective of the proposed research is to form a guideline for optimizing the construction costs/risks with the aim of developing an optimal design of resilient modular systems. The procedures for achieving the research objective included: (a) development of 3D structural analysis models of the modules, (b) strength/stability investigation of the structure, (c) developing the fabrication cost function, (e) checking elastic and inelastic distortion, and (f) constructing the site-fit risk functions. The total site-fit risk function minimizes the cost/risk associated with fabrication, transportation; alignment, rework, and safety, while maximizing stiffness in terms of story drift values for site re-alignment and fitting alternatives. The fabrication cost function was developed by collecting 61 data points for the investigated module chassis using the SAP2000 software while reducing the initial section sizes, in addition to the fabrication costs at each step (61 steps). With the reduction of the structural reinforcement, story drift values increase, therefore there will be a larger distortion in the module. This generic module design procedure models a trade-off between the amount of reinforcement and expected need for significant field alterations. Structural design software packages such as SAP2000, AutoCAD, and Autodesk were used in order to model and test the module chassis. This research hypothesizes that the influential factors in the site-fit risk functions are respectively: fabrication, transportation, alignment, safety, and rework costs/risks. In addition, the site-fit risk function provides a theoretical range of possible solutions for the construction industry. The maximum allowable modular out-of-tolerance value, which requires the minimum amount of cost with respect to the defined function, can be configured using this methodology. This research concludes that over-reinforced or lightly-reinforced designs are not the best solution for mitigating risks, and reducing costs. For this reason the site-fit risk function will provide a range of pareto-optimal building solutions with respect to the fabrication, transportation, safety, alignment, and rework costs/risks.
440

Public-private partnership in the provision of secondary education in the Gaborone city area of Botswana

Sedisa, Kitso Nkaiwa 30 June 2008 (has links)
Public sector organisations are established in order to promote the quality of citizen's lives through the provision of public services. However, the demands for public services often outstrip the limited resources at the disposal of the public sector for the delivery of such services. Public-private partnerships (PPPs) are emerging as an important tool of public policy to deliver public infrastructure and the attendant services. The main aim of this study is to establish the extent to which PPPs can be used to improve the quality of the delivery of secondary education in the Gaborone City area in Botswana. The study includes a conceptual analysis of the nature of the public services in general, and in particular, the nature and the provision of secondary education in Botswana with specific reference to the Gaborone City area. The study also includes a conceptual analysis of PPPs as gleaned from published literature. Various dimensions of PPPs are analysed and these include but are not limited to definitions, benefits, models and the antecedents for the successful implementation of PPPs. Among the various models that are analysed in the study, the design, build, operate and finance (DBOF) model is preferred for improving the quality of the delivery of secondary education in the Gaborone City area in Botswana. In addition to the conceptual analysis, an empirical research study is undertaken in which the secondary school heads are the respondents to a structured questionnaire. The results of the empirical research support the conceptual analysis to the extent that in both cases, it is possible to improve the quality of the delivery of secondary education through PPPs. More secondary schools can be built and more facilities be made available to schools. Through the use of PPPs, most if not all learners can receive the entire secondary education programme, from junior to senior secondary education. Existing secondary schools can be modernised through PPPs. Ancillary services can be delivered by the organisations that have the necessary expertise. Certain antecedents for the successful implementation of PPPs are necessary. Through PPPs, secondary schools can be made attractive and intellectually stimulating. / Public Administration / (D.Litt. et Phil. ( Public Administration))

Page generated in 0.2511 seconds