• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 89
  • 12
  • 7
  • 6
  • 4
  • 4
  • 3
  • 3
  • 2
  • 1
  • Tagged with
  • 153
  • 44
  • 43
  • 39
  • 36
  • 27
  • 22
  • 21
  • 20
  • 19
  • 19
  • 19
  • 18
  • 18
  • 16
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Solving Quantified Boolean Formulas

Samulowitz, Horst Cornelius 28 July 2008 (has links)
Abstract Solving Quantified Boolean Formulas Horst Samulowitz Doctor of Philosophy Graduate Department of Computer Science University of Toronto 2008 Many real-world problems do not have a simple algorithmic solution and casting these problems as search problems is often not only the simplest way of casting them, but also the most efficient way of solving them. In this thesis we will present several techniques to advance search-based algorithms in the context of solving quantified boolean formulas (QBF). QBF enables complex realworld problems including planning, two-player games and verification to be captured in a compact and quite natural fashion. We will discuss techniques ranging from straight forward pre-processing methods utilizing strong rules of inference to more sophisticated online approaches such as dynamic partitioning. Furthermore, we will show that all of the presented techniques achieve an essential improvement of the search process when solving QBF. At the same time the displayed empirical results also reveal the orthogonality of the different techniques with respect to performance. Generally speaking each approach performs well on a particular subset of benchmarks, but performs poorly on others. Consequently, an adaptive employment of the available techniques that maximizes the performance would result in further improvements. We will demonstrate that such an adaptive approach can pay off in practice, by presenting the results of using machine learning methods to dynamically select the best variable ordering heuristics during search.
22

On the dynamics of active documents for distributed data management

Bourhis, Pierre 11 February 2011 (has links) (PDF)
One of the major issues faced by Web applications is the management of evolving of data. In this thesis, we consider this problem and in particular the evolution of active documents. Active documents is a formalism describing the evolution of XML documents by activating Web services calls included in the document. It has already been used in the context of the management of distributed data \cite{axml}. The main contributions of this thesis are theoretical studies motivated by two systems for managing respectively stream applications and workflow applications. In a first contribution, we study the problem of view maintenance over active documents. The results served as the basis for an implementation of stream processors based on active documents called Axlog widgets. In a second one, we see active documents as the core of data centric workflows and consider various ways of expressing constraints on the evolution of documents. The implementation, called Axart, validated the approach of a data centric workflow system based on active documents. The hidden Web (also known as deep or invisible Web), that is, the partof the Web not directly accessible through hyperlinks, but through HTMLforms or Web services, is of great value, but difficult to exploit. Wediscuss a process for the fully automatic discovery, syntacticand semantic analysis, and querying of hidden-Web services. We proposefirst a general architecture that relies on a semi-structured warehouseof imprecise (probabilistic) content. We provide a detailed complexityanalysis of the underlying probabilistic tree model. We describe how wecan use a combination of heuristics and probing to understand thestructure of an HTML form. We present an original use of a supervisedmachine-learning method, namely conditional random fields,in an unsupervised manner, on an automatic, imperfect, andimprecise, annotation based on domain knowledge, in order to extractrelevant information from HTML result pages. So as to obtainsemantic relations between inputs and outputs of a hidden-Web service, weinvestigate the complexity of deriving a schema mapping between databaseinstances, solely relying on the presence of constants in the twoinstances. We finally describe a model for the semantic representationand intensional indexing of hidden-Web sources, and discuss how toprocess a user's high-level query using such descriptions.
23

SMT-Based Reasoning and Planning in TAL

Hallin, Magnus January 2010 (has links)
Automated planning as a satisfiability problem is a method developed in theearly nineties. It has some known disadvantages, such as its inefficient encod-ing of numbers. The field of Satisfiability Modulo Teories tries to connectalready established solvers for e.g. linear constraints into SAT-solvers in orderto make reasoning about numerical values more efficient. This thesis combines planning as satisfiability and SMT to perform efficientreasoning about actions that occupy realistic time in Temporal Action Logic,a formalism developed at Linköping University for reasoning about action andchange.
24

Novel Value Ordering Heuristics Using Non-Linear Optimization In Boolean Satisfiability

Pisanov, Vladimir January 2012 (has links)
Boolean Satisfiability (SAT) is a fundamental NP-complete problem of determining whether there exists an assignment of variables which makes a Boolean formula evaluate to True. SAT is a convenient representation for many naturally occurring optimization and decisions problems such as planning and circuit verification. SAT is most commonly solved by a form of backtracking search which systematically explores the space of possible variable assignments. We show that the order in which variable polarities are assigned can have a significant impact on the performance of backtracking algorithms. We present several ways of transforming SAT instances into non-linear objective functions and describe three value-ordering methods based on iterative optimization techniques. We implement and test these heuristics in the widely-recognized MiniSAT framework. The first approach determines polarities by applying Newton's Method to a sparse system of non-linear objective functions whose roots correspond to the satisfying assignments of the propositional formula. The second approach determines polarities by minimizing an objective function corresponding to the number of clauses conflicting with each assignment. The third approach determines preferred polarities by performing stochastic gradient descent on objective functions sampled from a family of continuous potentials. The heuristics are evaluated on a set of standard benchmarks including random, crafted and industrial problems. We compare our results to five existing heuristics, and show that MiniSAT equipped with our heuristics often outperforms state-of-the-art SAT solvers.
25

Deriving Framework Usages Based on Behavioral Models

SAEKI, Motoshi, KOBAYASHI, Takashi, ZENMYO, Teruyoshi 01 April 2010 (has links)
No description available.
26

Verifying Web Application Vulnerabilities by Model Checking

Hung, Chun-Chieh 20 August 2009 (has links)
Due to the continued development of Internet technology, more and more people are willing to take advantage of high-interaction and diverse web applications to deal with commercial, knowledge-sharing, and social activities. However, while web applications deeply affect our society by degrees, hackers start exploiting web application vulnerabilities to attack innocent end user and back-end database, and therefore pose significant threat in information security. According to this situation, this paper proposes a detection mechanism based on Model Checking to detect web application vulnerabilities. We reduce the problem whether the vulnerabilities exist or not to a kind of SMT (Satisfiability Modulo Theories) problem, and analyze all of the traces of tainted data flow in web applications to find possible vulnerabilities by SMT solver. The experimental results show that the method we proposed can identify SQL injection and XSS vulnerabilities effectively, and prove our method is a feasible way to find web application vulnerabilities.
27

Efficient, mechanically-verified validation of satisfiability solvers

Wetzler, Nathan David 04 September 2015 (has links)
Satisfiability (SAT) solvers are commonly used for a variety of applications, including hardware verification, software verification, theorem proving, debugging, and hard combinatorial problems. These applications rely on the efficiency and correctness of SAT solvers. When a problem is determined to be unsatisfiable, how can one be confident that a SAT solver has fully exhausted the search space? Traditionally, unsatisfiability results have been expressed using resolution or clausal proof systems. Resolution-based proofs contain perfect reconstruction information, but these proofs are extremely large and difficult to emit from a solver. Clausal proofs rely on rediscovery of inferences using a limited number of techniques, which typically takes several orders of magnitude longer than the solving time. Moreover, neither of these proof systems has been able to express contemporary solving techniques such as bounded variable addition. This combination of issues has left SAT solver authors unmotivated to produce proofs of unsatisfiability. The work from this dissertation focuses on validating satisfiability solver output in the unsatisfiability case. We developed a new clausal proof format called DRAT that facilitates compact proofs that are easier to emit and capable of expressing all contemporary solving and preprocessing techniques. Furthermore, we implemented a validation utility called DRAT-trim that is able to validate proofs in a time similar to that of the discovery time. The DRAT format has seen widespread adoption in the SAT community and the DRAT-trim utility was used to validate the results of the 2014 SAT Competition. DRAT-trim uses many advanced techniques to realize its performance gains, so why should the results of DRAT-trim be trusted? Mechanical verification enables users to model programs and algorithms and then prove their correctness with a proof assistant, such as ACL2. We designed a new modeling technique for ACL2 that combines efficient model execution with an agile and convenient theory. Finally, we used this new technique to construct a fast, mechanically-verified validation tool for proofs of unsatisfiability. This research allows SAT solver authors and users to have greater confidence in their results and applications by ensuring the validity of unsatisfiability results. / text
28

Exploiting Problem Structure in QBF Solving

Goultiaeva, Alexandra 27 March 2014 (has links)
Deciding the truth of a Quantified Boolean Formula (QBF) is a canonical PSPACE-complete problem. It provides a powerful framework for encoding problems that lie in PSPACE. These include many problems in automatic verification, and problems with discrete uncertainty or non-determinism. Two person adversarial games are another type of problem that are naturally encoded in QBF. It is standard practice to use Conjunctive Normal Form (CNF) when representing QBFs. Any propositional formula can be efficiently translated to CNF via the addition of new variables, and solvers can be implemented more efficiently due to the structural simplicity of CNF. However, the translation to CNF involves a loss of some structural information. This thesis shows that this structural information is important for efficient QBF solving, and shows how this structural information can be utilized to improve state-of-the-art QBF solving. First, a non-CNF circuit-based solver is presented. It makes use of information not present in CNF to improve its performance. We present techniques that allow it to exploit the duality between solutions and conflicts that is lost when working with CNF. This duality can also be utilized in the production of certificates, allowing both true and false formulas to have easy-to-verify certificates of the same form. Then, it is shown that most modern CNF-based solvers can benefit from the additional information derived from duality using only minor modifications. Furthermore, even partial duality information can be helpful. We show that for standard methods for conversion to CNF, some of the required information can be reconstructed from the CNF and greatly benefit the solver.
29

Application of Logic Synthesis Toward the Inference and Control of Gene Regulatory Networks

Lin, Pey Chang K 16 December 2013 (has links)
In the quest to understand cell behavior and cure genetic diseases such as cancer, the fundamental approach being taken is undergoing a gradual change. It is becoming more acceptable to view these diseases as an engineering problem, and systems engineering approaches are being deployed to tackle genetic diseases. In this light, we believe that logic synthesis techniques can play a very important role. Several techniques from the field of logic synthesis can be adapted to assist in the arguably huge effort of modeling cell behavior, inferring biological networks, and controlling genetic diseases. Genes interact with other genes in a Gene Regulatory Network (GRN) and can be modeled as a Boolean Network (BN) or equivalently as a Finite State Machine (FSM). As the expression of genes deter- mine cell behavior, important problems include (i) inferring the GRN from observed gene expression data from biological measurements, and (ii) using the inferred GRN to explain how genetic diseases occur and determine the ”best” therapy towards treatment of disease. We report results on the application of logic synthesis techniques that we have developed to address both these problems. In the first technique, we present Boolean Satisfiability (SAT) based approaches to infer the predictor (logical support) of each gene that regulates melanoma, using gene expression data from patients who are suffering from the disease. From the output of such a tool, biologists can construct targeted experiments to understand the logic functions that regulate a particular target gene. Our second technique builds upon the first, in which we use a logic synthesis technique; implemented using SAT, to determine gene regulating functions for predictors and gene expression data. This technique determines a BN (or family of BNs) to describe the GRN and is validated on a synthetic network and the p53 network. The first two techniques assume binary valued gene expression data. In the third technique, we utilize continuous (analog) expression data, and present an algorithm to infer and rank predictors using modified Zhegalkin polynomials. We demonstrate our method to rank predictors for genes in the mutated mammalian and melanoma networks. The final technique assumes that the GRN is known, and uses weighted partial Max-SAT (WPMS) towards cancer therapy. In this technique, the GRN is assumed to be known. Cancer is modeled using a stuck-at fault model, and ATPG techniques are used to characterize genes leading to cancer and select drugs to treat cancer. To steer the GRN state towards a desirable healthy state, the optimal selection of drugs is formulated using WPMS. Our techniques can be used to find a set of drugs with the least side-effects, and is demonstrated in the context of growth factor pathways for colon cancer.
30

Quantum Speed-ups for Boolean Satisfiability and Derivative-Free Optimization

Arunachalam, Srinivasan January 2014 (has links)
In this thesis, we have considered two important problems, Boolean satisfiability (SAT) and derivative free optimization in the context of large scale quantum computers. In the first part, we survey well known classical techniques for solving satisfiability. We compute the approximate time it would take to solve SAT instances using quantum techniques and compare it with state-of-the heart classical heuristics employed annually in SAT competitions. In the second part of the thesis, we consider a few classically well known algorithms for derivative free optimization which are ubiquitously employed in engineering problems. We propose a quantum speedup to this classical algorithm by using techniques of the quantum minimum finding algorithm. In the third part of the thesis, we consider practical applications in the fields of bio-informatics, petroleum refineries and civil engineering which involve solving either satisfiability or derivative free optimization. We investigate if using known quantum techniques to speedup these algorithms directly translate to the benefit of industries which invest in technology to solve these problems. In the last section, we propose a few open problems which we feel are immediate hurdles, either from an algorithmic or architecture perspective to getting a convincing speedup for the practical problems considered.

Page generated in 0.0735 seconds