• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Computing Markov bases, Gröbner bases, and extreme rays

Malkin, Peter 25 June 2007 (has links)
In this thesis, we address problems from two topics of applied mathematics: linear integer programming and polyhedral computation. Linear integer programming concerns solving optimisation problems to maximise a linear cost function over the set of integer points in a polyhedron. Polyhedral computation is concerned with algorithms for computing different properties of convex polyhedra. First, we explore the theory and computation of Gröbner bases and Markov bases for linear integer programming. Second, we investigate and improve an algorithm from polyhedral computation that converts between different representations of cones and polyhedra. A Markov basis is a set of integer vectors such that we can move between any two feasible solutions of an integer program by adding or subtracting vectors in the Markov basis while never moving outside the set of feasible solutions. Markov bases are mainly used in algebraic statistics for sampling from a set of feasible solutions. The major contribution of this thesis is a fast algorithm for computing Markov bases, which we used to solve a previously intractable computational challenge. Gröbner basis methods are exact local search approaches for solving integer programs. We present a Gröbner basis approach that can use the structure of an integer program in order to solve it more efficiently. Gröbner basis methods are interesting mainly from a purely theoretical viewpoint, but they are also interesting because they may provide insight into why some classes of integer programs are difficult to solve using standard techniques and because someday they may be able to solve these difficult problems. Computing the properties of convex polyhedra is useful for solving problems within different areas of mathematics such as linear programming, integer programming, combinatorial optimisation, and computational geometry. We investigate and improve an algorithm for converting between a generator representation of a cone or polyhedron and a constraint representation of the cone or polyhedron and vice versa. This algorithm can be extended to compute circuits of matrices, which are used in computational biology for metabolic pathway analysis.
2

Analysing the behaviour of neural networks

Breutel, Stephan Werner January 2004 (has links)
A new method is developed to determine a set of informative and refined interface assertions satisfied by functions that are represented by feed-forward neural networks. Neural networks have often been criticized for their low degree of comprehensibility.It is difficult to have confidence in software components if they have no clear and valid interface description. Precise and understandable interface assertions for a neural network based software component are required for safety critical applications and for theintegration into larger software systems. The interface assertions we are considering are of the form &quote if the input x of the neural network is in a region (alpha symbol) of the input space then the output f(x) of the neural network will be in the region (beta symbol) of the output space &quote and vice versa. We are interested in computing refined interface assertions, which can be viewed as the computation of the strongest pre- and postconditions a feed-forward neural network fulfills. Unions ofpolyhedra (polyhedra are the generalization of convex polygons in higher dimensional spaces) are well suited for describing arbitrary regions of higher dimensional vector spaces. Additionally, polyhedra are closed under affine transformations. Given a feed-forward neural network, our method produces an annotated neural network, where each layer is annotated with a set of valid linear inequality predicates. The main challenges for the computation of these assertions is to compute the solution of a non-linear optimization problem and the projection of a polyhedron onto a lower-dimensional subspace.

Page generated in 0.4032 seconds