11 |
Two problems in extremal set theoryBrown Kramer, Joshua. January 1900 (has links)
Thesis (Ph.D.)--University of Nebraska-Lincoln, 2007. / Title from title screen (site viewed July 9, 2007). PDF text: 126 p. : ill. UMI publication number: AAT 3252840. Includes bibliographical references. Also available in microfilm and microfiche formats.
|
12 |
On the Hadamard theorem and maximal determinants in combinatorial investigationsUnknown Date (has links)
The Hadamard Theorem plays a brief but important role in the study of maximal determinants in chapter II. The proof which is presented follows closely the paper by Everett and Ryser, but an attempt has been made to clarify and expand their work wherever possible. / Typescript. / "June, 1960." / "Submitted to the Graduate Council of Florida State University in partial fulfillment of the requirements for the degree of Master of Science." / Advisor: Marion F. Tinsley, Professor Directing Paper. / Includes bibliographical references (leaf 23).
|
13 |
Combinatorial displaysHajdu, Peter, 1946- January 1974 (has links)
No description available.
|
14 |
Enumerative methods in combinatorial analysisAbramson, Morton January 1966 (has links)
No description available.
|
15 |
Jump numbers, hyperrectangles and Carlitz compositionsCheng, Bo January 1999 (has links)
Thesis (Ph.D.)--University of the Witwatersrand, Faculty of Science, 1998. / A thesis submitted to the Faculty of Science, University of the Witwatersrand,
Johannesburg, in fulfilment of the requirements for the degree of
Doctor of Philosophy.
Johannesburg 1998 / Let A = (aij) be an m x n matrix. There is a natural way to associate a
poset PA with A. A jump in a linear extension of PA is a pair of consecutive
elements which are incomparable in Pa. The jump number of A is the minimum
number of jumps in any linear extension of PA. The maximum jump
number over a class of n x n matrices of zeros and ones with constant row and
column sum k, M (n, k), has been investigated in Chapter 2 and 3. Chapter
2 deals with extremization problems concerning M (n ,k). In Chapter 3, we
obtain the exact values for M (11,k). M(n,Q), M (n,n-3) and M(n,n-4).
The concept of frequency hyperrectangle generalizes the concept of latin
square. In Chapter 4 we derive a bound for the maximum number of mutually
orthogonal frequency hyperrectangles. Chapter 5 gives two algorithms to
construct mutually orthogonal frequency hyperrectangles.
Chapter 6 is devoted to some enumerative results about Carlitz compositions
(compositions with different adjacent parts).
|
16 |
The Maximum Size of Combinatorial Geometries Excluding Wheels and Whirls as MinorsHipp, James W. (James William), 1956- 08 1900 (has links)
We show that the maximum size of a geometry of rank n excluding the (q + 2)-point line, the 3-wheel W_3, and the 3-whirl W^3 as minor is (n - 1)q + 1, and geometries of maximum size are parallel connections of (q + 1)-point lines. We show that the maximum size of a geometry of rank n excluding the 5-point line, the 4-wheel W_4, and the 4-whirl W^4 as minors is 6n - 5, for n ≥ 3. Examples of geometries having rank n and size 6n - 5 include parallel connections of the geometries V_19 and PG(2,3).
|
17 |
Additive stucture, rich lines, and exponential set-expansionBorenstein, Evan. January 2009 (has links)
Thesis (Ph.D)--Mathematics, Georgia Institute of Technology, 2009. / Committee Chair: Croot, Ernie; Committee Member: Costello, Kevin; Committee Member: Lyall, Neil; Committee Member: Tetali, Prasad; Committee Member: Yu, XingXing. Part of the SMARTech Electronic Thesis and Dissertation Collection.
|
18 |
The structure of the solution space and its relation to execution time of evolutionary algorithms with applicationsGhannadian, Farzad 12 1900 (has links)
No description available.
|
19 |
Maximal (0,1,2,...t)-cliques of some association schemes /Choi, Sul-young January 1985 (has links)
No description available.
|
20 |
Active Learning with Combinatorial CoverageKatragadda, Sai Prathyush 04 August 2022 (has links)
Active learning is a practical field of machine learning as labeling data or determining which data to label can be a time consuming and inefficient task. Active learning automates the process of selecting which data to label, but current methods are heavily model reliant. This has led to the inability of sampled data to be transferred to new models as well as issues with sampling bias. Both issues are of crucial concern in machine learning deployment. We propose active learning methods utilizing Combinatorial Coverage to overcome these issues.
The proposed methods are data-centric, and through our experiments we show that the inclusion of coverage in active learning leads to sampling data that tends to be the best in transferring to different models and has a competitive sampling bias compared to benchmark methods. / Master of Science / Machine learning (ML) models are being used frequently in a variety of applications. For the model to be able to learn, data is required. Processing this data is often one of the most, if not the most, time consuming aspects of utilizing ML. One especially burdensome aspect of data processing is data labeling, or determining what each data point corresponds to in terms of real world class. For example, determining if a data point that is an image contains a plane or bird. This way the ML model can learn from the data.
Active learning is a sub-field of machine learning which aims to ease this burden by allowing the model to select which data would be most beneficial to label, so that the entirety of the dataset does not need to be labeled. The issue with current active learning methods is that they are highly model dependent. In machine learning deployment the model being used may change while data stays the same, so this model dependency can cause for data points we label with respect to one model to not be ideal for another model. This model dependency has led to sampling bias issues as well; points which are chosen to be labeled may all be similar or not representative of all data resulting in the ML model not being as knowledgeable as possible.
Relevant work has focused on the sampling bias issue, and several methods have been proposed to combat this issue. Few of the methods are applicable to any type of ML model though. The issue of sampled points not generalizing to different models has been studied but no solutions have been proposed.
In this work we present active learning methods using Combinatorial Coverage. Combinatorial Coverage is a statistical technique from the field of Design of Experiments, and has commonly been used to design test sets. The extension of Combinatorial Coverage to ML is newer, and provides a way to focus on the data. We show that this data focused approach to active learning achieves a better performance when the sampled data is used for a different model and that it achieves a competitive sampling bias compared to benchmark methods.
|
Page generated in 0.02 seconds