• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 484
  • 296
  • 197
  • 80
  • 78
  • 36
  • 34
  • 33
  • 16
  • 10
  • 10
  • 8
  • 8
  • 7
  • 7
  • Tagged with
  • 1436
  • 155
  • 147
  • 136
  • 114
  • 114
  • 112
  • 107
  • 99
  • 79
  • 69
  • 68
  • 64
  • 54
  • 53
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
211

The Use Of Filters In Topology

Dasser, Abdellatif 01 January 2004 (has links)
Sequences are sufficient to describe topological properties in metric spaces or, more generally, topological spaces having a countable base for the topology. However, filters or nets are needed in more abstract spaces. Nets are more natural extension of sequences but are generally less friendly to work with since quite often two nets have distinct directed sets for domains. Operations involving filters are set theoretic and generally certain to filters on the same set. The concept of a filter was introduced by H. Cartan in 1937 and an excellent treatment of the subject can be found in N. Bourbaki (1940).
212

Privacy, Surveillance And The State: A Comparison Of U.S. And British Privacy Rights

Lander, Angelina 01 January 2009 (has links)
This study investigates the effects of institutional structure on the privacy rights regimes in the United States and the United Kingdom, from 2000-2006. The goal of this research is to analyze how variation in the institutional arrangements across these two countries allowed for more or less protection of privacy rights for citizens. Domestic terrorist attacks during the time period represent a catalyst for changes in police and government surveillance activities. Veto points literature provides the framework for institutional comparison. The first part of the research provides a discussion of the historical evolution of privacy rights in both states, focusing on government and police surveillance and investigations. The second part of the research, based on veto points theory, compares the institutional arrangements of the United States and the United Kingdom, and suggests that the number of veto points and the ideological proximity of veto players have had an effect on the formulation of policy. Laws governing surveillance, investigations and privacy in the year 2000 provide a benchmark for analyzing how policies change over time.
213

The Sum of Two Integer Cubes - Restricted

Jonsson, Kenny January 2022 (has links)
We study the size of sets containing sums of two integer cubes such that their representation is unique and also fit between two consecutive integer cubes. We will try to write algorithms that efficiently calculate the size of these sets and also implement these algorithms in PythonTM. Although we will fail to find a non-iterative algorithm, we will find different ways of approximating the size of these sets. We will also find that techniques used in our failed algorithms can be used to calculate the number of integer lattice points inside a circle.
214

Inventory Management and Inbound Logistics Optimization for a Food Processing Company

Zhou, Heng January 2013 (has links)
No description available.
215

Points and Lines in the Plane

Smith, Justin Wesley 06 December 2010 (has links)
No description available.
216

POINTS OF REFERENCE: PROJECTILE POINTS, HUNTING AND IDENTITY AT THE NEOLITHIC ÇATALHÖYÜK, TURKEY

Dogiama, Triantafyllia Eirini January 2017 (has links)
This thesis examines the practice of hunting within the Neolithic community of Çatalhöyük by focusing on its stone projectile points. Wild faunal remains indicate that hunting remained in practice, even though domesticated animals and plants comprised the staple diet of the Neolithic people of Çatalhöyük. Hunting and the “wild” are venerated in the site's iconography (wall paintings, wild bull skull mural installations, zoomorphic figurines), while obsidian projectile points—the dominant hunting weapons— were carefully executed artifacts that seem to have served more than one purpose. In studying the projectile points I consider the role of hunting at Çatalhöyük and its significance in shaping personal and communal identities. By employing an attribute analysis I examined the projectiles’ deposition in varied contexts (caches, burials, building infills, middens etc). Differences in use-wear traces as well as in technological and morphological traits suggest that the Çatalhöyük point assemblage consists of two groups that were used and treated in very distinct ways. The first group comprises projectile weapons that were used in hunting, exhibiting clear signs of actual use; whereas the second group consists of bifaces that were in all likelihood reserved for ceremonial purposes given their pristine condition and special deposition. This thesis argues that hunting was not merely an alternate subsistence strategy but an arena where symbolic expression and social identities could be performed and negotiated. / Thesis / Doctor of Philosophy (PhD) / This thesis researches the role of hunting in one of the earliest agricultural societies with livestock, Çatalhöyük in Turkey (7th-6th millennium cal BCE). For this purpose I studied the stone tips of projectile weapons, i.e. arrows and spears. The results of this work show that apart from the actual hunting weapons, the people of Çatalhöyük also had weapons reserved for ceremonial purposes. Indeed other evidence also shows that hunting and the “wild” was revered at the site: wall paintings, wall decorations with wild animal skulls, large feasting events, and animal figurines. In this thesis I argue that hunting was not only a strategy for the farmers of Çatalhöyük to acquire extra food resources but it also played a significant role in their symbolic and religious life, which is why this tradition persisted even after the domestication of animals.
217

Schur-class of finitely connected planar domains: the test-function approach

Guerra Huaman, Moises Daniel 12 May 2011 (has links)
We study the structure of the set of extreme points of the compact convex set of matrix-valued holomorphic functions with positive real part on a finitely-connected planar domain 𝐑 normalized to have value equal to the identity matrix at some prescribed point t₀ ∈ 𝐑. This leads to an integral representation for such functions more general than what would be expected from the result for the scalar-valued case. After Cayley transformation, this leads to a integral Agler decomposition for the matrix Schur class over 𝐑 (holomorphic contractive matrix-valued functions over 𝐑). Application of a general theory of abstract Schur-class generated by a collection of test functions leads to a transfer-function realization for the matrix Schur-class over 𝐑, extending results known up to now only for the scalar case. We also explain how these results provide a new perspective for the dilation theory for Hilbert space operators having 𝐑 as a spectral set. / Ph. D.
218

Efficient computer experiment designs for Gaussian process surrogates

Cole, David Austin 28 June 2021 (has links)
Due to advancements in supercomputing and algorithms for finite element analysis, today's computer simulation models often contain complex calculations that can result in a wealth of knowledge. Gaussian processes (GPs) are highly desirable models for computer experiments for their predictive accuracy and uncertainty quantification. This dissertation addresses GP modeling when data abounds as well as GP adaptive design when simulator expense severely limits the amount of collected data. For data-rich problems, I introduce a localized sparse covariance GP that preserves the flexibility and predictive accuracy of a GP's predictive surface while saving computational time. This locally induced Gaussian process (LIGP) incorporates latent design points, inducing points, with a local Gaussian process built from a subset of the data. Various methods are introduced for the design of the inducing points. LIGP is then extended to adapt to stochastic data with replicates, estimating noise while relying upon the unique design locations for computation. I also address the goal of identifying a contour when data collection resources are limited through entropy-based adaptive design. Unlike existing methods, the entropy-based contour locator (ECL) adaptive design promotes exploration in the design space, performing well in higher dimensions and when the contour corresponds to a high/low quantile. ECL adaptive design can join with importance sampling for the purpose of reducing uncertainty in reliability estimation. / Doctor of Philosophy / Due to advancements in supercomputing and physics-based algorithms, today's computer simulation models often contain complex calculations that can produce larger amounts of data than through physical experiments. Computer experiments conducted with simulation models are sought-after ways to gather knowledge about physical problems but come with design and modeling challenges. In this dissertation, I address both data size extremes - building prediction models with large data sets and designing computer experiments when scarce resources limit the amount of data. For the former, I introduce a strategy of constructing a series of models including small subsets of observed data along with a set of unobserved data locations (inducing points). This methodology also contains the ability to perform calculations with only unique data locations when replicates exist in the data. The locally induced model produces accurate predictions while saving computing time. Various methods are introduced to decide the locations of these inducing points. The focus then shifts to designing an experiment for the purpose of accurate prediction around a particular output quantity of interest (contour). A experimental design approach is detailed that selects new sample locations one-at-a-time through a function to maximize the amount of information gain in the contour region for the overall model. This work is combined with an existing method to estimate the true volume of the contour.
219

Determination of the characteristics of heat transfer from a horizontal silver surface to boiling mixtures of ethanol and benzene

Watkins, W. B. January 1950 (has links)
Heat transfer to boiling liquids is of primary industrial importance. Surprisingly enough organized study of the variables which affect heat transfer to boiling liquids has been far less than proportional to the industrial usage of this type of heat transmission. Perhaps the least investigated phenomenon involved in heat transfer to boiling liquids is the so~called critical state or maximum in the rate of heat transfer-thermal driving force relationship. This maximum or peak is believed to be caused by a change in the type or method of heat transfer from the heating surface to the boiling liquid. The system is said to pass from a state of nucleate boiling through the maximum to a state of film boiling. The critical point is known to very for various liquids yet no adequate correlations were found which would provide a prediction of the characteristics of heat transfer for binary mixtures of liquids. In as much es ethanol and benzene are used extensively in industry they were selected for use in this investigation. The purpose of this investigation was to determine the characteristics of heat transfer from a horizontal silver surface to boiling mixtures of ethanol and benzene. A horizontal plate evaporator, with the necessary accessory equipment for measurement and control, was designed and constructed. The test liquids were prepared for concentrations of ethanol in benzene from 0 to 100 per cent in 10 volume per cent increments. These test liquids were charged to the evaporator and the characteristics of heat transfer determined by a series of steady state conditions of heat transfer. The rate of heat transfer was evaluated from the wattage input to the electrical heating unit. The temperature gradient between the heating surface and the main body of liquids was determined by evaluating the readings of thermocouples placed in the liquid space and the heater plate. The maximum rate of heat transfer was considered equivalent to the heat flux which caused the boiling system to shift through the maximum in the heat flux-temperature gradient relationship. The critical temperature gradient was obtained by an extrapolation of the heat flux-temperature gradient curve to maximum heat flux. The general conditions for the tests were: heat transfer surface, silver; cold liquid height in evaporator, 4—1/2 inches; evaporator diameter, 1·31/32 inches. The tests were made at normal atmospheric pressure which varied from 710.0 to 718.0 millimeters of mercury pressure. Steady state conditions of heat transfer were established before data was taken. / Master of Science
220

Mining Rare Features in Fingerprints using Core points and Triplet-based Features

Munagani, Indira Priya Darshini 04 January 2014 (has links)
A fingerprint matching algorithm with a novel set of matching parameters based on core points and triangular descriptors is proposed to discover rarity in fingerprints. The algorithm uses a mathematical and statistical approach to discover rare features in fingerprints which provides scientific validation for both ten-print and latent fingerprint evidence. A feature is considered rare if it is statistically uncommon; that is, the rare feature should be unique among N (N>100) randomly sampled prints. A rare feature in a fingerprint has higher discriminatory power when it is identified in a print (latent or otherwise). In the case of latent fingerprint matching, the enhanced discriminatory power from the rare features can help in delivering a confident court judgment. In addition to mining the rare features, a parallel algorithm for fingerprint matching on GPUs is also proposed to reduce the run-time of fingerprint matching on larger databases. Results show that 1) matching algorithm is useful in eliminating false matches. 2) each of the 30 fingerprints randomly selected to mine rare features have a small set of highly distinctive statistically rare features some of whose occurrence is one in 1000 fingerprints. 3) the parallel algorithm implemented on GPUs for larger databases is around 40 times faster than the sequential algorithm. / Master of Science

Page generated in 0.0832 seconds