• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 769
  • 266
  • 192
  • 88
  • 52
  • 46
  • 33
  • 16
  • 14
  • 13
  • 13
  • 13
  • 13
  • 13
  • 13
  • Tagged with
  • 1766
  • 282
  • 274
  • 270
  • 261
  • 249
  • 246
  • 235
  • 233
  • 192
  • 155
  • 152
  • 149
  • 132
  • 116
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

'n Geografiese analise van Midrand as nywerheidsgebied

Smit, Magdalena Johanna 24 April 2014 (has links)
M.A. (Geography) / Please refer to full text to view abstract
102

Interní audit / Internal audit

Šilhán, Josef January 2008 (has links)
system description, risk, risk evaluation
103

Practical uniform interpolation for expressive description logics

Koopmann, Patrick January 2015 (has links)
The thesis investigates methods for uniform interpolation in expressive description logics. Description logics are formalisms commonly used to model ontologies. Ontologies store terminological information and are used in a wide range of applications, such as the semantic web, medicine, bio-informatics, software development, data bases and language processing. Uniform interpolation eliminates terms from an ontology such that logical entailments in the remaining language are preserved. The result, the uniform interpolant, is a restricted view of the ontology that can be used for a variety of tasks such as ontology analysis, ontology reuse, ontology evolution and information hiding. Uniform interpolation for description logics has only gained an interest in the research community in the last years, and theoretical results show that it is a hard problem requiring specialised reasoning approaches. We present a range of uniform interpolation methods that can deal with expressive description logics such as ALC and many of its extensions. For all these logics, these are the first methods that are able to compute uniform interpolants for all inputs. The methods are based a new family of saturation-based reasoning methods, which make it possible to eliminate symbols in a goal-oriented manner. The practicality of this approach is shown by an evaluation on realistic ontologies.
104

Images of early British Columbia : landscape photography, 1858-1888

Schwartz, Joan Marsha January 1977 (has links)
With their cumbersome equipment and refractory technology, professional photographers recorded pioneering development in British Columbia almost from the beginning of white settlement. This study examines landscape photographs taken during the thirty year period between the beginning of the Fraser River gold rush and the completion of the Canadian Pacific Railway. It comments on the nature and meaning of the photographs and suggests their relevance to an understanding of land and life in early British Columbia. Photography in early British Columbia was almost exclusively the work of professionals whose success rested upon their sensitivity to the market. For this reason, their work is to some extent a mirror of early British Columbians' sense of themselves in a new place. Nineteenth century landscape photographers focused on the wagon road and later the railroad, on gold mining and on settlement. The early forest industry attracted far less attention than the gold rush and though fishing and farming had begun, they were seldom photographed. Picnics, regattas and other leisure activities were recorded, particular in Victoria and New Westminster and more frequently in the 1880's. Spectacular physical landscapes dominate the photographic record of wilderness; microscale nature studies are singularly absent. The attention to extreme symbols of progress in the photographic record of early British Columbia is understandable. The recency and rapidity of development had made material advance a common and concrete reality, and British Columbians wanted a record of their achievement. Some colonials brought with them conservative ideas of home and society based upon British traditions and Victorian taste. Photographs of elegant surroundings and genteel pastimes confirmed that they had created a civilized society and an ordered landscape in an isolated corner of Empire. In the wider context, the photographic record of early British Columbia shares elements with other areas of frontier development and British colonization. However, it exhibits a distinctiveness which is attributable as much to the mix of landscape images in the British Columbia setting as to the images themselves. / Arts, Faculty of / Geography, Department of / Graduate
105

Marker Detection in Aerial Images

Alharbi, Yazeed 09 April 2017 (has links)
The problem that the thesis is trying to solve is the detection of small markers in high-resolution aerial images. Given a high-resolution image, the goal is to return the pixel coordinates corresponding to the center of the marker in the image. The marker has the shape of two triangles sharing a vertex in the middle, and it occupies no more than 0.01% of the image size. An improvement on the Histogram of Oriented Gradients (HOG) is proposed, eliminating the majority of baseline HOG false positives for marker detection. The improvement is guided by the observation that standard HOG description struggles to separate markers from negatives patches containing an X shape. The proposed method alters intensities with the aim of altering gradients. The intensity-dependent gradient alteration leads to more separation between filled and unfilled shapes. The improvement is used in a two-stage algorithm to achieve high recall and high precision in detection of markers in aerial images. In the first stage, two classifiers are used: one to quickly eliminate most of the uninteresting parts of the image, and one to carefully select the marker among the remaining interesting regions. Interesting regions are selected by scanning the image with a fast classifier trained on the HOG features of markers in all rotations and scales. The next classifier is more precise and uses our method to eliminate the majority of the false positives of standard HOG. In the second stage, detected markers are tracked forward and backward in time. Tracking is needed to detect extremely blurred or distorted markers that are missed by the previous stage. The algorithm achieves 94% recall with minimal user guidance. An average of 30 guesses are given per image; the user verifies for each whether it is a marker or not. The brute force approach would return 100,000 guesses per image.
106

New Advances in Joint Source-Channel and Multiple Description Coding

Wang, Xiaohan 01 1900 (has links)
<p> This thesis launches some new inquires and makes significant progress in the active research areas of joint source-channel coding and multiple description coding. Two interesting but previously untreated problems are investigated and partially settled: 1) can index assignment of source codewords be optimized with respect to a given joint source-channel decoding scheme, and if so, how? 2) can joint source-channel coding be optimized with respect to a given multiple description code, and if so, how?</p> <p> The first problem is formulated as one of quadratic assignment. Although quadratic assignment is NP-hard in general, we are able to develop a near-optimum index assignment algorithm for joint source-channel (JSC) maximum a posteriori (MAP) decoding, if the input is a Gaussian Markov sequence of high correlation. For general cases, good heuristic solutions are proposed. Convincing empirical evidence is presented to demonstrate the performance improvement of the index assignments optimized for MAP decoding over those designed for hard-decision decoding.</p> <p> The second problem is motivated by applications of signal communication and estimation in resource-constrained lossy networks. To keep the encoder complexity at a minimum, a signal is coded by a multiple description quantizer (MDQ) without channel coding. The code diversity of MDQ and the path diversity of the network are exploited by decoders to combat transmission errors. A key design objective is resource scalability: powerful nodes in the network can perform JSC-MD estimation under the criteria of maximum a posteriori probability or minimum mean-square error (MMSE), while primitive nodes resort to simpler multiple description (MD) decoding, all working with the same MDQ code. The application of JSC-MD to distributed estimation of hidden Markov models in a sensor network is demonstrated. The proposed JSC-MD MAP estimator is an algorithm of the longest path in a weighted directed acyclic graph, while the JSC-MD MMSE decoder is an extension of the well-known forward-backward algorithm to multiple descriptions. They outperform the existing hard-decision MDQ decoders by large margins.</p> / Thesis / Doctor of Philosophy (PhD)
107

Enhancing Description Logics For Rules Coverage

Carral Martinez, David 14 September 2012 (has links)
No description available.
108

Comparison of interpersonal and presentational description in Russian oral proficiency testing

Mikhailova, Julia V. 14 July 2005 (has links)
No description available.
109

Graph-Based Solution for Two Scalar Quantization Problems in Network Systems

Zheng, Qixue January 2018 (has links)
This thesis addresses the optimal scalar quantizer design for two problems, i.e. the two-stage Wyner-Ziv coding problem and the multiple description coding problem for finite-alphabet sources. The optimization problems are formulated as the minimization of a weighted sum of distortions and rates. The proposed solutions are globally optimal when the cells in each partition are contiguous. The solution algorithms are both based on solving the single-source or the all-pairs minimum-weight path (MWP) problems in certain weighted directed acyclic graphs (WDAG). When the conventional dynamic programming technique is used to solve the underlying MWP problems the time complexity achieved is $O(N^3)$ for both problems, where $N$ is the size of the source alphabet. We first present the optimal design of a two-stage Wyner-Ziv scalar quantizer with forwardly or reversely degraded side information (SI) {for finite-alphabet sources and SI}. We assume that binning is performed optimally and address the design of the quantizer partitions. A solution based on dynamic programming is proposed with $O(N^3)$ time complexity. %The solution relies on finding the single-source or the all-pairs MWP in several one dimensional WDAGs. Further, a so-called {\it partial Monge property} is additionally introduced and a faster solution algorithm exploiting this property is proposed. Experimental results assess the practical performance of the proposed scheme. Then we present the optimal design of an improved modified multiple-description scalar quantizer (MMDSQ). The improvement is achieved by optimizing all the scalar quantizers. %are optimized under the assumption that all the central and side quantizers have contiguous codecells. The optimization is based on solving the single-source MWP problem in a coupled quantizer graph and the all-pairs MWP problem in a WDAG. Another variant design with the same optimization but enhanced with a better decoding process is also presented to decrease the gap to theoretical bounds. Both designs for the second problem have close or even better performances than the literature as shown in experiments. / Thesis / Master of Applied Science (MASc)
110

Adding Threshold Concepts to the Description Logic EL

Fernández Gil, Oliver 14 June 2016 (has links) (PDF)
We introduce a family of logics extending the lightweight Description Logic EL, that allows us to define concepts in an approximate way. The main idea is to use a graded membership function m, which for each individual and concept yields a number in the interval [0,1] expressing the degree to which the individual belongs to the concept. Threshold concepts C~t for ~ in {<,<=,>,>=} then collect all the individuals that belong to C with degree ~t. We further study this framework in two particular directions. First, we define a specific graded membership function deg and investigate the complexity of reasoning in the resulting Description Logic tEL(deg) w.r.t. both the empty terminology and acyclic TBoxes. Second, we show how to turn concept similarity measures into membership degree functions. It turns out that under certain conditions such functions are well-defined, and therefore induce a wide range of threshold logics. Last, we present preliminary results on the computational complexity landscape of reasoning in such a big family of threshold logics.

Page generated in 0.2265 seconds