• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1314
  • 700
  • 234
  • 112
  • 97
  • 43
  • 36
  • 18
  • 16
  • 16
  • 15
  • 15
  • 11
  • 10
  • 10
  • Tagged with
  • 3151
  • 582
  • 547
  • 368
  • 355
  • 298
  • 296
  • 294
  • 237
  • 221
  • 215
  • 208
  • 191
  • 186
  • 180
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1011

BELIEF PROPAGATION DECODING OF FINITE-LENGTH POLAR CODES

RAJAIE, TARANNOM 01 February 2012 (has links)
Polar codes, recently invented by Arikan, are the first class of codes known to achieve the symmetric capacity for a large class of channels. The symmetric capacity is the highest achievable rate subject to using the binary input letters of the channel with equal probability. Polar code construction is based on a phenomenon called channel polarization. The encoding as well as the decoding operation of polar codes can be implemented with O(N logN) complexity, where N is the blocklength of the code. In this work, we study the factor graph representation of finite-length polar codes and their effect on the belief propagation (BP) decoding process over Binary Erasure Channel (BEC). Particularly, we study the parity-check-based (H-Based) as well as the generator based (G-based) factor graphs of polar codes. As these factor graphs are not unique for a code, we study and compare the performance of Belief Propagation (BP) decoders on number of well-known graphs. Error rates and complexities are reported for a number of cases. Comparisons are also made with the Successive Cancellation (SC) decoder. High errors are related to the so-called stopping sets of the underlying graphs. we discuss the pros and cons of BP decoder over SC decoder for various code lengths. / Thesis (Master, Electrical & Computer Engineering) -- Queen's University, 2012-01-31 17:10:59.955
1012

The network structure of courses in Alberta's provincial education system

Fuite, Jim Unknown Date
No description available.
1013

Visual control of human gait during locomotor pointing

Popescu, Adrian Unknown Date
No description available.
1014

Fast and Accurate Visibility Preprocessing

Nirenstein, Shaun 01 October 2003 (has links)
Visibility culling is a means of accelerating the graphical rendering of geometric models. Invisible objects are efficiently culled to prevent their submission to the standard graphics pipeline. It is advantageous to preprocess scenes in order to determine invisible objects from all possible camera views. This information is typically saved to disk and may then be reused until the model geometry changes. Such preprocessing algorithms are therefore used for scenes that are primarily static. Currently, the standard approach to visibility preprocessing algorithms is to use a form of approximate solution, known as conservative culling. Such algorithms over-estimate the set of visible polygons. This compromise has been considered necessary in order to perform visibility preprocessing quickly. These algorithms attempt to satisfy the goals of both rapid preprocessing and rapid run-time rendering. We observe, however, that there is a need for algorithms with superior performance in preprocessing, as well as for algorithms that are more accurate. For most applications these features are not required simultaneously. In this thesis we present two novel visibility preprocessing algorithms, each of which is strongly biased toward one of these requirements. The first algorithm has the advantage of performance. It executes quickly by exploiting graphics hardware. The algorithm also has the features of output sensitivity (to what is visible), and a logarithmic dependency in the size of the camera space partition. These advantages come at the cost of image error. We present a heuristic guided adaptive sampling methodology that minimises this error. We further show how this algorithm may be parallelised and also present a natural extension of the algorithm to five dimensions for accelerating generalised ray shooting. The second algorithm has the advantage of accuracy. No over-estimation is performed, nor are any sacrifices made in terms of image quality. The cost is primarily that of time. Despite the relatively long computation, the algorithm is still tractable and on average scales slightly superlinearly with the input size. This algorithm also has the advantage of output sensitivity. This is the first known tractable exact solution to the general 3D from-region visibility problem. In order to solve the exact from-region visibility problem, we had to first solve a more general form of the standard stabbing problem. An efficient solution to this problem is presented independently.
1015

Rūšiavimo algoritmų vizualizavimas ir sudėtingumo analizė / Visualization and Complexity Analysis of Sorting Algorithms

Saročka, Gediminas 02 July 2012 (has links)
Rūšiavimo algoritmų sudėtingumo analizių galima atrasti be problemų, todėl pagrindinė šio darbo idėja buvo sukurti rūšiavimo algoritmų vizualizavimą. Šiame darbe buvo sukurtas trijų paprastųjų rūšiavimo algoritmų (įterpimo, burbulo ir išrinkimo), bei dviejų greitųjų rūšiavimo algoritmų (Šelo ir sąlajos) vizualizavimas. Darbe taip pat galima skaičiuoti rūšiavimo algoritmų rūšiuojamą laiką. / There is a lot of complexity analysis of sorting algorithms can be found without problems, so the main idea of this work was to create a visualization of sorting algorithms. This work was created three simple sorting algorithms (insertion sort, bubble sort and selection sort), and two high-speed sorting algorithms (Shell sort and merge sort) visualization. This program is capable of calculating sorting time of sorting algorithm for further sorting algorithm complexity analysis.
1016

Parallel algorithms for free and associative-commutative unification

Hains, Gaétan January 1989 (has links)
A survey of algorithms for free unification is given, followed by an overview of the computability and complexity of unification problems. Second-order unification is known to be undecidable, and a proof is given that the first-order problem is also undecidable under an arbitrary set of axioms. A new systolic algorithm is introduced for term minimisation or term compaction. This is a general-purpose tool for systems using structure sharing. Apart from time and space savings, its use allows subterms to be tested for equality in constant time. The use of compact terms greatly simplifies free term matching and gives rise to a linear-time algorithm with lower processing overheads than the Paterson-Wegman unification algorithm. A sublinear-time solution to the same problem is also given, assuming preloaded data. No existing algorithm for free unification has a sublinear-time implementation and this is related to the notion of a sparse P-complete problem. The complexity of restricted associative-commutative term matching is analysed. Contrary to an earlier conjecture the problem is NP-complete if variables occur at most twice but their number is unrestricted. Parallel methods are suggested as efficient solutions for the | tractable | linear and 1-variable versions of the problem. Results presented here should be useful in the implementation of fast symbolic ma- nipulation systems.
1017

The Influence of Music Congruence and Message Complexity on the Response of Consumers to Advertisements

Seneviratne, Buddhakoralalage Leelanga Dananjaya January 2015 (has links)
The overall aim of this study was to examine how the characteristics of two salient stimuli -music and message- of an audio advertisement influence the psychological state of consumers and how such a state subsequently determines their cognitive and affective responses to the advertisement. In achieving this aim, this study was guided by a combination of two cognitive resource utilisation theories, Limited Capacity Model of Motivated Mediated Message Processing (Lang, 2000) and Resource-Matching Hypothesis (Anand & Sternthal, 1989). In particular building upon inconsistency and load theories, this study proposed that certain stimulus characteristics prompted certain states of a consumer’s cognition. These two stimulus characteristics were the congruence of musical stimulus and the complexity of the message stimulus. The model then predicted the potential effect of these characteristics on certain psychological states (Psychological Discomfort and Cognitive Load) leading to affective (Attitude towards Advertisement) and cognitive (encoding, storage, and retention) responses. To empirically examine this model, an online experiment (using a 2 x 2 between-subject x 2 with-in subject mix design) was conducted, in which a mixed sample of 284 subjects was exposed to a set of audio advertisements especially designed for this study. Unfamiliar music in conjunction with a fictitious brand was used and the exposure level was maintained at low. ANCOVA, MANCOVA, two-stage hierarchical regression analysis, and Repeated-measures MANCOVA were administered to test the hypotheses presented in the conceptual model. Among major findings were that the multiple informational structures in a complex message positively influenced cognitive load, while congruent music was capable of attenuating the level of cognitive load. Incongruent music, on the other hand, was capable of generating a dissonance state experienced as psychological discomfort that in turn increased the level of cognitive load as a result of listener’s trying to resolve such a state. Both dissonance and cognitive load negatively influenced attitude towards advertisements, and the affect primacy of attitude formation appeared to be more applicable. Though high cognitive load clearly undermines encoding, storage, and retrieval processes, no evidence was found to support the Resource-matching Hypothesis. Furthermore, the findings suggested that the cognitive load offset by the congruent music would increase advertisement effectiveness by enabling its message to carry more information and by generating more favourable attitudes.
1018

PRODUCTION, EXCHANGE AND SOCIAL INTERACTION IN THE GREEN RIVER REGION OF WESTERN KENTUCKY: A MULTISCALAR APPROACH TO THE ANALYSIS OF TWO SHELL MIDDEN SITES

Moore, Christopher R. 01 January 2011 (has links)
The Green River region of western Kentucky has been a focus of Archaic period research since 1915. Currently, the region is playing an important role in discussions of Archaic hunter-gatherer cultural complexity. Unfortunately, many of the larger Green River sites contain several archaeological components ranging from the Early to Late Archaic periods. Understanding culture change requires that these multiple components somehow be sorted and addressed individually. Detailed re-analyses of Works Progress Administration (WPA) era artifact collections from two archaeological sites in the Green River region – the Baker (15Mu12) and Chiggerville (15Oh1) shell middens – indicate that these sites are relatively isolated Middle and Late Archaic components, respectively. The relatively unmixed character of Baker and Chiggerville makes these sites excellent candidates for evaluating aspects of complexity during the Archaic. After developing a theoretical basis for evaluating the relative complexity of the social organization of the Baker and Chiggerville site inhabitants on the basis of the material record they left behind, I employ detailed analyses of the bone, antler, and stone tools from these two sites to examine six microscalar aspects of complexity – technological organization, subsistence, specialization, leadership, communication networks, and exchange. These microscalar aspects of complexity all can be linked materially to the archaeological record of the Green River region and can be evaluated as proxies for changes in social organization among the hunter-gatherers who inhabited this region during the Middle and Late Archaic periods. Although the Baker assemblage indicated greater complexity in communication networks and certain proxies for leadership and technological organization, most indicators suggest that the Chiggerville site inhabitants were the more complexly organized group and were in the process of developing a tribal-like social formation. This research, therefore, tentatively supports the hypothesis of increasing complexity through time during the Archaic. However, marked differences in the technological strategies utilized by the Baker and Chiggerville site inhabitants indicates these groups may not have been historically related, thereby violating one of the primary assumptions of the project. If this alternative hypothesis is confirmed through additional research, then no conclusions concerning change through time can be derived from this study.
1019

Finite Memory Policies for Partially Observable Markov Decision Proesses

Lusena, Christopher 01 January 2001 (has links)
This dissertation makes contributions to areas of research on planning with POMDPs: complexity theoretic results and heuristic techniques. The most important contributions are probably the complexity of approximating the optimal history-dependent finite-horizon policy for a POMDP, and the idea of heuristic search over the space of FFTs.
1020

THE INFLUENCE OF WIDOWED STATUS AND TASK COMPLEXITY ON DECISION MAKING

Ortz, Courtney L. 01 January 2013 (has links)
Widowhood is a stressful life event that can impact an individual’s everyday life, including her decision making abilities. The complexity of the decision is also likely to influence the decision making abilities of these widows. The purpose of this dissertation was to better understand widows’ decision making processes, their preferences for collaboration when making decisions, and their satisfaction with the decision outcomes. Data analysis consisted of a series of 3 (widowed status) x 2 (task complexity) ANOVAS and ANCOVAS which found that both complexity and widowed status influence decision making processes. Higher complexity led to less overall satisfaction, but none of the other satisfaction variables yielded significant results. In addition, there were no significant findings with regard to preferences for collaboration. Multiple linear regressions were conducted to better understand individual difference variables on decision processing. Restoration orientation coping, loss orientation coping, and task complexity were found to be significant for decision processing and satisfaction measures. Future studies should aim to develop decision aids for this particular population so that they are able to make better decisions.

Page generated in 0.0681 seconds