• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 129
  • 43
  • 20
  • 17
  • 9
  • 9
  • 5
  • 4
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 308
  • 100
  • 74
  • 36
  • 35
  • 33
  • 31
  • 29
  • 25
  • 24
  • 24
  • 23
  • 20
  • 19
  • 19
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

A Curriculum Guide for Integrating Literary Theory into Twelfth Grade Florida english Language Arts

Philpot, Helen 01 January 2007 (has links)
Providing high school students a course of study for becoming competent and thorough lifelong independent readers of complex texts was the goal for this thesis. This is accomplished by integrating literary theory that looks beyond just the typical level of analysis often emphasized in many Florida classrooms. If put into use and successful, this curriculum guide will aid Florida teachers in endowing their students with a new level of ability to analyze literature. Research of prior work done in the field of integrating critical theory into high school classrooms was analyzed and synthesized in order to create a larger course of critical theory study to be completed during the senior year of high school in the state of Florida. The curriculum guide acts as a starting point, providing teachers with all the tools necessary to bring literary theory into the high school classroom while maintaining their individual teaching style. The curriculum guide is broken into four distinct units which follow the most common course of Florida twelfth grade study, the English canon, with each chapter addressing two literary theories. The literary theories utilized are: New Criticism, New Historicism, Feminism, Marxism, Reader Response, Psychoanalysis, Structuralism, and Deconstruction.
42

On the Influence of Structure and Complexity in Perceived Duration

Zeigler , Derek E. January 2013 (has links)
No description available.
43

From Normality to Pathology: In Defense of Continuity

Petrolini, Valentina, M.A. January 2017 (has links)
No description available.
44

Neural Decoding of Categorical Features in Naturalistic Social Interactions

Kim, Eunbin 19 December 2018 (has links)
No description available.
45

Dimensionality Reduction with Non-Gaussian Mixtures

Tang, Yang 11 1900 (has links)
Broadly speaking, cluster analysis is the organization of a data set into meaningful groups and mixture model-based clustering is recently receiving a wide interest in statistics. Historically, the Gaussian mixture model has dominated the model-based clustering literature. When model-based clustering is performed on a large number of observed variables, it is well known that Gaussian mixture models can represent an over-parameterized solution. To this end, this thesis focuses on the development of novel non-Gaussian mixture models for high-dimensional continuous and categorical data. We developed a mixture of joint generalized hyperbolic models (JGHM), which exhibits different marginal amounts of tail-weight. Moreover, it takes into account the cluster specific subspace and, therefore, limits the number of parameters to estimate. This is a novel approach, which is applicable to high, and potentially very- high, dimensional spaces and with arbitrary correlation between dimensions. Three different mixture models are developed using forms of the mixture of latent trait models to realize model-based clustering of high-dimensional binary data. A family of mixture of latent trait models with common slope parameters are developed to reduce the number of parameters to be estimated. This approach facilitates a low-dimensional visual representation of the clusters. We further developed the penalized latent trait models to facilitate ultra high dimensional binary data which performs automatic variable selection as well. For all models and families of models developed in this thesis, the algorithms used for model-fitting and parameter estimation are presented. Real and simulated data sets are used to assess the clustering ability of the models. / Thesis / Doctor of Philosophy (PhD)
46

Reconstruction results for first-order theories

Han, Jesse January 2018 (has links)
In this thesis, we study problems related to the reconstruction (up to bi-interpretability) of first-order theories from various functorial invariants: automorphism groups, endomorphism monoids, (categories of) countable models, and (ultra)categories of models. / Thesis / Master of Science (MSc)
47

A stochastic process model for transient trace data

Mathur, Anup 05 October 2007 (has links)
Creation of sufficiently accurate workload models of computer systems is a key step in evaluating and tuning these systems. Workload models for an observable system can be built from traces collected by observing the system. This dissertation presents a novel technique to construct non-executable, artificial workload models fitting transient trace data. The trace can be a categorical or numerical time-series. The trace is considered a sample realization of a non-stationary stochastic process, {X<sub>t</sub>}, such that random variables X<sub>t</sub> follow different probability distributions. To estimate the parameters for the model a Rate Evolution Graph (REG) is built from the trace data. The REG is a two-dimensional Cartesian graph which plots the number of occurrences of each unique state in the trace on the ordinate and time on the abscissa. The REG contains one path for all instances of each unique state in the trace. The derivative of a REG path at time t is used as an estimate of the probability of occurrence of the corresponding state at t. We use piecewise linear regression to fit straight line segments to each REG path. The slopes of the line segments that fit a REG path estimate the time dependent probability of occurrence of the corresponding state. The estimates of occurrence probabilities of all unique states in the trace are used to construct a time-dependent joint probability mass function. The joint probability mass function is the representation of the Pzrecewise Independent Stochastic Process model for the trace. Two methods that assist to compact the model, while retaining accuracy, are also discussed. / Ph. D.
48

Sémantique algébrique des ressources pour la logique classique / Algebraic resource semantics for classical logic

Novakovic, Novak 08 November 2011 (has links)
Le thème général de cette thèse est l’exploitation de l’interaction entre la sémantique dénotationnelle et la syntaxe. Des sémantiques satisfaisantes ont été découvertes pour les preuves en logique intuitionniste et linéaire, mais dans le cas de la logique classique, la solution du problème est connue pour être particulièrement difficile. Ce travail commence par l’étude d’une interprétation concrète des preuves classiques dans la catégorie des ensembles ordonnés et bimodules, qui mène à l’extraction d’invariants significatifs. Suit une généralisation de cette sémantique concrète, soit l’interprétation des preuves classiques dans une catégorie compacte fermée où chaque objet est doté d’une structure d’algèbre de Frobenius. Ceci nous mène à une définition de réseaux de démonstrations pour la logique classique. Le concept de correction, l’élimination des coupures et le problème de la “full completeness” sont abordés au moyen d’un enrichissement naturel dans les ordres sur la catégorie de Frobenius, produisant une catégorie pour l'élimination des coupures et un concept de ressources pour la logique classique. Revenant sur notre première sémantique concrète, nous montrons que nous avons une représentation fidèle de la catégorie de Frobenius dans la catégorie des ensembles ordonnés et bimodules. / The general theme of this thesis is the exploitation of the fruitful interaction between denotational semantics and syntax. Satisfying semantics have been discovered for proofs in intuitionistic and certain linear logics, but for the classical case, solving the problem is notoriously difficult.This work begins with investigations of concrete interpretations of classical proofs in the category of posets and bimodules, resulting in the definition of meaningful invariants of proofs. Then, generalizing this concrete semantics, classical proofs are interpreted in a free symmetric compact closed category where each object is endowed with the structure of a Frobenius algebra. The generalization paves a way for a theory of proof nets for classical proofs. Correctness, cut elimination and the issue of full completeness are addressed through natural order enrichments defined on the Frobenius category, yielding a category with cut elimination and a concept of resources in classical logic. Revisiting our initial concrete semantics, we show we have a faithful representation of the Frobenius category in the category of posets and bimodules.
49

The algebra of entanglement and the geometry of composition

Hadzihasanovic, Amar January 2017 (has links)
String diagrams turn algebraic equations into topological moves that have recurring shapes, involving the sliding of one diagram past another. We individuate, at the root of this fact, the dual nature of polygraphs as presentations of higher algebraic theories, and as combinatorial descriptions of "directed spaces". Operations of polygraphs modelled on operations of topological spaces are used as the foundation of a compositional universal algebra, where sliding moves arise from tensor products of polygraphs. We reconstruct several higher algebraic theories in this framework. In this regard, the standard formalism of polygraphs has some technical problems. We propose a notion of regular polygraph, barring cell boundaries that are not homeomorphic to a disk of the appropriate dimension. We define a category of non-degenerate shapes, and show how to calculate their tensor products. Then, we introduce a notion of weak unit to recover weakly degenerate boundaries in low dimensions, and prove that the existence of weak units is equivalent to a representability property. We then turn to applications of diagrammatic algebra to quantum theory. We re-evaluate the category of Hilbert spaces from the perspective of categorical universal algebra, which leads to a bicategorical refinement. Then, we focus on the axiomatics of fragments of quantum theory, and present the ZW calculus, the first complete diagrammatic axiomatisation of the theory of qubits. The ZW calculus has several advantages over ZX calculi, including a computationally meaningful normal form, and a fragment whose diagrams can be read as setups of fermionic oscillators. Moreover, its generators reflect an operational classification of entangled states of 3 qubits. We conclude with generalisations of the ZW calculus to higher-dimensional systems, including the definition of a universal set of generators in each dimension.
50

Modern design of experiments methods for screening and experimentations with mixture and qualitative variables

Chantarat, Navara 06 November 2003 (has links)
No description available.

Page generated in 0.0633 seconds