• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 518
  • 53
  • 47
  • 32
  • 12
  • 12
  • 7
  • 7
  • 5
  • 5
  • 5
  • 5
  • 5
  • 5
  • 3
  • Tagged with
  • 771
  • 771
  • 601
  • 588
  • 135
  • 116
  • 101
  • 89
  • 68
  • 64
  • 61
  • 60
  • 60
  • 56
  • 54
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
401

Bayesian methods for knowledge transfer and policy search in reinforcement learning

Wilson, Aaron (Aaron Creighton) 28 July 2012 (has links)
How can an agent generalize its knowledge to new circumstances? To learn effectively an agent acting in a sequential decision problem must make intelligent action selection choices based on its available knowledge. This dissertation focuses on Bayesian methods of representing learned knowledge and develops novel algorithms that exploit the represented knowledge when selecting actions. Our first contribution introduces the multi-task Reinforcement Learning setting in which an agent solves a sequence of tasks. An agent equipped with knowledge of the relationship between tasks can transfer knowledge between them. We propose the transfer of two distinct types of knowledge: knowledge of domain models and knowledge of policies. To represent the transferable knowledge, we propose hierarchical Bayesian priors on domain models and policies respectively. To transfer domain model knowledge, we introduce a new algorithm for model-based Bayesian Reinforcement Learning in the multi-task setting which exploits the learned hierarchical Bayesian model to improve exploration in related tasks. To transfer policy knowledge, we introduce a new policy search algorithm that accepts a policy prior as input and uses the prior to bias policy search. A specific implementation of this algorithm is developed that accepts a hierarchical policy prior. The algorithm learns the hierarchical structure and reuses components of the structure in related tasks. Our second contribution addresses the basic problem of generalizing knowledge gained from previously-executed policies. Bayesian Optimization is a method of exploiting a prior model of an objective function to quickly identify the point maximizing the modeled objective. Successful use of Bayesian Optimization in Reinforcement Learning requires a model relating policies and their performance. Given such a model, Bayesian Optimization can be applied to search for an optimal policy. Early work using Bayesian Optimization in the Reinforcement Learning setting ignored the sequential nature of the underlying decision problem. The work presented in this thesis explicitly addresses this problem. We construct new Bayesian models that take advantage of sequence information to better generalize knowledge across policies. We empirically evaluate the value of this approach in a variety of Reinforcement Learning benchmark problems. Experiments show that our method significantly reduces the amount of exploration required to identify the optimal policy. Our final contribution is a new framework for learning parametric policies from queries presented to an expert. In many domains it is difficult to provide expert demonstrations of desired policies. However, it may still be a simple matter for an expert to identify good and bad performance. To take advantage of this limited expert knowledge, our agent presents experts with pairs of demonstrations and asks which of the demonstrations best represents a latent target behavior. The goal is to use a small number of queries to elicit the latent behavior from the expert. We formulate a Bayesian model of the querying process, an inference procedure that estimates the posterior distribution over the latent policy space, and an active procedure for selecting new queries for presentation to the expert. We show, in multiple domains, that the algorithm successfully learns the target policy and that the active learning strategy generally improves the speed of learning. / Graduation date: 2013
402

Data augmentation for latent variables in marketing

Kao, Ling-Jing, January 2006 (has links)
Thesis (Ph. D.)--Ohio State University, 2006. / Title from first page of PDF file. Includes bibliographical references (p. 215-219).
403

Sigma-Point Kalman Filters for Probabilistic Inference in Dynamic State-Space Models

Van der Merwe, Rudolph 04 1900 (has links) (PDF)
Ph.D. / Electrical and Computer Engineering / Probabilistic inference is the problem of estimating the hidden variables (states or parameters) of a system in an optimal and consistent fashion as a set of noisy or incomplete observations of the system becomes available online. The optimal solution to this problem is given by the recursive Bayesian estimation algorithm which recursively updates the posterior density of the system state as new observations arrive. This posterior density constitutes the complete solution to the probabilistic inference problem, and allows us to calculate any "optimal" estimate of the state. Unfortunately, for most real-world problems, the optimal Bayesian recursion is intractable and approximate solutions must be used. Within the space of approximate solutions, the extended Kalman filter (EKF) has become one of the most widely used algorithms with applications in state, parameter and dual estimation. Unfortunately, the EKF is based on a sub-optimal implementation of the recursive Bayesian estimation framework applied to Gaussian random variables. This can seriously affect the accuracy or even lead to divergence of any inference system that is based on the EKF or that uses the EKF as a component part. Recently a number of related novel, more accurate and theoretically better motivated algorithmic alternatives to the EKF have surfaced in the literature, with specific application to state estimation for automatic control. We have extended these algorithms, all based on derivativeless deterministic sampling based approximations of the relevant Gaussian statistics, to a family of algorithms called Sigma-Point Kalman Filters (SPKF). Furthermore, we successfully expanded the use of this group of algorithms (SPKFs) within the general field of probabilistic inference and machine learning, both as stand-alone filters and as subcomponents of more powerful sequential Monte Carlo methods (particle filters). We have consistently shown that there are large performance benefits to be gained by applying Sigma-Point Kalman filters to areas where EKFs have been used as the de facto standard in the past, as well as in new areas where the use of the EKF is impossible.
404

Den bångstyriga verkligheten : Har det svenska systemskiftet haft någon betydelse för arbetet med elever i behov av stöd? / The unruly reality : Have the changes in the Swedish political and economic system had a significant effect on the organization of assistance for pupils with special needs?

Löfquist, Staffan January 1999 (has links)
This study compares the manner in which local schools organize to address the policy problem of support for pupils with special needs. Since the end of the 1970s, it has been a central aim to decentralise basic comprehensive education in Sweden. Several reforms have totally remapped the formal organization structures and how educational resources are allocated from the state to municipalities. The role of central and regional state administration has shifted from being highly involved in regulating state grants to evaluating implementation of state goals. The implementa­tion structure approach used in research relies primarily upon semi-structured interviews with members of schools to identify who is involved in the tasks of defining needs, deciding priorities among needs, mobilising resources to alleviate needs and evaluating the work. From the teachers selected as the point of entry into schools, the interviewing proceeded to other members of the municipal educational system, within or outside the local school district. In each municipality one local school district was selected for study, and in each school district the study focuses on the upper level of the comprehensive school. Care has been taken to select schools of a certain size and to find schools with non-selective school populations. The same schools were studied in 1986 and 1995. One conclusion from the interviews in 1986 was that political intentions and the special SR- grant had minor or non-existent implications for the work being done in schools. It was of no interest where resources came from. This same conclusion can still be made in 1995. The review of how municipal education committees organized allocation of the SR-grant in 1986 did not indicate that they had acted to develop areas or criteria to direct more actively the use of resources in schools. This study argues that the capacity of municipal education committees to actively participate in the work has actually deteriorated. Decentralization of formal powers in combination with declining resources actually worsened the situation in this respect between 1986 and 1995. One can see that variation between schools in respect of total resources in the schools has declined, but to a level under the expected total amount of resources in 1986. In a comparison between schools on how schools actually have allocated resources to different education purposes, the finding is that the variation is immense both in 1986 and 1995 - but in different ways. Some schools gave priority to lowering the group average while others made their priority special education teachers. In both cases there is no evidence that pupil needs had anything to do with these priorities. The lack of evaluations for pupils with special needs is the foremost problem for both those with responsibility for schools and for the implications at higher levels.
405

Information and communication in public affairs management ¡Vthe integration experiment for the third-person effect

Yang, Yung-Ho 26 July 2007 (has links)
In the integrated public affairs management framework, except information communication, mass communication media also have economic industrial characteristics. Issues from mass communication media must make people have clinical and neutral cognition to avoided influences on people¡¦s judgements about facts. According to the study of the third-person effect, we proved that people mentually exist cognitive biases against communication information. Thus they always become operated subjects of elections. In order to understand subjects¡¦ cognitive and judge models, we employed three major factors, two situations of negative and positive news about THSR (Taiwan High-Speed Railway) to design our Information Integration Theory research. We also compared with the south and north people in Taiwan to show unbalances of the regional development about mass media and population quality. Findings of the research are as follow¡G 1. The north people with different issue involvements, have more movements in the the third-person effect in the negative news situation. 2. In the negative news situation, the north people¡¦s third-person effect and information reliability were represented by a negative correlation. 3. Media expose and people¡¦s third-person effect were represented by a positive correlation. 4. People¡¦s information integration models were following the adding rule. 5. The north people have more intention to support media regulation with the public opinion and actions. At last, we concluded several suggestions: 1. Unbalances of the regional development in Taiwan not only between the south and north, we must invest more resources to study and record. 2. Our study was a tiny subsection of public affairs management framework, researchers could study deeper and wider in the future. 3. The statistic tool of Information Integration Theory need to be updated to simplify importing data.
406

Contributions to Bayesian wavelet shrinkage

Remenyi, Norbert 07 November 2012 (has links)
This thesis provides contributions to research in Bayesian modeling and shrinkage in the wavelet domain. Wavelets are a powerful tool to describe phenomena rapidly changing in time, and wavelet-based modeling has become a standard technique in many areas of statistics, and more broadly, in sciences and engineering. Bayesian modeling and estimation in the wavelet domain have found useful applications in nonparametric regression, image denoising, and many other areas. In this thesis, we build on the existing techniques and propose new methods for applications in nonparametric regression, image denoising, and partially linear models. The thesis consists of an overview chapter and four main topics. In Chapter 1, we provide an overview of recent developments and the current status of Bayesian wavelet shrinkage research. The chapter contains an extensive literature review consisting of almost 100 references. The main focus of the overview chapter is on nonparametric regression, where the observations come from an unknown function contaminated with Gaussian noise. We present many methods which employ model-based and adaptive shrinkage of the wavelet coefficients through Bayes rules. These includes new developments such as dependence models, complex wavelets, and Markov chain Monte Carlo (MCMC) strategies. Some applications of Bayesian wavelet shrinkage, such as curve classification, are discussed. In Chapter 2, we propose the Gibbs Sampling Wavelet Smoother (GSWS), an adaptive wavelet denoising methodology. We use the traditional mixture prior on the wavelet coefficients, but also formulate a fully Bayesian hierarchical model in the wavelet domain accounting for the uncertainty of the prior parameters by placing hyperpriors on them. Since a closed-form solution to the Bayes estimator does not exist, the procedure is computational, in which the posterior mean is computed via MCMC simulations. We show how to efficiently develop a Gibbs sampling algorithm for the proposed model. The developed procedure is fully Bayesian, is adaptive to the underlying signal, and provides good denoising performance compared to state-of-the-art methods. Application of the method is illustrated on a real data set arising from the analysis of metabolic pathways, where an iterative shrinkage procedure is developed to preserve the mass balance of the metabolites in the system. We also show how the methodology can be extended to complex wavelet bases. In Chapter 3, we propose a wavelet-based denoising methodology based on a Bayesian hierarchical model using a double Weibull prior. The interesting feature is that in contrast to the mixture priors traditionally used by some state-of-the-art methods, the wavelet coefficients are modeled by a single density. Two estimators are developed, one based on the posterior mean and the other based on the larger posterior mode; and we show how to calculate these estimators efficiently. The methodology provides good denoising performance, comparable even to state-of-the-art methods that use a mixture prior and an empirical Bayes setting of hyperparameters; this is demonstrated by simulations on standard test functions. An application to a real-word data set is also considered. In Chapter 4, we propose a wavelet shrinkage method based on a neighborhood of wavelet coefficients, which includes two neighboring coefficients and a parental coefficient. The methodology is called Lambda-neighborhood wavelet shrinkage, motivated by the shape of the considered neighborhood. We propose a Bayesian hierarchical model using a contaminated exponential prior on the total mean energy in the Lambda-neighborhood. The hyperparameters in the model are estimated by the empirical Bayes method, and the posterior mean, median, and Bayes factor are obtained and used in the estimation of the total mean energy. Shrinkage of the neighboring coefficients is based on the ratio of the estimated and observed energy. The proposed methodology is comparable and often superior to several established wavelet denoising methods that utilize neighboring information, which is demonstrated by extensive simulations. An application to a real-world data set from inductance plethysmography is considered, and an extension to image denoising is discussed. In Chapter 5, we propose a wavelet-based methodology for estimation and variable selection in partially linear models. The inference is conducted in the wavelet domain, which provides a sparse and localized decomposition appropriate for nonparametric components with various degrees of smoothness. A hierarchical Bayes model is formulated on the parameters of this representation, where the estimation and variable selection is performed by a Gibbs sampling procedure. For both the parametric and nonparametric part of the model we are using point-mass-at-zero contamination priors with a double exponential spread distribution. In this sense we extend the model of Chapter 2 to partially linear models. Only a few papers in the area of partially linear wavelet models exist, and we show that the proposed methodology is often superior to the existing methods with respect to the task of estimating model parameters. Moreover, the method is able to perform Bayesian variable selection by a stochastic search for the parametric part of the model.
407

A brief discourse on human conduct in economics

Hayes, Ethan 06 July 2006
Since the transformation from Political Economy to Economics and from Classical to Neoclassical theory in the late nineteenth century, a theory of human behavior has constituted the initial foundation upon which all economic theory is based and developed. Two main theories of human behavior developed by William Stanley Jevons and Carl Menger have been generally accepted to have ushered in this Marginalist Revolution. Jevons marginal utility theory popularized by Alfred Marshall is still extensively used today, while the Austrian approach of Menger was effectively removed from academic discussion in the nineteen thirties; mainly as a result of the annexation of Austria and the dissolution of the Austrian School of Economics. Given the inability of economists to fully operationalize the marginal utility theory and realistically explain and resolve a broad range of behavioral anomalies using Neoclassical and Post-Neoclassical Economics, this thesis attempts to examine and address the most fundamental issues of human behavior in economics to explain how utility theory and modern Neoclassical and Post-Neoclassical Economics are flawed and how a realistic theory of human behavior, developed from the scholarly work of the early Austrian Economists, can be used to develop the basis of a scientific economics, derived from observation, that holds the potential to both expand the scope of economic understanding, redirect the focus of the discipline, and possibly unify the many disparate theories in the field.
408

Bayesian model class selection on regression problems

Mu, He Qing January 2010 (has links)
University of Macau / Faculty of Science and Technology / Department of Civil and Environmental Engineering
409

Essays on Labor Supply Dynamics, Home Production, and Case-based Preferences

Naaman, Michael 24 July 2013 (has links)
In this paper we examine models that incorporate CBDT. In the first chapter, we will examine CBDT more thoroughly including a reinterpretation of the standard labor supply problem under a wage tax in a partial equilibrium model where preferences exhibit characteristics of CBDT. In the second chapter, we extend the labor supply decision under a wage tax by incorporating a household production function. Utility maximization by repeated substitution is applied as a novel approach to solving dynamic optimization problems. This approach allows us to find labor supply elasticities that evolve over the life cycle. In the third chapter, CBDT will be explored in more depth focusing on its applicability in representing people's preferences over movie rentals in the Netflix competition. This chapter builds on the theoretical model introduced in chapter 1, among other things, expressing the rating of any customer movie pair using the ratings of similar movies that the customer rated and the ratings of the movie in question by similar customers. We will also explore in detail the econometric model used in the Netflix competition which utilizes machine learning and spatial regression to estimate customer's preferences.
410

Decisions that make things work better: an analysis of the quality concept

Camps Lorente, Oriol 06 July 2012 (has links)
The present thesis is aimed at analyzing the concept of quality and at dis-cussing, in a unified manner, its role not only in operations management but also in strategic thinking. It criticizes the widespread view that quality is meeting the client’s needs and expectations in such a way that the gap between perceptions and expectations is minimized. Essentially it develops a systematic proposal in order to understand the concept on the intui-tive basis that quality is tantamount to how well something works for a given purpose. The analysis is based on the fact that anytime that the quality concept is used there are actors that carry out an action with the help of a means—thus quality is a particular sort of means-ends fitness. Roughly speaking, the quality of a means is its capability to improve the expected conse-quences of the action. The analysis shows under which conditions this conclusion can be understood in terms of multi-attribute preference orderings under uncertainty—some ideas from decision theory, which are required in order to do that, are presented. In short, whether the expected consequences improve or not depends on an assumed preference ordering that has to be correct given the actors’ circumstances and purposes—but it may be distinct from the actual preferences of many individual actors. Quality is neither subjective (it does not change depending on the psychological processes of any particular individual) nor objective (in a sense, it depends on action and cognition), but it is relative to a given set of reference preferences. Some conditions apply to what it counts as a means, how it relates to an end, which attributes are relevant to assess consequences, or which reference preferences are well-formed. In particular, I discuss to which kind of means the concept is properly applied. As a complement, a basic model of means-ends relationships (built on several properties of Boolean functions) is presented. At a slightly more technical level, it shows relevant insights, but strictly speaking it is not required in order to understand the rest of the thesis. Quality appears in management under two interrelated forms: (a) or-ganizations’ interventions in the quality of what they use and provide and (b) organizations’ initiatives to compete through quality. (a) The basic structure behind quality management is examined under the lens of the quality concept’s analysis. The following issues are dis-cussed: setting quality criteria, product design, process design, onsite planning, onsite control, standardization, product improvement, process improvement and rethinking reference preferences. (b) The role of quality on competitive advantage and sustainable profit-ability depends on how quality relates to entry barriers. I show different ways in which quality can interact (if it does) with product differentiation, experience effects, scope economies, reputation, capital requirements, access to distribution channels, switching costs, legal barriers or scale economies. The strength of quality as a driver of profitability is dis-cussed—the conclusion is that it is not easy to build sustainable competitive advantages on the basis of quality alone and that its complementary role in competition has several aspects that are needed to be taken into account. Finally, the particular example of how the quality concept works in re-lation to information flows is treated in some detail. Information flows are processes that move information from the firm’s inner or outer environ-ment to actions. Decisive factors of its quality (basically, information asymmetries and coherence) are discussed. Two actual case studies are presented. The example of information flows is aimed at showing the mo-tivation for a general analysis of the quality concept beyond sloganlike statements about clients, products, perceptions and expectations. / La tesis tiene como objetivo analizar el concepto de calidad y discutir de forma unificada su papel tanto en la dirección de operaciones como en el pensamiento estratégico. El trabajo argumenta en contra de la opinión generalizada de que la calidad es satisfacer las necesidades y expectativas del cliente de tal manera que la diferencia entre percepciones y expectativas se reduzca. Se desarrolla una propuesta sistemática con el fin de entender el concepto sobre la base intuitiva de que la calidad se relaciona con lo bien que funcionan las cosas para un fin determinado. El análisis se basa en el hecho de que cualquier uso del concepto de calidad supone que hay actores que realizan una acción con la ayuda de un medio. En consecuencia, la calidad es un tipo particular de relación entre medios y fines. En términos generales, la calidad de un medio es su capacidad para mejorar las consecuencias esperadas de la acción. El análisis muestra cómo ésta conclusión puede entenderse en términos de preferencias multiatributo bajo condiciones de incertidumbre. Que las consecuencias sean mejores o no depende de suponer una relación de preferencias que debe ser correcta dados los propósitos y circunstancias de los actores, pero que puede ser distinta de las preferencias reales de muchos actores individuales. La calidad no es subjetiva (no cambia en función de los procesos psicológicos de un individuo en particular) ni objetiva (en cierto sentido, depende de la acción y la cognición), sino que es relativa a ciertas preferencias de referencia. En la tesis se estudia qué se considera un medio, cómo se relaciona con un fin, qué atributos son relevantes, cuándo las preferencias de referencia están bien formadas, y a qué tipo de medios tiene sentido aplicar el concepto. Como complemento, se presenta un modelo simple (construido a partir de propiedades de las funciones booleanas) sobre relaciones medios-fines. El concepto de calidad aparece en la gestión bajo dos formas interrelacionadas: (a) las actividades de las organizaciones para intervenir en la calidad de lo que utilizan y proporcionan, y (b) las iniciativas para competir a través de la calidad. (a) Las actividades básicas de la gestión de la calidad se examinan bajo la perspectiva del análisis del concepto: definición de criterios de calidad, diseño de productos, diseño de procesos, planificación operativa, control, estandarización, mejora del producto, mejora de procesos, y redefinición de las preferencias de referencia. (b) El papel de la calidad en la rentabilidad y la ventaja competitiva sostenibles depende de cómo la calidad se relaciona con las barreras de entrada. El trabajo trata diferentes formas en que la calidad puede interactuar con la diferenciación del producto, los efectos de aprendizaje, las economías de alcance, la reputación, los requisitos de capital, el acceso a canales de distribución, los efectos red, las barreras legales o las economías de escala. Se argumenta que no es fácil de construir ventajas competitivas sostenibles solamente sobre la base de la calidad; su papel complementario en otras estrategias competitivas tiene varios aspectos a considerar. Por último, se trata el ejemplo concreto de la calidad de los flujos de información (procesos que mueven información desde el entorno hasta las decisiones para modificarlo). Se estudian, y se ilustran mediante dos casos reales, los factores decisivos de su calidad: la coherencia y las asimetrías en la información. El ejemplo de la información tiene por objetivo mostrar la necesidad de un análisis del concepto de calidad más allá de exhortaciones sobre clientes, productos, percepciones y expectativas. / L’objectiu de la tesi és analitzar el concepte de qualitat i discutir de forma unificada el seu paper tant en la direcció d'operacions com en el pensament estratègic. El treball argumenta en contra de l'opinió generalitzada de que la qualitat consisteix en satisfer les necessitats i expectatives del client de manera que la diferència entre percepcions i expectatives es redueixi. Es desenvolupa una proposta sistemàtica per tal d'entendre el concepte sobre la base intuïtiva de que la qualitat es relaciona amb com de bé funcionen les coses per una finalitat determinada. L'anàlisi es basa en el fet que qualsevol ús del concepte de qualitat suposa que hi ha actors que realitzen una acció amb l'ajuda d'un mitjà. En conseqüència, la qualitat és un tipus particular de relació entre mitjans i fins. En termes generals, la qualitat d'un mitjà és la seva capacitat per millorar les conseqüències esperades de l'acció. L'anàlisi mostra com aquesta conclusió es pot entendre en termes de preferències multiatribut sota condicions d'incertesa. Que les conseqüències siguin millors o no depèn de suposar una relació de preferències que ha de ser correcta donats els propòsits i circumstàncies dels actors, però que pot ser diferent de les preferències reals de molts actors individuals. La qualitat no és subjectiva (no canvia en funció dels processos psicològics d'un individu en particular) ni objectiva (en certa manera, depèn de l'acció i la cognició), sinó que és relativa a certes preferències de referència. En la tesi s'estudia què es considera un mitjà, com es relaciona amb un fi, quins atributs són rellevants, quan les preferències de referència estan ben formades, i a quin tipus de mitjans té sentit aplicar el concepte. Com a complement, es presenta un model simple (construït a partir de propietats de les funcions booleanes) sobre les relacions entre mitjans i fins. El concepte de qualitat apareix en la gestió sota dues formes interrelacionades: (a) les activitats de les organitzacions per intervenir en la qualitat d’allò que fan servir i proporcionen, i (b) les iniciatives per competir a través de la qualitat. (a) Les activitats bàsiques de la gestió de la qualitat s'examinen sota la perspectiva de l'anàlisi del concepte: definició de criteris de qualitat, disseny de productes, disseny de processos, planificació operativa, control, estandardització, millora del producte, millora de processos , i redefinició de les preferències de referència. (b) El paper de la qualitat en la rendibilitat i l'avantatge competitiu sostenibles depèn de com la qualitat es relaciona amb les barreres d'entrada. El treball tracta diferents formes en les que la qualitat pot interactuar amb la diferenciació del producte, els efectes d'aprenentatge, les economies d'abast, la reputació, els requisits de capital, l'accés a canals de distribució, els efectes xarxa, les barreres legals o les economies d'escala. S'argumenta que no és fàcil construir avantatges competitius sostenibles només sobre la base de la qualitat; el seu paper complementari en altres estratègies competitives té diversos aspectes a considerar. Finalment, es tracta l'exemple concret de la qualitat dels fluxos d'informació (processos que mouen informació des de l'entorn fins a les decisions per modificar). S'estudien, i s'il•lustren mitjançant dos casos reals, els factors decisius de la seva qualitat: la coherència i les asimetries en la informació. L'exemple de la informació té per objectiu mostrar la necessitat d'una anàlisi del concepte de qualitat més enllà d’exhortacions sobre clients, productes, percepcions i expectatives.

Page generated in 0.0993 seconds