• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 91
  • 33
  • 11
  • 7
  • 5
  • 5
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 200
  • 29
  • 25
  • 21
  • 20
  • 17
  • 16
  • 16
  • 15
  • 15
  • 14
  • 14
  • 13
  • 13
  • 13
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

A Novel Concept and Context-Based Approach for Web Information Retrieval

Zakos, John, n/a January 2005 (has links)
Web information retrieval is a relatively new research area that has attracted a significant amount of interest from researchers around the world since the emergence of the World Wide Web in the early 1990s. The problems facing successful web information retrieval are a combination of challenges that stem from traditional information retrieval and challenges characterised by the nature of the World Wide Web. The goal of any information retrieval system is to provide an information need fulfilment in response to an information need. In a web setting, this means retrieving as many relevant web documents as possible in response to an inputted query that is typically limited to only containing a few terms expressive of the user's information need. This thesis is primarily concerned with firstly reviewing pertinent literature related to various aspects of web information retrieval research and secondly proposing and investigating a novel concept and context-based approach. The approach consists of techniques that can be used together or independently and aim to provide an improvement in retrieval accuracy over other approaches. A novel concept-based term weighting technique is proposed as a new method of deriving query term significance from ontologies that can be used for the weighting of inputted queries. A technique that dynamically determines the significance of terms occurring in documents based on the matching of contexts is also proposed. Other contributions of this research include techniques for the combination of document and query term weights for the ranking of retrieved documents. All techniques were implemented and tested on benchmark data. This provides a basis for performing comparison with previous top performing web information retrieval systems. High retrieval accuracy is reported as a result of utilising the proposed approach. This is supported through comprehensive experimental evidence and favourable comparisons against previously published results.
12

Robust Controllers Design by Loop Shaping Approach

Li, Chien-Te 03 September 2001 (has links)
This thesis mainly proposes a new method to design Hinf Loop Shaping Robust Controller by choosing Weighting Function. In the paper, the author first introduces the concept of SISO Loop Shaping design. It utilizes Small Gain Theorem to achieve robust stability of the system and develops the relationship of Open Loop Transfer Function(L) to Robust Performance and to Robust Stability of the system.. These concepts can be extended to Hinf Loop Shaping method. As to Hinf loop shaping method, the author first introduces the problem of Robust Stability under the framework of Coprime Factor and the theory of Hinf Loop Shaping, and then discusses the relationship between stability margin and the different pole-zero system. Generally speaking, the control theories of the Loop Shaping are mainly used for making appropriate adjustments between the stability and performance of the system. Because the system can conform to the performance requirement through the choice of Weighting Function, the author proposes a new method toward MIMO system to design Hinf Loop Shaping Controller by choosing Weighting Function under the framework of Hinf Loop Shaping. Moreover, at the end of the paper,the author compares the result of the new method with that of the literature.
13

複雑な内生抽出法に基づく標本への離散選択モデルの適用

KITAMURA, Ryuichi, 酒井, 弘, SAKAI, Hiroshi, 北村, 隆一, 山本, 俊行, YAMAMOTO, Toshiyuki 01 1900 (has links)
No description available.
14

Valuation of environmental impacts and its use in environmental systems analysis tools

Ahlroth, Sofia January 2009 (has links)
Valuation of environmental impacts in monetary terms is a both difficult and controversial undertaking. However, the need to highlight the value of ecosystem services in policy decisions has become more and more evident in the face of climate change and diminishing biodiversity in the sea and other ecosystems. Valuing non-market goods and services, like ecosystem services, is a lively research field within environmental economics, and valuation methods have been considerably elaborated in the last ten years. In practical policy analyses, there is often a need for readily available valuations of different impacts. This thesis explores and develops several ways to include valuation of environmental impacts in different policy tools, such as cost-benefit analysis, environmental accounting and life-cycle analysis. The first paper in this thesis is a part of the Swedish attempts to construct and calculate an environmentally adjusted NDP (net national product). This work involved putting a price on non-marketed environmental goods and assets. The valuation methods used in paper I include many of the available methods to value non-marketed goods and services. Valuation of environmental impacts and/or environmental pressures is used in a number of environmental systems analysis tools besides environmental accounting. Examples are Cost-Benefit Analysis, Life Cycle Assessment, Life Cycle Cost analysis, Strategic Environmental Assessment and Environmental Management Systems. These tools have been developed in different contexts and for different purposes; the way valuation is used also differs. In paper II, the current use of values/weights in the tools is explored, as well as the usefulness of a common valuation/weighting scheme and necessary qualities of such a scheme. In the third paper, a set of generic weights meeting these criteria is developed. Some of the generic values in the weighting set are taken from directly from other studies, while some are calculated by applying a benefit transfer method called structural benefit transfer on results from selected valuation studies. The method is tested on a number of valuation studies in the fourth paper. Climate change will have a significant impact on Sweden during this century, both positive and negative. In the fifth paper, a rough estimate of the impacts on man-made capital and human health is presented. The study is an example of an impact assessment including only marketed assets valued with market prices. In the last paper, the economics of sustainable energy use is discussed; what is a sustainable energy price, and how might growth be affected if energy use is limited to a sustainable level? The discussion is based on two different models of thought: a back-casting study, describing how a sustainable future society might look like, and economic scenarios projected with general equilibrium models. / QC 20100330
15

A detailed, stochastic population balance model for twin-screw wet granulation

McGuire, Andrew Douglas January 2018 (has links)
This thesis concerns the construction of a detailed, compartmental population balance model for twin-screw granulation using the stochastic weighted particle method. A number of new particle mechanisms are introduced and existing mechanisms augmented including immersion nucleation, coagulation, breakage, consolidation, liquid penetration, primary particle layering and transport. The model’s predictive power is assessed over a range of liquid-solid mass feed ratios using existing experimental data and is demonstrated to qualitatively capture key experimental trends in the physical characteristic of the granular product. As part of the model development process, a number of numerical techniques for the stochastic weighed method are constructed in order to efficiently solve the population balance model. This includes a new stochastic implementation of the immersion nucleation mechanism and a variable weighted inception algorithm that dramatically reduces the number of computational particles (and hence computational power) required to solve the model. Optimum operating values for free numerical parameters and the general convergence properties of the complete simulation algorithm are investigated in depth. The model is further refined though the use of distinct primary particle and aggregate population balances, which are coupled to simulate the complete granular system. The nature of this coupling permits the inclusion of otherwise computational prohibitive mechanisms, such as primary particle layering, into the process description. A new methodology for assigning representative residence times to simulation compartments, based on screw geometry, is presented. This residence time methodology is used in conjunction with the coupled population balance framework to model twin-screw systems with a number of different screw configurations. The refined model is shown to capture key trends attributed to screw element geometry, in particular, the ability of kneading elements to distribute liquid across the granular mass.
16

Kriging radio environment map construction

Lundqvist, Erik January 2022 (has links)
With the massive increase in usage of some parts of the electromagnetic spectrum during the last decades, the ability to create real time maps of signal coverage is now more important than ever before. This Masters project is designed to test two different methods of generating such maps with a one second limit to processing time. The interpolation methods under consideration are known as inverse distance weighting and kriging. Several different variants of kriging are considered and compared some of which were implemented specif cally for the project and one variant designed by a third party.The data used is acquired from an antenna array inside a laboratory room at LTU rather than being simulated. The data collection is done with the transmitter at several different positions in the room to make sure the interpolation works consistently. The results show only small differences in both the mean and median of the absolute error when comparing inverse distance weighting and kriging and the variations between transmitter positions are signifcant enough that no single variant is consistently the best using that metric. Using a resolution with 25cm2 pixel size there were no problems reaching significantly lower than the 1sec time limit. If the resolution is increased to apixel size of 1cm2 neither method is able to consistently update the map at the required pace. Kriging however showed that it can generate values outside the range of observed values which could make the extra effort required to implement it worth it since such a characteristic might be very useful for  finding the transmitter.
17

Does the Future Look Bright?Visual Imagery Perspective Moderates the Impact of Trait Biases in Expectations

Niese, Zachary Adolph 09 June 2015 (has links)
No description available.
18

On the Relation between Valence Weighting and Self-Regulation

Granados Samayoa, Javier Andre 12 October 2017 (has links)
No description available.
19

A comparative analysis of areal interpolation methods

Hawley, Kevin J. January 2005 (has links)
No description available.
20

Applications de la théorie de l'information à l'apprentissage statistique / Applications of Information Theory to Machine Learning

Bensadon, Jérémy 02 February 2016 (has links)
On considère ici deux sujets différents, en utilisant des idées issues de la théorie de l'information : 1) Context Tree Weighting est un algorithme de compression de texte qui calcule exactement une prédiction Bayésienne qui considère tous les modèles markoviens visibles : on construit un "arbre de contextes", dont les nœuds profonds correspondent aux modèles complexes, et la prédiction est calculée récursivement à partir des feuilles. On étend cette idée à un contexte plus général qui comprend également l'estimation de densité et la régression, puis on montre qu'il est intéressant de remplacer les mixtures Bayésiennes par du "switch", ce qui revient à considérer a priori des suites de modèles plutôt que de simples modèles. 2) Information Geometric Optimization (IGO) est un cadre général permettant de décrire plusieurs algorithmes d'optimisation boîte noire, par exemple CMA-ES et xNES. On transforme le problème initial en un problème d'optimisation d'une fonction lisse sur une variété Riemannienne, ce qui permet d'obtenir une équation différentielle du premier ordre invariante par reparamétrage. En pratique, il faut discrétiser cette équation, et l'invariance n'est plus valable qu'au premier ordre. On définit l'algorithme IGO géodésique (GIGO), qui utilise la structure de variété Riemannienne mentionnée ci-dessus pour obtenir un algorithme totalement invariant par reparamétrage. Grâce au théorème de Noether, on obtient facilement une équation différentielle du premier ordre satisfaite par les géodésiques de la variété statistique des gaussiennes, ce qui permet d'implémenter GIGO. On montre enfin que xNES et GIGO sont différents dans le cas général, mais qu'il est possible de définir un nouvel algorithme presque invariant par reparamétrage, GIGO par blocs, qui correspond exactement à xNES dans le cas Gaussien. / We study two different topics, using insight from information theory in both cases: 1) Context Tree Weighting is a text compression algorithm that efficiently computes the Bayesian combination of all visible Markov models: we build a "context tree", with deeper nodes corresponding to more complex models, and the mixture is computed recursively, starting with the leaves. We extend this idea to a more general context, also encompassing density estimation and regression; and we investigate the benefits of replacing regular Bayesian inference with switch distributions, which put a prior on sequences of models instead of models. 2) Information Geometric Optimization (IGO) is a general framework for black box optimization that recovers several state of the art algorithms, such as CMA-ES and xNES. The initial problem is transferred to a Riemannian manifold, yielding parametrization-invariant first order differential equation. However, since in practice, time is discretized, this invariance only holds up to first order. We introduce the Geodesic IGO (GIGO) update, which uses this Riemannian manifold structure to define a fully parametrization invariant algorithm. Thanks to Noether's theorem, we obtain a first order differential equation satisfied by the geodesics of the statistical manifold of Gaussians, thus allowing to compute the corresponding GIGO update. Finally, we show that while GIGO and xNES are different in general, it is possible to define a new "almost parametrization-invariant" algorithm, Blockwise GIGO, that recovers xNES from abstract principles.

Page generated in 0.0514 seconds