• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1905
  • 468
  • 303
  • 235
  • 210
  • 117
  • 100
  • 64
  • 52
  • 49
  • 42
  • 40
  • 35
  • 23
  • 16
  • Tagged with
  • 4194
  • 2547
  • 1156
  • 703
  • 620
  • 298
  • 267
  • 236
  • 233
  • 232
  • 230
  • 218
  • 210
  • 196
  • 190
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

The effects of working memory and speech rate on lexical ambiguity resolution /

Kadulina, Yara. January 2006 (has links)
No description available.
42

Visual encoding in short-term memory.

Hiles, David Roger January 1973 (has links)
No description available.
43

Short-long-term memory interaction with underlearned long term storage.

Fergenson, P. Everett 01 January 1965 (has links) (PDF)
No description available.
44

The role of working memory during concept attainment : maintaining hypotheses and managing feedback

Sadesky, Gregory S. (Gregory Steven) January 1994 (has links)
No description available.
45

A Comparison of Statistical Filtering Methods for Automatic Term Extraction for Domain Analysis

Tilley, Jason W. 13 May 2009 (has links)
Fourteen word frequency metrics were tested to evaluate their effectiveness in identifying vocabulary in a domain. Fifteen domain engineering projects were examined to measure how closely the vocabularies selected by the fourteen word frequency metrics were to the vocabularies produced by domain engineers. Six filtering mechanisms were also evaluated to measure their impact on selecting proper vocabulary terms. The results of the experiment show that stemming and stop word removal do improve overlap scores and that term frequency is a valuable contributor to overlap. Variations on term frequency are not always significant improvers of overlap. / Master of Science
46

Application of Optimal Approach in Load Forecasting and Unit Commitment Problems

Liao, Gwo-Ching 25 October 2005 (has links)
An Integrated Chaos Search Genetic Algorithm (CGA) /Fuzzy System (FS), Tabu Search (TS) and Neural Fuzzy Network (NFN) method for load forecasting is presented in this paper. A Fuzzy Hyper-Rectangular Composite Neural Networks (FHRCNNs) was used for the initial load forecasting. Then we used CGAFS and TS to find the optimal solution of the parameters of the FHRCNNs, instead of Back-Propagation (BP). First the CGAFS generates a set of feasible solution parameters and then puts the solution into the TS. The CGAFS has good global optimal search capabilities, but poor local optimal search capabilities. The TS method on the other hand has good local optimal search capabilities. We combined both methods to try and obtain both advantages, and in doing so eliminate the drawback of the traditional ANN training by BP. This thesis presents a hybrid Chaos Search Immune Algorithm (IA)/Genetic Algorithm (GA) and Fuzzy System (FS) method (CIGAFS) for solving short-term thermal generating unit commitment problems (UC). The UC problem involves determining the start-up and shutdown schedules for generating units to meet the forecasted demand at the minimum cost. The commitment schedule must satisfy other constraints such as the generating limits per unit, reserve and individual units. We combined IA and GA, then added chaos search and fuzzy system approach in it. Then we used the hybrid system to solve UC. Numerical simulations were carried out using four cases; ten, twenty and thirty thermal units power systems over a 24-hour period.
47

Short-term and long-term reliability studies in the deregulated power systems

Li, Yishan 12 April 2006 (has links)
The electric power industry is undergoing a restructuring process. The major goals of the change of the industry structure are to motivate competition, reduce costs and improve the service quality for consumers. In the meantime, it is also important for the new structure to maintain system reliability. Power system reliability is comprised of two basic components, adequacy and security. In terms of the time frame, power system reliability can mean short-term reliability or long-term reliability. Short-term reliability is more a security issue while long-term reliability focuses more on the issue of adequacy. This dissertation presents techniques to address some security issues associated with short-term reliability and some adequacy issues related to long-term reliability in deregulated power systems. Short-term reliability is for operational purposes and is mainly concerned with security. Thus the way energy is dispatched and the actions the system operator takes to remedy an insecure system state such as transmission congestion are important to shortterm reliability. Our studies on short-term reliability are therefore focused on these two aspects. We first investigate the formulation of the auction-based dispatch by the law of supply and demand. Then we develop efficient algorithms to solve the auction-based dispatch with different types of bidding functions. Finally we propose a new Optimal Power Flow (OPF) method based on sensitivity factors and the technique of aggregation to manage congestion, which results from the auction-based dispatch. The algorithms and the new OPF method proposed here are much faster and more efficient than the conventional algorithms and methods. With regard to long-term reliability, the major issues are adequacy and its improvement. Our research thus is focused on these two aspects. First, we develop a probabilistic methodology to assess composite power system long-term reliability with both adequacy and security included by using the sequential Monte Carlo simulation method. We then investigate new ways to improve composite power system adequacy in the long-term. Specifically, we propose to use Flexible AC Transmission Systems (FACTS) such as Thyristor Controlled Series Capacitor (TCSC), Static Var Compensator (SVC) and Thyristor Controlled Phase Angle Regulator (TCPAR) to enhance reliability.
48

Memory for color over brief intervals : one capacity or two? /

Morales, Dawn A. January 2003 (has links)
Thesis (Ph. D.)--University of California, San Diego, 2003. / Vita. Includes bibliographical references (leaves 149-160).
49

Exploring the Relationships Between Children's Working Memory and Long-Term Memory

2015 November 1900 (has links)
Working memory and long-term memory are two types of memory associated with children’s learning and academic performance. A number of memory models have suggested there is a relationship between working memory and long-term memory; however, there is a lack of empirical research measuring this relationship using standardized assessment tools. Further, there are currently no studies measuring this relationship in children. The purpose of this study was to investigate the relationship between children’s working memory (i.e., verbal working memory, visual-spatial working memory, verbal short-term memory, visual-spatial short-term memory, and the central executive) and long-term memory, using standardized assessment tools. The Automated Working Memory Assessment was used to measure working memory and the Woodcock-Johnson Tests of Cognitive Abilities – Third Edition was used to measure long-term memory. This study utilized secondary data from a larger SSHRC funded study. Participants included 41 children between grades 1 and 8. The majority of parents who volunteered to have their children participate identified them as having a disability (e.g., speech/language difficulty; learning disability). Kendall’s tau-b revealed statistically significant correlations between four areas of working memory (i.e., verbal working memory, visual-spatial working memory, visual-spatial short-term memory, and central executive) and long-term memory. Mann-Whitney tests revealed children with higher working memory abilities differed significantly from children with lower working memory abilities on measures of long-term memory. The findings from this study may have implications for both theory and practice. The relationship observed between working memory and long-term memory appears to align with widely accepted memory models (e.g., Baddeley, 2000; Dehn, 2008). The findings also suggest interventions designed to improve children’s working memory may have the potential to enhance long-term memory abilities.
50

An Assessment of Econometric Methods Used in the Estimation of Affine Term Structure Models

Juneja, Januj January 2010 (has links)
The first essay empirically evaluates recently developed techniques that have been proposed to improve the estimation of affine term structure models. The evaluation presented here is performed on two dimensions. On the first dimension, I find that invariant transformations and rotations can be used to reduce the number of free parameters needed to estimate the model and subsequently, improve the empirical performance of affine term structure models. The second dimension of this evaluation surrounds the comparison between estimating an affine term structure model using the model-free method and the inversion method. Using daily LIBOR rate and swap rate quotes from June 1996 to July 2008 to extract a panel of 3,034 time-series observations and 14 cross sections, this paper shows that, a term structure model that is estimated using the model-free method does not perform significantly better in fitting yields, at any horizon, than the more traditional methods available in the literature.The second essay attempts explores implications of using principal components analysis in the estimation of affine term structure models. Early work employing principal component analysis focused on portfolio formation and trading strategies. Recent work, however, has moved the usage of principal components analysis into more formal applications such as the direct involvement of principal component based factors within an affine term structure model. It is this usage of principal components analysis in formal model settings that warrants a study of potential econometric implications of its application to term structure modeling. Serial correlation in interest rate data, for example, has been documented by several authors. The majority of the literature has focused on strong persistence in state variables as giving rise to this phenomena. In this paper, I take yields as given, and hence document the effects of whitening on the model-implied state-dependent factors, subsequently estimated by the principal component based model-free method. These results imply that the process of pre-whitening the data does play a critical role in model estimation. Results are robust to Monte Carlo Simulations. Empirical results are obtained from using daily LIBOR rate and swap rate quotes from June 1996 to July 2008 to extract a panel of zero-coupon yields consisting of 3,034 time-series observations and 14 cross sections.The third essay examines the extent to which the prevalence of estimation risk in numerical integration creates bias, inefficiencies, and inaccurate results in the widely used class of affine term structure models. In its most general form, this class of models relies on the solution to a system of non-linear Ricatti equations to back out the state-factor coefficients. Only in certain cases does this class of models admit explicit, and thus analytically tractable, solutions for the state factor coefficients. Generally, and for more economically plausible scenarios, explicit closed form solutions do not exist and the application of Runge-Kutta methods must be employed to obtain numerical estimates of the coefficients for the state variables. Using a panel of 3,034 yields and 14 cross-sections, this paper examines what perils, if any, exist in this trade off of analytical tractability against economic flexibility. Robustness checks via Monte Carlo Simulations are provided. In specific, while the usage of analytical methods needs less computational time, numerical methods can be used to estimate a broader set of economic scenarios. Regardless of the data generating process, the generalized Gaussian process seems to dominate the Vasicek model in terms of bias and efficiency. However, when the data are generated from a Vasicek model, the Vasicek model performs better than the generalized Gaussian process for fitting the yield curve. These results impart new and important information about the trade off that exists between using analytical methods and numerical methods for estimate affine term structure models.

Page generated in 0.0279 seconds