• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3869
  • 3456
  • 531
  • 449
  • 392
  • 96
  • 83
  • 80
  • 76
  • 75
  • 74
  • 61
  • 45
  • 40
  • 35
  • Tagged with
  • 10836
  • 1379
  • 1337
  • 1265
  • 1102
  • 1090
  • 958
  • 906
  • 900
  • 863
  • 853
  • 797
  • 727
  • 718
  • 708
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
501

The optimal linear arrangement problem : algorithms and approximation

Horton, Steven Bradish 05 1900 (has links)
No description available.
502

Efficient implementations of the primal-dual method

Osiakwan, Constantine N. K. January 1984 (has links)
No description available.
503

Numerical Treatment of Non-Linear singular pertubation problems.

Shikongo, Albert. January 2007 (has links)
<p>This thesis deals with the design and implementation of some novel numerical methods for non-linear singular pertubations problems (NSPPs). It provide a survey of asymptotic and numerical methods for some NSPPs in the past decade. By considering two test problems, rigorous asymptotic analysis is carried out. Based on this analysis, suitable numerical methods are designed, analyzed and implemented in order to have some relevant results of physical importance. Since the asymptotic analysis provides only qualitative information, the focus is more on the numerical analysis of the problem which provides the quantitative information.</p>
504

High speed high accuracy signal processing with parallel analog circuits

Thomsen, Axel 12 1900 (has links)
No description available.
505

Algorithms for singular systems

Beauchamp, Gerson 05 1900 (has links)
No description available.
506

Interpretation of results from simplified principal components

Uddin, Mudassir January 1999 (has links)
Linear multivariate statistical methods are widely used for analysing data sets which consist of a large number of variables. These techniques, which include principal component analysis, factor analysis, canonical correlation analysis, redundancy analysis and discriminant analysis, all produce a set of new variables, commonly called 'factors', according to some criterion which differs for different techniques. Among these techniques, principal component analysis is one of the most popular techniques used for reducing the dimensions of the multivariate data set. In many applications, when Principal Component Analysis (PCA) is performed on a large number of variables, the interpretation of the results is not simple. The derived eigenvectors of the sample covariance or correlation matrix are not necessarily in a simple form, with all coefficients either 'large' or 'negligible'. To aid interpretation, it is fairly common practice to rotate the retained set of components, often using orthogonal rotation. The purpose of rotation is to simplify structure, and thus to make it easier to interpret the low-dimensional space represented by the retained set of components. Thus, quantification of simplicity is a two step process. The first set involves the extraction of the feature from the data called components, while the second stage uses a rotation method to simplify the structure. One of the two main purposes of this thesis is to combine into one step these two separate stages of dimension reduction (finding the components) and simplification (rotation). This goal is achieved by combining these two objectives in the form of a single function leading to what we call Simplified Components (SCs). Another objective is to discover which of the many possible criteria suggested in factor analysis can be adopted in the proposed procedure of SCs. Thus, a simplified one-step procedure of SCs is proposed, using four measures of simplicity, namely varimax, quartimax, orthomax and equamax indices.
507

Censored regression and the Pearson system of distributions : an estimation method and application to demand analysis

Izadi, Hooshang January 1989 (has links)
No description available.
508

Bayesian inference about outputs of computationally expensive algorithms with uncertainty on the inputs

Haylock, Richard George Edward January 1997 (has links)
In the field of radiation protection, complex computationally expensive algorithms are used to predict radiation doses, to organs in the human body from exposure to internally deposited radionuclides. These algorithms contain many inputs, the true values of which are uncertain. Current methods for assessing the effects of the input uncertainties on the output of the algorithms are based on Monte Carlo analyses, i.e. sampling from subjective prior distributions that represent the uncertainty on each input, evaluating the output of the model and calculating sample statistics. For complex computationally expensive algorithms, it is often not possible to get a large enough sample for a meaningful uncertainty analysis. This thesis presents an alternative general theory for uncertainty analysis, based on the use of stochastic process models, in a Bayesian context. The measures provided by the Monte Carlo analysis are obtained, plus extra more informative measures, but using a far smaller sample. The theory is initially developed in a general form and then specifically for algorithms with inputs whose uncertainty can be characterised by independent normal distributions. The Monte Carlo and Bayesian methodologies are then compared using two practical examples. The first example, is based on a simple model developed to calculate doses due to radioactive iodine. This model has two normally distributed uncertain parameters and due to its simplicity an independent measurement of the true uncertainty on the output is available for comparison. This exercise appears to show that the Bayesian methodology is superior in this simple case. The purpose of the second example is to determine if the methodology is practical in a 'real-life' situation and to compare it with a Monte Carlo analysis. A model for calculating doses due to plutonium contamination is used. This model is computationally expensive and has fourteen uncertain inputs. The Bayesian analysis compared favourably to the Monte Carlo, indicating that it has the potential to provide more accurate uncertainty analyses for the parameters of computationally expensive algorithms.
509

Diffractive optical elements : design and fabrication issues

Blair, Paul January 1995 (has links)
No description available.
510

Field and flow effects on tethered polymer chains

Bown, Gavin John January 1999 (has links)
Solvated brush layers formed by linear polystyrenes have been investigated under quiescent and solvent shear-flow conditions using neutron reflectometry. Cyclohexane and toluene were used as solvents, and the polystyrene chains were tethered by one end to macroscopically flat silicon substrates via short poly-4-vinylpyridine end- groups. The brush systems were studied using a purpose built flow reflectometry cell. The brush height was found to increase with improving thermodynamic quality of the solvent due to increasingly strong repulsive excluded volume interactions between chain segments. Model fitting of the reflectivity data revealed that the polymer volume fraction profile was well described by parabola-like functions in agreement with the predictions of self-consistent field theory. No changes in the reflectivity profiles were observed upon exposure of the brush layers to solvent flow-induced area average shear rates of up to 147 000 s(^-1). This observation has been rationalised through a number of considerations, including comparison with recent theoretical predictions. Aqueous micellar dispersions of diblock copolymers of styrene and ethylene oxide have been studied using small angle neutron scattering in the concentration range 0.05 to 6.5 weight % copolymer. The micelles formed were found to be spherically symmetric, consisting of a polystyrene core surrounded by a corona of highly stretched solvated polyethylene oxide chains. At higher concentrations, the intermicellar interaction has been interpreted using an analytic structure factor originally developed to describe macroion solutions. On subjecting the dispersions to Couette shear, no evidence for long-range ordering of the micelles was observed in the diffraction patterns.

Page generated in 0.1911 seconds