• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 47
  • 10
  • 6
  • 5
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 90
  • 90
  • 23
  • 11
  • 9
  • 8
  • 8
  • 8
  • 8
  • 8
  • 7
  • 7
  • 7
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Semianalytical satellite theory and sequential estimation

Taylor, Stephen Paul January 1982 (has links)
Thesis (M.S.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 1982. / MICROFICHE COPY AVAILABLE IN ARCHIVES AND ENGINEERING. / Includes bibliographical references. / by Stephen Paul Taylor. / M.S.
52

Algorithms for sequence alignment

Powell, David Richard, 1973- January 2001 (has links)
Abstract not available
53

Topics in sequence analysis

Ma, Jinyong 12 November 2012 (has links)
This thesis studies two topics in sequence analysis. In the first part, we investigate the large deviations of the shape of the random RSK Young diagrams, associated with a random word of size n whose letters are independently drawn from an alphabet of size m=m(n). When the letters are drawn uniformly and when both n and m converge together to infinity, m not growing too fast with respect to n, the large deviations of the shape of the Young diagrams are shown to be the same as that of the spectrum of the traceless GUE. Since the length of the top row of the Young diagrams is the length of the longest (weakly) increasing subsequence of the random word, the corresponding large deviations follow. When the letters are drawn with non-uniform probability, a control of both highest probabilities will ensure that the length of the top row of the diagrams satisfies a large deviation principle. In either case, both speeds and rate functions are identified. To complete our study, non-asymptotic concentration bounds for the length of the top row of the diagrams, are obtained for both models. In the second part, we investigate the order of the r-th, 1<= r < +∞, central moment of the length of the longest common subsequence of two independent random words of size n whose letters are identically distributed and independently drawn from a finite alphabet. When all but one of the letters are drawn with small probabilities, which depend on the size of the alphabet, the r-th central moment is shown to be of order n^{r/2}. In particular, when r=2, we get the order of the variance of the longest common subsequence.
54

Analytic solutions to small scale two level programs with applications to the United States Department of Agriculture grain commodities programs / United States Department of Agriculture grain commodities programs

Peterson, Henry Howard January 1986 (has links)
Binder's title on spine: United States Department of Agriculture grain commodities programs. / Typescript. / Thesis (Ph. D.)--University of Hawaii at Manoa, 1986. / Bibliography: leaves 100-102. / Photocopy. / x, 102 leaves 29 cm
55

Computerized achievement tests : sequential and fixed length tests

Wiberg, Marie H. January 2003 (has links)
The aim of this dissertation is to describe how a computerized achivement test can be constructed and used in practice. Throughout this dissertation the focus is on classifying the examinees into masters and non-masters depending on their ability. However, there has been no attempt to estimate their ability. In paper I, a criterion-referenced computerized test with a fixed number of items is expressed as a statistical inference problem. The theory of optimal design is used to find the test that has the strongest power. A formal proof is provided showing that all items should have the same item characteristics, viz. high discrimination, low guessing and difficulty near the cutoff score, in order to give us the most powerful statistical test. An efficiency study shows how many times more non-optimal items are needed if we do not use optimal items in order to achieve the same power in the test. In paper II, a computerized mastery sequential test is examined using sequential analysis. The focus is on examining the sequential probability ratio test and to minimize the number of items in a test, i.e. to minimize the average sample number function, abbreviated as the ASN function. Conditions under which the ASN function decreases are examined. Further, it is shown that the optimal values are the same for item discrimination and item guessing, but differ for item difficulty compared with tests with fixed number of items. Paper III presents three simulation studies of sequential computerized mastery tests. Three cases are considered, viz. the examinees' responses are either identically distributed, not identically distributed, or not identically distributed together with estimation errors in the item characteristics. The simulations indicate that the observed results from the operating characteristic function differ significantly from the theoretical results. The mean number of items in a test, the distribution of test length and the variance depend on whether the true values of the item characteristics are known and whether they are iid or not. In paper IV computerized tests with both pretested items with known item parameters, and try-out items with unknown item parameters are considered. The aim is to study how the item parameters for try-out items can be estimated in a computerized test. Although the unknown examinees' abilities may act as nuisance parameters, the asymptotic variance of the item parameter estimators can be calculated. Examples show that a more reliable variance estimator yields much larger estimates of the variance than commonly used variance estimators.
56

Contribution à la modélisation du genou arthrosique. Application à l’étude d’une orthèse de décharge / Contribution to knee osteoarthritis modelling. Application to the study of an unloading brace.

Langlois, Karine 21 December 2016 (has links)
Le projet se situe dans le contexte de la gonarthrose fémoro-tibiale, et plus spécifiquement de son traitement par orthèse de décharge. Du fait de la cinématique particulière de l’orthèse testée, un protocole spécifique a été mis en place dans le but d’approfondir les mécanismes d’actions de cette orthèse. Ce protocole s’est appuyé sur des outils d’investigation utilisés dans la routine clinique (système EOS® et Vicon®). Seize sujets symptomatiques ont participé à l’étude. Les objectifs principaux étaient de valider et d’utiliser des méthodes de personnalisation des modèles, support de la biomécanique, dans ce contexte spécifique, afin d’améliorer la précision de la quantification des paramètres cinématiques et dynamiques. En effet, l’état de l’art démontre que l’indicateur dynamique couramment utilisé dans l’étude de la gonarthrose, le moment articulaire externe d’adduction, souffre de résultats controversés. Les objectifs secondaires étaient de caractériser la pathologie à l’aide des indicateurs calculés ainsi que d’approfondir le mécanisme d’action de l’orthèse de décharge. Ainsi, deux méthodes principales ont été investiguées. La première porte sur l’analyse séquentielle cinématique de l’articulation fémoro-tibiale en utilisant le système EOS®. Cette analyse nécessite une étape de recalage d’objet 3D sur des vues 2D biplanaires. La fiabilité de ce recalage manuel a été quantifiée en évaluant, d’une part, la précision de la méthode grâce à des données in vitro et, d’autre part, la répétabilité de la méthode grâce à la participation de 3 opérateurs et de 6 sujets asymptomatiques dont les acquisitions ont permis d’obtenir des vues du genou dans plusieurs positions (extension et flexion 20°, 40°et 90°). La seconde méthode consiste en la fusion de données issues de deux environnements (EOS® et Vicon®) de façon à quantifier les moments articulaires externes dans le genou en définissant le point auquel sont calculés ces moments à partir du modèle 3D du fémur. Ce modèle étant recalé dans l’environnement Vicon® par l’intermédiaire des marqueurs externes détectables dans les acquisitions EOS® et Vicon®. Les résultats principaux concernent :1/ la fiabilité du recalage des modèles 3D des os sur des vues 2D de l’ordre de 0,3° et 1,6 mm dans le plan sagittal de 2,1° et 1,8mm dans le plan transversal ; 2/ la quantification d’angles positionnels 3D du fémur et du tibia des membres inférieurs symptomatiques montrant que la surface articulaire tibiale (plateaux tibiaux) tend à demeurer horizontale contrairement à l’inclinaison plus marquée du fémur ; 3/ une variation des moments articulaires externes en fonction de l’intégration ou non d’un modèle interne pour le calcul de ce paramètre ; 4/ la quantification de la cinématique séquentielle de l’orthèse et du genou. La fiabilité des méthodes développées dans ce travail a été estimée ouvrant la voie à leurs applications et à leurs développements en clinique.Mots clés : gonarthrose fémoro-tibiale ; orthèse ; analyse séquentielle ; recalage. / The context is the knee osteoarthritis and its treatment via an unloading brace. As the evaluated brace has specific kinematics, a dedicated protocol was set up using clinical routine tools (EOS® and Vicon®). Sixteen symptomatic subjects participated to this study. The main objective was to validate and to use personalization methods of the biomechanical models to improve the accuracy kinematics and kinetics parameter computation in this specific context. Indeed, the state of the art highlights that the dynamic indicator, currently used in knee osteoarthritis studies, the external adduction joint moment is controversial. The secondary objectives were both the characterization of the pathology using indicators obtained on the acquired data and the in-depth study of the mechanical action of the unloading brace evaluated in this study. Thus, two main methods were investigated. The first one is the sequential analysis of the knee kinematics using EOS® system. This analysis required a registration step of a 3D object on 2D biplanar views. The manual registration reliability was quantified by evaluating both the method accuracy with in-vitro data and the method repeatability thanks to the participation of 3 operators and 6 asymptomatic subjects allowing the acquisition of knee images in several positions (extension and 20°, 40°, and 90° flexion). The second method consists in the fusion of data obtained in the two acquisitions environments (EOS® and Vicon®), to obtain the external knee joint moment in order to define the point where the moment is computed from the femur 3D model. This model is registered in the Vicon® environment through external markers visible in both the EOS® and Vicon® acquisitions. The main results relate to: 1/the reliability of the registration of the bone 3D model on 2D views estimated at about 0,3° and 1,6 mm in the sagittal plane and about 2,1° and 1,8 mm in the transverse plane ; 2/ the quantification of the angular relative position of the shank and tibia of the symptomatic lower limbs, showing the tibial plateau horizontality opposed to the femur emphasized inclination ; 3/ external joint moment variation according to the computation method (with or without internal model included) ; 4/ brace and knee sequential kinematic quantification. As the reliability of the methods developed in this study was estimated, further developments and clinical applications and development could now be explored using these methods.Keywords : knee osteoarthritis; brace, sequential analysis; registration.
57

Sequential land cover classification

Ackermann, Etienne Rudolph 05 August 2011 (has links)
Land cover classification using remotely sensed data is a critical first step in large-scale environmental monitoring, resource management and regional planning. The classification task is made difficult by severe atmospheric scattering and absorption, seasonal variation, spatial dependence, complex surface dynamics and geometries, and large intra-class variability. Most of the recent research effort in land cover classification has gone into the development of increasingly robust and accurate (and also increasingly complex) classifiers by constructing–often in an ad hoc manner–multispectral, multitemporal, multisource classifiers using modern machine learning techniques such as artificial neural networks, fuzzy-sets, and expert systems. However, the focus has always been (almost exclusively) on increasing the classification accuracy of newly developed classifiers. We would of course like to perform land cover classification (i) as accurately as possible, but also (ii) as quickly as possible. Unfortunately there exists a tradeoff between these two requirements, since the faster we must make a decision, the lower we expect our classification accuracy to be, and conversely, a higher classification accuracy typically requires that we observe more samples (i.e., we must wait longer for a decision). Sequential analysis provides an attractive (indeed an optimal) solution to handling this tradeoff between the classification accuracy and the detection delay–and it is the aim of this study to apply sequential analysis to the land cover classification task. Furthermore, this study deals exclusively with the binary classification of coarse resolution MODIS time series data in the Gauteng region in South Africa, and more specifically, the task of discriminating between residential areas and vegetation is considered. / Dissertation (MEng)--University of Pretoria, 2011. / Electrical, Electronic and Computer Engineering / unrestricted
58

SEQUENTIAL A/B TESTING USING PRE-EXPERIMENT DATA

Stenberg, Erik January 2019 (has links)
This thesis bridges the gap between two popular methods of achieving more efficient online experiments, sequential tests and variance reduction with pre-experiment data. Through simulations, it is shown that there is efficiency to be gained in using control-variates sequentially along with the popular mixture Sequential Probability Ratio Test. More efficient tests lead to faster decisions and smaller sample sizes required. The technique proposed is also tested using empirical data on users from the music streaming service Spotify. An R package which includes the main tests applied in this thesis is also presented.
59

Data-Driven Methods for Modeling and Predicting Multivariate Time Series using Surrogates

Chakraborty, Prithwish 05 July 2016 (has links)
Modeling and predicting multivariate time series data has been of prime interest to researchers for many decades. Traditionally, time series prediction models have focused on finding attributes that have consistent correlations with target variable(s). However, diverse surrogate signals, such as News data and Twitter chatter, are increasingly available which can provide real-time information albeit with inconsistent correlations. Intelligent use of such sources can lead to early and real-time warning systems such as Google Flu Trends. Furthermore, the target variables of interest, such as public heath surveillance, can be noisy. Thus models built for such data sources should be flexible as well as adaptable to changing correlation patterns. In this thesis we explore various methods of using surrogates to generate more reliable and timely forecasts for noisy target signals. We primarily investigate three key components of the forecasting problem viz. (i) short-term forecasting where surrogates can be employed in a now-casting framework, (ii) long-term forecasting problem where surrogates acts as forcing parameters to model system dynamics and, (iii) robust drift models that detect and exploit 'changepoints' in surrogate-target relationship to produce robust models. We explore various 'physical' and 'social' surrogate sources to study these sub-problems, primarily to generate real-time forecasts for endemic diseases. On modeling side, we employed matrix factorization and generalized linear models to detect short-term trends and explored various Bayesian sequential analysis methods to model long-term effects. Our research indicates that, in general, a combination of surrogates can lead to more robust models. Interestingly, our findings indicate that under specific scenarios, particular surrogates can decrease overall forecasting accuracy - thus providing an argument towards the use of 'Good data' against 'Big data'. / Ph. D.
60

Muli-stage [i.e. Multi-stage] multi-product lotsize sequencing of operations

Nnaji, Bartholomew O. January 1982 (has links)
In this thesis, a multiple-stage - product sequence selection problem was mathematically modeled and analyzed. A fortran program that calculates the machine requirements for each machine station and each product type, and searches for the optimal product sequence combination was developed. Four cases of the sequence selection problem were analyzed in detail. A comparison of homogenous machine stations and product lines was made in an analytical manner. Results of an example problem solved for the above types of systems were presented. Computational results for different sequence combinations (ranging from 8 to 125) are discussed. / Master of Science

Page generated in 0.0715 seconds