• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7
  • 1
  • Tagged with
  • 12
  • 12
  • 5
  • 5
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

On the generalization of the distribution of the significant digits under computation /

Wong, James Teng. January 1969 (has links)
Thesis (Ph. D.)--Oregon State University, 1969. / Typescript. Includes bibliographical references (leaf 43). Also available on the World Wide Web.
2

METACOGNITION IN LEARNING ELEMENTARY PROBABILITY AND STATISTICS

RYSZ, TERI January 2004 (has links)
No description available.
3

Introduction to STATISTICS in a Biological Context

Seier, Edith, Joplin, Karl H. 01 January 2011 (has links)
This is a textbook written for undergraduate students in biology or health sciences in an introductory statistics course. / https://dc.etsu.edu/etsu_books/1061/thumbnail.jpg
4

Specification and Verification of Systems Using Model Checking and Markov Reward Models

Lifson, Farrel 01 May 2004 (has links)
The importance of service level management has come to the fore in recent years as computing power becomes more and more of a commodity. In order to present a consistently high quality of service systems must be rigorously analysed, even before implementation, and monitored to ensure these goals can be achieved. The tools and algorithms found in performability analysis offer a potentially ideal method to formally specify and analyse performance and reliability models. This thesis examines Markov reward models, a formalism based on continuous time Markov chains, and it's usage in the generation and analysis of service levels. The particular solution technique we employ in this thesis is model checking, using Continuous Reward Logic as a means to specify requirement and constraints on the model. We survey the current tools available allowing model checking to be performed on Markov reward models. Specifically we extended the Erlangen-Twente Markov Chain Checker to be able to solve Markov reward models by taking advantage of the Duality theorem of Continuous Stochastic Reward Logic, of which Continuous Reward Logic is a sub-logic. We are also concerned with the specification techniques available for Markov reward models, which have in the past merely been extensions to the available specification techniques for continuous time Markov chains. We implement a production rule system using Ruby, a high level language, and show the advantages gained by using it's native interpreter and language features in order to cut down on implementation time and code size. The limitations inherent in Markov reward models are discussed and we focus on the issue of zero reward states. Previous algorithms used to remove zero reward states, while preserving the numerical properties of the model, could potentially alter it's logical properties. We propose algorithms based on analysing the continuous reward logic requirement beforehand to determine whether a zero reward state can be removed safely as well as an approach based on substitution of zero reward states. We also investigate limitations on multiple reward structures and the ability to solve for both time and reward. Finally we perform a case study on a Beowulf parallel computing cluster using Markov reward models and the ETMCC tool, demonstrating their usefulness in the implementation of performability analysis and the determination of the service levels that can be offered by the cluster to it's users.
5

A Comparison of Statistical and Geometric Reconstruction Techniques: Guidelines for Correcting Fossil Hominin Crania

Neeser, Rudolph 01 January 2007 (has links)
The study of human evolution centres, to a large extent, around the study of fossil morphology, including the comparison and interpretation of these remains within the context of what is known about morphological variation within living species. However, many fossils suffer from environmentally caused damage (taphonomic distortion) which hinders any such interpretation: fossil material may be broken and fragmented while the weight and motion of overlaying sediments can cause their plastic distortion. To date, a number of studies have focused on the reconstruction of such taphonomically damaged specimens. These studies have used myriad approaches to reconstruction, including thin plate spline methods, mirroring, and regression-based approaches. The efficacy of these techniques remains to be demonstrated, and it is not clear how different parameters (e.g., sample sizes, landmark density, etc.) might effect their accuracy. In order to partly address this issue, this thesis examines three techniques used in the virtual reconstruction of fossil remains by statistical or geometrical means: mean substitution, thin plate spline warping (TPS), and multiple linear regression. These methods are compared by reconstructing the same sample of individuals using each technique. Samples drawn from Homo sapiens, Pan troglodytes, Gorilla gorilla, and various hominin fossils are reconstructed by iteratively removing then estimating the landmarks. The testing determines the methods' behaviour in relation to the extant of landmark loss (i.e., amount of damage), reference sample sizes (this being the data used to guide the reconstructions), and the species of the population from which the reference samples are drawn (which may be different to the species of the damaged fossil). Given a large enough reference sample, the regression-based method is shown to produce the most accurate reconstructions. Various parameters effect this: when using small reference samples drawn from a population of the same species as the damaged specimen, thin plate splines is the better method, but only as long as there is little damage. As the damage becomes severe (missing 30% of the landmarks, or more), mean substitution should be used instead: thin plate splines are shown to have a rapid error growth in relation to the amount of damage. When the species of the damaged specimen is unknown, or it is the only known individual of its species, the smallest reconstruction errors are obtained with a regression-based approach using a large reference sample drawn from a living species. Testing shows that reference sample size (combined with the use of multiple linear regression) is more important than morphological similarity between the reference individuals and the damaged specimen. The main contribution of this work are recommendations to the researcher on which of the three methods to use, based on the amount of damage, number of reference individuals, and species of the reference individuals.
6

In Search of Computer Music Analysis: Music Information Retrieval, Optimization, and Machine Learning from 2000-2016

Persaud, Felicia Nafeeza 21 August 2018 (has links)
My thesis aims to critically examine three methods in the current state of Computer Music Analysis. I will concentrate on Music Information Retrieval, Optimization, and Machine Learning. My goal is to describe and critically analyze each method, then examine the intersection of all three. I will start by looking at David Temperley’s The Cognition of Basic Musical Structures (2001) which offers an outline of major accomplishments before the turn of the 21st century. This outline will provide a method of organization for a large portion of the thesis. I will conclude by explaining the most recent developments in terms of the three methods cited. Following trends in these developments, I can hypothesize the direction of the field.
7

Tahová pevnost vláknitých svazků a kompozitů / Tensile strength of fibrous yarns and composites

Rypl, Rostislav Unknown Date (has links)
Technical textiles play a highly important role in today's material engineering. In fibrous composites, which are being applied in a number of industrial branches ranging from aviation to civil engineering, technical textiles are used as the reinforcing or toughening constituent. With growing number of production facilities for fibrous materials, the need for standardized and reproducible quality control procedures becomes urgent. The present thesis addresses the issue of tensile strength of high-modulus multifilament yarns both from the theoretical and experimental point of view. In both these aspects, novel approaches are introduced. Regarding the theoretical strength of fibrous yarns, a model for the length dependent tensile strength is formulated, which distinguishes three asymptotes of the mean strength size effect curve. The transition between the model of independent parallel fibers applicable for smaller gauge lengths and the chain-of-bundles model applicable for longer gauge lengths is emphasized in particular. It is found that the transition depends on the stress transfer or anchorage length of filaments and can be identified experimentally by means of standard tensile tests at different gauge lengths. In the experimental part of the thesis, the issue of stress concentration in the clamping has been addressed. High-modulus yarns with brittle filaments are very sensitive to stress concentrations when loaded in tension making the use of traditional tensile test methods difficult. A novel clamp adapter for the Statimat 4U yarn tensile test machine (producer: Textechno GmbH) has been developed and a prototype has been built. A test series comparing yarns strengths tested with the clamp adapter and with commonly used test methods has been performed and the results are discussed. Furthermore, they are compared with theoretical values using the Daniels' statistical fiber-bundle model.
8

Link prediction and link detection in sequences of large social networks using temporal and local metrics

Cooke, Richard J. E. 01 November 2006 (has links)
This dissertation builds upon the ideas introduced by Liben-Nowell and Kleinberg in The Link Prediction Problem for Social Networks [42]. Link prediction is the problem of predicting between which unconnected nodes in a graph a link will form next, based on the current structure of the graph. The following research contributions are made: • Highlighting the difference between the link prediction and link detection problems, which have been implicitly regarded as identical in current research. Despite hidden links and forming links having very highly significant differing metric values, they could not be distinguished from each other by a machine learning system using traditional metrics in an initial experiment. However, they could be distinguished from each other in a "simple" network (one where traditional metrics can be used for prediction successfully) using a combination of new graph analysis approaches. • Defining temporal metric statistics by combining traditional statistical measures with measures commonly employed in financial analysis and traditional social network analysis. These metrics are calculated over time for a sequence of sociograms. It is shown that some of the temporal extensions of traditional metrics increase the accuracy of link prediction. • Defining traditional metrics using different radii to those at which they are normally calculated. It is shown that this approach can increase the individual prediction accuracy of certain metrics, marginally increase the accuracy of a group of metrics, and greatly increase metric computation speed without sacrificing information content by computing metrics using smaller radii. It also solves the “distance-three task” (that common neighbour metrics cannot predict links between nodes at a distance greater than three). • Showing that the combination of local and temporal approaches to link prediction can lead to very high prediction accuracies. Furthermore in “complex” networks (ones where traditional metrics cannot be used for prediction successfully) local and temporal metrics become even more useful.
9

Neural mechanisms of information processing and transmission

Leugering, Johannes 05 November 2021 (has links)
This (cumulative) dissertation is concerned with mechanisms and models of information processing and transmission by individual neurons and small neural assemblies. In this document, I first provide historical context for these ideas and highlight similarities and differences to related concepts from machine learning and neuromorphic engineering. With this background, I then discuss the four main themes of my work, namely dendritic filtering and delays, homeostatic plasticity and adaptation, rate-coding with spiking neurons, and spike-timing based alternatives to rate-coding. The content of this discussion is in large part derived from several of my own publications included in Appendix C, but it has been extended and revised to provide a more accessible and broad explanation of the main ideas, as well as to show their inherent connections. I conclude that fundamental differences remain between our understanding of information processing and transmission in machine learning on the one hand and theoretical neuroscience on the other, which should provide a strong incentive for further interdisciplinary work on the domain boundaries between neuroscience, machine learning and neuromorphic engineering.
10

An Analysis of the Effect of Early Season Winning Percentage on Final Regular Season Winning Percentage

Emily, Martin M. January 2019 (has links)
No description available.

Page generated in 0.1905 seconds