• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 38
  • 28
  • 5
  • 5
  • 3
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 102
  • 102
  • 23
  • 22
  • 21
  • 17
  • 17
  • 17
  • 16
  • 13
  • 12
  • 12
  • 12
  • 10
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

On the Value of Online Learning for Cognitive Radar Waveform Selection

Thornton III, Charles Ethridge 16 May 2023 (has links)
Modern radar systems must operate in a wide variety of time-varying conditions. These include various types of interference from neighboring systems, self-interference or clutter, and targets with fluctuating responses. It has been well-established that the quality and nature of radar measurements depend heavily on the choice of signal transmitted by the radar. In this dissertation, we discuss techniques which may be used to adapt the radar's waveform on-the-fly while making very few a priori assumptions about the physical environment. By employing tools from reinforcement learning and online learning, we present a variety of algorithms which handle practical issues of the waveform selection problem that have been left open by previous works. In general, we focus on two key challenges inherent to the waveform selection problem, sample-efficiency and universality. Sample-efficiency corresponds to the number of experiences a learning algorithm requires to achieve desirable performance. Universality refers to the learning algorithm's ability to achieve desirable performance across a wide range of physical environments. Specifically, we develop a contextual bandit-based approach to vastly improve the sample-efficiency of learning compared to previous works. We then improve the generalization performance of this model by developing a Bayesian meta-learning technique. To handle the problem of universality, we develop a learning algorithm which is asymptotically optimal in any Markov environment having finite memory length. Finally, we compare the performance of learning-based waveform selection to fixed rule-based waveform selection strategies for the scenarios of dynamic spectrum access and multiple-target tracking. We draw conclusions as to when learning-based approaches are expected to significantly outperform rule-based strategies, as well as the converse. / Doctor of Philosophy / Modern radar systems must operate in a wide variety of time-varying conditions. These include various types of interference from neighboring systems, self-interference or clutter, and targets with fluctuating responses. It has been well-established that the quality and nature of radar measurements depend heavily on the choice of signal transmitted by the radar. In this dissertation, we discuss techniques which may be used to adapt the radar's waveform on-the-fly while making very few explicit assumptions about the physical environment. By employing tools from reinforcement learning and online learning, we present a variety of algorithms which handle practical and theoretical issues of the waveform selection problem that have been left open by previous works. We begin by asking the questions "What is cognitive radar?" and "When should cognitive radar be used?" in order to develop a broad mathematical framework for the signal selection problem. The latter chapters then deal with the role of intelligent real-time decision-making algorithms which select favorable signals for target tracking and interference mitigation. We conclude by discussing the possible roles of cognitive radar within future wireless networks and larger autonomous systems.
12

Exponenciální třídy a jejich význam pro statistickou inferenci / Exponenciální třídy a jejich význam pro statistickou inferenci

Moneer Borham Abdel-Maksoud, Sally January 2011 (has links)
This diploma thesis provides an evaluation of Exponential families of distributions which has a special position in mathematical statistics. Diploma will learn the basic concepts and facts associated with the distribution of exponential type. Especially with focusing on the advantages of exponential families in classical parametric statistics, thus in theory of estimation and hypothesis testing. Emphasis will be placed on one-parameter and multi-parameters systems.
13

Exponenciální třídy a jejich význam pro statistickou inferenci / Exponenciální třídy a jejich význam pro statistickou inferenci

Moneer Borham Abdel-Maksoud, Sally January 2011 (has links)
Title: Exponential families in statistical inference Author: Sally Abdel-Maksoud Department: Department of Probability and Mathematical Statistics Supervisor: doc. RNDr. Daniel Hlubinka, Ph.D. Supervisor's e-mail address: Daniel.Hlubinka@mff.cuni.cz Abstract: This diploma thesis provides an evaluation of Exponential families of distributions which has a special position in mathematical statistic including appropriate properties for estimation of population parameters, hypothesis testing and other inference problems. Diploma will introduce the basic concepts and facts associated with the distribution of exponential type especially with focusing on the advantages of exponential families in classical parametric statistics, thus in theory of estimation and hypothesis testing. Emphasis will be placed on one-parameter and multi- parameters systems. It also exposes an important concepts about the curvature of a statistical problem including the curvature in exponential families. We will define a quantity that measure how nearly "exponential" the families are. This quantity is said to be the statistical curvature of the family. We will show that the family with a small curvature enjoy the good properties of exponential families Moreover, the properties of the curvature, hypotheses testing and some...
14

Drivers of Dengue Within-Host Dynamics and Virulence Evolution

Ben-Shachar, Rotem January 2016 (has links)
<p>Dengue is an important vector-borne virus that infects on the order of 400 million individuals per year. Infection with one of the virus's four serotypes (denoted DENV-1 to 4) may be silent, result in symptomatic dengue 'breakbone' fever, or develop into the more severe dengue hemorrhagic fever/dengue shock syndrome (DHF/DSS). Extensive research has therefore focused on identifying factors that influence dengue infection outcomes. It has been well-documented through epidemiological studies that DHF is most likely to result from a secondary heterologous infection, and that individuals experiencing a DENV-2 or DENV-3 infection typically are more likely to present with more severe dengue disease than those individuals experiencing a DENV-1 or DENV-4 infection. However, a mechanistic understanding of how these risk factors affect disease outcomes, and further, how the virus's ability to evolve these mechanisms will affect disease severity patterns over time, is lacking. In the second chapter of my dissertation, I formulate mechanistic mathematical models of primary and secondary dengue infections that describe how the dengue virus interacts with the immune response and the results of this interaction on the risk of developing severe dengue disease. I show that only the innate immune response is needed to reproduce characteristic features of a primary infection whereas the adaptive immune response is needed to reproduce characteristic features of a secondary dengue infection. I then add to these models a quantitative measure of disease severity that assumes immunopathology, and analyze the effectiveness of virological indicators of disease severity. In the third chapter of my dissertation, I then statistically fit these mathematical models to viral load data of dengue patients to understand the mechanisms that drive variation in viral load. I specifically consider the roles that immune status, clinical disease manifestation, and serotype may play in explaining viral load variation observed across the patients. With this analysis, I show that there is statistical support for the theory of antibody dependent enhancement in the development of severe disease in secondary dengue infections and that there is statistical support for serotype-specific differences in viral infectivity rates, with infectivity rates of DENV-2 and DENV-3 exceeding those of DENV-1. In the fourth chapter of my dissertation, I integrate these within-host models with a vector-borne epidemiological model to understand the potential for virulence evolution in dengue. Critically, I show that dengue is expected to evolve towards intermediate virulence, and that the optimal virulence of the virus depends strongly on the number of serotypes that co-circulate. Together, these dissertation chapters show that dengue viral load dynamics provide insight into the within-host mechanisms driving differences in dengue disease patterns and that these mechanisms have important implications for dengue virulence evolution.</p> / Dissertation
15

Geometric context from single and multiple views

Flint, Alexander John January 2012 (has links)
In order for computers to interact with and understand the visual world, they must be equipped with reasoning systems that include high–level quantities such as objects, actions, and scenes. This thesis is concerned with extracting such representations of the world from visual input. The first part of this thesis describes an approach to scene understanding in which texture characteristics of the visual world are used to infer scene categories. We show that in the context of a moving camera, it is common to observe images containing very few individually salient image regions, yet overall texture structure often allows our system to derive powerful contextual cues about the environment. Our approach builds on ideas from texture recognition, and we show that our algorithm out–performs the well–known Gist descriptor on several classification tasks. In the second part of this thesis we we are interested in scene understanding in the context of multiple calibrated views of a scene, as might be obtained from a Structure–from–Motion or Simultaneous Localization and Mapping (SLAM) system. Though such systems are capable of localizing the camera robustly and efficiently, the maps produced are typically sparse point-clouds that are difficult to interpret and of little use for higher–level reasoning tasks such as scene understanding or human-machine interaction. In this thesis we begin to address this deficiency, presenting progress towards modeling scenes using semantically meaningful primitives such as floor, wall, and ceiling planes. To this end we adopt the indoor Manhattan representation, which was recently proposed for single–view reconstruction. This thesis presents the first in–depth description and analysis of this model in the literature. We describe a probabilistic model relating photometric features, stereo photo–consistencies, and 3D point clouds to Manhattan scene structure in a Bayesian framework. We then present a fast dynamic programming algorithm that solves exact MAP inference in this model in time linear in image size. We show detailed comparisons with the state–of–the art in both the single– and multiple–view contexts. Finally, we present a framework for learning within the indoor Manhattan hypothesis class. Our system is capable of extrapolating from labelled training examples to predict scene structure for unseen images. We cast learning as a structured prediction problem and show how to optimize with respect to two realistic loss functions. We present experiments in which we learn to recover scene structure from both single and multiple views — from the perspective of our learning algorithm these problems differ only by a change of feature space. This work constitutes one of the most complicated output spaces (in terms of internal constraints) yet considered within a structure prediction framework.
16

DISTRICT HEAT PRICE MODEL ANALYSIS : A risk assesment of Mälarenergi's new district heat price model

Landelius, Erik, Åström, Magnus January 2019 (has links)
Energy efficiency measures in buildings and alternative heating methods have led to a decreased demand for district heating (DH). Furthermore, due to a recent increase in extreme weather events, it is harder for DH providers to maintain a steady production leading to increased costs. These issues have led DH companies to change their price models. This thesis investigated such a price model change, made by Mälarenergi (ME) on the 1st of August 2018. The aim was to compare the old price model (PM1) with the new price model (PM2) by investigating the choice of base and peak loads a customer can make for the upcoming year, and/or if they should let ME choose for them. A prediction method, based on predicting the hourly DH demand, was chosen after a literature study and several method comparisons were made from using weather parameters as independent variables. Consumption data from Mälarenergi for nine customers of different sizes were gathered, and eight weather parameters from 2014 to 2018 were implemented to build up the prediction model. The method comparison results from Unscrambler showed that multilinear regression was the most accurate statistical modelling method, which was later used for all predictions. These predictions from Unscrambler were then used in MATLAB to estimate the total annual cost for each customer and outcome. For PM1, the results showed that the flexible cost for the nine customers stands for 76 to 85 % of the total cost, with the remaining cost as fixed fees. For PM2, the flexible cost for the nine customers stands for 46 to 61 % of the total cost, with the remaining as fixed cost. Regarding the total cost, PM2 is on average 7.5 % cheaper than PM1 for smaller customer, 8.6 % cheaper for medium customers and 15.9 % cheaper for larger customers. By finding the lowest cost case for each customer their optimal base and peaks loads were found and with the use of a statistical inference method (Bootstrapping) a 95 % confidence interval for the base load and the total yearly cost with could be established. The conclusion regarding choices is that the customer should always choose their own base load within the recommended confidence interval, with ME’s choice seen as a recommendation. Moreover, ME should always make the peak load choice because they are willing to pay for an excess fee that the customer themselves must pay otherwise.
17

Molecular evolution of biological sequences

Vázquez García, Ignacio January 2018 (has links)
Evolution is an ubiquitous feature of living systems. The genetic composition of a population changes in response to the primary evolutionary forces: mutation, selection and genetic drift. Organisms undergoing rapid adaptation acquire multiple mutations that are physically linked in the genome, so their fates are mutually dependent and selection only acts on these loci in their entirety. This aspect has been largely overlooked in the study of asexual or somatic evolution and plays a major role in the evolution of bacterial and viral infections and cancer. In this thesis, we put forward a theoretical description for a minimal model of evolutionary dynamics to identify driver mutations, which carry a large positive fitness effect, among passenger mutations that hitchhike on successful genomes. We examine the effect this mode of selection has on genomic patterns of variation to infer the location of driver mutations and estimate their selection coefficient from time series of mutation frequencies. We then present a probabilistic model to reconstruct genotypically distinct lineages in mixed cell populations from DNA sequencing. This method uses Hidden Markov Models for the deconvolution of genetically diverse populations and can be applied to clonal admixtures of genomes in any asexual population, from evolving pathogens to the somatic evolution of cancer. To understand the effects of selection on rapidly adapting populations, we constructed sequence ensembles in a recombinant library of budding yeast (S. cerevisiae). Using DNA sequencing, we characterised the directed evolution of these populations under selective inhibition of rate-limiting steps of the cell cycle. We observed recurrent patterns of adaptive mutations and characterised common mutational processes, but the spectrum of mutations at the molecular level remained stochastic. Finally, we investigated the effect of genetic variation on the fate of new mutations, which gives rise to complex evolutionary dynamics. We demonstrate that the fitness variance of the population can set a selective threshold on new mutations, setting a limit to the efficiency of selection. In summary, we combined statistical analyses of genomic sequences, mathematical models of evolutionary dynamics and experiments in molecular evolution to advance our understanding of rapid adaptation. Our results open new avenues in our understanding of population dynamics that can be translated to a range of biological systems.
18

Rich Linguistic Structure from Large-Scale Web Data

Yamangil, Elif 18 October 2013 (has links)
The past two decades have shown an unexpected effectiveness of Web-scale data in natural language processing. Even the simplest models, when paired with unprecedented amounts of unstructured and unlabeled Web data, have been shown to outperform sophisticated ones. It has been argued that the effectiveness of Web-scale data has undermined the necessity of sophisticated modeling or laborious data set curation. In this thesis, we argue for and illustrate an alternative view, that Web-scale data not only serves to improve the performance of simple models, but also can allow the use of qualitatively more sophisticated models that would not be deployable otherwise, leading to even further performance gains. / Engineering and Applied Sciences
19

Learning 3-D Models of Object Structure from Images

Schlecht, Joseph January 2010 (has links)
Recognizing objects in images is an effortless task for most people.Automating this task with computers, however, presents a difficult challengeattributable to large variations in object appearance, shape, and pose. The problemis further compounded by ambiguity from projecting 3-D objects into a 2-D image.In this thesis we present an approach to resolve these issues by modeling objectstructure with a collection of connected 3-D geometric primitives and a separatemodel for the camera. From sets of images we simultaneously learn a generative,statistical model for the object representation and parameters of the imagingsystem. By learning 3-D structure models we are going beyond recognitiontowards quantifying object shape and understanding its variation.We explore our approach in the context of microscopic images of biologicalstructure and single view images of man-made objects composed of block-likeparts, such as furniture. We express detected features from both domains asstatistically generated by an image likelihood conditioned on models for theobject structure and imaging system. Our representation of biological structurefocuses on Alternaria, a genus of fungus comprising ellipsoid and cylindershaped substructures. In the case of man-made furniture objects, we representstructure with spatially contiguous assemblages of blocks arbitrarilyconstructed according to a small set of design constraints.We learn the models with Bayesian statistical inference over structure andcamera parameters per image, and for man-made objects, across categories, suchas chairs. We develop a reversible-jump MCMC sampling algorithm to exploretopology hypotheses, and a hybrid of Metropolis-Hastings and stochastic dynamicsto search within topologies. Our results demonstrate that we can infer both 3-Dobject and camera parameters simultaneously from images, and that doing soimproves understanding of structure in images. We further show how 3-D structuremodels can be inferred from single view images, and that learned categoryparameters capture structure variation that is useful for recognition.
20

Delay estimation in computer networks

Johnson, Nicholas Alexander January 2010 (has links)
Computer networks are becoming increasingly large and complex; more so with the recent penetration of the internet into all walks of life. It is essential to be able to monitor and to analyse networks in a timely and efficient manner; to extract important metrics and measurements and to do so in a way which does not unduly disturb or affect the performance of the network under test. Network tomography is one possible method to accomplish these aims. Drawing upon the principles of statistical inference, it is often possible to determine the statistical properties of either the links or the paths of the network, whichever is desired, by measuring at the most convenient points thus reducing the effort required. In particular, bottleneck-link detection methods in which estimates of the delay distributions on network links are inferred from measurements made at end-points on network paths, are examined as a means to determine which links of the network are experiencing the highest delay. Initially two published methods, one based upon a single Gaussian distribution and the other based upon the method-of-moments, are examined by comparing their performance using three metrics: robustness to scaling, bottleneck detection accuracy and computational complexity. Whilst there are many published algorithms, there is little literature in which said algorithms are objectively compared. In this thesis, two network topologies are considered, each with three configurations in order to determine performance in six scenarios. Two new estimation methods are then introduced, both based on Gaussian mixture models which are believed to offer an advantage over existing methods in certain scenarios. Computationally, a mixture model algorithm is much more complex than a simple parametric algorithm but the flexibility in modelling an arbitrary distribution is vastly increased. Better model accuracy potentially leads to more accurate estimation and detection of the bottleneck. The concept of increasing flexibility is again considered by using a Pearson type-1 distribution as an alternative to the single Gaussian distribution. This increases the flexibility but with a reduced complexity when compared with mixture model approaches which necessitate the use of iterative approximation methods. A hybrid approach is also considered where the method-of-moments is combined with the Pearson type-1 method in order to circumvent problems with the output stage of the former. This algorithm has a higher variance than the method-of-moments but the output stage is more convenient for manipulation. Also considered is a new approach to detection algorithms which is not dependant on any a-priori parameter selection and makes use of the Kullback-Leibler divergence. The results show that it accomplishes its aim but is not robust enough to replace the current algorithms. Delay estimation is then cast in a different role, as an integral part of an algorithm to correlate input and output streams in an anonymising network such as the onion router (TOR). TOR is used by users in an attempt to conceal network traffic from observation. Breaking the encryption protocols used is not possible without significant effort but by correlating the un-encrypted input and output streams from the TOR network, it is possible to provide a degree of certainty about the ownership of traffic streams. The delay model is essential as the network is treated as providing a pseudo-random delay to each packet; having an accurate model allows the algorithm to better correlate the streams.

Page generated in 0.1202 seconds