• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 855
  • 403
  • 113
  • 89
  • 24
  • 19
  • 13
  • 10
  • 7
  • 6
  • 5
  • 4
  • 3
  • 3
  • 3
  • Tagged with
  • 1885
  • 660
  • 330
  • 234
  • 219
  • 216
  • 212
  • 212
  • 208
  • 203
  • 189
  • 182
  • 169
  • 150
  • 143
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Bayesian topics relating to the exponential family

Gutierrez-Pena, Eduardo Arturo January 1995 (has links)
No description available.
82

A Metadata Inference Framework to Provide Operational Information Support for Fault Detection and Diagnosis Applications in Secondary HVAC Systems

Gao, Jingkun 01 December 2017 (has links)
As the cost of hardware decreases and software technology advances, building automation systems (BAS) have been widely deployed to new buildings or as part of the retrofit to replace the old control systems. Though they are becoming more prevalent and promise important benefits to the society, such as improved energy-efficiency and occupants’ comfort, many of their benefits remain unreachable. Research suggests that this is because of the heterogeneous, fragmented and nonstandardized nature of existing BASs. One of the purported benefits of these systems is the ability to reduce energy consumption through the application of automated approaches such as fault detection and diagnosis (FDD) algorithms. Savings of up to 0.16 quadrillion BTUs per year could be obtained in the US alone through the use of these approaches, which are just software applications running on BAS hardware. However, deployment of these applications for buildings remains a challenge due to the non-trivial efforts of organizing, managing and extracting metadata associated with sensors (e.g., information about their type, function, etc.), which is required by them. One of the reasons leading to the problem is that varying conventions, acronyms, and standards are used to define this metadata. Though standards and governmentmandated policies may lift these obstacles and enable these softwarebased improvements to our building stock, this effort could take years to come to fruition and there are alternative technical solutions, such as automated metadata inference techniques, that could help reign in on the non-standardized nature of today’s BASs. This thesis sheds light on the visibility of this alternative approach by answering three key questions, which are then validated using data from more than 400 buildings in the US: (a) What is the specific operational information required by FDD approaches for secondary heating, ventilation, and air conditioning (HVAC) systems found in existing literature? (b) How is the performance of existing metadata inference approaches affected by changes in building characteristics, weather conditions, building usage patterns, and geographical locations? (c) What is an approach that can provide physical interpretations in the case of incorrect metadata being inferred? We find that: (a) The BAS points required by more than 30% of FDD approaches include six sensors in AHUs monitoring supply air temperature, outside air temperature, chilled water valve position, return air temperature, supply air flow rate, and mixed air temperature; (b) The average performance of existing inference approaches in terms of accuracy is similar across building sites, though there is significant variance, and the expected accuracy of classifying the type of points required by a particular FDD application for a new unseen building is, on average, 75%; (c) A new approach based on physical models is developed and validated on both the simulation data and the real-world data to infer the point types confused by data-driven models with an accuracy ranging from 73% to 100%, and this approach can provide physical interpretations in the case of incorrect inference. Our results provide a foundation and starting point to infer the metadata required by FDD approaches and minimize the implementation cost of deploying FDD applications on multiple buildings.
83

Novel Methods for Drug-Target Interaction Prediction using Graph Mining

Ba Alawi, Wail 31 August 2016 (has links)
The problem of developing drugs that can be used to cure diseases is important and requires a careful approach. Since pursuing the wrong candidate drug for a particular disease could be very costly in terms of time and money, there is a strong interest in minimizing such risks. Drug repositioning has become a hot topic of research, as it helps reduce these risks significantly at the early stages of drug development by reusing an approved drug for the treatment of a different disease. Still, finding new usage for a drug is non-trivial, as it is necessary to find out strong supporting evidence that the proposed new uses of drugs are plausible. Many computational approaches were developed to narrow the list of possible candidate drug-target interactions (DTIs) before any experiments are done. However, many of these approaches suffer from unacceptable levels of false positives. We developed two novel methods based on graph mining networks of drugs and targets. The first method (DASPfind) finds all non-cyclic paths that connect a drug and a target, and using a function that we define, calculates a score from all the paths. This score describes our confidence that DTI is correct. We show that DASPfind significantly outperforms other state-of-the-art methods in predicting the top ranked target for each drug. We demonstrate the utility of DASPfind by predicting 15 novel DTIs over a set of ion channel proteins, and confirming 12 out of these 15 DTIs through experimental evidence reported in literature and online drug databases. The second method (DASPfind+) modifies DASPfind in order to increase the confidence and reliability of the resultant predictions. Based on the structure of the drug-target interaction (DTI) networks, we introduced an optimization scheme that incrementally alters the network structure locally for each drug to achieve more robust top 1 ranked predictions. Moreover, we explored effects of several similarity measures between the targets on the prediction accuracy and proposed an enhanced strategy for DTI prediction. Our results show significant improvements of the accuracy of the top ranked DTI prediction over the current state-of-the-art methods.
84

The Effects of Sample Size on Measures of Subjective Correlation

Gilkey, Justin Michael 30 July 2008 (has links)
No description available.
85

Inference of gene networks from time series expression data and application to type 1 Diabetes

Lopes, Miguel 04 September 2015 (has links)
The inference of gene regulatory networks (GRN) is of great importance to medical research, as causal mechanisms responsible for phenotypes are unravelled and potential therapeutical targets identified. In type 1 diabetes, insulin producing pancreatic beta-cells are the target of an auto-immune attack leading to apoptosis (cell suicide). Although key genes and regulations have been identified, a precise characterization of the process leading to beta-cell apoptosis has not been achieved yet. The inference of relevant molecular pathways in type 1 diabetes is then a crucial research topic. GRN inference from gene expression data (obtained from microarrays and RNA-seq technology) is a causal inference problem which may be tackled with well-established statistical and machine learning concepts. In particular, the use of time series facilitates the identification of the causal direction in cause-effect gene pairs. However, inference from gene expression data is a very challenging problem due to the large number of existing genes (in human, over twenty thousand) and the typical low number of samples in gene expression datasets. In this context, it is important to correctly assess the accuracy of network inference methods. The contributions of this thesis are on three distinct aspects. The first is on inference assessment using precision-recall curves, in particular using the area under the curve (AUPRC). The typical approach to assess AUPRC significance is using Monte Carlo, and a parametric alternative is proposed. It consists on deriving the mean and variance of the null AUPRC and then using these parameters to fit a beta distribution approximating the true distribution. The second contribution is an investigation on network inference from time series. Several state of the art strategies are experimentally assessed and novel heuristics are proposed. One is a fast approximation of first order Granger causality scores, suited for GRN inference in the large variable case. Another identifies co-regulated genes (ie. regulated by the same genes). Both are experimentally validated using microarray and simulated time series. The third contribution of this thesis is on the context of type 1 diabetes and is a study on beta cell gene expression after exposure to cytokines, emulating the mechanisms leading to apoptosis. 8 datasets of beta cell gene expression were used to identify differentially expressed genes before and after 24h, which were functionally characterized using bioinformatics tools. The two most differentially expressed genes, previously unknown in the type 1 Diabetes literature (RIPK2 and ELF3) were found to modulate cytokine induced apoptosis. A regulatory network was then inferred using a dynamic adaptation of a state of the art network inference method. Three out of four predicted regulations (involving RIPK2 and ELF3) were experimentally confirmed, providing a proof of concept for the adopted approach. / Doctorat en Sciences / info:eu-repo/semantics/nonPublished
86

Emotion and predictive processing : emotions as perceptions?

Araya, Jose Manuel January 2018 (has links)
In this Thesis, I systematize, clarify, and expand the current theory of emotion based on the principles of predictive processing-the interoceptive inference view of emotion-so as to show the following: (1) as it stands, this view is problematic. (2) Once expanded, the view in question can deal with its more pressing problems, and it compares favourably to competing accounts. Thus, the interoceptive inference view of emotion stands out as a plausible theory of emotion. According to the predictive processing (PP) framework, all what the brain does, in all its functions, is to minimize its precision-weighted prediction error (PE) (Clark, 2013, 2016; Hohwy, 2013). Roughly, PE consist in the difference between the sensory signals expected (and generated) from the top-down and the actual, incoming sensory signals. Now, in the PP framework, visual percepts are formed by minimizing visual PE in a specific manner: via visual perceptual inference. That is, the brain forms visual percepts in a top-down fashion by predicting its incoming lower-level sensory signals from higher-level models of the likely (hidden) causes of those visual signals. Such models can be seen as putting forward content-specifying hypotheses about the object or event responsible for triggering incoming sensory activity. A contentful percept is formed once a certain hypothesis achieves to successfully match, and thus supress, current lower-level sensory signals. In the interoceptive inference approach to interoception (Seth, 2013, 2015), the principles of PP have been extended to account for interoception, i.e., the perception of our homeostatic, physiological condition. Just as perception in the visual domain arises via visual perceptual inference, the interoceptive inference approach holds that perception of the inner, physiological milieu arises via interoceptive perceptual inference. Now, what might be called the interoceptive inference theory of valence (ITV) holds that the interoceptive inference approach can be used so as to account for subjective feeling states in general, i.e., mental states that feel good or bad-i.e., valenced mental states. According to ITV, affective valence arises by way of interoceptive perceptual inference. On the other hand, what might be called the interoceptive inference view of emotion (IIE) holds that the interoceptive inference approach can be used so as to account for emotions per se (e.g., fear, anger, joy). More precisely, IIE holds that, in direct analogy to the way in which visual percepts are formed, emotions arise from interoceptive predictions of the causes of current interoceptive afferents. In other words, emotions per se amount to interceptive percepts formed via higher-level, content-specifying emotion hypotheses. In this Thesis, I aim to systematize, clarify, and expand the interoceptive inference approach to interoception, in order to show that: (1) contrary to non-sensory theories of affective valence, valence is indeed constituted by interoceptive perceptions, and that interoceptive percepts do arise via interoceptive perceptual inference. Therefore, ITV holds. (2) Considering that IIE exhibits problematic assumptions, it should be amended. In this respect, I will argue that emotions do not arise via interoceptive perceptual inference (as IIE claims), since this assumes that there must be regularities pertaining to emotion in the physiological domain. I will suggest that emotions arise instead by minimizing interoceptive PE in another fashion. That is, emotions arise via external interoceptive active inference: by sampling and modifying the external environment in order to change an already formed interoceptive percept (which has been formed via interoceptive perceptual inference). That is, emotions are specific strategies for regulating affective valence. More precisely, I will defend the view that a certain emotion E amounts to a specific strategy for minimizing interoceptive PE by way of a specific set of stored knowledge of the counterfactual relations that obtain between (possible) actions and its prospective interoceptive, sensory consequences ("if I act in this manner, interoceptive signals should evolve in such-and-such way"). An emotion arises when such knowledge is applied in order to regulate valence.
87

Essays on Inference in Linear Mixed Models

Kramlinger, Peter 28 April 2020 (has links)
No description available.
88

Mixtures of triangular densities with applications to Bayesian mode regressions

Ho, Chi-San 22 September 2014 (has links)
The main focus of this thesis is to develop full parametric and semiparametric Bayesian inference for data arising from triangular distributions. A natural consequence of working with such distributions is it allows one to consider regression models where the response variable is now the mode of the data distribution. A new family of nonparametric prior distributions is developed for a certain class of convex densities of particular relevance to mode regressions. Triangular distributions arise in several contexts such as geosciences, econometrics, finance, health care management, sociology, reliability engineering, decision and risk analysis, etc. In many fields, experts, typically, have a reasonable idea about the range and most likely values that define a data distribution. Eliciting these quantities is thus, generally, easier than eliciting moments of other commonly known distributions. Using simulated and actual data, applications of triangular distributions, with and without mode regressions, in some of the aforementioned areas are tackled. / text
89

A microcanonical cascade formalism for multifractal systems and its application to data inference and forecasting

Pont, Oriol 24 April 2009 (has links) (PDF)
Many complex systems in Nature are multifractal, a feature closely related to scale invariance. Multifractality is ubiquitous and so it can be found in systems as diverse as marine turbulence, econometric series, heartbeat dynamics and the solar magnetic field. In recent years, there has been growing interest in modelling the multifractal structure in these systems. This has improved our understanding of certain phenomena and has opened the way for applications such as reduction of coding redundancy, reconstruction of data gaps and forecasting of multifractal variables. Exhaustive multifractal characterization of experimental data is needed for tuning parameters of the models. The design of appro- priate algorithms to achieve this purpose remains a major challenge, since discretization, gaps, noise and long-range correlations require ad- vanced processing, especially since multifractal signals are not smooth: due to scale invariance, they are intrinsically uneven and intermittent. In the present study, we introduce a formalism for multifractal data based on microcanonical cascades. We show that with appropri- ate selection of the representation basis, we greatly improve inference capabilities in a robust fashion. In addition, we show two applications of microcanonical cascades: first, forecasting of stock market series; and second, detection of interscale heat transfer in the ocean.
90

Time series classification

Rajan, Jebu Jacob January 1994 (has links)
No description available.

Page generated in 0.0543 seconds