• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 59
  • 14
  • 11
  • 6
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 119
  • 119
  • 22
  • 21
  • 17
  • 14
  • 13
  • 13
  • 13
  • 13
  • 12
  • 11
  • 11
  • 11
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Radio frequency interference modeling and mitigation in wireless receivers

Gulati, Kapil 21 October 2011 (has links)
In wireless communication systems, receivers have generally been designed under the assumption that the additive noise in system is Gaussian. Wireless receivers, however, are affected by radio frequency interference (RFI) generated from various sources such as other wireless users, switching electronics, and computational platforms. RFI is well modeled using non-Gaussian impulsive statistics and can severely degrade the communication performance of wireless receivers designed under the assumption of additive Gaussian noise. Methods to avoid, cancel, or reduce RFI have been an active area of research over the past three decades. In practice, RFI cannot be completely avoided or canceled at the receiver. This dissertation derives the statistics of the residual RFI and utilizes them to analyze and improve the communication performance of wireless receivers. The primary contributions of this dissertation are to (i) derive instantaneous statistics of co-channel interference in a field of Poisson and Poisson-Poisson clustered interferers, (ii) characterize throughput, delay, and reliability of decentralized wireless networks with temporal correlation, and (iii) design pre-filters to mitigate RFI in wireless receivers. / text
22

Multivariate Modeling in Chemical Toner Manufacturing Process

Khorami, Hassan January 2013 (has links)
Process control and monitoring is a common problem in high value added chemical manufacturing industries where batch processes are used to produce wide range of products on the same piece of equipment. This results in frequent adjustments on control and monitoring schemes. A chemical toner manufacturing process is representative of an industrial case which is used in this thesis. Process control and monitoring problem of batch processes have been researched, mostly through the simulation, and published in the past . However, the concept of applying the subject to chemical toner manufacturing process or to use a single indicator for multiple pieces of equipment have never been visited previously. In the case study of this research, there are many different factors that may affect the final quality of the products including reactor batch temperature, jacket temperature, impeller speed, rate of the addition of material to the reactor, or process variable associated with the pre-weight tank. One of the challenging tasks for engineers is monitoring of these process variables and to make necessary adjustments during the progression of a batch and change controls strategy of future batches upon completion of an existing batch. Another objective of the proposed research is the establishment of the operational boundaries to monitor the process through the usage of process trajectories of the history of the past successful batches. In this research, process measurements and product quality values of the past successful batches were collected and projected into matrix of data; and preprocessed through time alignment, centering, and scaling. Then the preprocessed data was projected into lower dimensions (latent variables) to produce latent variables and their trajectories during successful batches. Following the identification of latent variables, an empirical model was built through a 4-fold cross validation that can represent the operation of a successful batch. The behavior of two abnormal batches, batch 517 and 629, is then compared to the model by testing its statistical properties. Once the abnormal batches were flagged, their data set were folded back to original dimension to form a localization path for the time of abnormality and process variables that contributed to the abnormality. In each case the process measurement were used to establish operational boundaries on the latent variable space.
23

NOVEL COMPUTATIONAL METHODS FOR TRANSCRIPT RECONSTRUCTION AND QUANTIFICATION USING RNA-SEQ DATA

Huang, Yan 01 January 2015 (has links)
The advent of RNA-seq technologies provides an unprecedented opportunity to precisely profile the mRNA transcriptome of a specific cell population. It helps reveal the characteristics of the cell under the particular condition such as a disease. It is now possible to discover mRNA transcripts not cataloged in existing database, in addition to assessing the identities and quantities of the known transcripts in a given sample or cell. However, the sequence reads obtained from an RNA-seq experiment is only a short fragment of the original transcript. How to recapitulate the mRNA transcriptome from short RNA-seq reads remains a challenging problem. We have proposed two methods directly addressing this challenge. First, we developed a novel method MultiSplice to accurately estimate the abundance of the well-annotated transcripts. Driven by the desire of detecting novel isoforms, a max-flow-min-cost algorithm named Astroid is designed for simultaneously discovering the presence and quantities of all possible transcripts in the transcriptome. We further extend an \emph{ab initio} pipeline of transcriptome analysis to large-scale dataset which may contain hundreds of samples. The effectiveness of proposed methods has been supported by a series of simulation studies, and their application on real datasets suggesting a promising opportunity in reconstructing mRNA transcriptome which is critical for revealing variations among cells (e.g. disease vs. normal).
24

Credibility modeling with applications

Khapaeva, Tatiana 16 May 2014 (has links)
The purpose of this thesis is to show how the theory and practice of credibility can bene t statistical modeling. The task was, fundamentally, to derive models that could provide the best estimate of the losses for any given class and also to assess the variability of the losses, both from a class perspective as well as from an aggregate perspective. The model tting and diagnostic tests will be carried out using standard statistical packages. A case study that predicts the number of deaths due to cancer is considered, utilizing data furnished by the Colorado Department of Public Health and Environment. Several credibility models are used, including Bayesian, B uhlmann and B uhlmann-Straub approaches, which are useful in a wide range of actuarial applications.
25

Multivariate Modeling in Chemical Toner Manufacturing Process

Khorami, Hassan January 2013 (has links)
Process control and monitoring is a common problem in high value added chemical manufacturing industries where batch processes are used to produce wide range of products on the same piece of equipment. This results in frequent adjustments on control and monitoring schemes. A chemical toner manufacturing process is representative of an industrial case which is used in this thesis. Process control and monitoring problem of batch processes have been researched, mostly through the simulation, and published in the past . However, the concept of applying the subject to chemical toner manufacturing process or to use a single indicator for multiple pieces of equipment have never been visited previously. In the case study of this research, there are many different factors that may affect the final quality of the products including reactor batch temperature, jacket temperature, impeller speed, rate of the addition of material to the reactor, or process variable associated with the pre-weight tank. One of the challenging tasks for engineers is monitoring of these process variables and to make necessary adjustments during the progression of a batch and change controls strategy of future batches upon completion of an existing batch. Another objective of the proposed research is the establishment of the operational boundaries to monitor the process through the usage of process trajectories of the history of the past successful batches. In this research, process measurements and product quality values of the past successful batches were collected and projected into matrix of data; and preprocessed through time alignment, centering, and scaling. Then the preprocessed data was projected into lower dimensions (latent variables) to produce latent variables and their trajectories during successful batches. Following the identification of latent variables, an empirical model was built through a 4-fold cross validation that can represent the operation of a successful batch. The behavior of two abnormal batches, batch 517 and 629, is then compared to the model by testing its statistical properties. Once the abnormal batches were flagged, their data set were folded back to original dimension to form a localization path for the time of abnormality and process variables that contributed to the abnormality. In each case the process measurement were used to establish operational boundaries on the latent variable space.
26

EVALUATION OF STATISTICAL METHODS FOR MODELING HISTORICAL RESOURCE PRODUCTION AND FORECASTING

Nanzad, Bolorchimeg 01 August 2017 (has links)
This master’s thesis project consists of two parts. Part I of the project compares modeling of historical resource production and forecasting of future production trends using the logit/probit transform advocated by Rutledge (2011) with conventional Hubbert curve fitting, using global coal production as a case study. The conventional Hubbert/Gaussian method fits a curve to historical production data whereas a logit/probit transform uses a linear fit to a subset of transformed production data. Within the errors and limitations inherent in this type of statistical modeling, these methods provide comparable results. That is, despite that apparent goodness-of-fit achievable using the Logit/Probit methodology, neither approach provides a significant advantage over the other in either explaining the observed data or in making future projections. For mature production regions, those that have already substantially passed peak production, results obtained by either method are closely comparable and reasonable, and estimates of ultimately recoverable resources obtained by either method are consistent with geologically estimated reserves. In contrast, for immature regions, estimates of ultimately recoverable resources generated by either of these alternative methods are unstable and thus, need to be used with caution. Although the logit/probit transform generates high quality-of-fit correspondence with historical production data, this approach provides no new information compared to conventional Gaussian or Hubbert-type models and may have the effect of masking the noise and/or instability in the data and the derived fits. In particular, production forecasts for immature or marginally mature production systems based on either method need to be regarded with considerable caution. Part II of the project investigates the utility of a novel alternative method for multicyclic Hubbert modeling tentatively termed “cycle-jumping” wherein overlap of multiple cycles is limited. The model is designed in a way that each cycle is described by the same three parameters as conventional multicyclic Hubbert model and every two cycles are connected with a transition width. Transition width indicates the shift from one cycle to the next and is described as weighted coaddition of neighboring two cycles. It is determined by three parameters: transition year, transition width, and γ parameter for weighting. The cycle-jumping method provides superior model compared to the conventional multicyclic Hubbert model and reflects historical production behavior more reasonably and practically, by better modeling of the effects of technological transitions and socioeconomic factors that affect historical resource production behavior by explicitly considering the form of the transitions between production cycles.
27

Estimativa da irradiação solar global pelo método de Angstrom-Prescott e técnicas de aprendizado de máquinas / Estimation of global solar irradiation by Angstrom-Prescott method and machinelearning techniques

Silva, Maurício Bruno Prado da [UNESP] 22 February 2016 (has links)
Submitted by MAURÍCIO BRUNO PRADO DA SILVA null (mauricio.prado19@hotmail.com) on 2016-04-14T21:18:39Z No. of bitstreams: 1 Dissertação_Mauricio com ficha.pdf: 839383 bytes, checksum: d8cae8991d7bfed483f452706bf3cd66 (MD5) / Approved for entry into archive by Felipe Augusto Arakaki (arakaki@reitoria.unesp.br) on 2016-04-18T17:08:20Z (GMT) No. of bitstreams: 1 silva_mbp_me_bot.pdf: 839383 bytes, checksum: d8cae8991d7bfed483f452706bf3cd66 (MD5) / Made available in DSpace on 2016-04-18T17:08:20Z (GMT). No. of bitstreams: 1 silva_mbp_me_bot.pdf: 839383 bytes, checksum: d8cae8991d7bfed483f452706bf3cd66 (MD5) Previous issue date: 2016-02-22 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES) / No presente trabalho é descrito o estudo comparativo de métodos de estimativas da irradiação solar global (HG) nas partições diária (HGd) e mensal (HGm): geradas pela técnica de Angstrom-Prescott (A-P) e duas técnicas de Aprendizado de Máquina (AM), Máquinas de Vetores de Suporte (MVS) e Redes Neurais Artificiais (RNA). A base de dados usada foi medida no período de 1996 a 2011, na Estação Solarimétrica em Botucatu. Por meio da regressão entre a transmissividade atmosférica (HG/HO) e razão de insolação (n/N), o modelo estatístico (A-P) foi determinado, obtendo equações lineares que permitem estimar HG com elevados coeficientes de determinação. As técnicas, MVS e RNA, foram treinadas na mesma arquitetura de A-P (modelo 1). As técnicas MVS e RNA foram treinadas ainda em mais 3 modelos com acréscimos, uma a uma, das variáveis temperatura do ar, precipitação e umidade relativa (modelos 2, 3 e 4). Os modelos foram validados usando uma base de dados de dois anos, denominadas de típico e atipico, por meio de correlações entre os valores estimados e medidos, indicativos estatísticos rMBE, MBE, rRMSE, RMSE e d de Willmott. Os indicativos estatísticos r das correlações mostraram que o modelo (A-P) pode estimar HG com elevados coeficientes de determinação nas duas condições de validação. Já indicativos estatísticos rMBE, MBE, rRMSE, RMSE e d de Willmott indicam que o modelo (A-P) pode ser utilizado na estimativa de HGd com exatidão e precisão. Os indicativos estatísticos obtidos pelos 4 modelos das técnicas MVSd e RNAd (diária) e MVSm e RNAm (mensal) podem ser utilizadas nas estimativas de HGd com elevadas correlações e com precisão e exatidão. Entre os modelos foram selecionadas por comparação entre os indicativo estatisticos as redes MVS4d e RNA4d (diária) e MVS1m e RNA1m (mensal). A comparação dos indicativos estatísticos rMBE, MBE, rRMSE, RMSE, d de Willmott, r e R2 obtidos na validação entre os modelos (A-P), MVS e RNA mostrou que: a técnica MVS apresentou melhor resultado que o modelo estatístico de (A-P); esta técnica apresentou melhor resultado que a RNA; o modelo estatístico (A-P), apresentou no geral melhor resultado que a RNA. / In this paper describes the comparative study of different methods for estimating global solar irradiation (HG) in the daily partitions (HGd) and monthly (HGm): generated by Angstrom-Prescott (AP) and two machine learning techniques (ML), Support Vector Machines (SVM) and Artificial Neural Networks (ANN). The used database was measured from 1996 to 2011, in Solarimetric station in Botucatu. Through regression between atmospheric transmissivity (HG / HO) and insolation ratio (n / N), the statistical model (A-P) was determined, obtaining linear equations that allow estimating HG with high coefficients of determination. The techniques, svm and ANN, were trained on the same architecture of A-P (model 1). The SVM and ANN techniques were further trained on the most models with 3 additions, one by one, the variable air temperature, rainfall and relative humidity (model 2, 3 and 4 ). The models were validated using a database of two years, called of typical and atypical, with correlation between estimated and measured values, statistical indications: rMBE, MBE, rRMSE, RMSE, and d Willmott. The statistical indicative of correlations coefficient (r) showed that the model (A-P) can be estimated with high HG determination coefficients in the two validation conditions. The rMBE, MBE, rRMSE, RMSE Willmott and d indicate that the model (A-P) can be used to estimate HGD with accuracy and precision. The statistical indicative obtained by the four models of technical SVMd and ANNd (daily) and SVMm and ANNm (monthly) can be used in the estimates of HGD with high correlations and with precision and accuracy. Among the models were selected by comparing the indicative statistical SVM4d and ANN4d networks (daily) and SVM1m and ANN1m (monthly). The comparison of statistical indicative rMBE, MBE, rRMSE, RMSE, d Willmott, r and R2 obtained in the validation of the models (A-P), SVM and ANN showed that: the SVM technique showed better results than the statistical model (A-P); this technique showed better results than the ANN; the statistical model (A-P) showed overall better result than ANN.
28

Identificação do grau de comprometimento neurossensório-motor utilizando tempo de resposta / Identifying the degree of neurosensory-motor impairment using response time

Klafke, Marcelo 16 January 2017 (has links)
Submitted by Cássia Santos (cassia.bcufg@gmail.com) on 2017-02-15T09:30:57Z No. of bitstreams: 2 Dissertação - Marcelo Klafke - 2017.pdf: 17513255 bytes, checksum: b4da11e8b63654efc6d6796160aec16f (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) / Approved for entry into archive by Luciana Ferreira (lucgeral@gmail.com) on 2017-02-15T09:33:11Z (GMT) No. of bitstreams: 2 Dissertação - Marcelo Klafke - 2017.pdf: 17513255 bytes, checksum: b4da11e8b63654efc6d6796160aec16f (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) / Made available in DSpace on 2017-02-15T09:33:11Z (GMT). No. of bitstreams: 2 Dissertação - Marcelo Klafke - 2017.pdf: 17513255 bytes, checksum: b4da11e8b63654efc6d6796160aec16f (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) Previous issue date: 2017-01-16 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES / This work proposes the development of the evolution scale for neurosensory-motor pathology through a statistical approach and the use of response time tests. Reference data are collected in two groups: i) individuals are affected and ii) individuals are not affected with Multiple Sclerosis. A portable metering device is developed with the differential of measuring simple and choice response times for upper and lower limbs with the use of auditory and visual stimuli. A method is proposed to evaluate the effectiveness of the rehabilitation process of patients with neurosensory-motor impairment through standardized quantitative measures that aid in clinical decisions and treatment. The result of the proposed method is the multidimensional graph consisting of competing axes formed by normalized scales between 0% and 100% that indicate the level of neurosensory-motor impairment assessed from the response time data. The data distribution is analyzed using the maximum likelihood method, descriptive statistics and the use of the scale and graph model proposed is demonstrated and compared with the Kurtzke Expanded Disability Status Scale in Multiple Sclerosis patients. / Propõe-se neste trabalho o desenvolvimento da escala de evolução para patologia neurossensória-motora através de abordagem estatística e uso de testes de tempo de resposta. Dados de referência são coletados em dois grupos: i) indivíduos acometidos e ii) indivíduos não acometidos com Esclerose Múltipla. É desenvolvido dispositivo medidor portátil com diferencial de mensurar tempos de resposta simples e de escolha para membros superiores e inferiores com o uso de estímulos auditivos e visuais. Propõe-se método de avaliação da eficácia do processo de reabilitação de pacientes com comprometimento neurossensório-motor através de medidas quantitativas padronizadas que auxiliem nas decisões clínicas e no tratamento. O resultado do método proposto é o gráfico multidimensional composto por eixos concorrentes formados por escalas normalizadas entre 0% e 100% que indicam o nível de comprometimento neurossensório-motor avaliado a partir dos dados de tempo de resposta. A distribuição dos dados são analisadas através do método da máxima verossimilhança, estatística descritiva e o uso do modelo de escala e gráfico proposto são demonstrados e comparados com a avaliação da Escala Expandida do Estado de Incapacidade de Kurtzke em pacientes de Esclerose Múltipla.
29

Development, evaluation and application of inference-based decision support methods to meet the rising wood demands of the growing bio-economy sector

Husmann, Kai 16 October 2017 (has links)
No description available.
30

Assessing The Probability Of Fluid Migration Caused By Hydraulic Fracturing; And Investigating Flow And Transport In Porous Media Using Mri

Montague, James 01 January 2017 (has links)
Hydraulic fracturing is used to extract oil and natural gas from low permeability formations. The potential of fluids migrating from depth through adjacent wellbores and through the production wellbore was investigated using statistical modeling and predic-tive classifiers. The probability of a hydraulic fracturing well becoming hydraulically connected to an adjacent well in the Marcellus shale of New York was determined to be between 0.00% and 3.45% at the time of the study. This means that the chance of an in-duced fracture from hydraulic fracturing intersecting an existing well is highly dependent on the area of increased permeability caused by fracturing. The chance of intersecting an existing well does not mean that fluid will flow upwards; for upward migration to occur, a pathway must exist and a pressure gradient is required to drive flow, with the exception of gas flow caused by buoyancy. Predictive classifiers were employed on a dataset of wells in Alberta Canada to identify well characteristics most associated to fluid migration along the production well. The models, specifically a random forest, were able to identify pathways better than random guessing with 78% of wells in the data set identified cor-rectly. Magnetic resonance imaging (MRI) was used to visualize and quantify contami-nant transport in a soil column using a full body scanner. T1 quantification was used to determine the concentration of a contaminant surrogate in the form of Magnevist, an MRI contrast agent. Imaging showed a strong impact from density driven convection when the density difference between the two fluids was small (0.3%). MRI also identified a buildup of contrast agent concentration at the interface between a low permeability ground silica and higher permeability AFS 50-70 testing sand when density driven con-vection was eliminated.

Page generated in 0.0575 seconds