• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 25
  • 14
  • 6
  • 2
  • 1
  • Tagged with
  • 86
  • 86
  • 34
  • 31
  • 27
  • 27
  • 27
  • 26
  • 26
  • 18
  • 15
  • 13
  • 13
  • 12
  • 12
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Improvements to Clause Weighting Local Search for Propositional Satisfiability

Ferreira Junior, Valnir, N/A January 2007 (has links)
The propositional satisfiability (SAT) problem is of considerable theoretical and practical relevance to the artificial intelligence (AI) community and has been used to model many pervasive AI tasks such as default reasoning, diagnosis, planning, image interpretation, and constraint satisfaction. Computational methods for SAT have historically fallen into two broad categories: complete search and local search. Within the local search category, clause weighting methods are amongst the best alternatives for SAT, becoming particularly attractive on problems where a complete search is impractical or where there is a need to find good candidate solutions within a short time. The thesis is concerned with the study of improvements to clause weighting local search methods for SAT. The main contributions are: A component-based framework for the functional analysis of local search methods. A clause weighting local search heuristic that exploits longer-term memory arising from clause weight manipulations. The approach first learns which clauses are globally hardest to satisfy and then uses this information to treat these clauses differentially during weight manipulation [Ferreira Jr and Thornton, 2004]. A study of heuristic tie breaking in the domain of additive clause weighting local search methods, and the introduction of a competitive method that uses heuristic tie breaking instead of the random tie breaking approach used in most existing methods [Ferreira Jr and Thornton, 2005]. An evaluation of backbone guidance for clause weighting local search, and the introduction of backbone guidance to three state-of-the-art clause weighting local search methods [Ferreira Jr, 2006]. A new clause weighting local search method for SAT that successfully exploits synergies between the longer-term memory and tie breaking heuristics developed in the thesis to significantly improve on the performance of current state-of-the-art local search methods for SAT-encoded instances containing identifiable CSP structure. Portions of this thesis have appeared in the following refereed publications: Longer-term memory in clause weighting local search for SAT. In Proceedings of the 17th Australian Joint Conference on Artificial Intelligence, volume 3339 of Lecture Notes in Artificial Intelligence, pages 730-741, Cairns, Australia, 2004. Tie breaking in clause weighting local search for SAT. In Proceedings of the 18th Australian Joint Conference on Artificial Intelligence, volume 3809 of Lecture Notes in Artificial Intelligence, pages 70–81, Sydney, Australia, 2005. Backbone guided dynamic local search for propositional satisfiability. In Proceedings of the Ninth International Symposium on Artificial Intelligence and Mathematics, AI&M, Fort Lauderdale, Florida, 2006.
22

Computational methods for the analysis of HIV drug resistance dynamics

Al Mazari, Ali January 2007 (has links)
Doctor of Philosophy(PhD) / ABSTRACT Despite the extensive quantitative and qualitative knowledge about therapeutic regimens and the molecular biology of HIV/AIDS, the eradication of HIV infection cannot be achieved with available antiretroviral regimens. HIV drug resistance remains the most challenging factor in the application of approved antiretroviral agents. Previous investigations and existing HIV/AIDS models and algorithms have not enabled the development of long-lasting and preventive drug agents. Therefore, the analysis of the dynamics of drug resistance and the development of sophisticated HIV/AIDS analytical algorithms and models are critical for the development of new, potent antiviral agents, and for the greater understanding of the evolutionary behaviours of HIV. This study presents novel computational methods for the analysis of drug-resistance dynamics, including: viral sequences, phenotypic resistance, immunological and virological responses and key clinical data, from HIV-infected patients at Royal Prince Alfred Hospital in Sydney. The lability of immunological and virological responses is analysed in the context of the evolution of antiretroviral drug-resistance mutations. A novel Bayesian algorithm is developed for the detection and classification of neutral and adaptive mutational patterns associated with HIV drug resistance. To simplify and provide insights into the multifactorial interactions between viral populations, immune-system cells, drug resistance and treatment parameters, a Bayesian graphical model of drug-resistance dynamics is developed; the model supports the exploration of the interdependent associations among these dynamics.
23

Implementation of B-splines in a Conventional Finite Element Framework

Owens, Brian C. 16 January 2010 (has links)
The use of B-spline interpolation functions in the finite element method (FEM) is not a new subject. B-splines have been utilized in finite elements for many reasons. One reason is the higher continuity of derivatives and smoothness of B-splines. Another reason is the possibility of reducing the required number of degrees of freedom compared to a conventional finite element analysis. Furthermore, if B-splines are utilized to represent the geometry of a finite element model, interfacing a finite element analysis program with existing computer aided design programs (which make extensive use of B-splines) is possible. While B-splines have been used in finite element analysis due to the aforementioned goals, it is difficult to find resources that describe the process of implementing B-splines into an existing finite element framework. Therefore, it is necessary to document this methodology. This implementation should conform to the structure of conventional finite elements and only require exceptions in methodology where absolutely necessary. One goal is to implement B-spline interpolation functions in a finite element framework such that it appears very similar to conventional finite elements and is easily understandable by those with a finite element background. The use of B-spline functions in finite element analysis has been studied for advantages and disadvantages. Two-dimensional B-spline and standard FEM have been compared. This comparison has addressed the accuracy as well as the computational efficiency of B-spline FEM. Results show that for a given number of degrees of freedom, B-spline FEM can produce solutions with lower error than standard FEM. Furthermore, for a given solution time and total analysis time B-spline FEM will typically produce solutions with lower error than standard FEM. However, due to a more coupled system of equations and larger elemental stiffness matrix, B-spline FEM will take longer per degree of freedom for solution and assembly times than standard FEM. Three-dimensional B-spline FEM has also been validated by the comparison of a three-dimensional model with plane-strain boundary conditions to an equivalent two-dimensional model using plane strain conditions.
24

On Computational Methods for the Valuation of Credit Derivatives

Zhang, Wanhe 02 September 2010 (has links)
A credit derivative is a financial instrument whose value depends on the credit risk of an underlying asset or assets. Credit risk is the possibility that the obligor fails to honor any payment obligation. This thesis proposes four new computational methods for the valuation of credit derivatives. Compared with synthetic collateralized debt obligations (CDOs) or basket default swaps (BDS), the value of which depends on the defaults of a prescribed underlying portfolio, a forward-starting CDO or BDS has a random underlying portfolio, as some ``names'' may default before the CDO or BDS starts. We develop an approach to convert a forward product to an equivalent standard one. Therefore, we avoid having to consider the default combinations in the period between the start of the forward contract and the start of the associated CDO or BDS. In addition, we propose a hybrid method combining Monte Carlo simulation with an analytical method to obtain an effective method for pricing forward-starting BDS. Current factor copula models are static and fail to calibrate consistently against market quotes. To overcome this deficiency, we develop a novel chaining technique to build a multi-period factor copula model from several one-period factor copula models. This allows the default correlations to be time-dependent, thereby allowing the model to fit market quotes consistently. Previously developed multi-period factor copula models require multi-dimensional integration, usually computed by Monte Carlo simulation, which makes the calibration extremely time consuming. Our chaining method, on the other hand, possesses the Markov property. This allows us to compute the portfolio loss distribution of a completely homogeneous pool analytically. The multi-period factor copula is a discrete-time dynamic model. As a first step towards developing a continuous-time dynamic model, we model the default of an underlying by the first hitting time of a Wiener process, which starts from a random initial state. We find an explicit relation between the default distribution and the initial state distribution of the Wiener process. Furthermore, conditions on the existence of such a relation are discussed. This approach allows us to match market quotes consistently.
25

On Computational Methods for the Valuation of Credit Derivatives

Zhang, Wanhe 02 September 2010 (has links)
A credit derivative is a financial instrument whose value depends on the credit risk of an underlying asset or assets. Credit risk is the possibility that the obligor fails to honor any payment obligation. This thesis proposes four new computational methods for the valuation of credit derivatives. Compared with synthetic collateralized debt obligations (CDOs) or basket default swaps (BDS), the value of which depends on the defaults of a prescribed underlying portfolio, a forward-starting CDO or BDS has a random underlying portfolio, as some ``names'' may default before the CDO or BDS starts. We develop an approach to convert a forward product to an equivalent standard one. Therefore, we avoid having to consider the default combinations in the period between the start of the forward contract and the start of the associated CDO or BDS. In addition, we propose a hybrid method combining Monte Carlo simulation with an analytical method to obtain an effective method for pricing forward-starting BDS. Current factor copula models are static and fail to calibrate consistently against market quotes. To overcome this deficiency, we develop a novel chaining technique to build a multi-period factor copula model from several one-period factor copula models. This allows the default correlations to be time-dependent, thereby allowing the model to fit market quotes consistently. Previously developed multi-period factor copula models require multi-dimensional integration, usually computed by Monte Carlo simulation, which makes the calibration extremely time consuming. Our chaining method, on the other hand, possesses the Markov property. This allows us to compute the portfolio loss distribution of a completely homogeneous pool analytically. The multi-period factor copula is a discrete-time dynamic model. As a first step towards developing a continuous-time dynamic model, we model the default of an underlying by the first hitting time of a Wiener process, which starts from a random initial state. We find an explicit relation between the default distribution and the initial state distribution of the Wiener process. Furthermore, conditions on the existence of such a relation are discussed. This approach allows us to match market quotes consistently.
26

Computational methods for the analysis of HIV drug resistance dynamics

Al Mazari, Ali January 2007 (has links)
Doctor of Philosophy(PhD) / ABSTRACT Despite the extensive quantitative and qualitative knowledge about therapeutic regimens and the molecular biology of HIV/AIDS, the eradication of HIV infection cannot be achieved with available antiretroviral regimens. HIV drug resistance remains the most challenging factor in the application of approved antiretroviral agents. Previous investigations and existing HIV/AIDS models and algorithms have not enabled the development of long-lasting and preventive drug agents. Therefore, the analysis of the dynamics of drug resistance and the development of sophisticated HIV/AIDS analytical algorithms and models are critical for the development of new, potent antiviral agents, and for the greater understanding of the evolutionary behaviours of HIV. This study presents novel computational methods for the analysis of drug-resistance dynamics, including: viral sequences, phenotypic resistance, immunological and virological responses and key clinical data, from HIV-infected patients at Royal Prince Alfred Hospital in Sydney. The lability of immunological and virological responses is analysed in the context of the evolution of antiretroviral drug-resistance mutations. A novel Bayesian algorithm is developed for the detection and classification of neutral and adaptive mutational patterns associated with HIV drug resistance. To simplify and provide insights into the multifactorial interactions between viral populations, immune-system cells, drug resistance and treatment parameters, a Bayesian graphical model of drug-resistance dynamics is developed; the model supports the exploration of the interdependent associations among these dynamics.
27

A Computational Tool for the Prediction of Small Non-coding RNAs in Genome Sequences

Yu, Ning 01 December 2009 (has links)
The purpose of researching bacterial gene expression is to control and prevent the diseases which are caused by bacteria. Recently researchers discovered small non-coding RNAs (ncRNA / sRNA) perform a variety of critical regulatory functions in bacteria. The genome-wide searching for sRNAs, especially the computational method, has become an effective way to predict the small non-coding RNAs because sRNAs have the consistent sequence characteristics. This article proposes a hybrid computational approach, HybridRNA, for the prediction of small non-coding RNAs, which integrates three critical techniques, including secondary structural algorithm, thermo-dynamic stability analysis and sequence conservation prediction. Relying on these computational techniques, our approach was used to search for sRNAs in Streptococcus pyogenes which is one of the most important bacteria for human health. This search led five strongest candidates of sRNA to be predicted as the key components of known regulatory pathways in S. pyogens.
28

Eficácia em problemas inversos: generalização do algoritmo de recozimento simulado e função de regularização aplicados a tomografia de impedância elétrica e ao espectro de raios X / Efficiency in inverse problems: generalization of simulated annealing algorithm and regularization function applied to electrical impedance tomography and X-rays spectrum

Olavo Henrique Menin 08 December 2014 (has links)
A modelagem de processos em física e engenharia frequentemente resulta em problemas inversos. Em geral, esses problemas apresentam difícil resolução, pois são classificados como mal-postos. Resolvê-los, tratando-os como problemas de otimização, requer a minimização de uma função objetivo, que mede a discrepância entre os dados experimentais e os obtidos pelo modelo teórico, somada a uma função de regularização. Na maioria dos problemas práticos, essa função objetivo é não-convexa e requer o uso de métodos de otimização estocásticos. Dentre eles, tem-se o algoritmo de recozimento simulado (Simulated Annealing), que é baseado em três pilares: i) distribuição de visitação no espaço de soluções; ii) critério de aceitação; e iii) controle da estocasticidade do processo. Aqui, propomos uma nova generalização do algoritmo de recozimento simulado e da função de regularização. No algoritmo de otimização, generalizamos o cronograma de resfriamento, que usualmente são considerados algébricos ou logarítmicos, e o critério de Metropolis. Com relação à função de regularização, unificamos as versões mais utilizadas, em uma única fórmula. O parâmetro de controle dessa generalização permite transitar continuamente entre as regularizações de Tikhonov e entrópica. Por meio de experimentos numéricos, aplicamos nosso algoritmo na resolução de dois importantes problemas inversos na área de Física Médica: a determinação do espectro de um feixe de raios X, a partir de sua curva de atenuação, e a reconstrução da imagem na tomografia de impedância elétrica. Os resultados mostram que o algoritmo de otimização proposto é eficiente e apresenta um regime ótimo de parâmetros, relacionados à divergência do segundo momento da distribuição de visitação. / Modeling of processes in Physics and Engineering frequently yields inverse problems. These problems are normally difficult to be solved since they are classified as ill-posed. Solving them as optimization problems require the minimization of an objective function which measures the difference between experimental and theoretical data, added to a regularization function. For most of practical inverse problems, this objective function is non-convex and needs a stochastic optimization method. Among them, we have Simulated Annealing algorithm, which is based on three fundamentals: i) visitation distribution in the search space; ii) acceptance criterium; and iii) control of process stochasticity. Here, we propose a new generalization of simulated annealing algorithm and of the regularization function. On the optimization algorithm, we have generalized both the cooling schedule, which usually is algebric or logarithmic, and the Metropolis acceptance criterium. Regarding to regularization function, we have unified the most used versions in an unique equation. The generalization control parameter allows exchange continuously between the Tikhonov and entropic regularization. Through numerical experiments, we applied our algorithm to solve two important inverse problems in Medical Physics: determination of a beam X-rays spectrum from its attenuation curve and the image reconstruction of electrical impedance tomography. Results show that the proposed algorithm is efficient and presents an optimal arrangement of parameters, associated to the divergence of the visitation distribution.
29

Datsina Damro: um estudo do casamento entre os Xavante de Marãiwatsédé

Ramires, Marcos de MIranda 10 August 2015 (has links)
Submitted by Inês Marinho (bele_ballet@hotmail.com) on 2016-06-24T13:49:26Z No. of bitstreams: 1 Dissertação - Marcos Miranda Ramires.pdf: 4343319 bytes, checksum: 58911a457bd857500e25675b28d01258 (MD5) / Approved for entry into archive by Divisão de Documentação/BC Biblioteca Central (ddbc@ufam.edu.br) on 2016-07-05T15:55:51Z (GMT) No. of bitstreams: 1 Dissertação - Marcos Miranda Ramires.pdf: 4343319 bytes, checksum: 58911a457bd857500e25675b28d01258 (MD5) / Approved for entry into archive by Divisão de Documentação/BC Biblioteca Central (ddbc@ufam.edu.br) on 2016-07-05T16:00:02Z (GMT) No. of bitstreams: 1 Dissertação - Marcos Miranda Ramires.pdf: 4343319 bytes, checksum: 58911a457bd857500e25675b28d01258 (MD5) / Made available in DSpace on 2016-07-05T16:00:02Z (GMT). No. of bitstreams: 1 Dissertação - Marcos Miranda Ramires.pdf: 4343319 bytes, checksum: 58911a457bd857500e25675b28d01258 (MD5) Previous issue date: 2015-08-10 / CAPES - Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / This paper focuses its attention on the marriage alliance among the Xavante people living in Indigenous Land Marãiwatsédé, located in eastern Mato Grosso, far about 1,200 km from Cuiabá. This collective speaks a Jê language and inhabits a transition area between the Cerrado and Amazon biomes. I seek here to apply a method which, based on the assumptions of the Lévi-Strauss's kinship theory, articulated with the basic notions of graph theory, performs a thorough scan of empirical network alliance. This method also postulates which categories, rules and practices, analytical plans traditionally used in kinship studies, are taken as impact levels of tying kinship term, the exchange. / O presente trabalho centra sua atenção na aliança de casamento entre os Xavante residentes na Terra Indígena Marãiwatsédé, localizada no leste mato-grossense, distante cerca de 1.200 km de Cuiabá. Esse coletivo fala uma língua Jê e habita uma área de transição entre os biomas do Cerrado e Amazônia. Busco aqui aplicar um método que, partindo de pressupostos da teoria do parentesco de Lévi-Strauss, articulado a noções elementares da teoria dos grafos, realiza uma varredura exaustiva da rede empírica de aliança. O método também postula que categorias, normas e práticas, planos analíticos tradicionalmente usados nos estudos de parentesco, sejam tomados como níveis de repercussão do termo subordinante do parentesco, a troca.
30

Principal points, principal curves and principal surfaces

Ganey, Raeesa January 2015 (has links)
The idea of approximating a distribution is a prominent problem in statistics. This dissertation explores the theory of principal points and principal curves as approximation methods to a distribution. Principal points of a distribution have been initially introduced by Flury (1990) who tackled the problem of optimal grouping in multivariate data. In essence, principal points are the theoretical counterparts of cluster means obtained by the k-means algorithm. Principal curves defined by Hastie (1984), are smooth one-dimensional curves that pass through the middle of a p-dimensional data set, providing a nonlinear summary of the data. In this dissertation, details on the usefulness of principal points and principal curves are reviewed. The application of principal points and principal curves are then extended beyond its original purpose to well-known computational methods like Support Vector Machines in machine learning.

Page generated in 0.1416 seconds