• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 68
  • 28
  • 7
  • 5
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 139
  • 139
  • 25
  • 23
  • 19
  • 19
  • 18
  • 17
  • 16
  • 15
  • 15
  • 12
  • 12
  • 12
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

Resource Allocation under Uncertainty : Applications in Mobile Communications

Johansson, Mathias January 2004 (has links)
<p>This thesis is concerned with scheduling the use of resources, or allocating resources, so as to meet future demands for the entities produced by the resources. We consider applications in mobile communications such as scheduling users' transmissions so that the amount of transmitted information is maximized, and scenarios in the manufacturing industry where the task is to distribute work among production units so as to minimize the number of missed orders.</p><p>The allocation decisions are complicated by a lack of information concerning the future demand and possibly also about the capacities of the available resources. We therefore resort to using probability theory and the maximum entropy principle as a means for making rational decisions under uncertainty.</p><p>By using probabilities interpreted as a reasonable degree of belief, we find optimum decision rules for the manufacturing problem, bidding under uncertainty in a certain type of auctions, scheduling users in communications with uncertain channel qualities and uncertain arrival rates, quantization of channel information, partitioning bandwidth between interfering and non-interfering areas in cellular networks, hand-overs and admission control. Moreover, a new method for making optimum approximate Bayesian inference is introduced.</p><p>We further discuss reasonable optimization criteria for the mentioned applications, and provide an introduction to the topic of probability theory as an extension to two-valued logic. It is argued that this view unifies a wide range of resource-allocation problems, and we discuss various directions for further research.</p>
92

A probabilistic framework and algorithms for modeling and analyzing multi-instance data

Behmardi, Behrouz 28 November 2012 (has links)
Multi-instance data, in which each object (e.g., a document) is a collection of instances (e.g., word), are widespread in machine learning, signal processing, computer vision, bioinformatic, music, and social sciences. Existing probabilistic models, e.g., latent Dirichlet allocation (LDA), probabilistic latent semantic indexing (pLSI), and discrete component analysis (DCA), have been developed for modeling and analyzing multiinstance data. Such models introduce a generative process for multi-instance data which includes a low dimensional latent structure. While such models offer a great freedom in capturing the natural structure in the data, their inference may present challenges. For example, the sensitivity in choosing the hyper-parameters in such models, requires careful inference (e.g., through cross-validation) which results in large computational complexity. The inference for fully Bayesian models which contain no hyper-parameters often involves slowly converging sampling methods. In this work, we develop approaches for addressing such challenges and further enhancing the utility of such models. This dissertation demonstrates a unified convex framework for probabilistic modeling of multi-instance data. The three main aspects of the proposed framework are as follows. First, joint regularization is incorporated into multiple density estimation to simultaneously learn the structure of the distribution space and infer each distribution. Second, a novel confidence constraints framework is used to facilitate a tuning-free approach to control the amount of regularization required for the joint multiple density estimation with theoretical guarantees on correct structure recovery. Third, we formulate the problem using a convex framework and propose efficient optimization algorithms to solve it. This work addresses the unique challenges associated with both discrete and continuous domains. In the discrete domain we propose a confidence-constrained rank minimization (CRM) to recover the exact number of topics in topic models with theoretical guarantees on recovery probability and mean squared error of the estimation. We provide a computationally efficient optimization algorithm for the problem to further the applicability of the proposed framework to large real world datasets. In the continuous domain, we propose to use the maximum entropy (MaxEnt) framework for multi-instance datasets. In this approach, bags of instances are represented as distributions using the principle of MaxEnt. We learn basis functions which span the space of distributions for jointly regularized density estimation. The basis functions are analogous to topics in a topic model. We validate the efficiency of the proposed framework in the discrete and continuous domains by extensive set of experiments on synthetic datasets as well as on real world image and text datasets and compare the results with state-of-the-art algorithms. / Graduation date: 2013
93

Resource Allocation under Uncertainty : Applications in Mobile Communications

Johansson, Mathias January 2004 (has links)
This thesis is concerned with scheduling the use of resources, or allocating resources, so as to meet future demands for the entities produced by the resources. We consider applications in mobile communications such as scheduling users' transmissions so that the amount of transmitted information is maximized, and scenarios in the manufacturing industry where the task is to distribute work among production units so as to minimize the number of missed orders. The allocation decisions are complicated by a lack of information concerning the future demand and possibly also about the capacities of the available resources. We therefore resort to using probability theory and the maximum entropy principle as a means for making rational decisions under uncertainty. By using probabilities interpreted as a reasonable degree of belief, we find optimum decision rules for the manufacturing problem, bidding under uncertainty in a certain type of auctions, scheduling users in communications with uncertain channel qualities and uncertain arrival rates, quantization of channel information, partitioning bandwidth between interfering and non-interfering areas in cellular networks, hand-overs and admission control. Moreover, a new method for making optimum approximate Bayesian inference is introduced. We further discuss reasonable optimization criteria for the mentioned applications, and provide an introduction to the topic of probability theory as an extension to two-valued logic. It is argued that this view unifies a wide range of resource-allocation problems, and we discuss various directions for further research.
94

Impacts Of Policy Changes On Turkish Agriculture: An Optimization Model With Maximum Entropy

Eruygur, Hakki Ozan 01 October 2006 (has links) (PDF)
Turkey moves towards integration with EU since 1963. The membership will involve full liberalization of trade in agricultural products with EU. The impact of liberalization depends on the path of agricultural policies in Turkey and the EU. On the other hand, agricultural protection continues to be the most controversial issue in global trade negotiations of World Trade Organization (WTO). To evaluate the impacts of policy scenarios, an economic modeling approach based on non-linear mathematical programming is appropriate. This thesis analyzes the impacts of economic integration with the EU and the potential effects of the application of a new WTO agreement in 2015 on Turkish agriculture using an agricultural sector model. The basic approach is Maximum Entropy based Positive Mathematical Programming of Heckelei and Britz (1999). The model is based on a static optimization algorithm. Following an economic integration with EU, the net export of crops declines and can not tolerate the boom in net import of livestock products. Overall welfare affect is small. Consumers benefit from declining prices. Common Agricultural Policy (CAP) supports are determinative for the welfare of producers. WTO simulation shows that a 15 percent reduction in Turkey&rsquo / s binding WTO tariff commitments will increase net meat imports by USD 250 million.
95

Analytical Methods to Support Risk Identification and Analysis in Healthcare Systems

Cure Vellojin, Laila Nadime 01 January 2011 (has links)
Healthcare systems require continuous monitoring of risk to prevent adverse events. Risk analysis is a time consuming activity that depends on the background of analysts and available data. Patient safety data is often incomplete and biased. This research proposes systematic approaches to monitor risk in healthcare using available patient safety data. The methodologies combine traditional healthcare risk analysis methods with safety theory concepts, in an innovative manner, to allocate available evidence to potential risk sources throughout the system. We propose the use of data mining to analyze near-miss reports and guide the identification of risk sources. In addition, we propose a Maximum-Entropy based approach to monitor risk sources and prioritize investigation efforts accordingly. The products of this research are intended to facilitate risk analysis and allow for timely identification of risks to prevent harm to patients.
96

Approximations, simulation, and accuracy of multivariate discrete probability distributions in decision analysis

Montiel Cendejas, Luis Vicente 17 July 2012 (has links)
Many important decisions must be made without full information. For example, a woman may need to make a treatment decision regarding breast cancer without full knowledge of important uncertainties, such as how well she might respond to treatment. In the financial domain, in the wake of the housing crisis, the government may need to monitor the credit market and decide whether to intervene. A key input in this case would be a model to describe the chance that one person (or company) will default given that others have defaulted. However, such a model requires addressing the lack of knowledge regarding the correlation between groups or individuals. How to model and make decisions in cases where only partial information is available is a significant challenge. In the past, researchers have made arbitrary assumptions regarding the missing information. In this research, we developed a modeling procedure that can be used to analyze many possible scenarios subject to strict conditions. Specifically, we developed a new Monte Carlo simulation procedure to create a collection of joint probability distributions, all of which match whatever information we have. Using this collection of distributions, we analyzed the accuracy of different approximations such as maximum entropy or copula-models. In addition, we proposed several new approximations that outperform previous methods. The objective of this research is four-fold. First, provide a new framework for approximation models. In particular, we presented four new models to approximate joint probability distributions based on geometric attributes and compared their performance to existing methods. Second, develop a new joint distribution simulation procedure (JDSIM) to sample joint distributions from the set of all possible distributions that match available information. This procedure can then be applied to different scenarios to analyze the sensitivity of a decision or to test the accuracy of an approximation method. Third, test the accuracy of seven approximation methods under a variety of circumstances. Specifically, we addressed the following questions within the context of multivariate discrete distributions: Are there new approximations that should be considered? Which approximation is the most accurate, according to different measures? How accurate are the approximations as the number of random variables increases? How accurate are they as we change the underlying dependence structure? How does accuracy improve as we add lower-order assessments? What are the implications of these findings for decision analysis practice and research? While the above questions are easy to pose, they are challenging to answer. For Decision Analysis, the answers open a new avenue to address partial information, which bing us to the last contribution. Fourth, propose a new approach to decision making with partial information. The exploration of old and new approximations and the capability of creating large collections of joint distributions that match expert assessments provide new tools that extend the field of decision analysis. In particular, we presented two sample cases that illustrate the scope of this work and its impact on uncertain decision making. / text
97

Deslizamentos de terra no município de Nova Friburgo: histórico e interpretação estatística de dados / Landslides in municipal area of Nova Friburgo: historical and data statistical interpretation

Celso Narcizo Volotão 29 June 2006 (has links)
Uma modelagem estatística baseada no conceito "risco = probabilidade x conseqüência" e no principio da máxima entropia é apresentada nesta Dissertação, visando estabelecer previsões realistas de perda de vida no Município de Nova Friburgo, Estado do Rio de Janeiro, para subsidiar as autoridades públicas locais a minimizar as conseqüências de inundações e deslizamentos de terra. Os resultados mostram que o princípio da máxima entropia modela muito bem a distribuição de mortalidade em deslizamentos de terra. Comparações de mortalidade em Nova Friburgo, Petrópolis, Região Metropolitana de Recife e na Cidade do Rio de Janeiro também são apresentadas nesta Dissertação. Os resultados apresentados indicam que o risco de perda de vidas em Nova Friburgo é de, aproximadamente, a terça parte do risco em Petrópolis ou a quarta parte do risco na Cidade do Rio de Janeiro. Um estudo estatístico dos deslizamentos registrados na Rodovia RJ-116, no trecho de serra entre as cidades de Nova Friburgo e Cachoeiras de Macacu, mostra também um bom ajuste dos dados históricos com o princípio de máxima entropia, apesar do curto período de observação, de apenas 21 meses. As previsões baseadas no princípio de máxima entropia indicam que a Defesa Civil municipal deveria estar preparada para atuar de forma planejada em um evento extremo com possibilidade de ocorrência de até 27 casos fatais, no caso de uma chuva excepcionalmente elevada. / A statistical model based on the concept risk = probability x consequence and on the principle of maximum entropy is presented in this Dissertation, aiming to establish realistic predictions of loss of life in the Municipal area of Nova Friburgo State of Rio de Janeiro, to help the local public authorities to minimize the consequences of flooding and landslide. The results show that the principle of maximum entropy models very well the distribution of mortality in landslide disasters. Comparisons of landslide mortality in Nova Friburgo, Petrópolis, and Metropolitan Area of Recife and in the City of Rio de Janeiro are presented in this Dissertation. The results shown indicate that the risk of loss life in Nova Friburgo is about one third of the risk in Petrópolis and about one forth of risk in the Rio de Janeiro City. A statistical study of the recorded landslides in the Highway RJ-116, in the mountain area between Nova Friburgo and Cachoeiras de Macacu cities, shows also a good adjustment of the historical data with the principle of maximum entropy in spite of the short period of observation, only 21 months. Predictions based on the principle of maximum entropy indicate that the Municipal Civil Defense should be prepared to face a landslide disaster with as much as 27 fatal cases in case of a exceptionally high rainfall.
98

Efeito potencial de mudanças climáticas sobre a distribuição de Phyllomedusa centralis (Anura : Hylidae)

Arruda, Luana Aparecida Gomes de 02 December 2014 (has links)
Submitted by Valquíria Barbieri (kikibarbi@hotmail.com) on 2018-04-05T22:00:26Z No. of bitstreams: 1 DISS_2014_Luana Aparecida Gomes de Arruda.pdf: 854132 bytes, checksum: e5dce680d3fc443608aec08b5e456321 (MD5) / Approved for entry into archive by Jordan (jordanbiblio@gmail.com) on 2018-04-23T18:21:48Z (GMT) No. of bitstreams: 1 DISS_2014_Luana Aparecida Gomes de Arruda.pdf: 854132 bytes, checksum: e5dce680d3fc443608aec08b5e456321 (MD5) / Made available in DSpace on 2018-04-23T18:21:48Z (GMT). No. of bitstreams: 1 DISS_2014_Luana Aparecida Gomes de Arruda.pdf: 854132 bytes, checksum: e5dce680d3fc443608aec08b5e456321 (MD5) Previous issue date: 2014-12-02 / Pesquisas em campo voltadas para conservação de espécies são necessárias, porém muitas vezes são limitadas, onerosas e trabalhosas. Uma ferramenta que pode ser utilizada para minimizar estes problemas é a Modelagem Preditiva de Distribuição Potencial de Espécies (MPDPE), que permite estimar áreas potenciais de ocorrência atuais e futuras e que vem recebendo destaque em estudos conservacionistas, pois pode ser utilizada com espécies raras ou que estão sofrendo algum grau de ameaça. No presente estudo, a MPDPE foi utilizada para conhecer a distribuição e prever o efeito potencial de mudanças climáticas sobre a distribuição da perereca Phyllomedusa centralis, que possui distribuição conhecida restrita a poucas localidades no estado de Mato Grosso, Brasil. Os modelos foram desenvolvidos com o método de entropia máxima com o auxilio do programa MaxEnt, utilizando variáveis bioclimáticas atuais e futuras. Os modelos de predição de distribuição atual e de cenários futuros obtiveram elevados valores da área sob a curva operador-receptor e a validação apontou alta precisão na qualidade do modelo. Em todos os modelos, a variável ambiental com maior porcentagem de contribuição foi a precipitação. Todos os modelos apresentaram previsão de aumento nas áreas de distribuição potencial em comparação com o modelo atual de distribuição. Apesar de apresentar limitações, o modelo de distribuição potencial é relevante e pode ajudar no planejamento e gestão de reservas, descoberta de novas populações e manejo de espécies, identificação de áreas prioritárias para conservação e definição de onde é necessária a restauração de habitats. / Field surveys aimed at the conservation of species are necessary, but are often limited, costly and onerous. A tool that can be used to minimize these problems is the Predictive Modeling of Potential Species Distribution (PMPSD), which allows estimating current and future areas of potential occurrence and has received attention from conservationists studies, because it can be used with rare species or who are suffering some degree of threat. In the present study PMPSD was used to determine the distribution and predict the potential effect of climate change on the distribution of tree frog Phyllomedusa centralis, which has restricted distribution to a few localities in the state of Mato Grosso, Brazil. The models were developed with maximum entropy method with the aid of the MaxEnt program using current and future bioclimatic variables. The prediction models of current distribution and future scenarios had higher Area Under the receiver-operator Curve values and the validation showed high accuracy of the model quality. In all models the environmental variable with highest percentage of contribution was precipitation. All models showed increased areas of potential distribution compared with the current distribution model. Despite the limitations, the model of potential distribution is important and can help in planning and management of reserves, discovery of new populations, identification of priority areas for conservation, and definitions of regions to habitat restoration.
99

Caracteriza??o estat?stica de processos s?smicos via Distribui??o Generalizada de Pareto. Estudo de caso: Jo?o C?mara-RN

Silva, Raimundo Nonato Castro da 05 December 2008 (has links)
Made available in DSpace on 2014-12-17T15:26:36Z (GMT). No. of bitstreams: 1 RaimundoNCSpdf.pdf: 571294 bytes, checksum: e7a16fe2d495057f30a58acfdd36fce6 (MD5) Previous issue date: 2008-12-05 / The work is to make a brief discussion of methods to estimate the parameters of the Generalized Pareto distribution (GPD). Being addressed the following techniques: Moments (moments), Maximum Likelihood (MLE), Biased Probability Weighted Moments (PWMB), Unbiased Probability Weighted Moments (PWMU), Mean Power Density Divergence (MDPD), Median (MED), Pickands (PICKANDS), Maximum Penalized Likelihood (MPLE), Maximum Goodness-of-fit (MGF) and the Maximum Entropy (POME) technique, the focus of this manuscript. By way of illustration adjustments were made for the Generalized Pareto distribution, for a sequence of earthquakes intraplacas which occurred in the city of Jo?o C?mara in the northeastern region of Brazil, which was monitored continuously for two years (1987 and 1988). It was found that the MLE and POME were the most efficient methods, giving them basically mean squared errors. Based on the threshold of 1.5 degrees was estimated the seismic risk for the city, and estimated the level of return to earthquakes of intensity 1.5?, 2.0?, 2.5?, 3.0? and the most intense earthquake never registered in the city, which occurred in November 1986 with magnitude of about 5.2? / O objetivo desse trabalho ? fazer uma breve discuss?o dos m?todos de estima??o dos par?metros da distribui??o generalizada de Pareto (GPD). Sendo abordadas as seguintes t?cnicas: m?xima verossimilhan?a (MLE), m?xima verossimilhan?a penalizada (MPLE), m?todos dos momentos (moments), Pickands (Pickands), momentos ponderados pela probabilidade: viesado e n?o-viesado (PWMB, PWMU), diverg?ncia m?dia da densidade (MDPD), melhor qualidade do ajuste (MGF), mediana (MED) e o m?todo da m?xima entropia (POME), t?cnica que neste trabalho receber? uma maior aten??o. A t?tulo de ilustra??o foram feitos ajustes para a distribui??o generalizada de Pareto, para uma seq??ncia de sismos intraplacas, ocorridos no munic?pio de Jo?o C?mara, NE Brasil que foi monitorado continuamente durante dois anos (1987 e 1988). Verificou-se que o MLE e o POME foram os m?todos mais eficientes, dando basicamente os mesmos erros m?dios quadr?ticos. Com base no limiar de 1,5? foi estimado o risco s?smico para o munic?pio, sendo estimado o n?vel de retorno para os sismos de intensidade 1,5?, 2,0?, 2,5?, 3,0? e para o sismo mais intenso j? registrado no munic?pio, ocorrido em novembro de 1986 que teve a magnitude de 5,2?
100

Deslizamentos de terra no município de Nova Friburgo: histórico e interpretação estatística de dados / Landslides in municipal area of Nova Friburgo: historical and data statistical interpretation

Celso Narcizo Volotão 29 June 2006 (has links)
Uma modelagem estatística baseada no conceito "risco = probabilidade x conseqüência" e no principio da máxima entropia é apresentada nesta Dissertação, visando estabelecer previsões realistas de perda de vida no Município de Nova Friburgo, Estado do Rio de Janeiro, para subsidiar as autoridades públicas locais a minimizar as conseqüências de inundações e deslizamentos de terra. Os resultados mostram que o princípio da máxima entropia modela muito bem a distribuição de mortalidade em deslizamentos de terra. Comparações de mortalidade em Nova Friburgo, Petrópolis, Região Metropolitana de Recife e na Cidade do Rio de Janeiro também são apresentadas nesta Dissertação. Os resultados apresentados indicam que o risco de perda de vidas em Nova Friburgo é de, aproximadamente, a terça parte do risco em Petrópolis ou a quarta parte do risco na Cidade do Rio de Janeiro. Um estudo estatístico dos deslizamentos registrados na Rodovia RJ-116, no trecho de serra entre as cidades de Nova Friburgo e Cachoeiras de Macacu, mostra também um bom ajuste dos dados históricos com o princípio de máxima entropia, apesar do curto período de observação, de apenas 21 meses. As previsões baseadas no princípio de máxima entropia indicam que a Defesa Civil municipal deveria estar preparada para atuar de forma planejada em um evento extremo com possibilidade de ocorrência de até 27 casos fatais, no caso de uma chuva excepcionalmente elevada. / A statistical model based on the concept risk = probability x consequence and on the principle of maximum entropy is presented in this Dissertation, aiming to establish realistic predictions of loss of life in the Municipal area of Nova Friburgo State of Rio de Janeiro, to help the local public authorities to minimize the consequences of flooding and landslide. The results show that the principle of maximum entropy models very well the distribution of mortality in landslide disasters. Comparisons of landslide mortality in Nova Friburgo, Petrópolis, and Metropolitan Area of Recife and in the City of Rio de Janeiro are presented in this Dissertation. The results shown indicate that the risk of loss life in Nova Friburgo is about one third of the risk in Petrópolis and about one forth of risk in the Rio de Janeiro City. A statistical study of the recorded landslides in the Highway RJ-116, in the mountain area between Nova Friburgo and Cachoeiras de Macacu cities, shows also a good adjustment of the historical data with the principle of maximum entropy in spite of the short period of observation, only 21 months. Predictions based on the principle of maximum entropy indicate that the Municipal Civil Defense should be prepared to face a landslide disaster with as much as 27 fatal cases in case of a exceptionally high rainfall.

Page generated in 0.0668 seconds