Spelling suggestions: "subject:"[een] EXPONENTIAL"" "subject:"[enn] EXPONENTIAL""
561 |
Spam Analysis and Detection for User Generated Content in Online Social NetworksTan, Enhua 23 July 2013 (has links)
No description available.
|
562 |
Topics in Analytic Number TheoryPowell, Kevin James 31 March 2009 (has links) (PDF)
The thesis is in two parts. The first part is the paper “The Distribution of k-free integers” that my advisor, Dr. Roger Baker, and I submitted in February 2009. The reader will note that I have inserted additional commentary and explanations which appear in smaller text. Dr. Baker and I improved the asymptotic formula for the number of k-free integers less than x by taking advantage of exponential sum techniques developed since the 1980's. Both of us made substantial contributions to the paper. I discovered the exponent in the error term for the cases k=3,4, and worked the case k=3 completely. Dr. Baker corrected my work for k=4 and proved the result for k=5. He then generalized our work into the paper as it now stands. We also discussed and both contributed to parts of section 3 on bounds for exponential sums. The second part represents my own work guided by my advisor. I study the zeros of derivatives of Dirichlet L-functions. The first theorem gives an analog for a result of Speiser on the zeros of ζ'(s). He proved that RH is equivalent to the hypothesis that ζ'(s) has no zeros with real part strictly between 0 and ½. The last two theorems discuss zero-free regions to the left and right for L^{(k)}(s,χ).
|
563 |
Machine learning multicriteria optimization in radiation therapy treatment planning / Flermålsoptimering med maskininlärning inom strålterapiplaneringZhang, Tianfang January 2019 (has links)
In radiation therapy treatment planning, recent works have used machine learning based on historically delivered plans to automate the process of producing clinically acceptable plans. Compared to traditional approaches such as repeated weighted-sum optimization or multicriteria optimization (MCO), automated planning methods have, in general, the benefits of low computational times and minimal user interaction, but on the other hand lack the flexibility associated with general-purpose frameworks such as MCO. Machine learning approaches can be especially sensitive to deviations in their dose prediction due to certain properties of the optimization functions usually used for dose mimicking and, moreover, suffer from the fact that there exists no general causality between prediction accuracy and optimized plan quality.In this thesis, we present a means of unifying ideas from machine learning planning methods with the well-established MCO framework. More precisely, given prior knowledge in the form of either a previously optimized plan or a set of historically delivered clinical plans, we are able to automatically generate Pareto optimal plans spanning a dose region corresponding to plans which are achievable as well as clinically acceptable. For the former case, this is achieved by introducing dose--volume constraints; for the latter case, this is achieved by fitting a weighted-data Gaussian mixture model on pre-defined dose statistics using the expectation--maximization algorithm, modifying it with exponential tilting and using specially developed optimization functions to take into account prediction uncertainties.Numerical results for conceptual demonstration are obtained for a prostate cancer case with treatment delivered by a volumetric-modulated arc therapy technique, where it is shown that the methods developed in the thesis are successful in automatically generating Pareto optimal plans of satisfactory quality and diversity, while excluding clinically irrelevant dose regions. For the case of using historical plans as prior knowledge, the computational times are significantly shorter than those typical of conventional MCO. / Inom strålterapiplanering har den senaste forskningen använt maskininlärning baserat på historiskt levererade planer för att automatisera den process i vilken kliniskt acceptabla planer produceras. Jämfört med traditionella angreppssätt, såsom upprepad optimering av en viktad målfunktion eller flermålsoptimering (MCO), har automatiska planeringsmetoder generellt sett fördelarna av lägre beräkningstider och minimal användarinteraktion, men saknar däremot flexibiliteten hos allmänna ramverk som exempelvis MCO. Maskininlärningsmetoder kan vara speciellt känsliga för avvikelser i dosprediktionssteget på grund av särskilda egenskaper hos de optimeringsfunktioner som vanligtvis används för att återskapa dosfördelningar, och lider dessutom av problemet att det inte finns något allmängiltigt orsakssamband mellan prediktionsnoggrannhet och kvalitet hos optimerad plan. I detta arbete presenterar vi ett sätt att förena idéer från maskininlärningsbaserade planeringsmetoder med det väletablerade MCO-ramverket. Mer precist kan vi, givet förkunskaper i form av antingen en tidigare optimerad plan eller en uppsättning av historiskt levererade kliniska planer, automatiskt generera Paretooptimala planer som täcker en dosregion motsvarande uppnåeliga såväl som kliniskt acceptabla planer. I det förra fallet görs detta genom att introducera dos--volym-bivillkor; i det senare fallet görs detta genom att anpassa en gaussisk blandningsmodell med viktade data med förväntning--maximering-algoritmen, modifiera den med exponentiell lutning och sedan använda speciellt utvecklade optimeringsfunktioner för att ta hänsyn till prediktionsosäkerheter.Numeriska resultat för konceptuell demonstration erhålls för ett fall av prostatacancer varvid behandlingen levererades med volymetriskt modulerad bågterapi, där det visas att metoderna utvecklade i detta arbete är framgångsrika i att automatiskt generera Paretooptimala planer med tillfredsställande kvalitet och variation medan kliniskt irrelevanta dosregioner utesluts. I fallet då historiska planer används som förkunskap är beräkningstiderna markant kortare än för konventionell MCO.
|
564 |
Entropy Maximisation and Queues With or Without Balking. An investigation into the impact of generalised maximum entropy solutions on the study of queues with or without arrival balking and their applications to congestion management in communication networks.Shah, Neelkamal P. January 2014 (has links)
An investigation into the impact of generalised maximum entropy solutions on the study of queues with or without arrival balking and their applications to congestion management in communication networks
Keywords: Queues, Balking, Maximum Entropy (ME) Principle, Global Balance (GB), Queue Length Distribution (QLD), Generalised Geometric (GGeo), Generalised Exponential (GE), Generalised Discrete Half Normal (GdHN), Congestion Management, Packet Dropping Policy (PDP)
Generalisations to links between discrete least biased (i.e. maximum entropy (ME)) distribution inferences and Markov chains are conjectured towards the performance modelling, analysis and prediction of general, single server queues with or without arrival balking. New ME solutions, namely the generalised discrete Half Normal (GdHN) and truncated GdHN (GdHNT) distributions are characterised, subject to appropriate mean value constraints, for inferences of stationary discrete state probability distributions. Moreover, a closed form global balance (GB) solution is derived for the queue length distribution (QLD) of the M/GE/1/K queue subject to extended Morse balking, characterised by a Poisson prospective arrival process, i.i.d. generalised exponential (GE) service times and finite capacity, K. In this context, based on comprehensive numerical experimentation, the latter GB solution is conjectured to be a special case of the GdHNT ME distribution.
ii
Owing to the appropriate operational properties of the M/GE/1/K queue subject to extended Morse balking, this queueing system is applied as an ME performance model of Internet Protocol (IP)-based communication network nodes featuring static or dynamic packet dropping congestion management schemes. A performance evaluation study in terms of the model’s delay is carried out. Subsequently, the QLD’s of the GE/GE/1/K censored queue subject to extended Morse balking under three different composite batch balking and batch blocking policies are solved via the technique of GB. Following comprehensive numerical experimentation, the latter QLD’s are also conjectured to be special cases of the GdHNT. Limitations of this work and open problems which have arisen are included after the conclusions
|
565 |
Some Inferential Results for One-Shot Device Testing Data AnalysisSo, Hon Yiu January 2016 (has links)
In this thesis, we develop some inferential results for one-shot device testing data analysis. These extend and generalize existing methods in the literature.
First, a competing-risk model is introduced for one-shot testing data under accelerated life-tests. One-shot devices are products which will be destroyed immediately after use. Therefore, we can observe only a binary status as data, success or failure, of such products instead of its lifetime. Many one-shot devices contain multiple components and failure of any one of them will lead to the failure of the device. Failed devices are inspected to identify the specific cause of failure. Since the exact lifetime is not observed, EM algorithm becomes a natural tool to obtain the maximum likelihood estimates of the model parameters. Here, we develop the EM algorithm for competing exponential and Weibull cases.
Second, a semi-parametric approach is developed for simple one-shot device testing data. Semi-parametric estimation is a model that consists of parametric and non-parametric components. For this purpose, we only assume the hazards at different stress levels are proportional to each other, but no distributional assumption is made on the lifetimes. This provides a greater flexibility in model fitting and enables us to examine the relationship between the reliability of devices and the stress factors.
Third, Bayesian inference is developed for one-shot device testing data under exponential distribution and Weibull distribution with non-constant shape parameters for competing risks. Bayesian framework provides statistical inference from another perspective. It assumes the model parameters to be random and then improves the inference by incorporating expert's experience as prior information. This method is shown to be very useful if we have limited failure observation wherein the maximum likelihood estimator may not exist.
The thesis proceeds as follows. In Chapter 2, we assume the one-shot devices to have two components with lifetimes having exponential distributions with multiple stress factors. We then develop an EM algorithm for developing likelihood inference for the model parameters as well as some useful reliability characteristics. In Chapter 3, we generalize to the situation when lifetimes follow a Weibull distribution with non-constant shape parameters. In Chapter 4, we propose a semi-parametric model for simple one-shot device test data based on proportional hazards model and develop associated inferential results. In Chapter 5, we consider the competing risk model with exponential lifetimes and develop inference by adopting the Bayesian approach. In Chapter 6, we generalize these results on Bayesian inference to the situation when the lifetimes have a Weibull distribution. Finally, we provide some concluding remarks and indicate some future research directions in Chapter 7. / Thesis / Doctor of Philosophy (PhD)
|
566 |
Short-term forecasting of salinity intrusion in Ham Luong river, Ben Tre province using Simple Exponential Smoothing methodTran, Thai Thanh, Ngo, Quang Xuan, Ha, Hieu Hoang, Nguyen, Nhan Phan 13 May 2020 (has links)
Salinity intrusion in a river may have an adverse effect on the quality of life and can be perceived as a modern-day curse. Therefore, it is important to find technical ways to monitor and forecast salinity intrusion. In this paper, we designed a forecasting model using Simple Exponential Smoothing method (SES) which performs weekly salinity intrusion forecast in Ham Luong river (HLR), Ben Tre province based on historical data obtained from the Center for Hydro-meteorological forecasting of Ben Tre province. The results showed that the SES method provides an adequate predictive model for forecast of salinity intrusion in An Thuan, Son Doc, and Phu Khanh. However, the SES in My Hoa, An Hiep, and Vam Mon could be improved upon by another forecasting technique. This study suggests that the SES model is an easy-to-use modeling tool for water resource managers to obtain a quick preliminary assessment of salinity intrusion. / Xâm nhập mặn có thể gây tác động xấu đến đời sống con người, tuy nhiên nó hoàn toàn có thể dự báo được. Cho nên, một điều quan trọng là tìm được phương pháp kỹ thuật phù hợp để dự báo và giám sát xâm nhập mặn trên sông. Trong bài báo này, chúng tôi sử dụng phương pháp Simple Exponential Smoothing để dự báo xâm nhập mặn trên sông Hàm Luông, tỉnh Bến Tre. Kết quả cho thấy mô hình dự báo phù hợp cho các vị trí An Thuận, Sơn Đốc, và Phú Khánh. Tuy nhiên, các vị trí Mỹ Hóa, An Hiệp, và Vàm Mơn có thể tìm các phương pháp khác phù hợp hơn. Phương pháp Simple Exponential Smoothing rất dễ ứng dụng trong quản lý nguồn nước dựa vào việc cảnh báo xâm nhập mặn.
|
567 |
Performance Analysis of Virtualisation in a Cloud Computing Platform. An application driven investigation into modelling and analysis of performance vs security trade-offs for virtualisation in OpenStack infrastructure as a service (IaaS) cloud computing platform architectures.Maiyama, Kabiru M. January 2019 (has links)
Virtualisation is one of the underlying technologies that led to the success of cloud computing platforms (CCPs). The technology, along with other features such as multitenancy allows delivering of computing resources in the form of service through efficient sharing of physical resources. As these resources are provided through virtualisation, a robust agreement is outlined for both the quantity and quality-of-service (QoS) in a service level agreement (SLA) documents. QoS is one of the essential components of SLA, where performance is one of its primary aspects. As the technology is progressively maturing and receiving massive acceptance, researchers from industry and academia continue to carry out novel theoretical and practical studies of various essential aspects of CCPs with significant levels of success.
This thesis starts with the assessment of the current level of knowledge in the literature of cloud computing in general and CCPs in particular. In this context, a substantive literature review was carried out focusing on performance modelling, testing, analysis and evaluation of Infrastructure as a Service (IaaS), methodologies.
To this end, a systematic mapping study (SMSs) of the literature was conducted. SMS guided the choice and direction of this research.
The SMS was followed by the development of a novel open queueing network model (QNM) at equilibrium for the performance modelling and analysis of an OpenStack IaaS CCP. Moreover, it was assumed that an external arrival pattern is Poisson while the queueing stations provided exponentially distributed service times. Based on Jackson’s theorem, the model was exactly decomposed into individual M/M/c (c ≥ 1) stations. Each of these queueing stations was analysed in isolation, and closed-form expressions for key performance metrics, such as mean response time, throughput, server (resource) utilisation as well as bottleneck device were determined.
Moreover, the research was extended with a proposed open QNM with a bursty external arrival pattern represented by a Compound Poisson Process (CPP) with geometrically distributed batches, or equivalently, variable Generalised Exponential (GE) interarrival and service times. Each queueing station had c (c ≥ 1) GE-type servers. Based on a generic maximum entropy (ME) product form approximation, the proposed open GE-type QNM was decomposed into individual GE/GE/c queueing stations with GE-type interarrival and service times. The evaluation of the performance metrics and bottleneck analysis of the QNM were determined, which provided vital insights for the capacity planning of existing CCP architectures as well as the design and development of new ones. The results also revealed, due to a significant impact on the burstiness of interarrival and service time processes, resulted in worst-case performance bounds scenarios, as appropriate.
Finally, an investigation was carried out into modelling and analysis of performance and security trade-offs for a CCP architecture, based on a proposed generalised stochastic Petri net (GSPN) model with security-detection control model (SDCM). In this context, ‘optimal’ combined performance and security metrics were defined with both M-type or GE-type arrival and service times and the impact of security incidents on performance was assessed. Typical numerical experiments on the GSPN model were conducted and implemented using the Möbius package, and an ‘optimal’ trade-offs were determined between performance and security, which are crucial in the SLA of the cloud computing services. / Petroleum technology development fund (PTDF) of the government of Nigeria
Usmanu Danfodiyo University, Sokoto
|
568 |
Solid-State NMR Characterization of Polymeric and Inorganic MaterialsBaughman, Jessi Alan 19 May 2015 (has links)
No description available.
|
569 |
On the effect of asymmetry and dimension on computational geometric problemsSridhar, Vijay, Sridhar 07 November 2018 (has links)
No description available.
|
570 |
Neighbor Effects: The Influence of Colony-level Social Structure on Within-group Dynamics in a Social FishHellmann, Jennifer K. 26 October 2016 (has links)
No description available.
|
Page generated in 0.0793 seconds