• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 209
  • 31
  • 29
  • 13
  • 12
  • 10
  • 7
  • 5
  • 4
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • Tagged with
  • 408
  • 158
  • 59
  • 58
  • 57
  • 57
  • 55
  • 52
  • 49
  • 45
  • 42
  • 41
  • 39
  • 35
  • 34
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
351

Comparing Resource Abundance And Intake At The Reda And Wisla River Estuaries

Zahid, Saman January 2021 (has links)
The migratory birds stop at different stopover sites during migration. The presence of resources in these stopover sites is essential to regain the energy of these birds. This thesis aims to compare the resource abundance and intake at the two stopover sites: Reda and Wisla river estuaries. How a bird's mass changes during its stay at an estuary is considered as a proxy for the resource abundance of a site. The comparison is made on different subsets, including those which has incomplete data, i.e. next day is not exactly one day after the previous capture. Multiple linear regression, Generalized additive model and Linear mixed effect model are used for analysis. Expectation maximization and an iterative predictive process are implemented to deal with incomplete data. We found that Reda has higher resource abundance and intake as compared to that of Wisla river estuary.
352

'Correlation and portfolio analysis of financial contagion and capital flight'

NAKMAI, SIWAT 29 November 2018 (has links)
This dissertation mainly studies correlation and then portfolio analysis of financial contagion and capital flight, focusing on currency co-movements around the political uncertainty due to the Brexit referendum on 26 June 2016. The correlation, mean, and covariance computations in the analysis are both time-unconditional and time-conditional, and the generalized autoregressive conditional heteroskedasticity (GARCH) and exponentially weighted moving average (EWMA) methods are applied. The correlation analysis in this dissertation (Chapter 1) extends the previous literature on contagion testing based on a single global factor model, bivariate correlation analysis, and heteroskedasticity bias correction. Chapter 1 proposes an alternatively extended framework, assuming that intensification of financial correlations in a state of distress could coincide with rising global-factor-loading variability, provides simple tests to verify the assumptions of the literature and of the extended framework, and considers capital flight other than merely financial contagion. The outcomes show that, compared to the literature, the extended framework can be deemed more verified to the Brexit case. Empirically, with the UK being the shock-originating economy and the sterling value plummeting on the US dollar, there exist contagions to some other major currencies as well as a flight to quality, particularly to the yen, probably suggesting diversification benefits. When the correlation coefficients are time-conditional, or depend more on more recent data, the evidence shows fewer contagions and flights since the political uncertainty in question disappeared gradually over time. After relevant interest rates were partialled out, some previous statistical contagion and flight occurrences became less significant or even insignificant, possibly due to the significant impacts of the interest rates on the corresponding currency correlations. The portfolio analysis in this dissertation (Chapter 2) examines financial contagion and capital flight implied by portfolio reallocations through mean-variance portfolio analysis, and builds on the correlation analysis in Chapter 1. In the correlation analysis, correlations are bivariate, whereas in the portfolio analysis they are multivariate and the risk-return tradeoff is also vitally involved. Portfolio risk minimization and reward-to-risk maximization are the two analytical cases of portfolio optimality taken into consideration. Robust portfolio optimizations, using shrinkage estimations and newly proposed risk-based weight constraints, are also applied. The evidence demonstrates that the portfolio analysis outcomes regarding currency contagions and flights, implying diversification benefits, vary and are noticeably dissimilar from the correlation analysis outcomes of Chapter 1. Subsequently, it could be inferred that the diversification benefits deduced from the portfolio and correlation analyses differ owing to the dominance, during market uncertainty, of the behaviors of the means and (co)variances of all the shock-originating and shock-receiving returns, over the behaviors of just bivariate correlations between the shock-originating and shock-receiving returns. Moreover, corrections of the heteroskedasticity bias inherent in the shock-originating returns, overall, do not have an effect on currency portfolio rebalancing. Additionally, hedging demands could be implied from detected structural portfolio reallocations, probably as a result of variance-covariance shocks rising from Brexit. / This dissertation mainly studies correlation and then portfolio analysis of financial contagion and capital flight, focusing on currency co-movements around the political uncertainty due to the Brexit referendum on 26 June 2016. The correlation, mean, and covariance computations in the analysis are both time-unconditional and time-conditional, and the generalized autoregressive conditional heteroskedasticity (GARCH) and exponentially weighted moving average (EWMA) methods are applied. The correlation analysis in this dissertation (Chapter 1) extends the previous literature on contagion testing based on a single global factor model, bivariate correlation analysis, and heteroskedasticity bias correction. Chapter 1 proposes an alternatively extended framework, assuming that intensification of financial correlations in a state of distress could coincide with rising global-factor-loading variability, provides simple tests to verify the assumptions of the literature and of the extended framework, and considers capital flight other than merely financial contagion. The outcomes show that, compared to the literature, the extended framework can be deemed more verified to the Brexit case. Empirically, with the UK being the shock-originating economy and the sterling value plummeting on the US dollar, there exist contagions to some other major currencies as well as a flight to quality, particularly to the yen, probably suggesting diversification benefits. When the correlation coefficients are time-conditional, or depend more on more recent data, the evidence shows fewer contagions and flights since the political uncertainty in question disappeared gradually over time. After relevant interest rates were partialled out, some previous statistical contagion and flight occurrences became less significant or even insignificant, possibly due to the significant impacts of the interest rates on the corresponding currency correlations. The portfolio analysis in this dissertation (Chapter 2) examines financial contagion and capital flight implied by portfolio reallocations through mean-variance portfolio analysis, and builds on the correlation analysis in Chapter 1. In the correlation analysis, correlations are bivariate, whereas in the portfolio analysis they are multivariate and the risk-return tradeoff is also vitally involved. Portfolio risk minimization and reward-to-risk maximization are the two analytical cases of portfolio optimality taken into consideration. Robust portfolio optimizations, using shrinkage estimations and newly proposed risk-based weight constraints, are also applied. The evidence demonstrates that the portfolio analysis outcomes regarding currency contagions and flights, implying diversification benefits, vary and are noticeably dissimilar from the correlation analysis outcomes of Chapter 1. Subsequently, it could be inferred that the diversification benefits deduced from the portfolio and correlation analyses differ owing to the dominance, during market uncertainty, of the behaviors of the means and (co)variances of all the shock-originating and shock-receiving returns, over the behaviors of just bivariate correlations between the shock-originating and shock-receiving returns. Moreover, corrections of the heteroskedasticity bias inherent in the shock-originating returns, overall, do not have an effect on currency portfolio rebalancing. Additionally, hedging demands could be implied from detected structural portfolio reallocations, probably as a result of variance-covariance shocks rising from Brexit.
353

Application des processus stochastiques aux enchères en temps réel et à la propagation d'information dans les réseaux sociaux / Application of stochastic processes to real-time bidding and diffusion processes on networks

Lemonnier, Rémi 22 November 2016 (has links)
Dans cette thèse, nous étudions deux applications des processus stochastiques au marketing internet. Le premier chapitre s’intéresse au scoring d’internautes pour les enchères en temps réel. Ce problème consiste à trouver la probabilité qu’un internaute donné réalise une action d’intérêt, appelée conversion, dans les quelques jours suivant l’affichage d’une bannière publicitaire. Nous montrons que les processus de Hawkes constituent une modélisation naturelle de ce phénomène mais que les algorithmes de l’état de l’art ne sont pas applicables à la taille des données typiquement à l’œuvre dans des applications industrielles. Nous développons donc deux nouveaux algorithmes d’inférence non-paramétrique qui sont plusieurs ordres de grandeurs plus rapides que les méthodes précédentes. Nous montrons empiriquement que le premier a de meilleures performances que les compétiteurs de l’état de l’art, et que le second permet une application à des jeux de données encore plus importants sans payer un prix trop important en terme de pouvoir de prédiction. Les algorithmes qui en découlent ont été implémentés avec de très bonnes performances depuis plusieurs années à 1000 mercis, l’agence marketing d’avant-garde étant le partenaire industriel de cette thèse CIFRE, où ils sont devenus un actif important pour la production. Le deuxième chapitre s’intéresse aux processus diffusifs sur les graphes qui constituent un outil important pour modéliser la propagation d’une opération de marketing viral sur les réseaux sociaux. Nous établissons les premières bornes théoriques sur le nombre total de nœuds atteint par une contagion dans le cadre de graphes et dynamiques de diffusion quelconques, et montrons l’existence de deux régimes bien distincts : le régime sous-critique où au maximum $O(sqrt{n})$ nœuds seront infectés, où $n$ est la taille du réseau, et le régime sur-critique ou $O(n)$ nœuds peuvent être infectés. Nous étudions également le comportement par rapport au temps d’observation $T$ et mettons en lumière l’existence de temps critiques en-dessous desquels une diffusion, même sur-critique sur le long terme, se comporte de manière sous-critique. Enfin, nous étendons nos travaux à la percolation et l’épidémiologie, où nous améliorons les résultats existants. / In this thesis, we study two applications of stochastic processes in internet marketing. The first chapter focuses on internet user scoring for real-time bidding. This problem consists in finding the probability for a given user to perform an action of interest, called conversion, in the next few days. We show that Hawkes processes are well suited for modelizing this phenomena but that state-of-the-art algorithms are not applicable to the size of datasets involved. We therefore develop two new algorithms able to perform nonparametric multivariate Hawkes process inference orders of magnitude faster than previous methods. We show empirically that the first one outperforms state-of-the-art competitors, and the second one scales to very large datasets while keeping very high prediction power. The resulting algorithms have been implemented with very good performances for several years in 1000mercis, a pioneering marketing agency being the industrial partner of this CIFRE PhD, where they became an important business asset. The second chapter focuses on diffusion processes graphs, an important tool for modelizing the spread of a viral marketing operation over social networks. We derive the first theoretical bounds for the total number of nodes reached by a contagion for general graphs and diffusion dynamics, and show the existence of two well distinct regimes: the sub-critical one where at most $O(sqrt{n})$ nodes are infected, where $n$ is the size of the network, and the super-critical one where $O(n)$ nodes can be infected. We also study the behavior wrt to the observation time $T$ and reveals the existence of critical times under which a long-term super-critical diffusion process behaves sub-critically. Finally, we extend our works to different application fields, and improve state-of-the-art results in percolation and epidemiology.
354

Interactions of Connected Electric Vehicles with Modern Power Grids in Smart Cities

Alghamdi, Turki 10 August 2021 (has links)
In a smart city, it is vital to provide a clean and green environment by curbing air pollution and greenhouse gas emissions (GHGs) from transportation. As a recent action from many governments aiming to minimize transportation’s pollution upon the climate, new plans have been announced to ban cars with gas engines throughout the world. Therefore, it is anticipated that the presence of electric vehicles (EVs) will grow very fast globally. Consequently, the necessity to establish electric vehicle supply equipment (EVSE) in the smart city through public charging stations is growing incrementally year by year. However, the EV charging process via EVSE which is primarily connected to the power grid will put high pressure upon the centralized power grid, especially during peak demand periods. Increasing the power production of power grid will increase the environmental impact. Therefore, it is fundamental for the smart city to be equipped with a modern power grid to cope with the traditional power grid’s drawbacks. In this thesis, we conduct an in-depth analysis of the problem of EVs’ interaction with the modern power grid in a smart city to manage and control EV charging and discharging processes. We also present various approaches and mechanisms toward identifying and investigating these challenges and requirements to manage the power demand. We propose novel solutions, namely Decentralized-EVSE (D-EVSE), for EVs’ charging and discharging processes based on Renewable Energy Sources (RESs) and an energy storage system. We present two algorithms to manage the interaction between EVs and D-EVSE while maximizing EV drivers’ satisfaction in terms of reducing the waiting time for charging or discharging services and minimizing the stress placed on D-EVSE. We propose an optimization model based on Game Theory (GT) to manage the interaction between EVs and D-EVSE. We name this the decentralized-GT (D-GT) model. This model aims to find the optimal solution for EVs and D-EVSE based on the concept of win-win. We design a decentralized profit maximization algorithm to help D-EVSE take profit from the electricity price variation during the day when selling or buying electricity respectively to EVs or from the grid or EVs as discharging processes. We implement different scenarios to these models and show through analytical and simulation results that our proposed models help to minimize the D-EVSE stress level, increase the D-EVSE sustainability, maximize the D-EVSE profit, as well as maximize EV drivers’ satisfaction and reduce EVs’ waiting time.
355

Získávání znalostí z multimediálních databází / Knowledge Discovery in Multimedia Databases

Jurčák, Petr January 2009 (has links)
This master's thesis is dedicated to theme of knowledge discovery in Multimedia Databases, especially basic methods of classification and prediction used for data mining. The other part described about extraction of low level features from video data and images and summarizes information about content-based search in multimedia content and indexing this type of data. Final part is dedicated to implementation Gaussian mixtures model for classification and compare the final result with other method SVM.
356

Context Sensitive Civic Duty : An Experimental Study of how Corruption Affects both a Duty to Vote and a Duty to Abstain

Engström, Simon January 2021 (has links)
In this thesis I explore a novel context sensitive conceptualisation of civic duty according to which the conduct (or misconduct) of elected officials affects whether eligible voters feel either a duty to vote (DTV) or a duty to abstain (DTA). Specifically, I argue that under conditions of corruption the norm of electoral accountability may override peoples’ sense of DTV in which case they instead feel a DTA. This context sensitive account is contrasted with a Kantian account of civic duty according to which eligible voters feel a duty to always vote, regardless of contextual factors. The empirical results provides tentative support for the claim that corruption not only decreases eligible voters’ sense of DTV but also increases their sense of DTA. This thesis thus contributes not only to the advancement of the conceptualisation of civic duty in relation to voter turnout, but its results also has important implications for how the rational choice perspective approaches the cost/benefit analysis commonly associated with the voting decision. In the latter case these results indicate that abstainers too may act out of duty and can therefore be assumed to gain positive utility from their abstention. However, the possibility that abstention (just as voting) yields unique costs and benefits has to my knowledge never been acknowledged in the rational choice literature on voter turnout. I therefore conclude by presenting a novel suggestion of how the potential costs and benefits of abstention can be incorporated into the calculus of voting.
357

Reproducibility and Applicability of a Fuzzy-based Routing Algorithm in Wireless Sensor Networks

Rönningen, Hannes, Olofsson, Erik January 2023 (has links)
Wireless sensor networks is a broad subject with many applications and interesting research areas, such as optimization within connectivity and energy efficiency. One problem is that most published articles in this field use customized simulation environments and do not provide source code of their implementation. By not including aspects of implementation, it becomes difficult to determine how the results are achieved, which questions the validity and reliability of the works. This thesis aims to reproduce one of these researched methods, an algorithm that balances battery life with efficient routing within a network using fuzzy logic, with the goal to increase the reliability of the methodology within its field. The research question constructed on the foundation of these premises is thus “Is reproducibility satisfactory in a research work on a multi-objective routing algorithm, using fuzzy logic, in wireless sensor networks?, a case study by Minhas et al”. Two additional research questions emerge from the first one: “How does the reproduced algorithm perform in comparison to a selection of dif erent routing algorithms?” and “Is the reproduced algorithm, as is, applicable to a less idealistic environment?” To answer the research questions a computer simulation method is used to build, execute, and analyze the output of the algorithms. The results show that the implemented algorithm performs noticeably better in both lifetime and ratio to the shortest path compared with the original implementation, hinting towards the implementation and reproducibility deviating from expected results. The reproduced algorithm is also compared to two other algorithms under a different simulation environment, where it performs better in lifetime and packet delivery rate whilst performing slightly worse in energy efficiency and total energy consumption. Due to the significant differences in performance against the reproduced article’s implementation the study concludes that the reproducibility is not satisfactory. Lastly, it concludes that it does not perform well in a less idealistic simulation environment, making it less applicable.
358

Case Histories and Analyses of Synthetic Economies: Implications for Experiments, Game Design, Monetization, and Revenue Maximization.

Wolf, Christopher Alexander 10 May 2013 (has links)
No description available.
359

Particle-based Stochastic Volatility in Mean model / Partikel-baserad stokastisk volatilitet medelvärdes model

Kövamees, Gustav January 2019 (has links)
This thesis present a Stochastic Volatility in Mean (SVM) model which is estimated using sequential Monte Carlo methods. The SVM model was first introduced by Koopman and provides an opportunity to study the intertemporal relationship between stock returns and their volatility through inclusion of volatility itself as an explanatory variable in the mean-equation. Using sequential Monte Carlo methods allows us to consider a non-linear estimation procedure at cost of introducing extra computational complexity. The recently developed PaRIS-algorithm, introduced by Olsson and Westerborn, drastically decrease the computational complexity of smoothing relative to previous algorithms and allows for efficient estimation of parameters. The main purpose of this thesis is to investigate the volatility feedback effect, i.e. the relation between expected return and unexpected volatility in an empirical study. The results shows that unanticipated shocks to the return process do not explain expected returns. / Detta examensarbete presenterar en stokastisk volatilitets medelvärdes (SVM) modell som estimeras genom sekventiella Monte Carlo metoder. SVM-modellen introducerades av Koopman och ger en möjlighet att studera den samtida relationen mellan aktiers avkastning och deras volatilitet genom att inkludera volatilitet som en förklarande variabel i medelvärdes-ekvationen. Sekventiella Monte Carlo metoder tillåter oss att använda icke-linjära estimerings procedurer till en kostnad av extra beräkningskomplexitet. Den nyligen utvecklad PaRIS-algoritmen, introducerad av Olsson och Westerborn, minskar drastiskt beräkningskomplexiteten jämfört med tidigare algoritmer och tillåter en effektiv uppskattning av parametrar. Huvudsyftet med detta arbete är att undersöka volatilitets-återkopplings-teorin d.v.s. relationen mellan förväntad avkastning och oväntad volatilitet i en empirisk studie. Resultatet visar på att oväntade chockar i avkastningsprocessen inte har förklarande förmåga över förväntad avkastning.
360

Machine learning multicriteria optimization in radiation therapy treatment planning / Flermålsoptimering med maskininlärning inom strålterapiplanering

Zhang, Tianfang January 2019 (has links)
In radiation therapy treatment planning, recent works have used machine learning based on historically delivered plans to automate the process of producing clinically acceptable plans. Compared to traditional approaches such as repeated weighted-sum optimization or multicriteria optimization (MCO), automated planning methods have, in general, the benefits of low computational times and minimal user interaction, but on the other hand lack the flexibility associated with general-purpose frameworks such as MCO. Machine learning approaches can be especially sensitive to deviations in their dose prediction due to certain properties of the optimization functions usually used for dose mimicking and, moreover, suffer from the fact that there exists no general causality between prediction accuracy and optimized plan quality.In this thesis, we present a means of unifying ideas from machine learning planning methods with the well-established MCO framework. More precisely, given prior knowledge in the form of either a previously optimized plan or a set of historically delivered clinical plans, we are able to automatically generate Pareto optimal plans spanning a dose region corresponding to plans which are achievable as well as clinically acceptable. For the former case, this is achieved by introducing dose--volume constraints; for the latter case, this is achieved by fitting a weighted-data Gaussian mixture model on pre-defined dose statistics using the expectation--maximization algorithm, modifying it with exponential tilting and using specially developed optimization functions to take into account prediction uncertainties.Numerical results for conceptual demonstration are obtained for a prostate cancer case with treatment delivered by a volumetric-modulated arc therapy technique, where it is shown that the methods developed in the thesis are successful in automatically generating Pareto optimal plans of satisfactory quality and diversity, while excluding clinically irrelevant dose regions. For the case of using historical plans as prior knowledge, the computational times are significantly shorter than those typical of conventional MCO. / Inom strålterapiplanering har den senaste forskningen använt maskininlärning baserat på historiskt levererade planer för att automatisera den process i vilken kliniskt acceptabla planer produceras. Jämfört med traditionella angreppssätt, såsom upprepad optimering av en viktad målfunktion eller flermålsoptimering (MCO), har automatiska planeringsmetoder generellt sett fördelarna av lägre beräkningstider och minimal användarinteraktion, men saknar däremot flexibiliteten hos allmänna ramverk som exempelvis MCO. Maskininlärningsmetoder kan vara speciellt känsliga för avvikelser i dosprediktionssteget på grund av särskilda egenskaper hos de optimeringsfunktioner som vanligtvis används för att återskapa dosfördelningar, och lider dessutom av problemet att det inte finns något allmängiltigt orsakssamband mellan prediktionsnoggrannhet och kvalitet hos optimerad plan. I detta arbete presenterar vi ett sätt att förena idéer från maskininlärningsbaserade planeringsmetoder med det väletablerade MCO-ramverket. Mer precist kan vi, givet förkunskaper i form av antingen en tidigare optimerad plan eller en uppsättning av historiskt levererade kliniska planer, automatiskt generera Paretooptimala planer som täcker en dosregion motsvarande uppnåeliga såväl som kliniskt acceptabla planer. I det förra fallet görs detta genom att introducera dos--volym-bivillkor; i det senare fallet görs detta genom att anpassa en gaussisk blandningsmodell med viktade data med förväntning--maximering-algoritmen, modifiera den med exponentiell lutning och sedan använda speciellt utvecklade optimeringsfunktioner för att ta hänsyn till prediktionsosäkerheter.Numeriska resultat för konceptuell demonstration erhålls för ett fall av prostatacancer varvid behandlingen levererades med volymetriskt modulerad bågterapi, där det visas att metoderna utvecklade i detta arbete är framgångsrika i att automatiskt generera Paretooptimala planer med tillfredsställande kvalitet och variation medan kliniskt irrelevanta dosregioner utesluts. I fallet då historiska planer används som förkunskap är beräkningstiderna markant kortare än för konventionell MCO.

Page generated in 0.0959 seconds