• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 26
  • Tagged with
  • 26
  • 26
  • 26
  • 26
  • 24
  • 24
  • 10
  • 9
  • 9
  • 8
  • 6
  • 4
  • 4
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Portfolio optimization in presence of a self-exciting jump process: from theory to practice

Veronese, Andrea 27 April 2022 (has links)
We aim at generalizing the celebrated portfolio optimization problem "à la Merton", where the asset evolution is steered by a self-exciting jump-diffusion process. We first define the rigorous mathematical framework needed to introduce the stochastic optimal control problem we are interesting in. Then, we provide a proof for a specific version of the Dynamic Programming Principle (DPP) with respect to the general class of self-exciting processes under study. After, we state the Hamilton-Jacobi-Bellman (HJB) equation, whose solution gives the value function for the corresponding optimal control problem. The resulting HJB equation takes the form of a Partial-Integro Differential Equation (PIDE), for which we prove both existence and uniqueness for the solution in the viscosity sense. We further derive a suitable numerical scheme to solve the HJB equation corresponding to the portfolio optimizationproblem. To this end, we also provide a detailed study of solution dependence on the parameters of the problem. The analysis is performed by calibrating the model on ENI asset levels during the COVID-19 worldwide breakout. In particular, the calibration routine is based on a sophisticated Sequential Monte Carlo algorithm.
22

Directional relationships between BOLD activity and autonomic nervous system fluctuations revealed by fast fMRI acquisition

Iacovella, Vittorio January 2012 (has links)
The problem of the relationship between brain function, characterized by functional magnetic resonance imaging, and physiological fluctuations by means of cardiac / respiratory oscillations is one of the most debated topics in the last decade. In recent literature, a great number of studies are found that focus on both practical and conceptual aspects about this topic. In this work, we start with reviewing two distinct approaches in considering physiology - related sequences with respect to functional magnetic resonance imaging: one treating physiology - related fluctuations as generators of noise, the other considering them as carriers of cognitively relevant information. In chapter 2 – “Physiology – related effects in the BOLD signal at rest at 4T”, we consider physiological quantities as generators of noise, and discuss conceptual flaws researchers have to face when dealing with data de-noising procedures. We point out that it can be difficult to show that the procedure has achieved its stated aim, i.e. to remove only physiology - related components from the data. As a practical solution, we present a benchmark for assessing whether correction for physiological noise has achieved its stated aim, based on the principle of permutation testing. In chapter 3 – “Directional relationships between BOLD activity and autonomic nervous system fluctuations revealed by fast fMRI acquisition”, on the other hand, we will consider autonomic indicants derived from physiological time - series as meaningful components of the BOLD signal. There, we describe a FMRI experiment building on this, where the goal was to localize brain areas whose activity is directionally related to autonomic one, in a top - down modulation fashion. In chapter 4 we recap the conclusions we found from the two approaches and we summarize the general contributions of our findings. We point out that bringing together the distinct approaches we reviewed lead us to mainly two contributions. On one hand we thought back the validity of almost established procedures in FMRI resting - state pre-processing pipelines. On the other we were able to say something new about general relationship between BOLD and autonomic activity, resting state fluctuations and deactivation theory.
23

SDEs and MFGs towards Machine Learning applications

Garbelli, Matteo 04 December 2023 (has links)
We present results that span three interconnected domains. Initially, our analysis is centred on Backward Stochastic Differential Equations (BSDEs) featuring time-delayed generators. Subsequently, we direct our interest towards Mean Field Games (MFGs) incorporating absorption aspects, with a focus on the corresponding Master Equation within a confined domain under the imposition of Dirichlet boundary conditions. The investigation culminates in exploring pertinent Machine Learning methodologies applied to financial and economic decision-making processes.
24

Novel data-driven analysis methods for real-time fMRI and simultaneous EEG-fMRI neuroimaging

Soldati, Nicola January 2012 (has links)
Real-time neuroscience can be described as the use of neuroimaging techniques to extract and evaluate brain activations during their ongoing development. The possibility to track these activations opens the doors to new research modalities as well as practical applications in both clinical and everyday life. Moreover, the combination of different neuroimaging techniques, i.e. multimodality, may reduce several limitations present in each single technique. Due to the intrinsic difficulties of real-time experiments, in order to fully exploit their potentialities, advanced signal processing algorithms are needed. In particular, since brain activations are free to evolve in an unpredictable way, data-driven algorithms have the potentials of being more suitable than model-driven ones. In fact, for example, in neurofeedback experiments brain activation tends to change its properties due to training or task eects thus evidencing the need for adaptive algorithms. Blind Source Separation (BSS) methods, and in particular Independent Component Analysis (ICA) algorithms, are naturally suitable to such kind of conditions. Nonetheless, their applicability in this framework needs further investigations. The goals of the present thesis are: i) to develop a working real-time set up for performing experiments; ii) to investigate different state of the art ICA algorithms with the aim of identifying the most suitable (along with their optimal parameters), to be adopted in a real-time MRI environment; iii) to investigate novel ICA-based methods for performing real-time MRI neuroimaging; iv) to investigate novel methods to perform data fusion between EEG and fMRI data acquired simultaneously. The core of this thesis is organized around four "experiments", each one addressing one of these specic aims. The main results can be summarized as follows. Experiment 1: a data analysis software has been implemented along with the hardware acquisition set-up for performing real-time fMRI. The set-up has been developed with the aim of having a framework into which it would be possible to test and run the novel methods proposed to perform real-time fMRI. Experiment 2: to select the more suitable ICA algorithm to be implemented in the system, we investigated theoretically and compared empirically the performance of 14 different ICA algorithms systematically sampling different growing window lengths, model order as well as a priori conditions (none, spatial or temporal). Performance is evaluated by computing the spatial and temporal correlation to a target component of brain activation as well as computation time. Four algorithms are identied as best performing without prior information (constrained ICA, fastICA, jade-opac and evd), with their corresponding parameter choices. Both spatial and temporal priors are found to almost double the similarity to the target at not computation costs for the constrained ICA method. Experiment 3: the results and the suggested parameters choices from experiment 2 were implemented to monitor ongoing activity in a sliding-window approach to investigate different ways in which ICA-derived a priori information could be used to monitor a target independent component: i) back-projection of constant spatial information derived from a functional localizer, ii) dynamic use of temporal , iii) spatial, or both iv) spatial-temporal ICA constrained data. The methods were evaluated based on spatial and/or temporal correlation with the target IC component monitored, computation time and intrinsic stochastic variability of the algorithms. The results show that the back-projection method offers the highest performance both in terms of time course reconstruction and speed. This method is very fast and effective as far as the monitored IC has a strong and well defined behavior, since it relies on an accurate description of the spatial behavior. The dynamic methods oer comparable performances at cost of higher computational time. In particular the spatio-temporal method performs comparably in terms of computational time to back-projection, offering more variable performances in terms of reconstruction of spatial maps and time courses. Experiment 4: finally, Higher Order Partial Least Square based method combined with ICA is proposed and investigated to integrate EEG-fMRI data acquired simultaneously. This method showed to be promising, although more experiments are needed.
25

Financial risk sources and optimal strategies in jump-diffusion frameworks

Prezioso, Luca 25 March 2020 (has links)
An optimal dividend problem with investment opportunities, taking into consideration a source of strategic risk is being considered, as well as the effect of market frictions on the decision process of the financial entities. It concerns the problem of determining an optimal control of the dividend under debt constraints and investment opportunities in an economy with business cycles. It is assumed that the company is to be allowed to accept or reject investment opportunities arriving at random times with random sizes, by changing its outstanding indebtedness, which would impact its capital structure and risk profile. This work mainly focuses on the strategic risk faced by the companies; and, in particular, it focuses on the manager's problem of setting appropriate priorities to deploy the limited resources available. This component is taken into account by introducing frictions in the capital structure modification process. The problem is formulated as a bi-dimensional singular control problem under regime switching in presence of jumps. An explicit condition is obtained in order to ensure that the value function is finite. A viscosity solution approach is used to get qualitative descriptions of the solution. Moreover, a lending scheme for a system of interconnected banks with probabilistic constraints of failure is being considered. The problem arises from the fact that financial institutions cannot possibly carry enough capital to withstand counterparty failures or systemic risk. In such situations, the central bank or the government becomes effectively the risk manager of last resort or, in extreme cases, the lender of last resort. If, on the one hand, the health of the whole financial system depends on government intervention, on the other hand, guaranteeing a high probability of salvage may result in increasing the moral hazard of the banks in the financial network. A closed form solution for an optimal control problem related to interbank lending schemes has been derived, subject to terminal probability constraints on the failure of banks which are interconnected through a financial network. The derived solution applies to real bank networks by obtaining a general solution when the aforementioned probability constraints are assumed for all the banks. We also present a direct method to compute the systemic relevance parameter for each bank within the network. Finally, a possible computation technique for the Default Risk Charge under to regulatory risk measurement processes is being considered. We focus on the Default Risk Charge measure as an effective alternative to the Incremental Risk Charge one, proposing its implementation by a quasi exhaustive-heuristic algorithm to determine the minimum capital requested to a bank facing the market risk associated to portfolios based on assets emitted by several financial agents. While most of the banks use the Monte Carlo simulation approach and the empirical quantile to estimate this risk measure, we provide new computational approaches, exhaustive or heuristic, currently becoming feasible, because of both new regulation and the high speed - low cost technology available nowadays.
26

Semantic Enrichment of Mobile Phone Data Records Exploiting Background Knowledge

Dashdorj, Zolzaya January 2015 (has links)
Every day, billions of mobile network log data (commonly defined as Call Detailed Records, or CDRs) are generated by cell phones operators. These data provide inspiring insights about human actions and behaviors, which are essentials in the development of context aware appli- cations and services. This potential demand has fostered major research activities in a variety of domains such as social and economic development, urban planning, and health prevention. The major challenge of this thesis is to interpret CDR for human activity recognition, in the light of background knowledge of the CDR data context. Indeed each entry of the CDR is as- sociated with a context, which describes the temporal and spatial location of the user when a particular network data has been generated by his/her mobile devices. Knowing, by leveraging available Web 2.0 data sources, (e.g., Openstreetmap) this research thesis proposes to develop a novel model from combination of logical and statistical reasoning standpoints for enabling human activity inference in qualitative terms. The results aimed at compiling human behavior predictions into sets of classification tasks in the CDRs. Our research results show that Point of Interest (POI)s are a good proxy for predicting the content of human activities in an area. So the model is proven to be effective for predicting the context of human activity, when its total level could be efficiently observed from cell phone data records.

Page generated in 0.106 seconds