• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Weak Solutions to a Fractional Fokker-Planck Equation via Splitting and Wasserstein Gradient Flow

Bowles, Malcolm 22 August 2014 (has links)
In this thesis, we study a linear fractional Fokker-Planck equation that models non-local (`fractional') diffusion in the presence of a potential field. The non-locality is due to the appearance of the `fractional Laplacian' in the corresponding PDE, in place of the classical Laplacian which distinguishes the case of regular (Gaussian) diffusion. Motivated by the observation that, in contrast to the classical Fokker-Planck equation (describing regular diffusion in the presence of a potential field), there is no natural gradient flow formulation for its fractional counterpart, we prove existence of weak solutions to this fractional Fokker-Planck equation by combining a splitting technique together with a Wasserstein gradient flow formulation. An explicit iterative construction is given, which we prove weakly converges to a weak solution of this PDE. / Graduate
2

Energy-Dissipative Methods in Numerical Analysis, Optimization and Deep Neural Networks for Gradient Flows and Wasserstein Gradient Flows

Shiheng Zhang (17540328) 05 December 2023 (has links)
<p dir="ltr">This thesis delves into the development and integration of energy-dissipative methods, with applications spanning numerical analysis, optimization, and deep neural networks, primarily targeting gradient flows and porous medium equations. In the realm of optimization, we introduce the element-wise relaxed scalar auxiliary variable (E-RSAV) algorithm, showcasing its robustness and convergence through extensive numerical experiments. Complementing this, we design an Energy-Dissipative Evolutionary Deep Operator Neural Network (DeepONet) to numerically address a suite of partial differential equations. By employing a dual-subnetwork structure and utilizing the Scalar Auxiliary Variable (SAV) method, the network achieves impeccable approximations of operators while upholding the Energy Dissipation Law, even when training data comprises only the initial state. Lastly, we formulate first-order schemes tailored for Wasserstein gradient flows. Our schemes demonstrate remarkable properties, including mass conservation, unique solvability, positivity preservation, and unconditional energy dissipation. Collectively, the innovations presented here offer promising pathways for efficient and accurate numerical solutions in both gradient flows and Wasserstein gradient flows, bridging the gap between traditional optimization techniques and modern neural network methodologies.</p>

Page generated in 0.1017 seconds