• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • No language data
  • Tagged with
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Advances In Numerical Methods for Partial Differential Equations and Optimization

Xinyu Liu (19020419) 10 July 2024 (has links)
<p dir="ltr">This thesis presents advances in numerical methods for partial differential equations (PDEs) and optimization problems, with a focus on improving efficiency, stability, and accuracy across various applications. We begin by addressing 3D Poisson-type equations, developing a GPU-accelerated spectral-element method that utilizes the tensor product structure to achieve extremely fast performance. This approach enables solving problems with over one billion degrees of freedom in less than one second on modern GPUs, with applications to Schrödinger and Cahn<i>–</i>Hilliard equations demonstrated. Next, we focus on parabolic PDEs, specifically the Cahn<i>–</i>Hilliard equation with dynamical boundary conditions. We propose an efficient energy-stable numerical scheme using a unified framework to handle both Allen<i>–</i>Cahn and Cahn<i>–</i>Hilliard type boundary conditions. The scheme employs a scalar auxiliary variable (SAV) approach to achieve linear, second-order, and unconditionally energy stable properties. Shifting to a machine learning perspective for PDEs, we introduce an unsupervised learning-based numerical method for solving elliptic PDEs. This approach uses deep neural networks to approximate PDE solutions and employs least-squares functionals as loss functions, with a focus on first-order system least-squares formulations. In the realm of optimization, we present an efficient and robust SAV based algorithm for discrete gradient systems. This method modifies the standard SAV approach and incorporates relaxation and adaptive strategies to achieve fast convergence for minimization problems while maintaining unconditional energy stability. Finally, we address optimization in the context of machine learning by developing a structure-guided Gauss<i>–</i>Newton method for shallow ReLU neural network optimization. This approach exploits both the least-squares and neural network structures to create an efficient iterative solver, demonstrating superior performance on challenging function approximation problems. Throughout the thesis, we provide theoretical analysis, efficient numerical implementations, and extensive computational experiments to validate the proposed methods. </p>

Page generated in 0.0677 seconds