Return to search

Energy-Dissipative Methods in Numerical Analysis, Optimization and Deep Neural Networks for Gradient Flows and Wasserstein Gradient Flows

<p dir="ltr">This thesis delves into the development and integration of energy-dissipative methods, with applications spanning numerical analysis, optimization, and deep neural networks, primarily targeting gradient flows and porous medium equations. In the realm of optimization, we introduce the element-wise relaxed scalar auxiliary variable (E-RSAV) algorithm, showcasing its robustness and convergence through extensive numerical experiments. Complementing this, we design an Energy-Dissipative Evolutionary Deep Operator Neural Network (DeepONet) to numerically address a suite of partial differential equations. By employing a dual-subnetwork structure and utilizing the Scalar Auxiliary Variable (SAV) method, the network achieves impeccable approximations of operators while upholding the Energy Dissipation Law, even when training data comprises only the initial state. Lastly, we formulate first-order schemes tailored for Wasserstein gradient flows. Our schemes demonstrate remarkable properties, including mass conservation, unique solvability, positivity preservation, and unconditional energy dissipation. Collectively, the innovations presented here offer promising pathways for efficient and accurate numerical solutions in both gradient flows and Wasserstein gradient flows, bridging the gap between traditional optimization techniques and modern neural network methodologies.</p>

  1. 10.25394/pgs.24716472.v1
Identiferoai:union.ndltd.org:purdue.edu/oai:figshare.com:article/24716472
Date05 December 2023
CreatorsShiheng Zhang (17540328)
Source SetsPurdue University
Detected LanguageEnglish
TypeText, Thesis
RightsCC BY 4.0
Relationhttps://figshare.com/articles/thesis/Energy-Dissipative_Methods_in_Numerical_Analysis_Optimization_and_Deep_Neural_Networks_for_Gradient_Flows_and_Wasserstein_Gradient_Flows/24716472

Page generated in 0.0026 seconds