• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • No language data
  • Tagged with
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Geometric Extensions of Neural Processes

Carr, Andrew Newberry 18 May 2020 (has links)
Neural Processes (NPs) are a class of regression models that learn a map from a set of input-output pairs to a distribution over functions. NPs are computationally tractable and provide a number of benefits over traditional nonlinear regression models. Despite these benefits, there are two main domains where NPs fail. This thesis is focused on presenting extensions of the Neural Process to these two areas. The first of these is the extension of Neural Processes graph and network data which we call Graph Neural Processes (GNP). A Graph Neural Process is defined as a Neural Process that operates on graph data. It takes spectral information from the graph Laplacian as inputs and then outputs a distribution over values. We demonstrate Graph Neural Processes in edge value imputation and discuss benefits and drawbacks of the method for other application areas. The second extension of Neural Processes comes in the fundamental training mechanism. NPs are traditionally trained using maximum likelihood, a probabilistic technique. We show that there are desirable classes of problems where NPs fail to learn. We also show that this drawback is solved by using approximations of the Wasserstein distance. We give experimental justification for our method and demonstrate its performance. These Wasserstein Neural Processes (WNPs) maintain the benefits of traditional NPs while being able to approximate new classes of function mappings.
2

Quantifying implicit and explicit constraints on physics-informed neural processes

Haoyang Zheng (10141679) 30 April 2021 (has links)
<p>Due to strong interactions among various phases and among the phases and fluid motions, multiphase flows (MPFs) are so complex that lots of efforts have to be paid to predict its sequential patterns of phases and motions. The present paper applies the physical constraints inherent in MPFs and enforces them to a physics-informed neural network (PINN) model either explicitly or implicitly, depending on the type of constraints. To predict the unobserved order parameters (OPs) (which locate the phases) in the future steps, the conditional neural processes (CNPs) with long short-term memory (LSTM, combined as CNPLSTM) are applied to quickly infer the dynamics of the phases after encoding only a few observations. After that, the multiphase consistent and conservative boundedness mapping (MCBOM) algorithm is implemented the correction the predicted OPs from CNP-LSTM so that the mass conservation, the summation of the volume fractions of the phases being unity, the consistency of reduction, and the boundedness of the OPs are strictly satisfied. Next, the density of the fluid mixture is computed from the corrected OPs. The observed velocity and density of the fluid mixture then encode in a physics-informed conditional neural processes and long short-term memory (PICNP-LSTM) where the constraint of momentum conservation is included in the loss function. Finally, the unobserved velocity in future steps is predicted from PICNP-LSTM. The proposed physics-informed neural processes (PINPs) model (CNP-LSTM-MCBOM-PICNP-LSTM) for MPFs avoids unphysical behaviors of the OPs, accelerates the convergence, and requires fewer data. The proposed model successfully predicts several canonical MPF problems, i.e., the horizontal shear layer (HSL) and dam break (DB) problems, and its performances are validated.</p>

Page generated in 0.0927 seconds