Transformer neural networks have made a significant impact on natural language processing. The Transformer network self-attention mechanism effectively addresses the vanishing gradient problem that limits a network learning capability, especially when the time series gets longer or the size of the network gets deeper. This dissertation examines the usage of the Transformer model for time-series forecasting and customizes it for a simultaneous multistep-ahead prediction model in a surrogate model predictive control (MPC) application. The proposed method demonstrates enhanced control performance and computation efficiency compared to the Long-short term memory (LSTM)-based MPC and one-step-ahead prediction model structures for both LSTM and Transformer networks. In addition to the Transformer, this research investigates hybrid machine-learning modeling. The machine learning models are known for superior function approximation capability with sufficient data. However, the quantity and quality of data to ensure the prediction precision are usually not readily available. The physics-informed neural network (PINN) is a type of hybrid modeling method using dynamic physics-based equations in training a standard machine learning model as a form of multi-objective optimization. The PINN approach with the state-of-the-art time-series neural networks Transformer is studied in this research providing the standard procedure to develop the Physics-Informed Transformer (PIT) and validating with various case studies. This research also investigates the benefit of nonlinear model-based control and estimation algorithms for managed pressure drilling (MPD). This work presents a new real-time high-fidelity flow model (RT-HFM) for bottom-hole pressure (BHP) regulation in MPD operations. Lastly, this paper presents details of an Arduino microcontroller temperature control lab as a benchmark for modeling and control methods. Standard benchmarks are essential for comparing competing models and control methods, especially when a new method is proposed. A physical benchmark considers real process characteristics such as the requirement to meet a cycle time, discrete sampling intervals, communication overhead with the process, and model mismatch. Novel contributions of this work are (1) a new MPC system built upon a Transformer time-series architecture, (2) a training method for time-series machine learning models that enables multistep-ahead prediction, (3) verification of Transformer MPC solution time performance improvement (15 times) over LSTM networks, (4) physics-informed machine learning to improve extrapolation potential, and (5) two case studies that demonstrate hybrid modeling and benchmark performance criteria.
Identifer | oai:union.ndltd.org:BGMYU2/oai:scholarsarchive.byu.edu:etd-10768 |
Date | 01 December 2022 |
Creators | Park, Junho |
Publisher | BYU ScholarsArchive |
Source Sets | Brigham Young University |
Detected Language | English |
Type | text |
Format | application/pdf |
Source | Theses and Dissertations |
Rights | https://lib.byu.edu/about/copyright/ |
Page generated in 0.0013 seconds