• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • Tagged with
  • 5
  • 5
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Maximum entropy regularization for calibrating a time-dependent volatility function

Hofmann, Bernd, Krämer, Romy 26 August 2004 (has links) (PDF)
We investigate the applicability of the method of maximum entropy regularization (MER) including convergence and convergence rates of regularized solutions to the specific inverse problem (SIP) of calibrating a purely time-dependent volatility function. In this context, we extend the results of [16] and [17] in some details. Due to the explicit structure of the forward operator based on a generalized Black-Scholes formula the ill-posedness character of the nonlinear inverse problem (SIP) can be verified. Numerical case studies illustrate the chances and limitations of (MER) versus Tikhonov regularization (TR) for smooth solutions and solutions with a sharp peak.
2

Theories of Optimal Control and Transport with Entropy Regularization / エントロピー正則化を伴う最適制御・輸送理論

Ito, Kaito 26 September 2022 (has links)
京都大学 / 新制・課程博士 / 博士(情報学) / 甲第24263号 / 情博第807号 / 新制||情||136(附属図書館) / 京都大学大学院情報学研究科数理工学専攻 / (主査)准教授 加嶋 健司, 教授 太田 快人, 教授 山下 信雄 / 学位規則第4条第1項該当 / Doctor of Informatics / Kyoto University / DGAM
3

Regularized Fine-tuning Strategies for Neural Language Models : Application of entropy regularization on GPT-2

Hong, Jae Eun January 2022 (has links)
Deep neural language models like GPT-2 is undoubtedly strong at text generation, but often requires special decoding strategies to prevent producing degenerate output - namely repetition. The use of maximum likelihood training objective results in a peaked probability distribution, leading to the over-confidence of neural networks. In this thesis, we explore entropy regularization for a neural language model that can easily smooth peaked output distribution during the fine-tuning process employing GPT-2. We first define the models in three ways: (1) Out of-the box model without fine-tuning process, (2) Fine-tuned model without entropy regularization, and (3) Fine-tuned model with entropy regularization. To investigate the effect of domains on the model, we also divide the dataset into three ways: (1) fine-tuned on heterogeneous dataset, tested on heterogeneous dataset, (2) fine-tuned on homogeneous dataset, tested on homogeneous dataset, and (3) fine-tuned on heterogeneous dataset, tested on homogeneous dataset. In terms of entropy regularization, we experiment controlling the entropy strength parameter (𝛽) in the range of [0.5, 1.0, 2.0, 4.0, 6.0] and annealing the parameter during fine-tuning process. Our findings prove that the entropy-based regularization during fine-tuning process improve the text generation models by significantly reducing the repetition rate without tuning the decoding strategies. As a result of comparing the probabilities of human-generated sentence tokens, it was observed that entropy regularization compensates for the shortcomings of the deterministic decoding method (Beam search) that mostly selects few high-probability words. Various studies have explored entropy regularization in the cold-start training process of neural networks. However, there are not many studies covering the effect of the fine-tuning stage of text generation tasks when employing large scale pre-trained language models. Our findings present strong evidence that one can achieve significant improvement in text generation by way of utilizing entropy regularization, a highly cost-effective approach, during the fine-tuning process.
4

Maximum entropy regularization for calibrating a time-dependent volatility function

Hofmann, Bernd, Krämer, Romy 26 August 2004 (has links)
We investigate the applicability of the method of maximum entropy regularization (MER) including convergence and convergence rates of regularized solutions to the specific inverse problem (SIP) of calibrating a purely time-dependent volatility function. In this context, we extend the results of [16] and [17] in some details. Due to the explicit structure of the forward operator based on a generalized Black-Scholes formula the ill-posedness character of the nonlinear inverse problem (SIP) can be verified. Numerical case studies illustrate the chances and limitations of (MER) versus Tikhonov regularization (TR) for smooth solutions and solutions with a sharp peak.
5

Magnetic field effects in chemical systems

Rodgers, Christopher T. January 2007 (has links)
Magnetic fields influence the rate and/or yield of chemical reactions that proceed via spin correlated radical pair intermediates. The field of spin chemistry centres around the study of such magnetic field effects (MFEs). This thesis is particularly concerned with the effects of the weak magnetic fields B₀ ~ 1mT relevant in the ongoing debates on the mechanism by which animals sense the geomagnetic field and on the putative health effects of environmental electromagnetic fields. Relatively few previous studies have dealt with such weak magnetic fields. This thesis presents several new theoretical tools and applies them to interpret experimental measurements. Chapter 1 surveys the development and theory of spin chemistry. Chapter 2 introduces the use of Tikhonov and Maximum Entropy Regularisation methods as a new means of analysing MARY field effect data. These are applied to recover details of the diffusive motion of reacting pyrene and N,N-dimethylaniline radicals. Chapter 3 gives a fresh derivation and appraisal of an approximate, semiclassical approach to MFEs. Monte Carlo calculations allow the elucidation of several "rules of thumb" for interpreting MFE data. Chapter 4 discusses recent optically-detected zero-field EPR measurements, adapting the gamma-COMPUTE algorithm from solid state NMR for their interpretation. Chapter 5 explores the role of RF polarisation in producing MFEs. The breakdown in weak fields of the familiar rotating frame approximation is analysed. Chapter 6 reviews current knowledge and landmark experiments in the area of animal magnetoreception. The origins of the sensitivity of European robins Erithacus rubecula to the Earth’s magnetic field are given particular attention. In Chapter 7, Schulten and Ritz’s hypothesis that avian magnetoreception is founded on a radical pair mechanism (RPM) reaction is appraised through calculations in model systems. Chapter 8 introduces quantitative methods of analysing anisotropic magnetic field effects using spherical harmonics. Chapter 9 considers recent observations that European robins may sometimes be disoriented by minuscule RF fields. These are shown to be consistent with magnetoreception via a radical pair with no (effective) magnetic nuclei in one of the radicals.

Page generated in 0.1187 seconds