• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • Tagged with
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

On the metastability of the Standard Model

Baum, Sebastian January 2015 (has links)
With the discovery of a particle consistent with the Standard Model (SM) Higgs at the Large Hadron Collider (LHC) at CERN in 2012, the final ingredient of the SM has been found. The SM provides us with a powerful description of the physics of fundamental particles, holding up at all energy scales we can probe with accelerator based experiments. However, astrophysics and cosmology show us that the SM is not the final answer, but e.g. fails to describe dark matter and massive neutrinos. Like any non-trivial quantum field theory, the SM must be subject to a so-called renormalization procedure in order to extrapolate the model between different energy scales. In this context, new problems of more theoretical nature arise, e.g. the famous hierarchy problem of the Higgs mass. Renormalization also leads to what is known as the metastability problem of the SM: assuming the particle found at the LHC is the SM Higgs boson, the potential develops a second minimum deeper than the electroweak one in which we live, at energy scales below the Planck scale. Absolute stability all the way up to the Planck scale is excluded at a confidence level of about 98 %. For the central experimental SM values the instability occurs at scales larger than ~ 10¹⁰ GeV. One can take two viewpoints regarding this instability: assuming validity of the SM all the way up to the Planck scale, the problem does not necessarily lead to an inconsistency of our existence. If we assume our universe to have ended up in the electroweak minimum after the Big Bang, the probability that it would have transitioned to its true minimum during the lifetime of the universe is spectacularly small.  If we on the other hand demand absolute stability, new physics must modify the SM at or below the instability scale of ~ 10¹⁰ GeV, and we can explore which hints the instability might provide us with on this new physics. In this work, the metastability problem of the SM and possible implications are revisited. We give an introduction to the technique of renormalization and apply this to the SM. We then discuss the stability of the SM potential and the hints this might provide us with on new physics at large scales. / Standardmodellen inom partikelfysik är vår bästa beskrivning av elementarpartiklarnas fysik. År 2012 hittades en ny skalär boson vid Large Hadron Collider (LHC) på CERN, som är kompatibel med att vara Higgs bosonen, den sista saknade delen av Standardmodellen. Men även om Standardmodellen ger oss en väldigt precis beskrivning av all fysik vi ser i partikelacceleratorer, vet vi från astropartikelfysik och kosmologi att den inte kan vara hela lösningen. T.ex. beskriver Standardmodellen ej mörk materia eller neutrinernas massa. Som alla kvantfältteorier måste man renormera Standardmodellen för att få en beskrivning som fungerar på olika energiskalor. När man renormerar Standardmodellen hittar man nya problem som är mer teoretiska, t.ex. det välkända hierarkiproblemet av Higgsmassan. Renormering leder också till vad som kallas för metastabilitetsproblemet, dvs att Higgspotentialen utvecklar ett minimum som är djupare än det elektrosvaga minimum vi lever i, på högre energiskalor. Om vi antar att partikeln som hittades på CERN är Standardmodellens Higgs boson, är absolut stabilitet exkluderad med 98 % konfidens. För centrala experimentiella mätningar av Standardmodells parametrar uppkommer instabiliteten på skalor över ~ 10¹⁰ GeV. Det finns två olika sätt att tolka stabilitetsproblemet: Om man antar att Standardmodellen är den rätta teorien ända upp till Planckskalan, kan vi faktiskt fortfarande existera. Om vi antar att universum hamnat i det elektrosvaga minimumet efter Big Bang är sannolikheten att det har gått över till sitt riktiga minimum under universums livstid praktiskt taget noll. Dvs att vi kan leva i ett metastabilt universum. Om vi å andra sidan kräver att potentialen måste vara absolut stabil, måste någon ny fysik modifiera Standardmodellen på eller under instabilitetsskalan ~10¹⁰ GeV. I så fall kan vi fundera på vilka antydningar stabilitetsproblemet kan ge oss om den nya fysiken. Den här uppsatsen beskriver Standardmodells metastabilitetsproblem. Vi ger en introduktion till renormering och använder tekniken till Standardmodellen. Sen diskuteras stabiliteten inom Standardmodellens potential och vilka antydningar problemet kan ge oss angående ny fysik.
2

Efficient and Online Deep Learning through Model Plasticity and Stability

January 2020 (has links)
abstract: The rapid advancement of Deep Neural Networks (DNNs), computing, and sensing technology has enabled many new applications, such as the self-driving vehicle, the surveillance drone, and the robotic system. Compared to conventional edge devices (e.g. cell phone or smart home devices), these emerging devices are required to deal with much more complicated and dynamic situations in real-time with bounded computation resources. However, there are several challenges, including but not limited to efficiency, real-time adaptation, model stability, and automation of architecture design. To tackle the challenges mentioned above, model plasticity and stability are leveraged to achieve efficient and online deep learning, especially in the scenario of learning streaming data at the edge: First, a dynamic training scheme named Continuous Growth and Pruning (CGaP) is proposed to compress the DNNs through growing important parameters and pruning unimportant ones, achieving up to 98.1% reduction in the number of parameters. Second, this dissertation presents Progressive Segmented Training (PST), which targets catastrophic forgetting problems in continual learning through importance sampling, model segmentation, and memory-assisted balancing. PST achieves state-of-the-art accuracy with 1.5X FLOPs reduction in the complete inference path. Third, to facilitate online learning in real applications, acquisitive learning (AL) is further proposed to emphasize both knowledge inheritance and acquisition: the majority of the knowledge is first pre-trained in the inherited model and then adapted to acquire new knowledge. The inherited model's stability is monitored by noise injection and the landscape of the loss function, while the acquisition is realized by importance sampling and model segmentation. Compared to a conventional scheme, AL reduces accuracy drop by >10X on CIFAR-100 dataset, with 5X reduction in latency per training image and 150X reduction in training FLOPs. Finally, this dissertation presents evolutionary neural architecture search in light of model stability (ENAS-S). ENAS-S uses a novel fitness score, which addresses not only the accuracy but also the model stability, to search for an optimal inherited model for the application of continual learning. ENAS-S outperforms hand-designed DNNs when learning from a data stream at the edge. In summary, in this dissertation, several algorithms exploiting model plasticity and model stability are presented to improve the efficiency and accuracy of deep neural networks, especially for the scenario of continual learning. / Dissertation/Thesis / Doctoral Dissertation Electrical Engineering 2020

Page generated in 0.1064 seconds