Return to search

Machine learning multi-stage classification and regression in the search for vector-like quarks and the Neyman construction in signal searches

<p> A search for vector-like quarks (VLQs) decaying to a Z boson using multi-stage machine learning was compared to a search using a standard square cuts search strategy. VLQs are predicted by several new theories beyond the Standard Model. The searches used 20.3 inverse femtobarns of proton-proton collisions at a center-of-mass energy of 8 TeV collected with the ATLAS detector in 2012 at the CERN Large Hadron Collider. CLs upper limits on production cross sections of vector-like top and bottom quarks were computed for VLQs produced singly or in pairs, <i>T<sub>single</sub>, B<sub>single</sub>, T<sub>pair</sub>, </i> and <i>B<sub>pair</sub>.</i> The two stage machine learning classification search strategy did not provide any improvement over the standard square cuts strategy, but for <i>T<sub>pair</sub>, B<sub> pair</sub>,</i> and <i>T<sub>single</sub>,</i> a third stage of machine learning regression was able to lower the upper limits of high signal masses by as much as 50%. Additionally, new test statistics were developed for use in the Neyman construction of confidence regions in order to address deficiencies in current frequentist methods, such as the generation of empty set confidence intervals. A new method for treating nuisance parameters was also developed that may provide better coverage properties than current methods used in particle searches. Finally, significance ratio functions were derived that allow a more nuanced interpretation of the evidence provided by measurements than is given by confidence intervals alone.</p>

Identiferoai:union.ndltd.org:PROQUEST/oai:pqdtoai.proquest.com:10247665
Date16 December 2016
CreatorsLeone, Robert Matthew
PublisherThe University of Arizona
Source SetsProQuest.com
LanguageEnglish
Detected LanguageEnglish
Typethesis

Page generated in 0.0864 seconds