121 |
Study of Triple-GEM detectors for the CMS muon spectrometer upgrade at LHC and study of the forward-backward charge asymmetry for the search of extra neutral gauge bosonsZenoni, Florian 27 April 2016 (has links)
Cette thèse de doctorat a pour cadre l’expérience CMS auprès du grand collisionneur de protons du CERN, le LHC. Le LHC, qui a permis la découverte en 2012 du boson de Brout-Englert-Higgs, est destiné à fonctionner pour encore 20 ans, avec une luminosité qui croîtra progressivement pour atteindre d’ici 2025 la valeur de 7.5 x 10^34 cm^-2 s^-1, c'est à dire environ cinq fois la valeur initialement prévue. Ceci a pour conséquence que les expériences doivent s’adapter et mettre à niveau une série de leurs composants et détecteurs. Une des prochaines mises à niveau de l’expérience CMS concerne les détecteurs Triple Gas Electron Multiplier (GEM) qui sont actuellement en développement pour la partie avant du spectromètre à muons de l’expérience. Ces détecteurs seront installés dans CMS durant le deuxième long arrêt du LHC, en 2018-2019, appelé LS2. Cette mise à niveau a pour but de contrôler les taux de déclenchement d’événements pour la détection de muons, grâce à la haute performance de ces détecteurs Triple GEM en présence de taux de particules extrêmement élevés (>1 kHz/cm^2). De plus, grâce à sa très bonne résolution spatiale (~250 um), la technologie GEM peut améliorer la reconstruction des traces de muons et la capacité d’identification du détecteur avant.Le but de mon travail de recherche est d’estimer la sensitivité des Triple GEMs à l’environnement de radiation hostile dans CMS, essentiellement composé de neutrons et de photons produits lors des interactions entre les particules et les détecteurs constituant l’expérience CMS. L’estimation précise de cette sensitivité est très importante, car une sous-estimation pourrait avoir des effets désastreux pour l’efficacité des Triple GEMs une fois installés dans CMS. Pour valider mes simulations, j’ai également reproduit des résultats expérimentaux obtenus avec d’autres détecteurs similaires déjà installés dans CMS, tels que les Resistive Plate Chambers (RPC).La deuxième partie de mon travail concerne l’étude de la capacité de l’expérience CMS à discerner différents modèles de nouvelle physique prédisant l’existence de bosons vecteurs, appelés Z'. Ces modèles font partie des extensions plausibles du Modèle Standard. En particulier, l’analyse se concentre sur des simulations dans lesquelles le Z' se désintègre en deux muons, et sur l’impact que les mises à niveau avec les détecteurs Triple GEM apporteront à ces mesures tout le long de la phase de haute intensité du LHC. Mes simulations montrent que plus de 20% des événements simulés comptent au moins un muon dans la région en pseudo-rapidité (eta) de CMS couverte par les détecteurs Triple GEM. Les résultats préliminaires démontrent que, dans le case de modèles à 3 TeV/c^2, il sera possible dès la fin de la Phase I de distinguer un Z'I d'un Z'SSM avec un niveau de signification alpha > 3 sigma. / This PhD thesis takes place in the CMS experiment at CERN's Large Hadron Collider (LHC). The LHC allowed the discovery of the Brout-Englert-Higgs boson in 2012, and is designed to run for at least 20 years, with an increasing luminosity that will reach by 2025 a value of 7.5 x 10^34 cm^-2 s^-1, that is a yield five times greater than the one initially intended. As a consequence, the experiments must adapt and upgrade many of their components and particle detectors. One of the foreseen upgrades of the CMS experiment concerns the Triple Gas Electron Multiplier (GEM) detectors, currently in development for the forward muon spectrometer. These detectors will be installed in CMS during the second long LHC shutdown (LS2), in 2018-2019. The aim of this upgrade is to better control the event trigger rate at Level 1 for muon detection, thanks to the high performance of these Triple GEM detectors, in presence of very high particle rates (>1 kHz/cm^2). Moreover, thanks to its excellent spatial resolution (~250 um), the GEM technology can improve the muon track reconstruction and the identification capability of the forward detector.The goal of my research is to estimate the sensitivity of Triple GEMs to the hostile background radiation in CMS, essentially made of neutron and photons generated by the interaction between the particles and CMS detectors. The accurate evaluation of this sensitivity is very important, as an underestimation could have ruinous effects of the Triple GEMs efficiency, once they are installed in CMS. To validate my simulations, I have reproduced experimental results obtained with similar detectors already installed in CMS, such as the Resistive Plate Chambers (RPC).The second part of my work regards the study of the CMS experiment capability to discriminate between different models of new physics predicting the existence of neutral vector bosons called Z'. These models belong to plausible extensions of the Standard Model. In particular, the analysis is focused on simulated samples in which the Z' decays in two muons, and on the impact that the Triple GEM detectors upgrades will bring to these measurements during the high luminosity phase of the LHC, called Phase II. My simulations prove that more than 20% of the simulated events see at least one muon in the CMS pseudo-rapidity (eta) region covered by Triple GEM detectors. Preliminary results show that, in the case of 3 TeV/c^2 models, it will be possible already at the end of Phase I to discriminate a Z'I from a Z'SSM with a significance level alpha > 3 sigma. / Doctorat en Sciences / info:eu-repo/semantics/nonPublished
|
122 |
Model independent searches for New Physics using Machine Learning at the ATLAS experiment / Recherche de Nouvelle Physique indépendante d'un modèle en utilisant l’apprentissage automatique sur l’experience ATLASJimenez, Fabricio 16 September 2019 (has links)
Nous abordons le problème de la recherche indépendante du modèle pour la Nouvelle Physique (NP), au Grand Collisionneur de Hadrons (LHC) en utilisant le détecteur ATLAS. Une attention particulière est accordée au développement et à la mise à l'essai de nouvelles techniques d'apprentissage automatique à cette fin. Le présent ouvrage présente trois résultats principaux. Tout d'abord, nous avons mis en place un système de surveillance automatique des signatures génériques au sein de TADA, un outil logiciel d'ATLAS. Nous avons exploré plus de 30 signatures au cours de la période de collecte des données de 2017 et aucune anomalie particulière n'a été observée par rapport aux simulations des processus du modèle standard. Deuxièmement, nous proposons une méthode collective de détection des anomalies pour les recherches de NP indépendantes du modèle au LHC. Nous proposons l'approche paramétrique qui utilise un algorithme d'apprentissage semi-supervisé. Cette approche utilise une probabilité pénalisée et est capable d'effectuer simultanément une sélection appropriée des variables et de détecter un comportement anormal collectif possible dans les données par rapport à un échantillon de fond donné. Troisièmement, nous présentons des études préliminaires sur la modélisation du bruit de fond et la détection de signaux génériques dans des spectres de masse invariants à l'aide de processus gaussiens (GPs) sans information préalable moyenne. Deux méthodes ont été testées dans deux ensembles de données : une procédure en deux étapes dans un ensemble de données tiré des simulations du modèle standard utilisé pour ATLAS General Search, dans le canal contenant deux jets à l'état final, et une procédure en trois étapes dans un ensemble de données simulées pour le signal (Z′) et le fond (modèle standard) dans la recherche de résonances dans le cas du spectre de masse invariant de paire supérieure. Notre étude est une première étape vers une méthode qui utilise les GPs comme outil de modélisation qui peut être appliqué à plusieurs signatures dans une configuration plus indépendante du modèle. / We address the problem of model-independent searches for New Physics (NP), at the Large Hadron Collider (LHC) using the ATLAS detector. Particular attention is paid to the development and testing of novel Machine Learning techniques for that purpose. The present work presents three main results. Firstly, we put in place a system for automatic generic signature monitoring within TADA, a software tool from ATLAS. We explored over 30 signatures in the data taking period of 2017 and no particular discrepancy was observed with respect to the Standard Model processes simulations. Secondly, we propose a collective anomaly detection method for model-independent searches for NP at the LHC. We propose the parametric approach that uses a semi-supervised learning algorithm. This approach uses penalized likelihood and is able to simultaneously perform appropriate variable selection and detect possible collective anomalous behavior in data with respect to a given background sample. Thirdly, we present preliminary studies on modeling background and detecting generic signals in invariant mass spectra using Gaussian processes (GPs) with no mean prior information. Two methods were tested in two datasets: a two-step procedure in a dataset taken from Standard Model simulations used for ATLAS General Search, in the channel containing two jets in the final state, and a three-step procedure from a simulated dataset for signal (Z′) and background (Standard Model) in the search for resonances in the top pair invariant mass spectrum case. Our study is a first step towards a method that takes advantage of GPs as a modeling tool that can be applied to several signatures in a more model independent setup.
|
123 |
Search for dark matter produced in association with a Z boson in the ATLAS detector at the Large Hadron ColliderMcLean, Kayla Dawn 01 March 2021 (has links)
This dissertation presents a search for dark matter particles produced in association with a Z boson in proton-proton collisions. The dataset consists of 139 fb^{-1} of collision events with centre-of-mass energy of 13 TeV, and was collected by the ATLAS detector from 2015-2018 at the Large Hadron Collider. Signal region events are required to contain a Z boson that decays leptonically to either e^+e^- or μ^+μ^-, and a significant amount of missing transverse momentum, which indicates the presence of undetected particles. Two types of dark matter models are studied: (1) simplified models with an s-channel axial-vector or vector mediator that couples to dark matter Dirac fermions, and (2) two-Higgs-doublet models with an additional pseudo-scalar that couples to dark matter Dirac fermions. The main Standard Model background sources are ZZ, WZ, non-resonant l^+l^-, and Z+jets processes, which are estimated using a combination of data and/or simulation. A new reweighting technique is developed for estimating the Z+jets background using γ+jets events in data; the resulting estimate significantly improves on the statistical and systematic errors compared to the estimate obtained from simulation. The observed data in the signal region are compared to Standard Model prediction using a transverse mass discriminant distribution. No significant excess in data is observed for the simplified models and two-Higgs-doublet models studied. A statistical analysis is performed and several exclusion limits are set on the parameters of the dark matter models. Results are compared to direct detection experiments, the CMS experiment, and other ATLAS searches. Prospects and improvements for future iterations of the search are also presented. / Graduate
|
124 |
[pt] ESTUDOS DE SENSIBILIDADE PARA VIOLAÇÃO DE CARGA-PARIDADE NOS DECAIMENTOS D+ -> K-K+Π+ E D+ -> Π-Π+Π+ NO EXPERIMENTO LHCB / [en] SENSITIVITY STUDIES FOR CHARGE-PARITY VIOLATION IN THE DECAYS D+ -> K-K+Π+ AND D+ -> Π-Π+Π+ IN THE LHCB EXPERIMENTLUCAS NICHOLAS FALCAO FERREIRA 11 January 2022 (has links)
[pt] Na última década, desde o início do funcionamento do LHC, o Modelo Padrão de física de partículas vem sendo posto à prova com precisão sem precedentes, com enorme êxito. Um de seus experimentos é o LHCb, dedicado à física dos hádrons contendo os quarks beauty e charm. Um dos importantes temas de pesquisa do LHCb é o estudo de efeitos de assimetria partícula-antipartícula em processos de decaimento, devido à chamada violação de Carga-Paridade (CP). A violação de CP é prevista pelo Modelo Padrão e, em decaimentos envolvendo o quark charm, pode ocorrer em certos processos chamados de suprimidos por Cabibbo. No entanto, este efeito é muito pequeno, da ordem de 0.1 porcento. Esta pequenez faz com que o
ambiente de decaimentos charmosos seja atraente para busca por física além do Modelo Padrão. O objetivo deste trabalho é o estudo de sensibilidade para violação de CP nos canais D+ -> K- K+π+ e D+ -> π- π+π+: no run II do LHCb. Através de uma representação do espaço de fase desses decaimentos, chamado de Dalitz Plot, e a utilização do método de Mirandizing, que se baseia em procurar significâncias locais na diferença da distribuição de eventos no Dalitz Plot de partícula e antipartícula, pode-se buscar por assimetrias de carga que indicariam efeitos de violação de CP nestes decaimentos. Baseando-se nas estatísticas dos dados tomados entre 2016 e 2018 no LHCb, foram desenvolvidos pseudoexperimentos, via método de Monte Carlo, visando reproduzir a dinâmica dos dados reais, e inserindo pequenos
efeitos de violação de CP. Verificamos que há sensibilidade para a violação de CP com efeitos da ordem de 10-3 em algumas situações, que condiz com as expectativas do Modelo Padrão, o que indica a possibilidade de observação de violação de CP nos dados reais do run II. / [en] In the last decade, since the beginning of the operation of the LHC, the Standard Model of particle physics has been tested with unprecedented precision, with enormous success. One of its experiment is LHCb, dedicated to the physics of hadrons containing the beauty and charm quarks. One of the important research topics of the LHCb is the study of the effects of particle-antiparticle asymmetry in decay processes, due to the so-called charge-parity violation (CP). CP violation is predicted by the Standard
Model and, in decays involving the charm quark, it can occur in certain processes called Cabibbo-suppressed. However, this effect is very small, of the order of 0.1 percent. This smallness makes the environment of charm decays attractive to search for physics beyond the Standard Model.
The aim of this work is the study of sensitivity for CP violation in the channels D+ -> K-K+π+ and D+ -> π-π+π+ in run II of the LHCb. Through the phase space of these decays, called Dalitz Plot, and the use
of the Mirandizing method, which is based on looking for local significance in the difference in the distribution of events for particle and antiparticle Dalitz Plots, one can seek for local asymmetries that would indicate effects of CP violation in these decays. Based on the data statistics taken between
2016 and 2018 at the LHCb, pseudo experiments were performed, using the Monte Carlo method, aiming to reproduce the dynamics of real data, and inserting small effects of CP violation. We found that there is sensitivity for CP violation with effects of the order of 10−3 in some situations, which is the expected level predicted by the Standard Model, indicating a possibility for observing CP violation in the real data from run II.
|
125 |
Search for the Higgs boson decaying to a pair of muons with the CMS experiment at the Large Hadron ColliderDmitry Kondratyev (14228264) 08 December 2022 (has links)
<p>The CERN Large Hadron Collider (LHC) offers a unique opportunity to test the Standard Model of particle physics. The Standard Model predicts the existence of a Higgs boson and provides accurate estimates for the strength of the interactions of the Higgs boson with other particles. After the discovery of the Higgs boson, the measurement of its properties, such as its couplings to other particles, is of paramount importance. </p>
<p>The projects described in this thesis explore different aspects of one of such measurements – the search for the Higgs boson decay into a pair of muons (H→<em>μμ</em>), conducted by the CMS experiment at the LHC. This decay plays an important role in elementary particle physics, as it provides a direct way to measure the coupling of the Higgs boson to the muon. The first evidence of the H→<em>μμ</em> decay was reported in 2020 as a result of an elaborate statistical analysis of the dataset collected by the CMS experiment during Run 2 of the LHC (2016–2018). The observed (expected) upper limit on the signal strength modifier for this decay at 95% confidence level was found to be 1.93 (0.81), constituting the most precise measurement to date. </p>
<p>The details of this analysis, along with studies to establish possible directions for the development of the next iteration of the H→<em>μμ</em> analysis using Run 3 data, are discussed in this thesis. In addition, a novel machine learning-based algorithm for the muon high level trigger is presented, which ultimately improves the data-taking efficiency of the CMS experiment, and hence, helps to increase the sensitivity of future H→<em>μμ</em> searches. Finally, projections of the H→<em>μμ</em> search sensitivity to the data-taking conditions at the High-Luminosity Large Hadron Collider are presented, estimating the achievable precision for future measurements of the Higgs boson properties.</p>
|
126 |
Measuring physical properties of the W boson in 7 TeV proton-proton collisionsKillewald, Phillip 22 October 2010 (has links)
No description available.
|
127 |
Laser-induced surface structuring for electron cloud mitigation in particle acceleratorsBez, Elena 11 December 2024 (has links)
Die Bildung von Elektronenwolken durch die Multiplikation von Sekundärelektronen im Strahlrohr von Teilchenbeschleunigern kann während des Betriebs zu einer verringerten Leistung führen. Durch Ultrakurzpuls-Laserstrukturierung kann die Sekundärelektronenemission einer Oberfläche effizient reduziert werden. Im Rahmen dieser Arbeit wurde eine Lösung zur Unterdrückung der Elektronenvervielfachung durch Laserstrukturierung der aus Kupfer bestehenden Innenwände der Strahlrohre des Large Hadron Colliders (LHC) entwickelt, die technische Einschränkungen und Anforderungen an die Oberflächeneigenschaften und Vakuumkompatibilität erfüllt. Dafür wurden fundamentale Abhängigkeiten zwischen den Laserbearbeitungsparametern und den Oberflächeneigenschaften wie z.B. der Abtragstiefe, den oberflächenchemischen Eigenschaften, der Wiederablagerung von Partikeln und letztendlich der Sekundärelektronenausbeute (SEY) im Labormaßstab untersucht. Für die Behandlung der Rohrinnenflächen der Vakuumkammern wurde ein spezieller Aufbau verwendet, der aus einer Pikosekunden Laserquelle, einem Strahlkopplungssystem, einer 15 m langen Hohlkernfaser und einem Roboter besteht, der sich im Inneren des Strahlrohrs bewegt. Dieses System wurde im Rahmen dieser Arbeit in Betrieb genommen und kalibriert. Eine Behandlung bei
niedriger akkumulierter Fluenz in Stickstofffluss resultierte in „optimalen Oberflächeneigenschaften“, d.h. einer geringen Abtragstiefe (ca. 15 μm ), geringer Partikelbedeckung, einer Cu2O dominierten Oberfläche und einem SEY-Maximum von 1.4 nach der Reinigung, welches sich während der elektroneninduzierten Konditionierung zu 1 reduziert. Die 10 m langen Strahlrohre, die in den kryogen gekühlten Magnetaufbauten im LHC installiert sind, sollen mit einem Longitudinal-Scanning-Verfahren selektiv bearbeitet werden. Ein 3.1 m langes, laserbearbeitetes Strahlrohr wurde im LHC installiert, um die Methode bezüglich möglicher Partikelablösungseffekte zu prüfen und um die Laserbehandlung als Oberflächentechnologie für Teilchenbeschleunigervakuumsysteme zu validieren.:1. Motivation
2. Context of the study
3. Sample preparation and characterization methods
4. Fundamental dependencies of the surface properties on the laser parameters
5. Robot-assisted laser processing of curved surfaces
6. Large-scale treatments of beam screens
7. Conclusions and outlook
A. Appendix / The formation of electron clouds by secondary electron multiplication in the beam pipes of particle accelerators can lead to reduced performance during operation. Surface roughening using ultrashort pulse lasers efficiently reduces the secondary electron yield (SEY) of a surface. In this study, a solution for the suppression of electron clouds by laser structuring the inner copper walls of the Large Hadron Collider (LHC) beam tubes was developed, fulfilling the technical constraints and surface property requirements. For this purpose, fundamental dependencies between the laser processing parameters and the surface properties such as the modification depth, the surface chemical composition, the particle redeposition, and finally the SEY were investigated on a laboratory scale. A dedicated setup able to perform the modification treatment in situ, directly in the beam pipe hosted by the LHC magnet, was commissioned and the operation parameters were optimized. The device consists of a picosecond laser source, a beam coupling system, a 15 m long hollow-core fiber, and a robot that travels inside the beam tube. Treatment at low accumulated laser fluence in nitrogen flux resulted in ”optimal surface properties”, and specifically, a low modification depth (≈ 15 μm), low particle redeposition, a Cu2O-dominated surface and a SEY maximum of 1.4 after cleaning, which reduces to 1 upon electron irradiation at both room and cryogenic temperatures. A selective longitudinal scan scheme was developed to process the 10 m long beam pipes installed in the cryogenic magnet assemblies of the LHC with the highest effectiveness. A 3.1 m long laser-processed vacuum chamber was installed in
the LHC to validate the method with respect to particle detachment.:1. Motivation
2. Context of the study
3. Sample preparation and characterization methods
4. Fundamental dependencies of the surface properties on the laser parameters
5. Robot-assisted laser processing of curved surfaces
6. Large-scale treatments of beam screens
7. Conclusions and outlook
A. Appendix
|
128 |
A Search for Displaced Leptons from Long-Lived Particle Decays with the ATLAS DetectorSmith, Andrew Caldon January 2024 (has links)
This thesis presents a search for leptons displaced from the primary vertex with the ATLAS detector at the Large Hadron Collider at CERN. The search includes the full proton–proton collision dataset collected during Run 2 from 2015-2018 at √𝑠 = 13 TeV and a partial dataset collected during Run 3 in 2022-2023 at √𝑠 = 13.6 TeV, corresponding to integrated luminosities of 140 fb⁻¹ and 56.3 fb⁻¹, respectively.
The search is the first performed using proton-proton collision data collected by the ATLAS detector during Run 3, at the highest collision energy ever achieved at a collider. Final states with displaced electrons or muons are considered, and novel triggers introduced in Run 3 are employed that use large impact parameter tracking to reconstruct displaced tracks with low momentum. In addition, multivariate techniques and timing information from the ATLAS electromagnetic calorimeter are employed to broaden the sensitivity to channels with large background rates or highly displaced electrons.
The results are consistent with the Standard Model background expectations and are used to set model-independent limits on the production of displaced electrons and muons. The analysis is also interpreted in the context of a gauge-mediated supersymmetry breaking model with pair-produced long-lived sleptons. The results include 95% CL exclusions of selectrons with lifetimes from 4 ps to 60 ns and a mass of 150 GeV, and exclusions of selectrons, smuons, and staus with a lifetime of 0.3 ns for masses up to 740 GeV, 830 GeV, and 440 GeV, respectively.
|
129 |
Interaction Region Design for a 100 TeV Proton-Proton ColliderMartin, Roman 20 September 2018 (has links)
Mit der Entdeckung des Higgs-Bosons hat ein Messprogramm begonnen, bei dem die Eigenschaften dieses neuen Teilchens mit der höchstmöglichen Präzision untersucht werden soll um die Gültigkeit des Standardmodells der Teilchenphysik zu prüfen und nach neuer Physik jenseits des Standardmodells zu suchen. Für dieses Ziel wird der Large Hadron Collider (LHC) und sein Upgrade, der High Luminosity-LHC bis etwa zum Jahr 2035 laufen und Daten produzieren.
Um an der Spitze der Teilchenphysik zu bleiben, hat die “European Strategy Group for Particle Physics” empfohlen, ambitionierte Nachfolgeprojekte für die Zeit nach dem LHC zu entwickeln. Entsprechend dieser Empfehlung hat das CERN die “Future Circular Collider” (FCC) -Studie gestartet, die die Machbarkeit neuer Speicherringe für Teilchenkollisionen (Collider) untersucht. In dieser Arbeit wird die Entwicklung der Wechselwirkungszonen für FCC-hh, einem Proton-Proton-Speicherring mit einer Schwerpunktsenergie von 100 TeV und einem Umfang von 100 km, beschrieben.
Die Wechselwirkungszone ist das Herzstück eines Colliders, da sie die erreichbare Luminosität bestimmt. Es ist daher entscheidend, schon früh im Entwicklungsprozess eine möglichst hohe Kollisionsrate anzustreben. Ausgehend von der optische Struktur der Wechselwirkungszonen des LHC und dem geplanten High Luminosity-LHC (HL-LHC) werden Strategien zur Skalierung hergeleitet um der höheren Strahlenergie gerecht zu werden. Bereits früh im Entwicklungsprozess wird die Strahlungsbelastung durch Teilchentrümmer vom Wechselwirkungspunkt als entscheidender Faktor für das Layout der Wechselwirkungszone identifiziert und eine allgemeine Design-Strategie, die den Schutz der supraleitenden Endfokussierungsmagnete mit einer hohen Luminosität verbindet, wird formuliert und implementiert. Aufgrund des deutlichen Spielraums in Bezug auf beta* wurde die resultierende Magnetstruktur zum Referenzdesign für das FCC-hh-Projekt. / The discovery of the Higgs boson is the start of a measurement program that aims to study the properties of this new particle with the highest possible precision in order to test the validity or the Standard Model of particle physics and to search for new physics beyond the Standard Model. For that purpose, the Large Hadron Collider (LHC) and its upgrade, the High Luminosity-LHC, will operate and produce data until 2035.
Following the recommendations of the European Strategy Group for Particle Physics, CERN launched the Future Circular Collider (FCC) study to design large scale particle colliders for high energy physics research in the post-LHC era. This thesis presents the development of the interaction region for FCC-hh, a proton-proton collider operating at 100 TeV center-of-mass energy.
The interaction region is the centerpiece of a collider as it determines the achievable luminosity. It is therefore crucial to aim for maximum production rates from the beginning of the design process. Starting from the lattices of LHC and its proposed upgrade, the High Luminosity LHC (HL-LHC), scaling strategies are derived to account for the increased beam rigidity. After identifying energy deposition from debris of the collision events as a driving factor for the layout, a general design strategy is drafted and implemented, unifying protection of the superconducting final focus magnets from radiation with a high luminosity performance. The resulting FCC-hh lattice has significant margins to the performance goals in terms of beta*.
Protecting the final focus magnets from radiation with thick shielding limits the minimum beta* and therefore the luminosity. An alternative strategy to increase the magnet lifetime by distributing the radiation load more evenly is developed. A proof of principle of this method, the so-called Q1 split, is provided. In order to demonstrate the feasibility of the derived interaction region lattices, first dynamic aperture studies are conducted.
|
130 |
Calorimeter-Based Triggers at the ATLAS Detector for Searches for Supersymmetry in Zero-Lepton Final States / Kalorimeterbasierte Trigger am ATLAS-Detektor für Suchen nach Supersymmetrie in Null-Lepton-EndzuständenMann, Alexander 16 February 2012 (has links)
No description available.
|
Page generated in 0.0675 seconds