• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3
  • 1
  • Tagged with
  • 4
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Recherche de la désintegration du boson de Higgs en deux leptons taus dans l'expérience ATLAS / Search for the Higgs boson decay into a pair of taus in ATLAS

Hariri, Faten 30 October 2015 (has links)
Au LHC, l'un des buts essentiels à savoir était de trouver la dernière pièce manquante du modèle standard (MS), i.e. le boson de Higgs (H). La recherche fut couronnée de succès avec les données prises en 2012 et la découverte d'une nouvelle particule scalaire de masse ~126 GeV, se désintégrant en deux bosons (deux photons ou deux bosons électrofaibles ZZ or W+W-). Pour vérifier la compatibilité de la nouvelle particule avec les prédictions du MS, son couplage aux fermions devait être établi, ce qui motiva la recherche du Higgs dans le mode de désintégration en deux leptons taus ayant un rapport d'embranchement important. Dans ATLAS, cette analyse est divisée en trois canaux selon le mode de désintégration des leptons taus. Le travail présenté dans cette thèse concerne le canal “lepton-hadron”, où l'un des taus de l'état final se désintègre leptoniquement en un muon ou un electron, alors que l'autre se désintègre hadroniquement. Les canaux de l'analyse H→tau+ tau- sont caractérisés par de larges valeurs de l'énergie transverse manquante (MET) dans l'état final et adoptent la même technique pour identifier le lepton tau. Dans cette thèse, une contribution importante, mettant en relief l'amélioration obtenue avec une nouvelle MET, est montrée. En utilisant les traces chargées pour estimer la composante “molle” de MET dans les événements issus de collisions p-p, la sensibilité à l'empilement (pile-up), inévitable dans les collisionneurs hadroniques à haute luminosité, est bien réduite. Les erreurs systématiques associées à la composante molle ont été évaluées et leur dependence sur les conditions de pile-up et de modélisation de l'événement a été étudiée pour différentes définitions de MET. Ceci contribuera à améliorer les futures analyses H→tau+ tau-. Dans l'analyse “lepton-hadron”, le bruit de fond dominant provient des événements dont un jet de hadrons est mal identifié comme un tau se désintégrant hadroniquement (“fake tau”). Le travail discuté montre en détail l'estimation de ce bruit de fond pour les deux configurations les plus sensibles aux événements de signal H, i.e. les événements produits avec un Higgs bien boosté ou ceux produits par fusion de deux bosons vecteurs (mode VBF). L'état final de ces derniers est caractérisé par deux jets bien séparés en pseudorapidité, répartis sur les deux hemisphères, produits en association avec les produits de désintégration du H. Enfin, cette thèse rapporte une dernière contribution utilisant la théorie effective des champs pour la production du boson de Higgs et pour estimer les couplages de ce dernier (HEFT), et explorer la nouvelle physique au delà du MS de façon indépendante du modèle théorique. Le travail consiste à tester et valider le modèle “tauDecay” dans le cadre d'une caractérisation du Higgs utilisant HEFT au sein de Madgraph5_aMC@NLO. Après avoir écrit un outil permettant de fusionner les fichiers de production et de désintégration du Higgs (utile surtout en travaillant avec une précision au niveau NLO), la validation du modèle a été faite de 3 façons indépendantes: avec la génération d' événements au niveau d'éléments de matrice directement, avec l'outil créé et en désintégrant les taus avec MadSpin. Ce nouvel outil est prêt à être utilisé durant le Run-II du LHC. / In the LHC project, one of the major goals was the search for the last missing piece of the standard model (SM), namely the Higgs boson (H). The quest was successful during the Run I data taking in 2012 with the discovery of a new scalar of mass ~126 GeV, compatible with the SM Higgs boson, and decaying to two bosons (either two photons or two electroweak vector bosons ZZ or W+W-). To complete the picture, one needed to establish the couplings of the new particle to fermions. This motivated the search for the decay mode into two tau leptons predicted with a high branching ratio.Inside the ATLAS collaboration, the analysis was divided into three channels according to the decay modes of the tau pair. The work reported in this Ph.D describes the “ lepton-hadron ” analysis where one tau lepton decays leptonically into an electron or a muon and the other decays hadronically. Common features of all three analyses are the identification of the tau lepton and the presence of large missing transverse energy (MET) due to the escaping neutrinos from the tau decays. An important contribution reported in this dissertation concerns the improvement brought by a new MET determination. By using charged tracks to estimate the contribution of the soft energy component produced in the proton-proton collision, the sensitivity to the overlayed events (“ pile-up ”), unavoidable in a high luminosity hadron collider, is very much reduced. The systematic uncertainties associated to this soft component were estimated, their dependence on physics modeling and pile-up conditions studied for various track-based MET definitions. It will contribute to an improved H→tau+ tau- analysis with future data.In the lepton-hadron H analysis, the dominant background comes from events where a hadronic jet is misidentified as a hadronic tau (“ fake-tau ”). The work reports in detail how this fake-tau background has been estimated in the two most sensitive event configurations predicted for the H signal i.e. events where the H boson is highly boosted or produced by fusion of vector bosons (VBF); VBF events are characterized by two forward and backward jets in addition to the H decay products.Finally, the thesis reports on a last contribution performed with the Higgs Effective Field Theory (HEFT) to study the H couplings and probe new physics beyond SM in a model independent way. The work consisted in testing and validating the “TauDecay” model in association with the Higgs characterization framework in Madgraph5_aMC@NLO. After implementing a tool to merge H production and decay in a single step (especially useful with NLO requirements), the validation was done in three different ways: direct matrix element generation, with the implemented merging tool and using MadSpin to decay taus. The combined package is ready for use in the LHC Run II context.
2

Lietuvos muzikologija [Lithuanian musicology] Vol. 1, Vilnius 2000 [Zusammenfassung]

15 June 2017 (has links)
Zusammenfassungen der 12 Artikel aus Heft 1 von Lietuvos muzikologija
3

Escalonamento de aplicações paralelas: de clusters para grids

Jacinto, Daniele Santini 24 August 2007 (has links)
Made available in DSpace on 2016-06-02T19:05:26Z (GMT). No. of bitstreams: 1 1631.pdf: 1988300 bytes, checksum: e305adb917a8fdf720897942982390b7 (MD5) Previous issue date: 2007-08-24 / Different algorithms provide efficient scheduling of parallel applications on distributed and heterogeneous computational platforms, such as computational grids. Most scheduling algorithms for such environments require an application model represented by a directed acyclic graph (DAG), selecting tasks for execution according to their processing and communication characteristics. The obtainment of DAGs for real applications, however, is not a simple quest. The required knowledge about the application tasks and the communication among them, considering existing transmission cycles, harden the elaboration of appropriate graphs. Particularly, MPI programs, that represent a meaningful portion of existing parallel applications, usually present a cyclic communication model among the master and the processing nodes. This behavior prevents most scheduling algorithms to be employed as they recursively traverse the graphs to prioritize the tasks. In this sense, this work presents a mechanism for the automatic creation of DAGs for real MPI application originally developed for homogeneous clusters. In order to do so, applications go through a monitored execution in a cluster and the collected data are used for the elaboration of an appropriate DAGs. Data dependencies are identified and existing cycles among the tasks are eliminated. The HEFT scheduling algorithm is used to evaluate the application model and the schedule obtained is then automatically converted into an RSL (Resource Specification Language) file for execution in a grid with Globus. Results from running real applications and simulations show using the grid can be advantageous. / Algoritmos diferentes possibilitam o escalonamento eficiente de aplicações paralelas em plataformas computacionais heterogêneas e distribuídas, tais como grids computacionais. Vários algoritmos de escalonamento para esses ambientes necessitam de um modelo de aplicação representado por um grafo acíclico direcionado (GAD), selecionando tarefas para execução de acordo com suas características de comunicação e de processamento. A obtenção de um GAD para uma aplicação real, contudo, não é uma questão simples. O conhecimento necessário sobre as tarefas da aplicação e as comunicações entre elas, considerando ciclos de transmissão, dificulta a elaboração de um grafo apropriado. Particularmente, programas MPI, os quais representam uma parcela significativa das aplicações paralelas, apresentam um modelo de comunicação cíclico entre o nó master e os nós de processamento. Esse comportamento impede a utilização de muitos algoritmos de escalonamento devido ao fato de eles percorrerem o grafo recursivamente para priorizar as tarefas. Nesse sentido, esse trabalho apresenta um mecanismo para a criação automática de GADs para aplicações MPI reais originalmente desenvolvidas para clusters homogêneos. Para essa implementação, aplicações são monitoradas durante a execução em um cluster e os dados coletados são usados para a elaboração de um GADs apropriados. Dependências de dados são identificadas e ciclos existentes entre as tarefas são eliminados. O algoritmo de escalonamento HEFT é usado para avaliar o modelo de aplicação e o escalonamento obtido é então automaticamente convertido em um arquivo RSL (Resource Specification Language) para execução em um grid com Globus. Resultados de execuções de aplicações reais e simulações demonstram que o uso de grid pode ser vantajoso.
4

Quality Assessment and Quality Improvement for UML Models

Jalbani, Akhtar Ali 28 February 2011 (has links)
No description available.

Page generated in 0.0394 seconds