• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 6
  • 1
  • Tagged with
  • 7
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

First Search for Heavy Neutral Leptons with IceCube DeepCore

Fischer, Leander 20 August 2024 (has links)
Die Beobachtung von Neutrino-Oszillationen hat gezeigt, dass Neutrinos eine von Null verschiedene Masse haben. Dieses Phänomen wird nicht durch das Standardmodell der Teilchenphysik beschrieben, aber eine mögliche Erklärung für dieses Dilemma ist die Existenz von schweren neutralen Leptonen in Form von rechtshändigen Neutrinos. Abhängig von ihrer Masse und Kopplung zu den Neutrinos des Standardmodells könnten diese Teilchen auch eine wichtige Rolle bei der Lösung weiterer unerklärter Beobachtungen wie Dunkler Materie und der Baryonenasymmetrie des Universums spielen. Diese Arbeit präsentiert die erste Suche nach schweren neutralen Leptonen mit dem IceCube Neutrino-Observatorium. Das standardmäßige Drei-Flavor-Neutrino-Modell wird erweitert, indem ein vierter Massenzustand im GeV-Bereich hinzugefügt wird und eine Mischung mit dem Tau-Neutrino durch den Parameter \ut4 erlaubt wird. Es werden drei Massenwerte für schwere neutrale Leptonen, $m_4$, von \SI{0.3}{\gev}, \SI{0.6}{\gev} und \SI{1.0}{\gev} getestet, wobei zehn Jahre Daten aus den Jahren 2011 bis 2021 verwendet werden. Für keine der drei getesteten Massen wird ein signifikantes Signal von schweren neutralen Leptonen gemessen. Die resultierenden Einschränkungen für den Mischungsparameter sind \ut4$ < 0.19\;(m_4 = \SI{0.3}{\gev})$, \ut4$ < 0.36\;(m_4 = \SI{0.6}{\gev})$ und \ut4$ < 0.40\;(m_4 = \SI{1.0}{\gev})$ im \SI{90}{\percent} - Konfidenzniveau. Diese erste Analyse legt die grundlegende Basis für zukünftige Suchen nach schweren neutralen Leptonen in IceCube. / The observation of neutrino oscillations has established that neutrinos have non-zero masses. This phenomenon is not explained by the standard model of particle physics, but one viable explanation to this dilemma is the existence of heavy neutral leptons in the form of right-handed neutrinos. Depending on their mass and coupling to standard model neutrinos, these particles could also play an important role in solving additional unexplained observations such as dark matter and the baryon asymmetry of the universe. This work presents the first search for heavy neutral leptons with the IceCube Neutrino Observatory. The standard three flavor neutrino model is extended by adding a fourth GeV-scale mass state and allowing mixing with the tau neutrino through the parameter \ut4. Three heavy neutral lepton mass values, $m_4$, of \SI{0.3}{\gev}, \SI{0.6}{\gev}, and \SI{1.0}{\gev} are tested using ten years of data, collected between 2011 and 2021. No significant signal of heavy neutral leptons is observed for any of the tested masses. The resulting constraints for the mixing parameter are \ut4$ < 0.19\;(m_4 = \SI{0.3}{\gev})$, \ut4$ < 0.36\;(m_4 = \SI{0.6}{\gev})$, and \ut4$ < 0.40\;(m_4 = \SI{1.0}{\gev})$ at \SI{90}{\percent} confidence level. This first analysis lays the fundamental groundwork for future searches for heavy neutral leptons in IceCube.
2

PLURISOGGETTIVITA' EVENTUALE E REATI ECONOMICI. PROFILI PROBLEMATICI DEL CONCORSO DI PERSONE IN CONTESTI DI COMPLESSITA' ORGANIZATIVA / Complicity and Economic Crime. A study of certain problematic aspects of complicity within complex organizations.

VENTURATO, BENEDETTA 24 May 2017 (has links)
Il lavoro analizza alcune delle principali problematiche poste dalla applicazione della teoria del concorso di persone nel reato ai cd. "crimini dei colletti bianchi", con particolare riferimento alla criminalità che si sviluppa all'interno di strutture organizzative complesse, così come esse emergono dall'analisi della prassi, e ha l'obiettivo di identificare - attraverso un ragionamento di tipo induttivo - le questioni teoriche ad esse sottese, la cui soluzione possa condurre a una coerente visione d'insieme dell'istituto, conforme a esigenze di garanzia e di giustizia. La ricerca ha preso avvio dalla acquisita consapevolezza del fatto che nelle grandi organizzazioni i processi decisionali coinvolgono solitamente una pluralità di funzioni interne all'impresa, l'esercizio di poteri di supervisione e il ricorso allo strumento della decisione collegiale, dando luogo a un sistema estremamente frammentato e fondato su asimmetrie informative, allocazione di competenze e affidamento. A fronte di un simile quadro - che nel lavoro viene trattato a partire dall'approfondimento di alcuni fondamentali studi sociologici e sulle organizzazioni - l'analisi condotta è volta a identificare i criteri per per la selezione dei contributi concorsuali rilevanti, sostenendo la necessità di continuare a fare riferimento (sul piano oggettivo) a un paradigma causale forte. I temi della causalità psichica, della causalità omissiva e delle cd. condotte neutrali sono parimenti fatti oggetto di analisi alla luce delle specificità del contesto organizzativo. I risultati raggiunti sul piano teorico sono quindi applicati a una selezione di casi, al fine di verificare se e in che misura essi siano in grado di condurre a soluzioni maggiormente conformi ai principi costituzionali che informano la responsabilità penale rispetto a quanto attualmente avviene nella prassi. / The study aims at analysing the major issues concerning the application of the complicity doctrine to white collar crimes committed within complex business organizations as they emerge from the praxis and at conceptually framing them, with a view to identifying - through an inductive reasoning - the main theoretical questions to be addressed to develop a consistent approach to the topic. The research starts from the acknowledgement of the fact that within big corporations, decision making processes usually entail the involvement of different corporate functions, the exercise of supervisory duties and the resort to collective decisions, resulting in an extremely fragmented system based on information asymmetry, competence allocation and trust. Against this background – which is dealt with building on the legacy of some key sociological and organizational studies – a legal analysis is conducted to identify criteria for the selection of the conducts worth punishing based on the identification of a causal contribution to the commission of the crime. Psychological influence, causation by omission, and the relevance of so called "neutral behaviours" (Alltagshandlungen) are also explored. The theoretical results of the analysis are then applied to certain selected cases to show how the elaborated framework can lead to solutions more respectful of the constitutional principles governing criminal responsibility.
3

PREDICTING CRASHES AND MANAGING PORTFOLIO IN CRISIS PERIOD

MADONNA, MICHELE MARIA 06 March 2015 (has links)
Eventi come l’ultima crisi finanziaria sono una delle principali cause di perdite inattese negli investimenti finanziari. Infatti, durante una crisi finanziaria, la volatilità dei rendimenti azionari aumenta a causa degli shock dei mercati, incrementando la probabilità di perdita. Per fronteggiare gli effetti della crisi gli investitori , sfruttando possibili informazioni provenienti dai mercati, dovrebbero impostare con anticipo le proprie strategie di investimento e gestirli in modo appropriato per limitare gli effetti degli shock. In base a tali considerazioni si è analizzato, con tale studio, la capacità di predizione da parte di alcune variabili finanziarie/economiche ( BSEYD e Term Yield Spread) dell’andamento dei principali mercati europei ( Germania, Francia e Spagna) e si è definita un appropriata strategia di investimento per i periodi di crisi, costruendo un modello di portafoglio neutrale agli shock. I periodi analizzati sono stati rispettivamente : 1994-2013 e 2003-2014. I risultati hanno dimostrato che le variabili analizzate presentano diverso potere di predizione dei mercati considerati e che la perfomance del portafoglio neutrale agli shock di mercato è migliore di quella del portafoglio market neutral nei periodi di crisi. / Events like the last financial crisis are one of the principal causes of unexpected loss in investments. During a financial crisis period the stock return volatility increases for effect of internal and external shocks and the probability of reaching the expected return become lower , increasing the probability of loss. To face crisis effects, international investors should consider possible signals and information present in the market about possible crashes or declining period to set in advance their investment strategies and to manage them properly to limit the shock effects. On the basis of these considerations, with this study, we analyzed the predictive power of 2 economic/ financial variables( BSEYD and Term spread yield) on principal European stock markets (France, Germany and Spain) and defined a proper investment strategy for financial crisis period, building a portfolio model that is neutral to the international market shocks. The study has been conducted in two periods: 1994-2013 ( prediction analysis) and 2003-2014 (shock neutral portfolio mode). Results show different predictive power on the of the variables on the different markets analyzed and a better performance of the shock neutral portfolio than the market neutral portfolio strategy in declining period of the market.
4

Modelling of cosmic ray modulation in the heliosphere by stochastic processes / Roelf du Toit Strauss

Strauss, Roelf du Toit January 2013 (has links)
The transport of cosmic rays in the heliosphere is studied by making use of a newly developed modulation model. This model employes stochastic differential equations to numerically solve the relevant transport equation, making use of this approach’s numerical advantages as well as the opportunity to extract additional information regarding cosmic ray transport and the processes responsible for it. The propagation times and energy losses of galactic electrons and protons are calculated for different drift cycles. It is confirmed that protons and electrons lose the same amount of rigidity when they experience the same transport processes. These particles spend more time in the heliosphere, and also lose more energy, in the drift cycle where they drift towards Earth mainly along the heliospheric current sheet. The propagation times of galactic protons from the heliopause to Earth are calculated for increasing heliospheric tilt angles and it is found that current sheet drift becomes less effective with increasing solar activity. Comparing calculated propagation times of Jovian electrons with observations, the transport parameters are constrained to find that 50% of 6 MeV electrons measured at Earth are of Jovian origin. Charge-sign dependent modulation is modelled by simulating the proton to anti-proton ratio at Earth and comparing the results to recent PAMELA observations. A hybrid cosmic ray modulation model is constructed by coupling the numerical modulation model to the heliospheric environment as simulated by a magneto-hydrodynamic model. Using this model, it is shown that cosmic ray modulation persists beyond the heliopause. The level of modulation in this region is found to exhibit solar cycle related changes and, more importantly, is independent of the magnitude of the individual diffusion coefficients, but is rather determined by the ratio of parallel to perpendicular diffusion. / PhD (Space Physics), North-West University, Potchefstroom Campus, 2013
5

Modelling of cosmic ray modulation in the heliosphere by stochastic processes / Roelf du Toit Strauss

Strauss, Roelf du Toit January 2013 (has links)
The transport of cosmic rays in the heliosphere is studied by making use of a newly developed modulation model. This model employes stochastic differential equations to numerically solve the relevant transport equation, making use of this approach’s numerical advantages as well as the opportunity to extract additional information regarding cosmic ray transport and the processes responsible for it. The propagation times and energy losses of galactic electrons and protons are calculated for different drift cycles. It is confirmed that protons and electrons lose the same amount of rigidity when they experience the same transport processes. These particles spend more time in the heliosphere, and also lose more energy, in the drift cycle where they drift towards Earth mainly along the heliospheric current sheet. The propagation times of galactic protons from the heliopause to Earth are calculated for increasing heliospheric tilt angles and it is found that current sheet drift becomes less effective with increasing solar activity. Comparing calculated propagation times of Jovian electrons with observations, the transport parameters are constrained to find that 50% of 6 MeV electrons measured at Earth are of Jovian origin. Charge-sign dependent modulation is modelled by simulating the proton to anti-proton ratio at Earth and comparing the results to recent PAMELA observations. A hybrid cosmic ray modulation model is constructed by coupling the numerical modulation model to the heliospheric environment as simulated by a magneto-hydrodynamic model. Using this model, it is shown that cosmic ray modulation persists beyond the heliopause. The level of modulation in this region is found to exhibit solar cycle related changes and, more importantly, is independent of the magnitude of the individual diffusion coefficients, but is rather determined by the ratio of parallel to perpendicular diffusion. / PhD (Space Physics), North-West University, Potchefstroom Campus, 2013
6

Search for heavy resonances decaying into the fully hadronic di-tau final state with the ATLAS detector

Morgenstern, Marcus Matthias 11 April 2014 (has links) (PDF)
The discovery of a heavy neutral particle would be a direct hint for new physics beyond the Standard Model. In this thesis searches for new heavy neutral particles decaying into two tau leptons, which further decay into hadrons, are presented. They cover neutral Higgs bosons in the context of the minimal supersymmetric extension of the Standard Model (MSSM) as well as Z′ bosons, predicted by various theories with an extended gauge sector. Both analyses are based on the full 2012 proton-proton collision dataset taken by the ATLAS experiment at the Large Hadron Collider (LHC). The extended Higgs sector in the MSSM suggests additional heavy neutral Higgs bosons which decay into tau leptons in about 10% of the time. Given that the dominant final state, φ → b¯b, suffers from tremendous QCD initiated backgrounds, the decay into two tau leptons is the most promising final state to discover such new resonances. The fully hadronic final state is the dominant one with a branching fraction of about 42%. It governs the sensitivity, in particular at high transverse momentum when the QCD multijet background becomes small. Other theoretical extensions of the Standard Model, which are mainly driven by the concept of gauge unification, predict additional heavy particles arising from an extended underlying gauge group. Some of them further predict an enhanced coupling to fermions of the third generation. This motivates the search for Z′ bosons in the fully hadronic di-tau final state. One major challenge in physics analyses involving tau leptons is to have an outstanding performance of trigger and identification algorithms suitable to select real tau leptons with high efficiency, while rejecting fake taus originating from quark or gluon initiated jets. In this work a new tau trigger concept based on multivariate classifiers has been developed and became the default tau trigger algorithm in 2012 data-taking. An updated tau identification technique based on the log-likelihood approach has been provided for 2011 data-taking. Furthermore, a new framework has been developed to perform the tuning of the tau identification algorithm and exploited for the optimisation for 2012 data-taking, accordingly. The search for new heavy neutral Higgs bosons in the context of the MSSM has been performed exploiting the full 2012 dataset corresponding to an integrated luminosity of 19.5 fb−1 taken at a centre-of-mass energy of √s = 8 TeV. Updated event selection criteria and novel data-driven background estimation techniques have been developed and are suitable to increase the sensitivity of the analysis significantly. No deviations from the Standard Model prediction are observed, and thus 95% C.L. exclusion limits on the production cross section times branching ratio, σ(pp → φ) × BR(φ → ττ), are derived exploiting the CLs method. The exclusion ranges from 13.0 pb at 150GeV to 7.0 fb at 1 TeV for Higgs boson production in association with b-quarks and from 23.6 pb at 150GeV to 7.5 fb at 1 TeV for Higgs bosons produced via gluon-gluon fusion. The obtained exclusion limit on σ(pp → φ) × BR(φ → ττ) can be related to an exclusion of the MSSM parameter space in the MA-tan β-plane. Various benchmark scenario are considered. The ”standard candle” is the mhmax scenario, for which tan β values between 13.3 and 55 can be excluded at 95% C.L. in the considered mass range. Updated benchmark scenarios designed to incorporate the recently discovered SM-like Higgs boson were suggested and analysed as well. In the mhmod+ (mhmod−) scenario tan β values between 13.5 (13.3 ) and 55 (52 ) can be excluded. Finally, a search for heavy neutral resonances in the context of Z′ bosons was performed. As in the search for new Higgs bosons, no deviation from the Standard Model prediction is observed, and hence exclusion limits on the production cross section times branching ratio, σ(pp → Z′) × BR(Z′ → ττ), and on the Z′ boson mass are derived exploiting the Bayesian approach. Z′ bosons with MZ′ < 1.9 TeV can be excluded at 95% credibility, and thus mark the strongest exclusion limit obtained in the di-tau final state by any collider experiment so far.
7

Search for heavy resonances decaying into the fully hadronic di-tau final state with the ATLAS detector

Morgenstern, Marcus Matthias 21 March 2014 (has links)
The discovery of a heavy neutral particle would be a direct hint for new physics beyond the Standard Model. In this thesis searches for new heavy neutral particles decaying into two tau leptons, which further decay into hadrons, are presented. They cover neutral Higgs bosons in the context of the minimal supersymmetric extension of the Standard Model (MSSM) as well as Z′ bosons, predicted by various theories with an extended gauge sector. Both analyses are based on the full 2012 proton-proton collision dataset taken by the ATLAS experiment at the Large Hadron Collider (LHC). The extended Higgs sector in the MSSM suggests additional heavy neutral Higgs bosons which decay into tau leptons in about 10% of the time. Given that the dominant final state, φ → b¯b, suffers from tremendous QCD initiated backgrounds, the decay into two tau leptons is the most promising final state to discover such new resonances. The fully hadronic final state is the dominant one with a branching fraction of about 42%. It governs the sensitivity, in particular at high transverse momentum when the QCD multijet background becomes small. Other theoretical extensions of the Standard Model, which are mainly driven by the concept of gauge unification, predict additional heavy particles arising from an extended underlying gauge group. Some of them further predict an enhanced coupling to fermions of the third generation. This motivates the search for Z′ bosons in the fully hadronic di-tau final state. One major challenge in physics analyses involving tau leptons is to have an outstanding performance of trigger and identification algorithms suitable to select real tau leptons with high efficiency, while rejecting fake taus originating from quark or gluon initiated jets. In this work a new tau trigger concept based on multivariate classifiers has been developed and became the default tau trigger algorithm in 2012 data-taking. An updated tau identification technique based on the log-likelihood approach has been provided for 2011 data-taking. Furthermore, a new framework has been developed to perform the tuning of the tau identification algorithm and exploited for the optimisation for 2012 data-taking, accordingly. The search for new heavy neutral Higgs bosons in the context of the MSSM has been performed exploiting the full 2012 dataset corresponding to an integrated luminosity of 19.5 fb−1 taken at a centre-of-mass energy of √s = 8 TeV. Updated event selection criteria and novel data-driven background estimation techniques have been developed and are suitable to increase the sensitivity of the analysis significantly. No deviations from the Standard Model prediction are observed, and thus 95% C.L. exclusion limits on the production cross section times branching ratio, σ(pp → φ) × BR(φ → ττ), are derived exploiting the CLs method. The exclusion ranges from 13.0 pb at 150GeV to 7.0 fb at 1 TeV for Higgs boson production in association with b-quarks and from 23.6 pb at 150GeV to 7.5 fb at 1 TeV for Higgs bosons produced via gluon-gluon fusion. The obtained exclusion limit on σ(pp → φ) × BR(φ → ττ) can be related to an exclusion of the MSSM parameter space in the MA-tan β-plane. Various benchmark scenario are considered. The ”standard candle” is the mhmax scenario, for which tan β values between 13.3 and 55 can be excluded at 95% C.L. in the considered mass range. Updated benchmark scenarios designed to incorporate the recently discovered SM-like Higgs boson were suggested and analysed as well. In the mhmod+ (mhmod−) scenario tan β values between 13.5 (13.3 ) and 55 (52 ) can be excluded. Finally, a search for heavy neutral resonances in the context of Z′ bosons was performed. As in the search for new Higgs bosons, no deviation from the Standard Model prediction is observed, and hence exclusion limits on the production cross section times branching ratio, σ(pp → Z′) × BR(Z′ → ττ), and on the Z′ boson mass are derived exploiting the Bayesian approach. Z′ bosons with MZ′ < 1.9 TeV can be excluded at 95% credibility, and thus mark the strongest exclusion limit obtained in the di-tau final state by any collider experiment so far.

Page generated in 0.0524 seconds