• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 110
  • 28
  • 25
  • 23
  • 17
  • 4
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 241
  • 31
  • 29
  • 27
  • 26
  • 21
  • 21
  • 21
  • 20
  • 20
  • 20
  • 19
  • 19
  • 17
  • 17
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Literature at the Dawn of Trauma Consciousness

Wolfsdorf, Adam January 2018 (has links)
We are living are living in the age of the trigger warning— educational cultures that threaten English teachers’ ability to present psychologically upsetting literature to students who may lack the necessary resilience to tolerate highly charged literary encounters with complex issues, such as rape, violence, racism, or political strife. And yet literature is filled with conflict— artistic representations of the precise traumas that certain members of our student populations may not be able to tolerate. In order to safeguard trauma survivors from potential reactivation of traumatic stress, a handful of educational institutions promote the use of trigger warnings. But are trigger warnings effective, and, if they are, what do they teach English teachers about what happens to individuals who have endured trauma and are therefore susceptible to being triggered? The purpose of this research, which consisted of interviews and an intensive focus group with seven veteran English teachers teaching at seven distinct schools throughout the world, was to offer insights and pedagogical awareness to English teachers, so that they can better anticipate, conceptualize, and decided for themselves how to respond to students who get triggered by emotionally complex literature. In addition to the qualitative research methods used with the seven English teacher participants, this study utilizes the work and thinking of trauma expert Bessel van der Kolk in an attempt to illustrate the neurological impacts of trauma through a comprehensive overview of PET scans of trauma survivors studied in van der Kolk’s lab in Brookline, Massachusetts. Each PET scan presents key features of what can happen to the brains of survivors, and may provide significant clues into what happens among our students when they get psychologically triggered in the classroom. The dissertation concludes with a one-on-one interview with Harvard psychiatrist Bessel van der Kolk, and offers his insights, wisdom, and conceptualizations for this highly complex and nuanced problem.
62

Sélection des électrons et recherche du boson de Higgs se désintégrant en paires de leptons tau avec l'expérience CMS au LHC / Electron selection and search for the Higgs boson decaying into tau leptons pairs with the CMS detector at the LHC

Daci, Nadir 30 October 2013 (has links)
Cette thèse s'inscrit dans le contexte des premières années d'exploitation du Large Hadron Collider (LHC). Cet appareil monumental a été construit dans le but d'explorer la physique de l'infiniment petit à l'échelle du TeV. Un des objectifs majeurs du LHC est la recherche du boson de Higgs. Sa découverte validerait le mécanisme de brisure de symétrie électrofaible, au travers duquel les bosons W et Z acquièrent leur masse. L'expérience Compact Muon Solenoid (CMS) analyse les collisions de protons du LHC. Leur fréquence élevée (20 MHz) permet d'observer des phénomènes rares, comme la production et la désintégration d'un boson de Higgs, mais elle nécessite alors une sélection rapide des collisions intéressantes, par un système de déclenchement. Les ressources informatiques disponibles pour le stockage et l'analyse des données imposent une limite au taux de déclenchement : la bande passante, répartie entre les différents signaux physiques, doit donc être optimisée. Dans un premier temps, j'ai étudié le déclenchement sur les électrons : ils constituent une signature claire dans l'environnement hadronique intense du LHC et permettent à la fois des mesures de haute précision et la recherche de signaux rares. Ils font partie des états finaux étudiés par un grand nombre d'analyses (Higgs, électrofaible, etc.). Dès les premières collisions en 2010, la présence de signaux anormaux dans l'électronique de lecture du calorimètre électromagnétique (ECAL) constituait une source d'augmentation incontrôlée du taux de déclenchement. En effet, leur taux de production augmentait avec l'énergie et l'intensité des collisions : ils étaient susceptibles de saturer la bande passante dès 2011, affectant gravement les performances de physique de CMS. J'ai optimisé l'algorithme d'élimination de ces signaux en conservant une excellente efficacité de déclenchement sur les électrons, pour les prises de données en 2011. D'autre part, l'intensité croissante des collisions au LHC fait perdre leur transparence aux cristaux du ECAL, induisant une inefficacité de déclenchement. La mise en place de corrections hebdomadaires de l'étalonnage du système de déclenchement a permis de compenser cette inefficacité. Dans un second temps, j'ai participé à la recherche du boson de Higgs dans son mode de désintégration en deux leptons tau. Cette analyse est la seule qui puisse actuellement vérifier le couplage du boson de Higgs aux leptons. Le lepton tau se désintégrant soit en lepton plus léger (électron ou muon), soit en hadrons, six états finaux sont possibles. Je me suis concentré sur les états finaux semi-leptoniques (électron/muon et hadrons), où la signification statistique du signal est maximale. Les algorithmes de déclenchement dédiés à cette analyse sélectionnent un lepton (électron ou muon) et un « tau hadronique » d'impulsions transverses élevées. Cependant, cette sélection élimine la moitié du signal, ce qui a motivé la mise en place d'algorithmes sélectionnant des leptons de basse impulsion, incluant une coupure sur l'énergie transverse manquante. Celle-ci limite le taux de déclenchement et sélectionne des évènements contenant des neutrinos, caractéristiques des désintégrations du lepton tau. Les distributions de masse invariante des processus de bruit de fond et de signal permettent de quantifier la compatibilité entre les données et la présence ou l'absence du signal. La combinaison de l'ensemble des états finaux conduit à l'observation d'un excès d'évènements sur un large intervalle de masse. Sa signification statistique vaut 3,2 déviations standard à 125 GeV ; la masse du boson mesurée dans ce canal vaut 122 ± 7 GeV. Cette mesure constitue la toute première évidence d'un couplage entre le boson de Higgs et le lepton tau. / This thesis fits into the first operating years of the Large Hadron Collider. This monumental machine was built to explore the infinitesimal structure of matter at the multi-TeV scale. The LHC aimed primarily at searching for the Higgs boson, the discovery of which would confirm the electroweak symmetry breaking model. This mechanism, which provides W and Z bosons with a mass, describes the transition from a unified electroweak interaction to a weak interaction (short range) and an electromagnetic interaction (infinite range). The LHC's proton collisions, operated at a 50 ns period, are analysed by 4 large detectors, including the Compact Muon Solenoid (CMS). This small period allows to observe very rare phenomena, such as the Higgs boson production and decay, but it requires a fast online selection of the interesting collisions: the trigger system. The computing resources available for the data's storage and analysis set a limit to the trigger rate. Therefore the bandwidth, which is split into several physics signals, must be optimised. Firstly, I studied the electron trigger: electrons are a clear signature in the intense hadronic environment within the LHC and allow a high measurement accuracy, as well as a search for rare signals. Besides, they are part of the final states investigated by a large number of analyses (Higgs, electroweak, etc). From the first collisions in 2010, anomalous signals in the CMS electromagnetic calorimeter (ECAL) were a source of uncontrolled trigger rate increase. Indeed, their production rate increased along with the collisions' energy and intensity: they were likely to saturate the bandwidth as early as 2011, crippling drastically the CMS physics performances. I optimised the anomalous signal rejection algorithm, while conserving an excellent electron triggering efficiency, as regards the data collected in 2011. Moreover, the increasing intensity of the LHC collisions causes a loss of transparency in the ECAL crystals. The setting-up of weekly corrections to the ECAL trigger calibration helped make up for the inefficiency caused by this loss of transparency. Secondly, I contributed to the search for the Higgs boson decaying to 2 tau leptons. So far, this analysis proved to be the only possible method to check the coupling of the Higgs boson to leptons. The tau lepton decays either into lighter leptons (electron or muon), or into hadrons: hence the study of six final states. I focused on the semileptonic final states, in which the expected signal is the most statistically significant. The trigger algorithms dedicated to this analysis select a lepton and a hadronic tau, with high transverse momenta. However, this selection removes half of the signal, which motivated the elaboration of new algorithms selecting low momenta leptons, including a cut on the missing transverse energy. This cut helps controlling the trigger rate and selects events containing neutrinos, which are a distinguishing feature of the tau lepton decay. The invariant mass distributions for all background and signal processes allow to quantify the compatibility between the acquired data and the presence of a signal. The combination of all final states leads to the observation of an excess of events over a large mass range. Its statistical significance is 3,2 standard deviations at 125 GeV ; the boson mass measured in this channel is 122 ± 7 GeV. This measurement is the first evidence for a coupling between the Higgs boson and the tau lepton.
63

Recherche de leptoquarks dans la topologie à jets et énergie transverse manquante avec le détecteur D0 au TeVatron

Zabi, Alexandre 28 October 2004 (has links) (PDF)
L'expérience D0 se déroule au laboratoire Fermilab situé aux Etats-Unis. Elle étudie les collisions proton-antiproton à une énergie dans le centre de masse de 1,96 TeV fournies par l'accélérateur TeVatron. L'acquisition des données par le détecteur D0 utilise un système de déclenchement sophistiqué permettant de sélectionner les collisions présentant un potentiel de physique intéressant. Le processeur électronique L2STT permet de déclencher sur la présence de particules à longue durée de vie dans l'état final. C'est par exemple le cas de la désintégration du Higgs en une paire de quarks b. Sa conception bénéficie des avancées récentes dans le domaine des hautes technologies. Ce système est dorénavant complètement installé et permettra très prochainement une optimisation supplémentaire de la stratégie de déclenchement de l'expérience. Les leptoquarks sont des particules responsables d'une interaction hypothétique entre les quarks et les leptons du Modèle Standard. La mise en évidence d'une telle particule serait interprétée comme signalant l'existence d'une nouvelle physique. Il s'agit dans ce manuscrit d'une recherche directe dans la topologie à jets et énergie transverse manquante. Dans le but de mener à bien cette recherche, une méthode de déclenchement devrait tout d'abord être développée ainsi qu'un outil précis pour déterminer son efficacité. L'analyse des événements présentant la topologie de jets acoplanaires a été conduite sur un lot de données correspondant à une luminosité intégrée de $85 pb^-1$. Cette analyse a permis d'exclure un domaine de masse pour les leptoquarks allant de $85 GeV/c^2$ à $109 GeV/c^2$ à 95% de niveau de confiance.
64

Conception et réalisation de l'électronique frontale du détecteur de pied de gerbe et de l'unité de décision du système du premier niveau de déclenchement de l'expérience LHCb

Cornat, Rémi 11 October 2002 (has links) (PDF)
Les expériences de physique des particules associées au collisionneur LHC devront traiter une collision toutes les 25 ns sur plusieurs centaines de milliers de voies de mesure. La quantité de données produites est considérable.<br> Sur l'expérience LHCb une unité de décision en effectue une première sélection. Nous proposons une solution pour sa réalisation. Il s'agit d'une électronique numérique pipelines à 40 MHz et réalisée en composants programmables avec des interfaces LVDS. Une première version du banc de test est présentée afin de générer des stimuli à la cadence de 40 MHz pour des mots jusqu'à 512 bits.<br> Le preshower fait partie du système calorimétrique de LHCb. Il est composé de 6000 voies de mesure. Le signal physique est d'abord mis en forme grâce à des intégrations analogiques sur 25ns sans temps mort (ASIC). Au bout de 20m de câble, les valeurs d'intégrales (en tension) sont numérisées puis traitées sur une centaine de cartes frontales.<br> La réalisation de prototypes de la partie de traitement des données ont permis de mettre en concurrence une technologie programmable et une technologie ASIC (AMS 0,35 µm) et de prendre en compte les contraintes fortes en terme de nombre de voies de mesure par carte (128 demi-voies) et de résistance aux radiations.
65

Generators, Calorimeter Trigger and J/ψ production at LHCb

Robbe, P. 12 March 2012 (has links) (PDF)
Ce document presente des résultats relatifs à la préparation du programme de physique de l'expérience LHCb: développement d'un logiciel de génération, commissioning du trigger calorimètre et mesure de la production des J/psi. Une simulation détaillée est obligatoire pour développer les outils d'analyse nécessaires pour réaliser ce programme et un logiciel de génération détaillé a été implémenté. Celui-ci décrit par exemple le mélange des B et la violation de CP dans les désintégrations des B pour l'environnement hadronique de LHCb. Pour les désintégrations hadroniques, le système de déclenchement de l'expérience est basé sur les calorimètres, en particulier le calorimètre hadronique. La grande section efficace de production au LHC permet de faire, avec les premières données enregistrées par l'expérience, une mesure de la section efficace différentielle des J/psi, et de la comparer avec des modèles thèoriques pour tester QCD dans le secteur des quarks lourds.
66

An FPGA implementation of neutrino track detection for the IceCube telescope

Wernhoff, Carl January 2010 (has links)
<p>The <em>IceCube telescope</em> is built within the ice at the geographical South Pole in the middle of the Antarctica continent. The purpose of the telescope is to detect muon neutrinos, the muon neutrino being an elementary particle with minuscule mass coming from space.</p><p>The detector consists of some 5000 DOMs registering photon hits (light). A muon neutrino traveling through the detector might give rise to a track of photons making up a straight line, and by analyzing the hit output of the DOMs, looking for tracks, neutrinos and their direction can be detected.</p><p>When processing the output, triggers are used. Triggers are calculation- efficient algorithms used to tell if the hits seem to make up a track - if that is the case, all hits are processed more carefully to find the direction and other properties of the track.</p><p>The Track Engine is an additional trigger, specialized to trigger on low- energy events (few track hits), which are particularly difficult to detect. Low-energy events are of special interest in the search for Dark Matter.</p><p>An algorithm for triggering on low-energy events has been suggested. Its main idea is to divide time in overlapping time windows, find all possible pairs of hits in each time window, calculate the spherical coordinates θ and ϕ of the position vectors of the hits of the pairs, histogram the angles, and look for peaks in the resulting 2d-histogram. Such peaks would indicate a straight line of hits, and, hence, a track.</p><p>It is not believed that a software implementation of the algorithm would be fast enough. The Master's Thesis project has had the aim of developing an FPGA implementation of the algorithm.</p><p>Such an FPGA implementation has been developed. Extensive tests on the design has yielded positive results showing that it is fully functional. The design can be synthesized to about 180 MHz, making it possible to handle an incoming hit rate of about 6 MHz, giving a margin of more than twice to the expected average hit rate of 2.6 MHz.</p>
67

Switching Behaviour within the Telecommunication Business : A qualitative study of former TeliaSonera customers

Göransson, Katrin, Frenzel, Felix January 2009 (has links)
<p>The telecommunication business has changed in Sweden during the recent years. From being a monopoly, the market has switched into a more competitive market with more competitors that offer more services. <em>TeliaSonera </em>is one of the largest telecommunication providers in Sweden. <em>TeliaSonera </em>is a co-operation between the companies <em>Telia</em>, which was one of the leading telecommunication companies in Sweden, and the Finnish counterpart <em>Sonera</em>. At the present time of this thesis, they provide their customers with services like Broadband, TV, Stationary phone and Mobile phone. These services are provided both to residential and business customers.</p><p>The aim of this research project is to understand the switching behaviour of former <em>TeliaSonera </em>customers by investigating the background of the customers' motivation to switch. Through analysing the findings, the researchers will be able to make assumptions about customer switching processes.</p><p>The research has been conducted with an explorative research approach and qualitative interviewing via telephone with 22 former <em>TeliaSonera </em>customers. The questions were related to their behaviour before, while and after the switching from <em>TeliaSonera </em>to a competing provider. From the interviews, the researchers seek to get a better understanding what triggers sway customers to switch. Additionally, it is equally important to understand the switching process customers go through.</p><p>The theoretical framework is based on prior research on customer behaviour and customer relationship management in the field of service management and marketing. Theories as triggers, active and passive customers or suggestions like the unconscious decision-making are being discussed. Triggers are the point where the customer begins to be aware of a possible switch of services. An active customer searches for the information oneself and a passive customer often are influenced by a third party. The theory about unconscious decision-making questions if the human subconscious can make decision for customers before they are even aware of it. This theory is being applied to the collected data.</p><p>The results of the research show that there can be found two different switching paths of interviewed customers, which are generated based on the collected customer stories. The two different paths are categorised into a <em>reactional </em>and <em>situational </em>switching path and an <em>influenced </em>switching path.</p>
68

Ribosome Associated Factors Recruited for Protein Export and Folding

Raine, Amanda January 2005 (has links)
<p>Protein folding and export to the membrane are crucial events in the cell. Both processes may be initiated already at the ribosome, assisted by factors that bind to the polypeptide as it emerges from the ribosome. The signal recognition particle (SRP) scans the ribosome for nascent peptides destined for membrane insertion and targets these ribosomes to the site for translocation in the membrane. Trigger factor (TF) is a folding chaperone that interacts with nascent chains to promote their correct folding, prevent misfolding and aggregation. </p><p>In this thesis, we first investigated membrane targeting and insertion of two heterologous membrane proteins in E. coli by using in vitro translation, membrane targeting and cross-linking. We found that these proteins are dependent on SRP for targeting and that they initially interact with translocon components in the same way as native nascent membrane proteins. </p><p>Moreover we have characterised the SRP and TF interactions with the ribosome both with cross-linking experiments and with quantitative binding experiments. Both SRP and TF bind to ribosomal L23 close to the nascent peptide exit site where they are strategically placed for binding to the nascent polypeptide. </p><p>Quantitative analysis of TF and SRP binding determined their respective KD values for binding to non translating ribosomes and reveals that they bind simultaneously to the ribosome, thus having separate binding sites on L23. </p><p>Finally, binding studies on ribosome nascent chain adds clues as to how TF functions as a chaperone.</p>
69

The Anticoincidence System of the PAMELA Satellite Experiment : Design of the data acquisition system and performance studies

Lundquist, Johan January 2005 (has links)
PAMELA is a satellite-borne cosmic ray experiment. Its primary scientific objective is to study the antiproton and positron components of the cosmic radiation. This will be done with unprecedented statistics over a wide energy range (~10MeV to ~100GeV). The PAMELA experiment consists of a permanent magnetic spectrometer, an electromagnetic calorimeter, a Time-of-Fight system, a neutron detector and a shower tail catcher. An anticoincidence (AC) system surrounds the spectrometer to detect particles which do not pass cleanly through the acceptance of the spectrometer. PAMELA will be mounted on a Russian Earth-observation satellite, and the launch is scheduled for 2006. The anticoincidence system for PAMELA has been developed by KTH, and consists of plastic scintillator detectors with photomultiplier tube read-out. Extensive testing has been performed during the development phase. Results are presented for environmental tests, tests with cosmic-rays and particle beams. The design of the digital part of the AC electronics has been realised on an FPGA (Field Programmable Gate Array) and a DSP (Digital Signal Processor). It records signals from the 16 AC photomultipliers and from various sensors for over-current and temperature. It also provides functionality for setting the photomultiplier discrimination thresholds, system testing, issuing alarms and communication with the PAMELA main data acquisition system. The design philosophy and functionality needs to be reliable and suitable for use in a space environment. To evaluate the performance of the AC detectors, a test utilizing cosmic-rays has been performed. The primary aim of the test was to calibrate the individual channels to gain knowledge of suitable discriminator levels for flight. A secondary aim was to estimate the AC detector efficiency. A lower limit of (99.89±0.04)% was obtained. An in-orbit simulation study was performed using protons to estimate trigger rates and investigate the AC system performance in a second level trigger. The average orbital trigger rate was estimated to be (8.4±0.6)Hz, consisting of (2.0±0.2)Hz good triggers and (6.4±0.5)Hz background. Inclusion of the AC system in the trigger condition to reduce background (for the purpose of data handling capacity) leads to losses of good triggers due to backscattering from the calorimeter (90% loss for 300GeV electrons and 25% for 100GeV protons). A method, using the calorimeter, for identifying backscattering events was investigated and found to reduce the loss of good events to below 1% (300GeV electrons) and 5% (100GeV protons), while maintaining a background reduction of 70%. of 70%. / QC 20101019
70

ORDO AB CHAO : Den politiska historien om biodrivmedel i den Europeiska Unionen – Aktörer, nätverk och strategier / ORDO AB CHAO : The political history of biofuels in the European Union – Actors, networks and strategies

Nordangård, Jacob January 2012 (has links)
Biodrivmedel blev efter millennieskiftet en alltmer prioriterad energikälla för EU och ansågs kunna stävja både klimathot och energissäkerhetsproblem samtidigt som drivmedelsproduktionen skulle gynna sysselsättningen i jordbruket. EUkommissionen formulerade 2007 ett mål om att ersätta 10 % av transportenergin till biodrivmedel. Snabbt uppkom dock en strid mellan en grupp av aktörer (miljörörelse och livsmedelsindustri) som såg biodrivmedelssatsningen som ett hot mot både miljön och livsmedelssäkerheten medan en annan grupp bestående av företrädesvis biodrivmedelsintressenter såg det som viktigt att behålla och utveckla EU:s mål för att rädda både klimat och miljö. Motsättningarna som uppkommit väcker frågor kring vilka logiker som legat bakom detta. Avhandlingens syfte är att analysera EU:s biodrivmedelspolicy, vilka aktörer och nätverk som har format denna process, vilka problem och lösningar som dessa aktörer och nätverk argumenterat för i processen, samt hur de har agerat för att mobilisera stöd för sina ståndpunkter. Detta har kopplats till teorier om nätverksstyrning, förekomsten av utlösande händelser i policyprocessen, resursberoende i nätverksmodellen samt på vilket sätt managementteori utövat inflytande. Metoden har varit att utifrån dokumentstudier rekonstruera det historiska förloppet och de aktörer som medverkat i processen. Avhandlingens visar att en förhållandevis liten grupp aktörer har haft ett stort inflytande över policyprocessen från det att problemen som biodrivmedel var satta att lösa definierades i slutet av 80-talet till det att hållbarhetsstandarder utvecklades och implementerades. Dessa aktörer har funnits i policynätverkens kärna och har som ett av sina centrala mål velat utarbeta globala regelverk för råvaruhandeln. De miljöorganisationer som medverkat i processen har genom resursberoenden till stor del varit underordnade denna grupp. Processerna har innehållit ett stort inslag av strategisk planläggning men även utlösande händelser som klimat- och livsmedelskriser har varit viktiga för att motivera politiska beslut. / Biofuels became a prioritized energy source for the EU in the new millennium. It was believed that biofuels would suppress both climate change and problems with energy security, and would simultaneously benefit agricultural employment. The EU Commission decided in 2007 that 10 % of the energy used in transportation would be replaced by biofuels. This was, however, soon criticized by a group of actors (environmental associations and the food industry) that saw the biofuels initiative as a threat to both the environment and food security. The biofuels proponents, on the other hand, argued that it was important to maintain and develop the EU’s biofuels objectives to save both the climate and the environment. These contradictions raised my interest to understand and analyze the logics that lie behind these different perspectives on the same issue. The aim of this thesis is to analyze the EU's biofuels policy, which actors and networks shaped this process, which problems and solutions these actors and networks put forward in the process, and how they have acted to mobilize support for their positions. Theoretically, I have applied theories on policy networks, the occurrence of triggering events in the policy process, resource dependence between actors and networks, and how management theory can be used to understand how policy develops. The main results are that a relatively small group of actors has had a strong influence on the policy process. These actors have been at the core of the policy community. The environmental organizations involved in the process have been subordinate to this policy community through resource dependencies. One actor network was formed that wanted to increase the amount of biofuels, while another was formed to protect the forest and soil from heavy exploitation. It took over 20 years before these contradicting efforts collided. This thesis concludes that the process contained large elements of strategic planning and that triggering events such as climate and food crises have been important to justify political decisions.

Page generated in 0.0279 seconds