• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 388990
  • 6043
  • 2495
  • 638
  • 450
  • 407
  • 116
  • 106
  • 106
  • 104
  • 99
  • 79
  • 56
  • 24
  • 24
  • Tagged with
  • 38192
  • 25742
  • 24555
  • 19045
  • 18300
  • 18277
  • 16987
  • 16428
  • 15590
  • 14465
  • 13072
  • 13051
  • 12903
  • 12754
  • 11794
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
13491

Photoluminescence Mapping of Erbium Doped Lithium Niobate / Fotoluminescenskartläggning av Erbiumdopad litiumniobat

Xie, Chuanshi January 2023 (has links)
Lithium niobate (LiNbO3) is a human-made crystalline which is widely used in modern photonics due to its useful properties. In recent years, there has been significant progress in the development of lithium niobate on insulator (LNOI) technology, realizing a fully functional photonic integrated circuits, thanks to its capabilities in both electro-optics and second-order optical nonlinearity. To integrate laser source and amplifier onto the photonic integrated chips, rare-earth-ion doping has been considered while traditional laser source is difficult to integrate on chip. Among the rare-earth ions, erbium ions can provide laser source at 1550nm, which can meet the requirements of high-speed telecommunication band. In this thesis, we investigated a recent-made erbium-implanted LNOI sample through setting up a homebuilt microscope system. By exciting the erbium-implanted area with 980nm continuous wave laser, the emission light around 1550nm was collected and measured. Developing the setup with an automatic scanning code, the photoluminescence response of the implanted area on the sample was mapped. Conclusively, the investigation of the implanted samples demonstrated that the erbium-ion emitters could emit measurable photoluminescence signal around 1550 nm wavelength in spite of background noise and notably at room temperature. These results demonstrated the capability of the designed microscope while still need to be improved in terms of detectors sensitivity and signal to noise ratio. It also revealed the promising implantation of active Er ions into LNOI chips conveying the fabrication of Er photon sources on thin film lithium niobate in next step. / Lithiumniobat (LiNbO_3) är en konstgjord kristall som är allmänt använd inom modern fotonik på grund av dess användbara egenskaper. Under de senaste åren har det skett betydande framsteg inom utvecklingen av teknologin för litiumniobat på isolator (LNOI), vilket har resulterat i fullt fungerande fotoniska integrerade kretsar, tack vare dess förmågor inom både elektrooptik och optisk icke-linjäritet av andra ordningen. För att integrera laserskälla och förstärkare på de fotoniska integrerade kretsarna har överväganden gjorts om tillsats av sällsynta jordartsmetaller, eftersom det är svårt att integrera traditionella laserskällor på kretsar. Bland de sällsynta jordartsmetalljonerna kan erbiumjoner fungera som laserskälla vid 1550 nm, vilket uppfyller kraven för höghastighetstelekommunikationsbandet. I denna avhandling undersökte vi nyligen tillverkat ett LNOI-prov som hade implanterats med erbiumjoner genom att bygga upp ett hemgjort mikroskopsystem. Genom att excitera det erbium-implanterade området med en 980 nm kontinuerlig våglaser samlades och mättes utstrålningen vid 1550 nm. Genom att utveckla installationen med en automatisk skanningskod kartlades fotoluminescensresponsen hos det implanterade området på provet. Sammantaget visade undersökningen av de implanterade proverna att erbiumjonemitterare kunde avge mätbara fotoluminescenssignaler vid en våglängd omkring 1550 nm trots bakgrundsbuller och framför allt vid rumstemperatur. Dessa resultat bekräftade mikroskopets kapacitet, även om förbättringar behövs vad gäller detektorns känslighet och signal-till-brusförhållandet. De avslöjade också den lovande implanteringen av aktiva Er-joner i LNOI-kretsar och möjligheten att tillverka Er-fotonkällor på tunna filmer av litiumniobat i nästa steg.
13492

Topologica linteractions in a multi-layered flocking system / Topologiska interaktioner i ett flerskiktigt flockningssystem

Liedholm, Malin January 2023 (has links)
With the multi-layered flocking system it is possible to simulate flocks that contain different types of agents that can be of various different sizes (variations in bounding radius and height). In the original implementation, the multi-layered flocking system uses a metric distance to find the nearest-neighbours of agents. However, results from real life field studies suggest that animals interact with each other in a flock using a topological distance. The goal of this thesis is therefore to implement a version of the multi-layered flocking system that uses a topological distance for interaction between agents. This is done by adapting two methods that are used to find the k-nearest neighbours (kNN), namely the original spatial partitioning method (OSP method) and the enhanced spatial partitioning method (ESP method), to work with the multi-layered flocking system. The aim is to compare the performance of these methods in terms of query time for four different flocking scenarios (standard, obstacle, follow and steer away). The implementation contains two types of agents of two different sizes. In the standard scenario all agents move together as a flock. The obstacle scenario is similar to the standard scenario with the addition that the simulation space contains stationary obstacles. In the follow scenario the smaller sized agents follow the bigger sized agents, and in the steer away scenario the smaller sized agents steer away from the bigger sized agents. An evaluation of how different numbers of kNN affect the collective motion (polarization, extension and frequency of collisions) of the flock in the four different scenarios is also done. The evaluation was performed by implementing the multi-layered flocking system in the Unity game engine, and running simulations with flocks of different sizes (125-3125 agents) and using different numbers of interacting kNN (k=5,10,15,20) for each of the scenarios. The results show that the ESP method on average is at least twice as fast compared to the OSP method in all four flocking scenarios, and the improvement in performance in query time did not differ much between the scenarios. Moreover, a value of k=10 was shown to be a good compromise between having fast kNN query times for the ESP method, but still having flocks of agents moving in a collective manner. / Med ett flerskiktigt flockningssystem är det möjligt att simulera flockar som innehåller olika typer av agenter som kan vara av olika storlekar (variation i begränsningsradie och längd). I den ursprungliga implementeringen av det flerskiktigt flockningssystem används ett metriskt avstånd för att hitta de närmaste grannarna till agenterna. Resultat från fältstudier tyder dock på att i verkligheten interagerar djur i en flock med varandra genom ett topologiskt avstånd. Målet med denna avhandling är därför att implementera en version av det flerskiktigt flockningssystem som använder ett topologiskt avstånd för att hitta de närmaste grannarna till agenterna. Detta görs genom att anpassa två metoder som kan användas för att hitta en agents k-närmaste grannar till det flerskiktigt flockningssystem. Dessa två metoder kallas den ursprungliga spatiala partitioneringsmetoden (OSP metoden) och den förbättrade spatiala partitioneringsmetoden (ESP metoden). Syftet är att jämföra prestandan mellan dessa två metoder, vilket mäts i tid som det tar att hitta de k-närmaste grannarna för varje metod. Jämförelser genomförs för fyra olika typer av scenarion (standard, hinder, följ och styr ifrån). I standard scenariot rör sig alla agenter tillsammans som en flock. Hinder scenariot liknar standard scenariot med tillägget att simuleringsmiljön består av ett antal stationära hinder. I följ scenariot så följer de mindre agenterna de större agenterna, och i styr ifrån scenariot så försöker de mindre agenterna fly från de större agenterna. Vidare görs en utvärdering av hur olika antal k-närmaste grannar påverkar den totala flockens samlade rörelse (polarisering, förlängning och frekvens av kollisioner) för de olika scenarierna. Utvärderingen görs genom att implementera det flerskiktiga flockningssystemet med de ovan nämnda k-närmaste grannar metoderna i spelmotorn Unity. Ett flertal simuleringar genomförs sedan för de fyra scenarierna och dessa består av olika flockstorlekar (125-3125 agenter) och använder olika värden på k (k=5,10,15,20). Resultaten visar att ESP metoden i genomsnitt är dubbelt så snabb jämfört med OSP metoden. Dessutom visades ett värde på k=10 vara en bra kompromiss mellan att snabbt hitta de k-närmaste grannarna när ESP metoden används, men också ha flockar av agenter som rör sig kollektivt.
13493

Overcoming the security quagmire: behavioural science and modern technology hold the key to solving the complex issue of law firm cyber security

O'Donovan, David, Marshakova, Alexandra 14 May 2018 (has links)
While all industries that handle valuable data have been subject to increasing levels of cyber attack, there is a set of inter-related factors in the law firm cyber security ecosystem that makes such firms more susceptible to attack and also serves to prevent them from taking action to counteract attack vulnerability. As a result of the inter-related external and internal factors affecting law firm cyber security, the human element of firm security infrastructure has been neglected, thereby making humans, at once law firms’ greatest asset, their main cyber security weakness. 1There has been some movement of late, and regulators and clients alike are right to demand law firms do more to improve their cyber security posture.2 However, much of the scrutiny to which their conduct has been subjected has tended to overlook the complexities of the law firm cyber security quagmire, and unless these issues are addressed in the context of a potential solution, meaningful change is not While all industries that handle valuable data have been subject to increasing levels of cyber attack, there is a set of inter-related factors in the law firm cyber security ecosystem that makes such firms more susceptible to attack and also serves to prevent them from taking action to counteract attack vulnerability. As a result of the inter-related external and internal factors affecting law firm cyber security, the human element of firm security infrastructure has been neglected, thereby making humans, at once law firms’ greatest asset, their main cyber security weakness. 1There has been some movement of late, and regulators and clients alike are right to demand law firms do more to improve their cyber security posture.2 However, much of the scrutiny to which their conduct has been subjected has tended to overlook the complexities of the law firm cyber security quagmire, and unless these issues are addressed in the context of a potential solution, meaningful change is not While all industries that handle valuable data have been subject to increasing levels of cyber attack, there is a set of inter-related factors in the law firm cyber security ecosystem that makes such firms more susceptible to attack and also serves to prevent them from taking action to counteract attack vulnerability. As a result of the inter-related external and internal factors affecting law firm cyber security, the human element of firm security infrastructure has been neglected, thereby making humans, at once law firms’ greatest asset, their main cyber security weakness. 1There has been some movement of late, and regulators and clients alike are right to demand law firms do more to improve their cyber security posture.2 However, much of the scrutiny to which their conduct has been subjected has tended to overlook the complexities of the law firm cyber security quagmire, and unless these issues are addressed in the context of a potential solution, meaningful change is not likely. Part 1 of this paper outlines the current threat landscape and details the integral role of human error in successful cyber breaches before turning to discuss recent cyber security incidents involving law firms. In Part 2, we analyse elements of law firm short-termism and the underregulation of law firm cyber security conduct and how these, when combined, play a key role in shaping law firm cyber security posture. Finally, in Part 3 we outline a realistic solution, incorporating principles from behavioural science and modern technological developments.
13494

La réduction de consommation dans les circuits digitaux / Power reduction in digital circuits

Láník, Jan 16 June 2016 (has links)
Le sujet de cette thèse est la réduction de consommation dans les circuits digitaux, et plus particulièrement dans ce cadre les méthodes basées sur la réduction de la fréquence de commutation moyenne, au niveau transistor. Ces méthodes sont structurelles, au sens où elles ne sont pas liées à l’optimisation des caractéristiques physique du circuit mais sur la structure de l’implémentation logique, et de ce fait parfaitement indépendantes de la technologie considérée. Nous avons développé dans ce cadre deux méthodes nouvelles. La première est basée sur l’optimisation de la structure de la partie combinatoire d’un circuit pendant la synthèse logique. La seconde est centrée sur la partie séquentielle du circuit. Elle consiste en la recherche de conditions permettant de détecter qu’un sous-circuit devient inactif, de sorte à pouvoir désactiver ce sous-circuit en coupant la branche correspondante de l’arbre d’horloge, et utilise des méthodes formelles pour prouver que la fonctionnalité du circuit n’en serait pas affectée. / The topic of this thesis are methods for power reduction in digital circuits by reducing average switching on the transistor level. These methods are structural in the sense that they are not related to tuning physical properties of the circuitry but to the internal structure of the implemented logic an d therefore independent on the particular technology. We developed two novel methods. One is based on optimizing the structure of the combinatorial part of a circuit during synthesis. The second method is focused on sequential part of the circuit. It looks for clock gating conditions that can be used to disable idle parts of a circuit and uses formal methods to prove that the function of the circuit will not be altered.
13495

Selective blockade of the sigma-1 receptor for the treatment of pain of different aetiology: Preclinical studies

Gris Ormo, Georgia 04 December 2015 (has links)
Tesi realitzada als Laboratoris Farmacèutics Esteve / The present Doctoral Thesis focuses on the study of the Sigma-1 receptor (σ1R) in the field of pain. This research has been a part of the preclinical σ1R project focusing on drug discovery of σ1R ligands for the treatment of pain of different aetiologies at the pharmaceutical company ESTEVE. The main goal of this Doctoral Thesis was to explore the therapeutic interest of σ1R blockade in the pharmacological management of neuropathic, inflammatory and postoperative pain. Neuropathic pain was the main indication at the beginning of this Doctoral Thesis, but inflammatory and postoperative pain had never been explored. The efficacy of the selective σ1R antagonist S1RA (E-58262) in these different types of pain was evaluated, and its potency and efficacy was compared to other marketed analgesic drugs. To this end, two species (rat and mouse), different pain-related behavioural endpoints (hind paw withdrawal response to thermal and mechanical stimulation), and different pharmacological strategies (systemic acute and repeated E-52862 administration), were evaluated. σ1R knockout mice were also used to study the in vivo specificity of E-52862 and the involvement of σ1R in the spinal modulation of several pain-related molecular markers in order to ascertain the mechanism of action of σ1R. Taken together, the results of this Doctoral Thesis provide new knowledge about σ1R and support the clinical development of selective σ1R antagonists as a suitable therapeutic intervention to achieve analgesia in pain conditions of different aetiology. / La presente Tesis Doctoral se centra en el estudio del receptor sigma-1 (σ1) en el campo del dolor. Esta investigación ha sido parte de un proyecto de la empresa farmacéutica ESTEVE centrado en el descubrimiento de fármacos con afinidad por el receptor σ1 para el tratamiento de dolor de diferente etiología. El objetivo principal de esta Tesis fue explorar el interés terapéutico del bloqueo del receptor σ1 para el manejo farmacológico del dolor neuropático, inflamatorio y postoperatorio. Se evaluó la potencia y eficacia del antagonista selectivo del receptor σ1, S1RA (E-52862) en estos diferentes tipos de dolor, y se comparó con otros fármacos analgésicos comercializados. Con este fin, se emplearon dos especies (rata y ratón), diferentes evaluaciones comportamentales relacionadas con el dolor (respuesta de retirada de la pata trasera a la estimulación térmica y mecánica), y diferentes estrategias farmacológicas (administración sistémica aguda y repetida del antagonista E-52862). También se utilizaron ratones knockout por el receptor σ1 para estudiar la especificidad in vivo del E-52862 y la participación del receptor σ1 en la modulación espinal de varios marcadores moleculares relacionados con el dolor con el fin de determinar el mecanismo de acción del receptor. En resumen, los resultados de esta Tesis Doctoral proporcionan nuevos conocimientos sobre el receptor σ1 y apoyan el desarrollo clínico de antagonistas selectivos por este receptor como una intervención terapéutica adecuada para lograr analgesia en condiciones de dolor de diferente etiología.
13496

Synchronization costs in parallel programs and concurrent data structures / Coûts de synchronization dans les programmes parallèlles et les structures de données simultanées

Aksenov, Vitalii 26 September 2018 (has links)
Pour utiliser la puissance de calcul des ordinateurs modernes, nous devons écrire des programmes concurrents. L’écriture de programme concurrent efficace est notoirement difficile, principalement en raison de la nécessité de gérer les coûts de synchronisation. Dans cette thèse, nous nous concentrons sur les coûts de synchronisation dans les programmes parallèles et les structures de données concurrentes.D’abord, nous présentons une nouvelle technique de contrôle de la granularité pour les programmes parallèles conçus pour un environnement de multi-threading dynamique. Ensuite, dans le contexte des structures de données concurrentes, nous considérons la notion d’optimalité de concurrence (concurrency-optimality) et proposons la première implémentation concurrence-optimal d’un arbre binaire de recherche qui, intuitivement, accepte un ordonnancement concurrent si et seulement si l’ordonnancement est correct. Nous proposons aussi la combinaison parallèle (parallel combining), une technique qui permet l’implémentation efficace des structures de données concurrences à partir de leur version parallèle par lots. Nous validons les techniques proposées par une évaluation expérimentale, qui montre des performances supérieures ou comparables à celles des algorithmes de l’état de l’art.Dans une perspective plus formelle, nous considérons le phénomène d’assistance (helping) dans des structures de données concurrentes. On observe un phénomène d’assistance quand l’ordre d’une opération d’un processus dans une trace linéarisée est fixée par une étape d’un autre processus. Nous montrons qu’aucune implémentation sans attente (wait-free) linéarisable d’une pile utilisant les primitives read, write, compare&swap et fetch&add ne peut être “sans assistance” (help-free), corrigeant une erreur dans une preuve antérieure de Censor-Hillel et al. Finalement, nous proposons une façon simple de prédire analytiquement le débit (throughput) des structures de données basées sur des verrous à gros grains. / To use the computational power of modern computing machines, we have to deal with concurrent programs. Writing efficient concurrent programs is notoriously difficult, primarily due to the need of harnessing synchronization costs. In this thesis, we focus on synchronization costs in parallel programs and concurrent data structures.First, we present a novel granularity control technique for parallel programs designed for the dynamic multithreading environment. Then in the context of concurrent data structures, we consider the notion of concurrency-optimality and propose the first implementation of a concurrency-optimal binary search tree that, intuitively, accepts a concurrent schedule if and only if the schedule is correct. Also, we propose parallel combining, a technique that enables efficient implementations of concurrent data structures from their parallel batched counterparts. We validate the proposed techniques via experimental evaluations showing superior or comparable performance with respect to state-of-the-art algorithms.From a more formal perspective, we consider the phenomenon of helping in concurrent data structures. Intuitively, helping is observed when the order of some operation in a linearization is fixed by a step of another process. We show that no wait-free linearizable implementation of stack using read, write, compare&swap and fetch&add primitives can be help-free, correcting a mistake in an earlier proof by Censor-Hillel et al. Finally, we propose a simple way to analytically predict the throughput of data structures based on coarse-grained locking
13497

Can intangibles lead to superior returns? : Global evidence on the relationship between employee satisfaction and abnormal equity returns.

Ballout, Rami, Nygård, Fredrik January 2013 (has links)
Subject background and discussion: In recent decades, issues of human rights, labor and environmental change has been hot topics world wide, which also has influenced the financial market. More and more investors use socially responsible investing (SRI) screens when constructing their portfolios. One form of SRI screen is to choose companies that have satisfied employees. Existing theory says that employee satisfaction is an intangible asset to the firm that will positively affect a firm’s performance in the future. Intangible assets are often unrecognized by the market and thereby not incorporated in the stock price. The efficient market hypothesis has been studied and debated for several decades. Proponents of the EMH argue that all available information is incorporated in the stock price, thus it is not possible to systematically beat the market. However, EMH is controversial, since research has shown different results regarding the possibility to make abnormal return from various investing strategy. Research question: Is it possible to make abnormal returns by investing in a portfolio of worldwide firms with top scores on the SRI screen employee satisfaction? Purpose: The main purpose of this study is to examine investor’s possibility to make abnormal return with controls for multiple risk factors by investing in worldwide firms with top scores in employee satisfaction. One sub-purpose is to examine how the market values intangibles depending on the degree of market efficiency. Another sub-purpose of the study is to test two different portfolio weighting methodologies, equally- and value weighted, and observe the differences between them. Theory: This study deals with the efficient market hypothesis and the concepts of SRI, employee satisfaction, intangible assets and several risk-adjusted measurements. Method: We have chosen to perform a quantitative study with a deductive approach to answer our research question. We used a sample size of 696 firms based on “Great Place to Works”- lists of companies with high employee satisfaction to construct sex portfolios with different holding periods and strategies. These portfolios have been explored and tested significantly with both equally and value weighted methods. Result/Analysis: The study finds significant evidence of an average annual abnormal return of 3,66% and 2,43% for our main portfolio over the market for equally- and value weighted, respectively, using the three-factor model. When adjusting for momentum, thus employing the four-factor model, all the predictive variables still identify strong persistence in the abnormal return, with statistical significance. Conclusion: The results show that it is possible to make abnormal returns, during the observed time period, regardless of the weighing methodology, although the equally weighted received higher abnormal returns. Thus, the market efficiency appears to be in weak form and does not fully value intangibles.
13498

Enterprise System Post-Implementation: A Practice of System Evaluation Issues in Health Care Organization : A case study of Jönköping County Council

Zhang, Yiping, Yu, Xinyi, Gilles, Sintset January 2011 (has links)
Introduction: As Information Technology (IT) becomes more and more advanced, the Enterprise System (ES) starts to attract researcher’s attention. While with the high rate of failure IT projects, it is important to evaluate the IT project properly. This paper conducts a case study in the Health Care area and chooses Jönköping County Council’s ROS system to be the target system. According to the established linkage between theory and real world organization, a practice of Enterprise System Evaluation is conducted by using an existing Uwizeyemungu et al.’s Enterprise System Effects Evaluation Model (ESEM). The research questions are as follows: What are the Enterprise Systems Effects which impact on business processes? To what extend do the ES effects impact on the business processes? Purpose: the study is an exploratory study that aims at identifying what are the ES Effects which impact on the business processes and assessing the importance and the actual degree of these effects. The answers of the first goal are explored by analyzing the documents and the record of interview, and the results are the basis of the second question. Method: This research has adopted a combined approach because of the nature of the research questions. Data has been collected through face-to-face interview, survey and the organizational documents. Secondary data are also be used for analyzing. Both qualitative and quantitative data are used for getting a reliable conclusion. Conclusions: The Enterprise System effects can be categorized into automaional effects, informational effects and transformational effects. The relationship between such effects and Performance indicators are very important. By determining the importance and impacts degree of such relationships, the evaluation results can be explicitly calculated and understood.
13499

Essays in labor and public economics

Béland, Louis-Philippe 03 1900 (has links)
Dans ma thèse, je me sers de modèles de recherche solides pour répondre à des questions importantes de politique publique. Mon premier chapitre évalue l’impact causal de l’allégeance partisane (républicain ou démocrate) des gouverneurs américains sur le marché du travail. Dans ce chapitre, je combine les élections des gouverneurs avec les données du March CPS pour les années fiscales 1977 à 2008. En utilisant un modèle de régression par discontinuité, je trouve que les gouverneurs démocrates sont associés à de plus faibles revenus individuels moyens. Je mets en évidence que cela est entrainée par un changement dans la composition de la main-d’oeuvre à la suite d’une augmentation de l’emploi des travailleurs à revenus faibles et moyens. Je trouve que les gouverneurs démocrates provoquent une augmentation de l’emploi des noirs et de leurs heures travaillées. Ces résultats conduisent à une réduction de l’écart salarial entre les travailleurs noir et blanc. Mon deuxième chapitre étudie l’impact causal des fusillades qui se produisent dans les écoles secondaires américaines sur les performances des éléves et les résultats des écoles tels que les effectifs et le nombre d’enseignants recruté, a l’aide d’une stratégie de différence-en-différence. Le chapitre est coécrit avec Dongwoo Kim. Nous constatons que les fusillades dans les écoles réduisent significativement l’effectif des élèves de 9e année, la proportion d’élèves ayant un niveau adéquat en anglais et en mathématiques. Nous examinons aussi l’effet hétérogene des tueries dans les écoles secondaires entre les crimes et les suicides. Nous trouvons que les fusillades de natures criminelles provoquent la diminution du nombre d’inscriptions et de la proportion d’élèves adéquats en anglais et mathématiques. En utilisant des données sur les élèves en Californie, nous confirmons qu’une partie de l’effet sur la performance des élèves provient des étudiants inscrits et ce n’est pas uniquement un effet de composition. Mon troisième chapitre étudie l’impact des cellulaires sur la performance scolaire des élèves. Le chapitre est coécrit avec Richard Murphy. Dans ce chapitre, nous combinons une base de données unique contenant les politiques de téléphonie mobile des écoles obtenues à partir d’une enquète auprès des écoles dans quatre villes en Angleterre avec des données administratives sur la performance scolaire des éleves. Nous étudions ainsi l’impact de l’introduction d’une interdiction de téléphonie mobile sur le rendement des éleves. Nos résultats indiquent qu’il y a une augmentation du rendement des éleves après l’instauration de l’interdiction des cellulaires à l’école, ce qui suggère que les téléphones mobiles sont sources de distraction pour l’apprentissage et l’introduction d’une interdiction à l’école limite ce problème. / In my thesis, I use compelling research designs to address important public policy issues. My first chapter estimates the causal impact of the party allegiance (Republican or Democratic) of U.S. governors on labor market outcomes. I match gubernatorial elections with March CPS data for income years 1977 to 2008. Using a regression discontinuity design, I find that Democratic governors are associated with lower average individual earnings. I provide evidence that this is driven by a change in workforce composition following an expansion in employment of workers with low and medium earnings. I also find that Democratic governors cause a reduction in the racial earnings gap between black and white workers through an increase in the annual hours worked by blacks relative to whites. My second chapter analyze how shootings in high schools affect schools and students using data from shooting databases, school report cards, and the Common Core of Data. The chapter is co-written with Dongwoo Kim. We examine schools’ test scores, enrollment, and number of teachers, as well as graduation, attendance, and suspension rates at schools that experienced a shooting, employing a difference-in-differences strategy that uses other high schools in the same district as the comparison group. Our findings suggest that homicidal shootings significantly decrease the enrollment of students in Grade 9, and reduce test scores in math and English. We find no statistically significant effect for suicidal shootings on any outcome variables of interest. Using student-level data from California, we confirm that some of the effects on student performance occur as a result of students remaining enrolled and not only due to changes in student body composition. My third chapter investigates the impact of school mobile phone policy on student performance. The chapter is co-written with Richard Murphy. Combining a unique dataset on autonomous mobile phone policies from a survey of schools in four cities in England with administrative data, we investigate the impact of imposing a mobile phone ban on student performance. Our results indicate an improvement in student results after a school bans the use of mobile phones; this suggests that mobile phones distract learning and imposing a ban limits this problem.
13500

Simulation of the Measurement of the Inclusive Jet Cross Sections in Z(→e+e−/→µ+µ−)+jets Events in pp Collisions at 14 TeV witht he ATLAS experiment

Segura i Solé, Ester 19 June 2009 (has links)
Després de 20 anys de preparació, el LHC s'encendrà a finals del 2009, col.lisionant protons a energies de 14TeV, per recrear els primers moments del Big Bang. Les partícules s'acceleraran al llarg del túnel circular de 27 km de circumferència fins a velocitats properes a la de la llum. El túnel, situat prop de Ginebra, i els seus experiments formen un dels més grans esforços de la història per estudiar l'estructura fonamental de la matèria. S'espera que a aquestes altes energies de col.lisió nous fenòmens físics puguin esdevenir i siguin descoberts. Entre les restes resultants de les col.lisions entre protons, evidències de dimensions extres, de la misteriosa matèria fosca que inunda el nostre univers o de la partícula de Higgs que dona massa a les partícules elementals, podran ser observades. ATLAS és un dels experiments del LHC. A més a més del descobriment de nova Física en el seu programa, també està inclosa la investigació de la Física ja coneguda. Un millor coneixement de la teoria pertorbativa de Quantum Chromodynamics és un dels objectius d'ATLAS. QCD és la teoria que descriu les interaccions fortes entre quarks i gluons. Aquesta teoria encara roman sense resoldre, ja que no exiteix un mètode d'aproximació vàlid per qualsevol escala d'energies. La teoria pertorbativa de QCD pot descriure un gran nombre d'interaccions a altes energies. El seu formalisme ha donat una eina molt valuosa dins l'estudi de les interaccions fortes. La signatura més notable dels processos QCD en col.lisionadors d'hadrons és la producció de jets de partícules col.limats. La mesura d'aquests jets en associació amb un bosó vector, W o Z, proporciona un test estricte dels càlculs de la teoria pertorbativa de QCD. A més, alguns dels processos de la nova Física, com la producció del bosó Higgs o de partícules supersimètriques, poden ser imitats per la producció dels bosons vectros en associació amb jets, que constitueixen un background irreduïble a l'investigació d'aquests nous processos. Aquesta tesi presenta la mesura de la "inclusive jet cross section" en events amb un bosó Z(e,e,) o Z(mu,mu), comparant les prediccions teòriques amb les dades "real", i.e. events Monte Carlo "fully-reconstructed", pel primer 1fb-1 de dades obtingudes amb el detector ATLAS. Les dades reconstruïdes i corregides es comparen amb les prediccions de la teoria pertorbativa de QCD a nivell NLO i LO. Aquestes prediccions pertorbatives es corregeixen amb les contribucions de processos no pertorbatius, com l'underlying event o la fragmentació dels partons dins dels jets d'hadrons. Aquests processos no poden ser descrits per la teoria pertorbativa, i s'estimen amb models fenomenològics. S'han utilitzat dos tipus diferents de dades reconstruides Monte Carlo, ALPGEN i PYHTIA. S'han estudiat les comparacions entre les prediccions d'ambdós generadors. Els processos background han estat estimats utilitzant diferents mètodes "data-driven". Per la recerca dels jets, s'ha utilitzat l'algoritme ATLAS Cone 0.4, després d'identificar la presència d'un bosó Z mitjançant la reconstrucció del seu decaiment (electrons o muons). Les dades reconstruides han estat corregides pels efectes del detector, utilitzant factors independents. Aquest anàlisi ha estat realitzat abans que ATLAS recollís les primeres dades reals, abans de l'inici del LHC, per aquest motiu aquesta tesi es basa en dades simulades amb generadors Monte Carlo. Mentres dura la preparació d'un experiment de Física d'altes energies com ATLAS, aquestes simulacions són molt importants pel desenvolupament d'estratègies eficaces d'anàlisi de dades i reconstrucció d'objectes físics mesurats pel detector. / Después de 20 años de preparación, el LHC va a encenderse a finales del 2009, colisionando protones a una energia de 14 TeV, para recrear los primeros momentos después del Big Bang. Las partículas serán aceleradas a lo largo del tunel circular de 27 km de circumferencia hasta velocidades próximas a la de la luz. El túnel, situado cerca de Ginebra, y sus experimentos forman uno de los más grandes esfuerzos de la história para estudiar la estructura fundamental de la materia. Se espera que a estas elevadas energías de colisión, nuevos fenómenos físicos puedan ocurrir y sean descubiertos. Entre los resultantes escombros de las colisiones entre protones, evidencias de dimensiones extras, de la misteriosa materia oscura que inunda nuestro universo o de la partícula de Higgs que da masa a las partículas elementales serán observadas. ATLAS es uno de los experimentos del LHC. Además del descubrimiento de nueva Fisica en su programa, también está la investigación de la Fisica ya descubierta. Un mejor conocimiento de la teoría perturbativa de Quantum Chromodynamics es uno de los objetivos de ATLAS. Quantum Chromodynamics es la teoria de campos que describe las interacciones fuertes entre quarks y gluones. Esta teoría todavia permanece sin resolver, ya que no existe un método de aproximación válido para cualquier escalada de energía. La teoría perturbativa de QCD puede describir un gran numero de interacciones a altas energías. Su formulismo ha dado una herramienta muy valiosa en el estudio de las interacciones fuertes. La signatura más notable de procesos QCD en collisionadores de hadrones es la producción de jets colimados de hadrones. La medición de estos jets en asociación con un bosón vector, W o Z, proporciona un test severo de los cálculos de la teoría perturbativa de QCD. Además, algunos de los procesos de nueva Física en los colisionadores de hadrones, como la production del boson de Higgs o de partículas supersimétricas, pueden ser imitados por la producción de bosones vectores en asociación con jets, que constituyen un background irreducible a la investigación de estos nuevos procesos. Esta tesis presenta la medida de la "inclusive jet cross section" en sucesos con un boson Z (ee) o Z(mu,mu), comparando las prediciones teoricas con los datos "reales", i.e. sucesos Monte Carlo "fully-reconstructed", para el primer 1fb-1 de datos obtenidos con el detector ATLAS. Los datos reconstruidos y corregidos son comparados a las predicciones de la teoria perturbativa de QCD a nivel NLO y LO. Estas predicciones perturbativas se corrigen con las contribuciones de procesos no perturbativos, como el underlying event o la fragmentación de los partones en jets de hadrones. Estos procesos no pueden ser descritos por la teoria perturbativa, y son estimados con modelos fenomenologicos. Se han ultilizado dos tipos distintos de datos reconstruidos Monte Carlo, ALPGEN y PYTHIA. Comparaciones entre las predicciones de ambos generadores Monte Carlo han sido estudiadas. Los procesos background han sido estimados usando distintos métodos "data-driven". Para la búsqueda de jets se ha usado el algoritmo ATLAS Cone 0.4 jet, después de identificar la presencia de un bosón Z a través de la reconstrucción de su decaimiento (electrones o muones). Los datos reconstruidos son corregidos por los efectors del detector, usando factores independientes. Este analisis ha sido realizado antes que ATLAS recogiera los primeros datos reales, antes del inicio del LHC, por este motivo esta tesis esta basada en datos simulados con generadores Monte Carlo. Mientras dura la preparación de un experimento de fisica de altas energías como ATLAS, estas simulaciones son de vital importancia para el desarrollo de estrategias eficaces de analisis de datos y reconstrucción de objetos físicos medidos por el detector. / After 20 years of preparation, the Large Hadron Collider is going to be switched on in late 2009, smashing protons at an energy of 14 (10) TeV, to recreate the first moments after the Big Bang. Particles will whizz around a circular tunnel of 27 km in circumference at near light speed. The tunnel, built near Geneva, and its experiments constitute one of the largest coordinated efforts ever made to study the fundamental structure of nature. It is expected that at the energies reached in the proton-proton collisions at the LHC, unknown physical phenomena will have to occur and will be observable. Among the particle debris may lie evidence for extra dimensions, mysterious dark matter that pervades the universe or the Higgs boson, which gives mass to elementary particles. ATLAS is one of the LHC experiments. Besides the new phenomena goals in its physics program, there is the understanding of the already known physics. The better understanding of perturbative Quantum Chromodynamics theory is one of the aims of ATLAS. Quantum Chromodynamics is the field theory which describes the strong interaction between quarks and gluons. It remains an "unsolved" theory, since no single approximation method can be used to all length scales. Perturbative QCD naturally describes a large set of high-energy, large-momentum-transfer cross sections and its formalism has provided an invaluable tool in the study of the strong interactions. The most prominent signature of QCD at hadron colliders is the production of collimated jets of hadrons. The measurement of the production of such jets in association with a vector boson, W or Z, provides a stringent test of perturbative QCD (pQCD) calculations. Furthermore, some of new physics processes at hadron colliders, such as the production of Higgs bosons and supersymmetric particles, can be mimicked by the production of vector bosons in association with jets that constitute irreducible backgrounds to these searches. This PhD. thesis presents the measurement of the inclusive jet cross section in Z → e+e− and in Z → μ+μ− events, comparing theory predictions with "real data", i.e. Monte Carlo fully-reconstructed events, for the first 1fb−1 of data at the ATLAS detector. Reconstructed corrected data is compared to next-to-LO (NLO) and LO pQCD predictions. Perturbative predictions are corrected for the contributions of the non-perturbative processes, like the underlying event and the fragmentation of the partons into jets of hadrons. These processes are not described by perturbation theory and must be estimated using phenomenological models. Two different reconstructed data are used, PYTHIA and ALPGEN Monte Carlo data. Comparisons of both Monte Carlo predictions are studied. Background processes are estimated proposing different data-driven methods to be applied to real data. ATLAS Cone 0.4 algorithm is used to look for jets in the events after identifying the presence of a Z boson through the reconstruction of its decay (electrons or muons). Reconstructed data is corrected for detector effects, using independent factors. As this work was carried out before the "physics-data" start of the LHC, the presented studies are based on Monte Carlo simulations. During the preparation of a high-energy collider experiment, such simulations are important to develop efficient strategies for data analysis and for the reconstruction of the physics objects observed with the detectors.

Page generated in 1.1596 seconds