11 |
Versatile low-energy electron source at the PHIL accelerator to characterise Micromegas with integrated Timepix CMOS readout and study dE/dx for low energy electrons / Source polyvalente d'électrons de basse énergie sur l'accélérateur PHIL pour caractériser des détecteurs Micromegas avec une lecture CMOS intégrée et étudier dE/dx pour les électrons de basse énergieKrylov, Vladyslav 12 June 2018 (has links)
Dans le cadre de cette thèse, la conception, la construction et la mise en service de la plateforme de test LEETECH ont été réalisées. La performance de LEETECH, y compris le mode de fonctionnement à faible multiplicité a été démontrée. En fournissant des paquets d’électrons avec une énergie ajustable jusqu’à 3.5 MeV, une multiplicité ajustable à partir d’électrons simples et une durée des paquets jusqu’à 20ps, LEETECH prend sa place entre les faisceaux tests de hautes énergies et de coûts élevés d’un part et l’utilisation de sources radioactifs d’autre part. Dans la région, qui correspond à la particule d’ionisation minimale, la plateforme offre aux détecteurs de traces les conditions similaires aux celles de faisceaux des hautes énergies. Le mode de fonctionnement à faible multiplicité a été étudié en utilisant un détecteur diamant de grande surface. En plus une capacité d’un capteur diamant de résoudre des paquets à faible nombre des particules a été démontrée. Dans le cadre du développement de la chambre à projection temporelle (TPC) pour le projet ILC, une session de test a été dédiée à un détecteur Micromegas/InGrid de large surface. Pour la première fois les pertes d’énergie par un électron dans un mélange de gaz basée sur Helium ont été mesurées pour une énergie de quelques MeV. La résolution en dE/dx et un algorithme pour la reconstruction de traces ont été développés. Une caractérisation préliminaire du quartz barre lu par MCPPMT – un candidat pour le détecteur temps-de- vol (TOF) avec la mission de l’identification des hadrons chargés dans le futur usine tau-charm HIEPA – a été accomplie. La résolution temporelle de 50 ps obtenue pour le détecteur étudié met cette technologie prometteuse pour les études plus approfondies. / Within the present thesis the design, construction and commissioning of a new test beam facility LEETECH have been performed. Performance of the new facility, including low-multiplicity operation mode has been demonstrated. A number of interesting detector tests, including large-area diamond, Micromegas/InGrid and quartz bar detectors have been performed. Development of new detector technologies for future high-energy physics collider experiments calls for selection of versatile test beam facilities, permitting to choose or adjust beam parameters such as particles type, energy and beam intensity, are irreplaceable in characterization and tests of developed instruments. Major applications comprise generic detector R&D, conceptual design and choice of detector technologies, technical design, prototypes and full-scale detector construction and tests, detector calibration and commissioning. A new test beam facility LEETECH (Low Energy Electron TECHnique) was designed, constructed and commissioned in LAL (Orsay) as an extension of existing PHIL accelerator. Providing electron bunches of adjustable energy (up to 3.5 MeV), intensity (starting from a few particles per bunch) and bunch time duration (down to 20 ps), LEETECH fills the gap between high-cost high-energy test beam facilities and use of radioactive sources. Covering a minimum-ionization particles region (electrons of energy above 1.6 MeV), LEETECH provides for tracking detectors similar conditions as high-energy facilities. Using LEETECH as an electron source, several types of detectors were investigated in order to study their performance or applications, also providing a characterization of the LEETECH performance. First studies of the LEETECH facility were performed with a plastic scintillator coupled to the Micro-channel plate photomultiplier. A low-multiplicity mode was investigated using the diamond sensor, at the same time demonstrating its ability to resolve bunches consisting of a few particles. In framework of Time Projection Chamber development for the ILC project, a session dedicated to a large-area Micromegas/InGrid module was performed. For the first time the electron energy losses in Helium-based gas mixtures were measured for the energies of few MeV. The dE/dx resolution was obtained and track reconstruction algorithm was developed. Being a candidate for the time-of- flight detector of the BESIII upgrade and future HIEPA tau-charm factories, a preliminary characterization of the quartz bar performed. The time resolution of the detector module of 50 ps was obtained, giving a promising results for the further detector studies.
|
12 |
Identification des particules par les émulsions nucléaires dans OPERAManai, Kais 31 October 2007 (has links) (PDF)
L'expérience OPERA propose de mettre en évidence l'oscillation par apparition du dans un faisceau pur en . Ce faisceau est produit au CERN, puis dirigé sur le détecteur situé à 732 km plus loin. Le détecteur OPERA est composé de deux spectromètres à muons et d'une cible formée de murs de briques qui sont une alternance de feuilles de plomb et d'émulsions. Cette structure permet de reconstruire avec une haute résolution spatiale la topologie de désintégration en coude du tau. Le grand défi de l'expérience OPERA est de pouvoir mettre en évidence les interactions avec le moins d'incertitude possible à travers l'identification de tout événement de bruit de fond ne comportant pas un . C'est à ce niveau que mon travail apporte une contribution intéressante en offrant la possibilité de réduire d'avantage le bruit de fond. Ma contribution principale d'analyse concerne le développement de la sélection, de la reconstruction et l'identification des muons de basse énergie à l'aide des émulsions nucléaires. Ce travail repose sur la mise en corrélation de variables sensibles à la fois à la perte d'énergie et à la diffusion multiple. Auparavant, seule l'énergie perdue était utilisée dans les analyses de séparation . Mon étude a permis de doubler l'efficacité d'identification des muons de basse énergie ce qui va permettre d'accroître la puissance de rejet des événements de bruit de fond et de diminuer la contamination de 30%. J'ai également étudié le pouvoir des émulsions dans l'identification et la séparation des particules chargées à travers l'analyse d'un test réalisé par le groupe de Nagoya au Japon contenant des protons et des pions de différentes énergies. J'ai montré que le système de scan Européen donne des résultats comparables aux résultats obtenus par le système de scan Japonais.
|
13 |
A study of deep levels of AlGaAs/GaAs heterojunction bipolar transistorsHuang, Chun-ta 10 July 1992 (has links)
A study of deep levels of the emitter region of a
heterojunction bipolar transistor is investigated using deep
level transient spectroscopy (DLTS), deep level admittance
spectroscopy (DLAS), thermally stimulated capacitance
(TSCAP), and capacitance-voltage (C-V) profiling. The DX
center, with an activation energy of 0.45 eV, is the only
deep level detected. By varying the DLTS rate window and
filling pulse widths, DX is found to be comprise of two
closely spaced DX centers, denoted DX1 and DX2. A positive
peak observed in the DLTS spectra is attributed to electron
capture, not minority carrier emission, and, thus, is an
experimental artifact. Finally, the reduction of current gain
(β) at low collector current and the effect of the DX center
on the switching characteristics of HBTs are briefly
discussed. / Graduation date: 1993
|
14 |
Uncontrollable and unpredictable stress with a reminder experience induces long-lasting effects on physiology and behavior: A novel approach to modeling post-traumatic stress disorder in ratsZoladz, Phillip R 01 June 2006 (has links)
People who endure horrific, life-threatening experiences are at risk for developing post-traumatic stress disorder (PTSD). However, only about 25% of all individuals who experience trauma develop PTSD. Recent research indicates that the presence of certain physiological conditions, such as reduced cortisol and parasympathetic inhibition, during trauma may increase one's susceptibility to developing PTSD. Thus, I attempted to develop a novel animal model of PTSD and test the hypothesis that reduced adrenal and parasympathetic activity during stress would exacerbate its long-term effects on behavior.In Experiment One, adult male rats were exposed to two stress sessions, each involving one hour of immobilization plus cat exposure. Before each session, rats were injected with vehicle, metyrapone, AF-DX 116, or both drugs. The second session occurred 10 days after the first and served to model a traumatic flashback. Stressed rats endured unstable housing conditions throughout t
he experiment to add an element of daily anxiety. Three weeks after the second session, all rats underwent a battery of tests to examine the lasting effects of stress on physiology and behavior. The results indicated that stressed rats exhibited heightened anxiety on the elevated plus maze, an exaggerated startle response, and greater blood pressure, relative to controls. Moreover, metyrapone, when combined with stress, led to significant short- and long-term spatial memory impairments. Experiment Two assessed the effects of the same stress paradigm on rats' sensitivity to yohimbine, an alpha-2 adrenergic receptor antagonist. Yohimbine induces flashbacks and panic attacks in patients with PTSD; thus, I hypothesized that stressed rats would react abnormally to this agent. Stressed and unstressed rats were administered vehicle or yohimbine (1 mg/kg) 30 min prior to behavioral testing. The results indicated that stressed rats were hyperresponsive to yohimbine, as evidenced by a greater su
ppression of rearing, greater avoidance of the center of the open field, and a greater suppression of activity on the elevated plus maze, relative to controls. Collectively, the findings of these studies indicate that uncontrollable and unpredictable psychological stress produces lasting changes in the physiology and behavior of rats that resemble symptoms commonly observed in people with PTSD.
|
15 |
Optisch detektierte magnetische Resonanzuntersuchungen an Defekten mit Negativ-U Eigenschaften: DX-Zentren in AlGaAs und der Sauerstoffdefekt in GaAs /Linde, Matthias. Unknown Date (has links)
Universiẗat, Diss., 1995--Paderborn.
|
16 |
Optical thermal and economic optimisation of a linear Fresnel collectorMoghimi Ardekani, Mohammad January 2017 (has links)
Solar energy is one of a very few low-carbon energy technologies with the enormous potential to grow to a large scale. Currently, solar power is generated via the photovoltaic (PV) and concentrating solar power (CSP) technologies. The ability of CSPs to scale up renewable energy at the utility level, as well as to store energy for electrical power generation even under circumstances when the sun is not available (after sunset or on a cloudy day), makes this technology an attractive option for sustainable clean energy. The levelised electricity cost (LEC) of CSP with thermal storage was about 0.16-0.196 Euro/kWh in 2013 (Kost et al., 2013). However, lowering LEC and harvesting more solar energy from CSPs in future motivate researchers to work harder towards the optimisation of such plants. The situation tempts people and governments to invest more in this ultimate clean source of energy while shifting the energy consumption statistics of their societies from fossil fuels to solar energy.
Usually, researchers just concentrate on the optimisation of technical aspects of CSP plants (thermal and/or optical optimisation). However, the technical optimisation of a plant while disregarding economic goals cannot produce a fruitful design and in some cases may lead to an increase in the expenses of the plant, which could result in an increase in the generated electrical power price.
The study focused on a comprehensive optimisation of one of the main CSP technology types, the linear Fresnel collector (LFC). In the study, the entire LFC solar domain was considered in an optimisation process to maximise the harvested solar heat flux throughout an imaginary summer day (optical goal), and to minimise cavity receiver heat losses (thermal goal) as well as minimising the manufacturing cost of the plant (economic goal). To illustrate the optimisation process, an LFC was considered with 12 design parameters influencing three objectives, and a unique combination of the parameters was found, which optimised the performance. In this regard, different engineering tools and approaches were introduced in the study, e.g., for the calculation of thermal goals, Computational Fluid Dynamics (CFD) and view area approaches were suggested, and for tackling optical goals, CFD and Monte-Carlo based ray-tracing approaches were introduced. The applicability of the introduced methods for the optimisation process was discussed through case study simulations. The study showed that for the intensive optimisation process of an LFC plant, using the Monte Carlo-based ray-tracing as high fidelity approach for the optical optimisation objective, and view area as a low fidelity approach for the thermal optimisation objective, made more sense due to the saving in computational cost without sacrificing accuracy, in comparison with other combinations of the suggested approaches.
The study approaches can be developed for the optimisation of other CSP technologies after some modification and manipulation. The techniques provide alternative options for future researchers to choose the best approach in tackling the optimisation of a CSP plant regarding the nature of optimisation, computational cost and accuracy of the process. / Thesis (PhD)--University of Pretoria, 2017. / Mechanical and Aeronautical Engineering / PhD / Unrestricted
|
17 |
Communicating Results of New Genomic Tests to PhysiciansJIN, JING 07 May 2009 (has links)
Background: New genomic tests are being developed to predict an individual’s risk of cancer recurrence by analyzing the expression of multiple genes. However, it is unclear how to report the test results so that they would be most useful to clinicians. A mail-out questionnaire has the potential to help a) describe physicians’ attitudes towards the clinical use of new genomic tests, b) determine what information physicians prefer to have included in the test reports, and c) explore how physicians think the test results would impact their treatment recommendations.
Objectives: To design such a questionnaire that could be used in the eventual large-scale survey, and to ensure that the questionnaire a) is comprehensible, b) has face validity, c) appears interesting to, and d) does not place undue response burden on, the target population.
Methods: The first draft, based on a specific genomic test for breast cancer recurrence (Oncotype DX) and on two case scenarios, was created. Cognitive interviews with practicing oncologists were conducted to identify problems in the questionnaire. The evaluation involved face-to-face interviews with Kingston oncologists who treat breast cancer, followed by telephone interviews with medical oncologists who treat breast cancer in other places in Ontario. Three-to-four oncologists were included in each round of interviewing after which the questionnaire was revised based on that round’s recommendations. Additional rounds of interviews were conducted until no new problems/issues were raised in one entire round.
Results: A medium-length questionnaire was drafted. Four rounds of interviews were conducted with no new problems/issues being raised in the fourth round. Most of the problems identified in the questionnaire related to comprehensibility, followed by logical issues which detected fundamental problems in the questionnaire design. There was no evidence of fatigue or disinterest in participants and they deemed the response burden reasonable.
Conclusion: The results suggest that the proposed questionnaire is comprehensible and has face validity. Additionally, it appears to be an interesting questionnaire to, and would not place undue burden on, the target population. Thus, the questionnaire is now ready for the field administration. / Thesis (Master, Community Health & Epidemiology) -- Queen's University, 2009-05-05 17:23:10.551
|
18 |
Evaluation of the Effectiveness of Implementing a UI Library in FinTech Applications / Implementing a UI Library in FinTech Applications : An Evaluation of the EffectivenessHallberg, Emil January 2021 (has links)
Implementing new technology into a complex software developing environment comes with many challenges in terms of code, user interface design, and developer experience. It is well-established that the pressing demand for security and regulations within financial technology makes it even more essential to implement new technologies with care and minimised risks. This work aims to determine the effectiveness of such implementations. Specifically, it evaluates the effectiveness of implementing a UI library in a FinTech application to find the most suitable approach. In this context, a UI library defines as a robust set of user interface components available in one place and a FinTech application as an application in financial technology with complex developing infrastructure. In order to successfully evaluate the effectiveness of implementing a UI library in a FinTech application, a thorough literature survey was performed to identify decisive factors relating to code quality, user interface, and developer experience. In a case study in which a FinTech company served as an example, a solution consisting of their product and UI library was developed. The solution was tested by collecting data from code evaluation, questionnaires, and interviews. The result shows that the solution has higher code quality, fulfils the FinTech UI requirements, and perceived as an improvement of the development infrastructure. On this basis, the utilised methodology and the recognised factors in this work should be taken into account to identify the most suitable approach when implementing a UI library in a FinTech application.
|
19 |
Recherche des squarks et des gluinos dans l'expérience DELPHI au LEPVerdier, Patrice 23 April 2001 (has links) (PDF)
La montée en énergie et en luminosité du LEP a permis d'étendre considérablement la recherche de nouvelle physique en collisionneur e+e-. La supersymètrie permet de résoudre plusieurs problèmes du Modèle Standard en introduisant une symétrie entre fermions et bosons. Les squark s stop (t1) et sbottom (b1), partenaires supersymétriques des quarks de troisième génération occupent une place particulière. Ils pourraient être parmi les particules supersymétriques les plus légères. Les squarks ont tout d'abord été recherchés dans les données collectées par DELPHI de 1998 à 2000 à des énergies dans le centre de masse allant de 189 à 208 GeV. Lorsque la R-parité est conservée, la particule supersymétrique la plus légère (LSP) est le neutralino (01) qui interagit très faiblement avec la matière. Les canaux de désintégration des squarks sont t1 -> c01 et b1 -> b01. Les événements recherchés sont caractérisés par deux jets et de l'énergie manquante. Une attention particulière à été portée à la modélisation de l'hadronisation du stop et à l'étude des interactions photon-photon produisant des hadrons. Des limites sur les masses des squarks ont été établies. Cependant, de nouveaux modèles prédisent que la LSP est le gluino (g). La signature d'énergie manquante de la LSP est remise en cause. Le scénario d'un gluino LSP a donc été développé et recherché dans DELPHI. Les données enregistrées en 1994 à la résonance du Z0 ont permis de donner pour la première fois une limite sur la masse d'un gluino stable. L'analyse des données LEP2 a été réalisée pour les canaux de désintégration du stop en cg et du sbottom en bg. Des limites sur la masse des squarks, dans ce scénario, ont été obtenues. Elles confortent les limites obtenues à LEP1.
|
20 |
INFLUENCE OF ONCOTYPE DX® ON CHEMOTHERAPY PRESCRIBING IN EARLY STAGE BREAST CANCER PATIENTS: A CLAIMS-BASED EVALUATION OF UTILIZATION IN THE REAL WORLDKennedy, Kenneth Neil 01 January 2012 (has links)
The decision for adjuvant therapy in women with early stage breast cancer (ESBC) has historically been guided by the presence or absence of specific biological markers (hormone and HER2 receptors), age, and extent of nodal involvement. Oncotype DX® is a validated assay that quantifies protein expression that can predict the risk of cancer recurrence. This study evaluates if the use of Oncotype DX® impacts chemotherapy prescribing in ESBC. This retrospective, cohort study identified patients with ESBC from a large commercially insured population from January 2007 through June 2009. Patients were identified as having ESBC by utilizing procedure and diagnosis codes to indicate that a sentinel lymph node biopsy had been performed. Hormone receptor status was verified by patients receiving at least one month of hormonal therapy including: tamoxifen, anastrozole, letrozole, or exemestane. Exclusion criteria will include patients less than 18 years of age, procedure codes indicating axillary lymph node dissection, or charges for trastuzumab. The administration of Oncotype DX® was not found to significantly affect a physician’s decision to prescribe chemotherapy. However, there were significant regional differences in Oncotype DX® utilization by region. Future studies should be conducted at a population level to determine the effects of Oncotype DX®.
|
Page generated in 0.3036 seconds