• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 111
  • 28
  • 25
  • 23
  • 17
  • 4
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 242
  • 31
  • 29
  • 27
  • 26
  • 21
  • 21
  • 21
  • 20
  • 20
  • 20
  • 19
  • 19
  • 17
  • 17
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

An electronic model of the ATLAS Phase-1 Upgrade Hadronic Endcap Calorimeter Front End Crate Baseplane

Porter, Ryan 07 August 2015 (has links)
This thesis presents an electrical model of two pairs of interconnects of the ATLAS Phase-1 Upgrade Hadronic Endcap Front End Crate prototype baseplane. Stripline transmission lines of the baseplane are modeled using Keysight Technologies' Electromagnetic Professional's (EMPro) 3D electromagnetic simulation (Finite Element Method) and the connectors are modeled using built-in models in Keysight Technologies' Advanced Design System (ADS). The model is compared in both the time and frequency domain to measured Time Domain Reflectometer (TDR) traces and S-parameters. The S-parameters of the model are found to be within 5% of the measured S-parameters for transmission and reflection, and range from 25% below to 100% above for forward and backward crosstalk. To make comparisons with measurements, the cables used to connect the prototype HEC baseplane to the measurement system had to be included in the model. Plots of the S-parameters of a model without these cables are presented for one pair of interconnects for which the crosstalk is expected to be the higher than most other interconnects of the baseplane. / Graduate / 0605 / 0798 / rdporter@uvic.ca
82

Duomenų vientisumo apribojimų realizavimo strategijos didelėje įmonėje tyrimas / Investigation of strategy for implementation of data integrity constraints in large application system

Preibys, Justas 28 January 2008 (has links)
Darbe apžvelgiami duomenų vientisumo apribojimų tipai ir jų realizavimo būdai, analizuojami jų privalumai ir trūkumai. Kiekvienam apribojimo tipui, pagal veikimo trukmės ir įgyvendinimo sudėtingumo charakteristikas, parinktas labiausiai efektyvus realizavimo būdas bei eksperimentiškai patvirtintos analizės metu iškeltos hipotezės. Remiantis eksperimentinio tyrimo rezultatais, sudaryta ir aprašyta duomenų vientisumo apribojimų realizacijos strategijos parinkimo metodika, kuri galėtų padėti didelėms įmonėms efektyviai įgyvendinti duomenų vientisumo apribojimus, sunaudojant kuo mažiau laiko, darbo ir sistemos resursų, bet tuo pačiu užtikrinant duomenų vientisumą ir korektiškumą. / This thesis describes types of data integrity constraints, the ways for implementation these constraints, and their analyzed advantages and disadvantages. It is chosen most effective way for implementation every type of data integrity constraint, according to their run duration and complexity of implementation characteristics, and experimentally approved hypotheses, which were raised during analysis. With reference to the results of the experiment, is made and described the methodology of implementation strategy for data integrity constraints, which could help effectively implement data integrity constraints for large application systems, using as less as it is possible time, workforce and system resources, and at the same time ensuring data integrity and correctness.
83

Vientisumo apribojimų įgyvendinimo duomenų bazėse metodika / Realization of methodology of integrity constrains in dabatabases

Eidukevičius, Edvinas 16 January 2006 (has links)
This work discuses complex rule systems, integrity constraints implementation issues. It describes all integrity constraints and their types. Showed their representation in UML concepts also ORACLE PL/SQL system code. Analysed realization possibilities of integrity constraints. Realized in database system and enforced testing process in this database of integrity constraits. Introduces users helpfull methodology about peration issues, possibilities and methods. Managed system of integrity constains creating templates. In some ways program generates PL/SQL code by users defined variables.
84

SURFACE ELECTROMYOGRAPHY CHARACTERIZATION OF THE LOCAL TWITCH RESPONSE ELECTED BY TRIGGER POINT INJECTION AND SNAPPING PALPATION IN MYOFASCIAL PAIN PATIENTS

Lim, Pei Feng 01 January 2004 (has links)
Local twitch responses (LTRs) can be elicited by snapping palpation of myofascial trigger points (TrP) or TrP injections. Objective: To characterize the LTR elicited by TrP injection and snapping palpation on surface electromyography (sEMG) in subjects with myofascial pain in 14 female subjects. Methods: Surface EMG electrodes were placed around the TrP and a control site on the trapezius muscle. Then the following protocol was carried out: tension and contraction of the ipsilateral trapezius muscle, baseline resting activity (five minutes), snapping palpation of the TrP and the control sites, TrP injection, and final resting activity (five minutes). The following data were recorded: pain ratings, areas of referred pain, presence of LTR, and sEMG recordings. Results: During the TrP injection, the investigator found LTRs in only 36% of the subjects, while 64% of the subjects reported that they felt the LTR, and the sEMG recorded only one LTR in one subject. Despite the low percentage of LTRs elicited clinically (36%), a large number of subjects (71%) reported more than 50% immediate reduction in pain intensity after the TrP injection. Conclusion: The sEMG is unable to register the LTR elicited by snapping palpation and TrP injection.
85

Mortgage Default in Southern California: Examining Distressed Borrower's Decision Making and Market Contagion

Wilkerson, Michael 01 January 2012 (has links)
This dissertation focuses on mortgage defaults in Southern California during the housing bubble of the 2000s. The rapid decline in the housing market that precipitated the current recession has been accompanied by an unprecedented number of loan defaults and foreclosures. Recent studies have identified two major theories of default--the "double trigger" hypothesis, where negative equity and an income shock are necessary conditions for default--and "strategic default" where negative equity is a sufficient condition for default. This paper adds to the default literature by adding short sale as another possible outcome of mortgage default. The primary goal is to analyze the determinants of mortgage default to assist in understanding the conditions under which strategic behavior of home sales is most likely to occur. Data from Los Angeles County was analyzed from 2007 to 2010 for every closed sale, then coded into three possible sales outcomes: 1) Organic 2) Short Sale 3) Real Estate Owned (REO). A multinomial probit model was used to model homeowner decision-making based on the sale outcome. The model rejected the "double trigger" hypothesis, as it was found that income shocks do no have a significant effect on impacting the predicted probability for distressed sales. Education levels, the sales price of homes, credit card debt, and market price reductions were found to be significant variables in determining distressed sales outcomes, thereby confirming the strategic default hypothesis. The next section studied spatial association of short sales and REO to see if any contagion effects were present. It was found that both short sales and REO form into clusters of hot and cold spots. Social stigma is believed to impact consumer behavior, the theory was confirmed through the findings of contagion and spatial lag. The final section constructed a hedonic price model to capture the price effects that distressed sales have on neighborhood pricing. Foreclosures were found to have three times the negative impact on neighborhood pricing compared to short sales.
86

Masking problematic channels in the liquid argon calorimeter for the high-level trigger of ATLAS

Taylor, Ryan Paul 02 June 2009 (has links)
Read-out channels in the liquid argon (LAr) calorimeter of the ATLAS detector are susceptible to various kinds of faults, which can impair the selection of events made by the trigger system. General-purpose software tools have been developed for dealing with problematic calorimeter channels. In order to give High-Level Trigger (HLT) algorithms robustness against detector problems, these tools have been applied in the HLT calorimeter data preparation code to mask problematic channels in the LAr calorimeter. Timing measurements and optimizations have been conducted to assess and minimize the impact of these operations on the execution speed of HLT algorithms. The efficacy of the bad-channel masking has been demonstrated using cosmic-ray data.
87

Studies of leaching, recovery and recycling of heavy metals

Askari, Hallo Mustafa January 2008 (has links)
The leachability of cadmium, cobalt, copper, lead, nickel and zinc metals and their oxides, sulfides and carbonates by water, 0.5 mol dm-3 CH3COOH, 0.1 mol dm -3 HCl/NaCI (1: 1 mixture) and 2 mol dm -3 HNO3 is reported. The concentrations of the leached heavy metals are compared with the trigger levels set by World Health Organisation (WHO). Three leaching solutions (nitric, sulfuric and hydrochloric acids) were used to extract copper, zinc, cobalt, nickel, iron and lead from spent catalysts prior to the application of separation technologies. Leaching experiments were conducted using both traditional methods and a microwave-assisted extraction technique. Data are provided on the effects of leaching temperature, leaching time, solid to liquid ratio and acid concentration on the extraction of different metals. The use of 2 mol dm-3 sulfuric acid at 50°C for 60 minutes and at a solid/liquid (S/L) ratio of 1: 25 achieved more than 90% extraction for all the metals studied. A comparison of the results from traditional and microwave extraction techniques demonstrates that microwave heating reduced the time required to obtain maximum metal extraction. The kinetics for the traditional extraction procedure showed that diffusion was the ratecontrolling process, but it was not possible to conclusively establish the rate controlling process for the microwave leaching. The feasibility of using an electrodialysis process to separate metal ions, such as copper from zinc, was examined. A laboratory-scale three compartments membrane system was designed, constructed, used and optimised for the separation process. The separation of copper from zinc in the electrodialysis process exploited the greater stability of the Cu-EDTA complex compared with the Zn-EDTA complex. It was observed that Zn 2+ ions migrated through the cation-exchange membrane from central compartment to catholyte and, simultaneously, the negative Cu-EDTA complex transferred to the analyte compartment crossing the anion exchange membrane. The technique was successfully used to separate mixtures of Cu: Cd and Zn: Ni. The technique could not, however, be used for the separation of Zn from Cd. An adsorption process was used to prepare copper, iron, nickel and zinc oxides catalysts on y-A1203 as support. The materials prepared were used in a fixed bed reactor to assess the catalytic oxidation of volatile organic compounds (methane and ethane) in air. Cu/y- A1203 was found to be the most promising catalyst for the complete oxidation of methane and ethane at temperatures of 575°C and 525°C, respectively. Increasing the calcination temperature in the drying and pre-treatment of the catalysts resulted in a decrease in the catalytic activity.
88

FPGA-based Instrumentation for Advanced Physics Experiments

Hidvégi, Attila January 2011 (has links)
Modern physical experiments often demand advanced instrumentation based on advances in  technology. This work describes four instrumentation physics projects that are based on modern, high-capacity Field-Programmable Gate Arrays, making use of their versatility, programmability, high bandwidth communication interfaces and signal processing capabilities. In the first project, a jet-finding algorithm for the ATLAS detector at the LHC experiment at CERN was developed and implemented, and different verification methods were created to validate the functionality and reliability. The experiment uses a three level trigger system, where the first level uses custom FPGA-based hardware for analysis of collision events in real-time. The second project was an advanced timing and triggering distribution system for the new European X-Ray Free Electron Laser (XFEL) facility at DESY in Hamburg. XFEL will enable scientists to study nano structures on the atomic scale. Its laser pulses will have the strongest peak power in the world with extremely short duration and a high repetition rate, which will even allow filming of chemical reactions. The timing system uses modern FPGAs to distribute high-speed signals over optical fibers and to deliver clocks and triggers with high accuracy. The third project was a new data acquisition board based on high-speed ADCs combined with high-performance FPGAs, to process data from segmented Ge-detectors in real-time. The aim was to improve system performance by greatly oversampling and filtering the analog signals to achieve greater effective resolution. Finally, an innovative solution was developed to replace an aging system used at CERN and Stockholm University to test vital electronics in the Tile Calorimeters of the ATLAS detector system. The new system is entirely based on a commercial FPGA development board, where all necessary custom communication protocols were implemented in firmware to emulate obsolete hardware. / Inom området instrumenteringsfysik bedrivs forskning och utveckling av avancerade instrument, som används inom moderna fysikexperiment. Denna avhandling beskriver fyra projekt där programmerbara kretsar (FPGA) har nyckelfunktioner för att lösa krävande instrumenteringsuppgifter. Den första projektet beskriver utveckling och implementering av en algoritm för detektering av partikelskurar efter partikelkollisioner i LHC-experimentets ATLAS-detektor. Experimentet genererar 40 miljoner händelser per sekund, som måste analyseras i real-tid med hjälp av snabba parallella algoritmer. Resultatet avgör vilka händelser som är tillräckligt intressanta för fortsatt noggrannare analys. Den andra projektet beskriver utvecklingen av ett system som distribuerar klock- och trigger-signaler över ett 3 kilometers experimentområde med extrem precision, i den nya röntgenlaseracceleratorn XFEL vid DESY i Hamburg. Vid XFEL kommer man utforska nanostrukturer och till och med filma molekylers kemiska reaktioner. I den tredje projektet beskrivs utvecklingen av ett höghastighets datainsamlingssystem, för segmenterade Ge-detektorer. Genom att översampla signalen med hög hastighet kan man uppnå en bättre noggrannhet i mätningen än vad AD-omvandlarens egna upplösning medger. Detta leder i sin tur  till förbättrade systemprestanda. Slutligen beskrivs en innovativ lösning till ett test system för den elektronik, som Stockholms universitet har levererat till ATLAS detektorn. Det nya systemet ersätter det föregående testsystemet, som är baserad på föråldrade inte längre tillgängliga komponenter. Det nya systemet är dessutom också billigare eftersom det är baserat på ett standard FPGA utvecklingskort. / ATLAS experiment of the Large Hadron Collider experiment / European X-ray Free Electron Laser
89

The design and construction of the beam scintillation counter for CMS

Bell, Alan James January 2008 (has links)
This thesis presents the design qualification and construction of the Beam Scintillator Counter (BSC) for the CMS Collaboration at CERN in 2007 - 2008. The BSC detector is designed to aid in the commissioning of the Compact Muon Solenoid (CMS) during the first 2 years of operation and provide technical triggering for beam halo and minimum-bias events. Using plastic scintillator tiles mounted at both ends of CMS, it will detect minimum ionizing particles through the low-to-mid luminosity phases of the Large Hadron Collider (LHC) commissioning. During these early phases, the BSC will provide probably the most interesting and widely used data of any of the CMS sub-detectors and will be employed in the track based alignment procedure of the central tracker and commissioning of the Forward Hadron Calorimeter.
90

Runtime and jitter of a laser triggered gas switch

Hutsel, Brian T. Kovaleski, Scott D. January 2008 (has links)
The entire thesis text is included in the research.pdf file; the official abstract appears in the short.pdf file; a non-technical public abstract appears in the public.pdf file. Title from PDF of title page (University of Missouri--Columbia, viewed on September 24, 2009). Thesis advisor: Dr. Scott Kovaleski. Includes bibliographical references.

Page generated in 0.1925 seconds