• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 259
  • 80
  • 32
  • 23
  • 23
  • 13
  • 9
  • 7
  • 5
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 584
  • 97
  • 53
  • 49
  • 45
  • 44
  • 43
  • 43
  • 39
  • 38
  • 36
  • 36
  • 31
  • 28
  • 28
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
531

[pt] CONTAGEM DE FÓTONS NO INFRAVERMELHO PRÓXIMO E MÉDIO VIA CONVERSÃO DE FREQÜÊNCIAS APLICADA A COMUNICAÇÕES QUÂNTICAS / [en] SINGLE PHOTON COUNTING IN THE NEAR- AND MID-INFRARED VIA FREQUENCY UP-CONVERSION APPLIED TO QUANTUM COMMUNICATIONS

06 September 2007 (has links)
[pt] Dois dispositivos de contagem de fótons únicos, sensíveis a comprimentos de onda no infravermelho próximo e médio, são propostos e experimentalmente investigados. Ambos utilizam uma técnica de dois estágios, composta de uma etapa inicial de conversão de freqüências em um cristal não-linear seguida de detecção por um fotodiodo avalanche de silício. Enquanto o primeiro projeto é voltado à detecção de fótons únicos a 1.55 μm para comunicações quânticas via fibra óptica, usando um processo intra-cavidade, o segundo projeto prevê o desenvolvimento de um contador de fótons operando a 4.65 μm para sistemas de espaço livre. Neste caso, um estudo de viabilidade para um sistema prático de criptografia quântica operando em um comprimento de onda no infravermelho médio é realizado. Os resultados mostram que, usando a tecnologia disponível na atualidade, tal sistema pode ser construído, embora sua utilidade se mostre restrita a enlaces possuindo certas condições meteorológicas específicas. / [en] Two single photon counting devices, operating at near- and mid-infrared wavelengths, are introduced and experimentally investigated. Both use a twostage technique, comprised of an initial frequency up-conversion procedure inside a nonlinear crystal followed by a silicon avalanche photodiode. Whereas the first project consists on detection of single photons at 1.55 ìm for fiber-optic-based quantum communications, using a cavity-enhanced procedure, the second project envisions the development of a single-photon counter operating at 4.65 ìm for free-space systems. In this case, a feasibility study for a practical quantum key distribution system operating in a mid-infrared wavelength is performed. The results show that, using present-day technology, such a system can be constructed, albeit its usefulness would be restricted to operation under very specific weather conditions.
532

Efficient and Scalable Subgraph Statistics using Regenerative Markov Chain Monte Carlo

Mayank Kakodkar (12463929) 26 April 2022 (has links)
<p>In recent years there has been a growing interest in data mining and graph machine learning for techniques that can obtain frequencies of <em>k</em>-node Connected Induced Subgraphs (<em>k</em>-CIS) contained in large real-world graphs. While recent work has shown that 5-CISs can be counted exactly, no exact polynomial-time algorithms are known that solve this task for <em>k </em>> 5. In the past, sampling-based algorithms that work well in moderately-sized graphs for <em>k</em> ≤ 8 have been proposed. In this thesis I push this boundary up to <em>k</em> ≤ 16 for graphs containing up to 120M edges, and to <em>k</em> ≤ 25 for smaller graphs containing between a million to 20M edges. I do so by re-imagining two older, but elegant and memory-efficient algorithms -- FANMOD and PSRW -- which have large estimation errors by modern standards. This is because FANMOD produces highly correlated k-CIS samples and the cost of sampling the PSRW Markov chain becomes prohibitively expensive for k-CIS’s larger than <em>k </em>> 8.</p> <p>In this thesis, I introduce:</p> <p>(a)  <strong>RTS:</strong> a novel regenerative Markov chain Monte Carlo (MCMC) sampling procedure on the tree, generated on-the-fly by the FANMOD algorithm. RTS is able to run on multiple cores and multiple machines (embarrassingly parallel) and compute confidence intervals of estimates, all this while preserving the memory-efficient nature of FANMOD. RTS is thus able to estimate subgraph statistics for <em>k</em> ≤ 16 for larger graphs containing up to 120M edges, and for <em>k</em> ≤ 25 for smaller graphs containing between a million to 20M edges.</p> <p>(b) <strong>R-PSRW:</strong> which scales the PSRW algorithm to larger CIS-sizes using a rejection sampling procedure to efficiently sample transitions from the PSRW Markov chain. R-PSRW matches RTS in terms of scaling to larger CIS sizes.</p> <p>(c) <strong>Ripple:</strong> which achieves unprecedented scalability by stratifying the R-PSRW Markov chain state-space into ordered strata via a new technique that I call <em>sequential stratified regeneration</em>. I show that the Ripple estimator is consistent, highly parallelizable, and scales well. Ripple is able to <em>count</em> CISs of size up to <em>k </em>≤ 12 in real world graphs containing up to 120M edges.</p> <p>My empirical results show that the proposed methods offer a considerable improvement over the state-of-the-art. Moreover my methods are able to run at a scale that has been considered unreachable until now, not only by prior MCMC-based methods but also by other sampling approaches. </p> <p><strong>Optimization of Restricted Boltzmann Machines. </strong>In addition, I also propose a regenerative transformation of MCMC samplers of Restricted Boltzmann Machines RBMs. My approach, Markov Chain Las Vegas (MCLV) gives statistical guarantees in exchange for random running times. MCLV uses a stopping set built from the training data and has a maximum number of Markov chain step-count <em>K</em> (referred as MCLV-<em>K</em>). I present a MCLV-<em>K</em> gradient estimator (LVS-<em>K</em>) for RBMs and explore the correspondence and differences between LVS-<em>K</em> and Contrastive Divergence (CD-<em>K</em>). LVS-<em>K</em> significantly outperforms CD-<em>K</em> in the task of training RBMs over the MNIST dataset, indicating MCLV to be a promising direction in learning generative models.</p>
533

Slow and Stopped Light with Many Atoms, the Anisotropic Rabi Model and Photon Counting Experiment on a Dissipative Optical Lattice

Thurtell, Tyler 10 August 2018 (has links)
No description available.
534

Evaluation of the Impact of X-ray Tube Voltage and Filter Thickness on the Performance of Spectral Photon-Counting Detectors / Utvärdering av inverkan av röntgenrörsspänning och filtertjocklek på prestanda för spektrala fotonräknande detektorer

Mannila, Cassandra, Larsson, Marcus January 2021 (has links)
During the past years photon-counting detectors (PCDs) have emerged as an alternative to conventional energy-integrating detectors and may significantly improve the standard of care for computed tomography (CT). There are two main alternatives for the material of the detector: cadmium telluride (CdTe) and silicon (Si). The settings of the X-ray tube and the applied filters need to be evaluated and optimized for the new detector technology. In this report, Monte Carlo simulations are used to determine how image quality is affected by different X-ray tube voltages and filter thicknesses. The image quality indicators that were chosen to evaluate are detective quantum efficiency (DQE) for material quantification and both DQE and dose-normalized signal-difference-to-noise ratio (SDNR) for detection tasks. Overall, silicon-based detectors performed better than cadmium-based detectors for quantification imaging tasks for all object thicknesses, while cadmium-based detectors were superior for detection imaging tasks in larger patients. For both silicon- and cadmium-based detectors, the dose-normalized image quality was largely independent of filter thickness, while the X-ray tube voltage had a more distinct impact on the result, where low voltages were optimal. / Under de senaste åren har fotonräknande detektorer blivit aktuellt som ett alternativ till konventionella energiintegrerande detektorer och kommer troligen förbättra datortomografibilder avsevärt. För de nya detektorerna finns det två huvudsakliga materialalternativ: kadmiumtellurid (CdTe) och kisel (Si). Inställningarna för röntgenröret och det pålagda filtret behöver utvärderas och optimeras för den nya detektorteknologin. I denna rapport användes Monte Carlo-simuleringar för att bestämma hur bildkvaliteten påverkades av rörspänningen och filtertjockleken. Bildkvaliteten bestämdes sedan utifrån tre indikatorer, detective quantum efficiency (DQE) för materialbestämning samt både DQE och dosnormaliserad signal-difference-to-noise ratio (SDNR) för detektionsuppgifter.    Den kiselbaserade detektorn presterade bättre än den kadmiumbaserade för materialbestämning för alla patientstorlekar medan den kadmiumbaserade presterade bättre på detektionsuppgifterna för större patienter. Vidare var den dosnormaliserade bildkvaliteten för både kisel- och kadmiumdetektorer svagt beroende av filtertjocklek medan båda påverkades starkt av rörspänningen, där låga spänningar var att föredra.
535

Primordial nuclides and low-level counting at Felsenkeller

Turkat, Steffen 14 November 2023 (has links)
Within cosmology, there are two entirely independent pillars which can jointly drive this field towards precision: Astronomical observations of primordial element abundances and the detailed surveying of the cosmic microwave background. However, the comparatively large uncertainty stemming from the nuclear physics input is currently still hindering this effort, i.e. stemming from the 2H(p,γ)3He reaction. An accurate understanding of this reaction is required for precision data on primordial nucleosynthesis and an independent determination of the cosmological baryon density. Elsewhere, our Sun is an exceptional object to study stellar physics in general. While we are now able to measure solar neutrinos live on earth, there is a lack of knowledge regarding theoretical predictions of solar neutrino fluxes due to the limited precision (again) stemming from nuclear reactions, i.e. from the 3He(α,γ)7Be reaction. This thesis sheds light on these two nuclear reactions, which both limit our understanding of the universe. While the investigation of the 2H(p,γ)3He reaction will focus on the determination of its crosssection in the vicinity of the Gamow window for the Big Bang nucleosynthesis, the main aim for the 3He(α,γ)7Be reaction will be a measurement of its γ-ray angular distribution at astrophysically relevant energies. In addition, the installation of an ultra-low background counting setup will be reported which further enables the investigation of the physics of rare events. This is essential for modern nuclear astrophysics, but also relevant for double beta decay physics and the search for dark matter. The presented setup is now the most sensitive in Germany and among the most sensitive ones worldwide. / Innerhalb der Kosmologie gibt es zwei völlig unabhängige Ansätze, die gemeinsam die Präzision in diesem Gebiet weiter vorantreiben können: Astronomische Beobachtungen der primordialen Elementhäufigkeiten und die detaillierte Vermessung des kosmischen Mikrowellenhintergrunds. Dieses Vorhaben wird derzeit allerdings noch durch die vergleichsweise große Unsicherheit des kernphysikalischen Inputs verhindert, vor allem bedingt durch das limitierte Verständnis der 2H(p,γ)3He-Reaktion. Eine präzise Vermessung dieser Reaktion ist sowohl für die Präzisionsdaten zur primordialen Nukleosynthese erforderlich, als auch für die damit einhergehende unabhängige Bestimmung der kosmologischen Baryonendichte. Des Weiteren ist unsere Sonne ein exzellent geeignetes Objekt, um unser theoretisches Verständnis über die Physik von Sternen mit experimentellen Messungen abgleichen zu können. Während wir heutzutage in der Lage sind, solare Neutrinos in Echtzeit auf der Erde messen können, mangelt es noch an der theoretischen Vorhersagekraft von solaren Neutrinoflüssen. Auch hier ist die Präzision (erneut) begrenzt durch das limitierte Verständnis der beteiligten Kernreaktionen, vor allem bedingt durch mangelnde Kenntnis über die 3He(α,γ)7Be-Reaktion. Die vorliegende Arbeit beleuchtet diese zwei Kernreaktionen, die beide unser Verständnis des Universums auf verschiedene Weise einschränken. Während sich die Untersuchung der 2H(p,γ)3He-Reaktion auf die Bestimmung ihres Wirkungsquerschnitts in der Nähe des Gamow-Fensters für die Urknall-Nukleosynthese konzentriert, ist das Hauptanliegen für die 3He(α,γ)7Be-Reaktion eine Messung der Winkelverteilung der dabei emittierten γ-Strahlung bei astrophysikalisch relevanten Energien. Darüber hinaus wird über die Installation eines Messaufbaus zur Untersuchung niedriger Aktivitäten berichtet, das sich durch seine äußerst geringe Untergrundzählrate auszeichnet. Bedingt durch seine hohe Sensitivität kann dieser Aufbau in Zukunft bedeutende Beiträge für die moderne nukleare Astrophysik leisten und ist darüber hinaus beispielsweise auch relevant für die Untersuchung von Doppel-Betazerfällen oder die Suche nach dunkler Materie. Der präsentierte Aufbau ist nun der Sensitivste seiner Art in Deutschland und gehört zu den Sensitivsten weltweit.
536

Molecular Spectroscopy Experiment to Measure Temperature-Dependent Radiative Lifetime of the SODIUM MOLECULE 6sΣ𝑔(𝑣 = 9, 𝐽 = 31) State

Kashem, Md Shakil Bin 17 July 2023 (has links)
No description available.
537

Privacy-preserving Building Occupancy Estimation via Low-Resolution Infrared Thermal Cameras

Zhu, Shuai January 2021 (has links)
Building occupancy estimation has become an important topic for sustainable buildings that has attracted more attention during the pandemics. Estimating building occupancy is a considerable problem in computer vision, while computer vision has achieved breakthroughs in recent years. But, machine learning algorithms for computer vision demand large datasets that may contain users’ private information to train reliable models. As privacy issues pose a severe challenge in the field of machine learning, this work aims to develop a privacypreserved machine learningbased method for people counting using a lowresolution thermal camera with 32 × 24 pixels. The method is applicable for counting people in different scenarios, concretely, counting people in spaces smaller than the field of view (FoV) of the camera, as well as large spaces over the FoV of the camera. In the first scenario, counting people in small spaces, we directly count people within the FoV of the camera by Multiple Object Detection (MOD) techniques. Our MOD method achieves up to 56.8% mean average precision (mAP). In the second scenario, we use Multiple Object Tracking (MOT) techniques to track people entering and exiting the space. We record the number of people who entered and exited, and then calculate the number of people based on the tracking results. The MOT method reaches 47.4% multiple object tracking accuracy (MOTA), 78.2% multiple object tracking precision (MOTP), and 59.6% identification F-Score (IDF1). Apart from the method, we create a novel thermal images dataset containing 1770 thermal images with proper annotation. / Uppskattning av hur många personer som vistas i en byggnad har blivit ett viktigt ämne för hållbara byggnader och har fått mer uppmärksamhet under pandemierna. Uppskattningen av byggnaders beläggning är ett stort problem inom datorseende, samtidigt som datorseende har fått ett genombrott under de senaste åren. Algoritmer för maskininlärning för datorseende kräver dock stora datamängder som kan innehålla användarnas privata information för att träna tillförlitliga modeller. Eftersom integritetsfrågor utgör en allvarlig utmaning inom maskininlärning syftar detta arbete till att utveckla en integritetsbevarande maskininlärningsbaserad metod för personräkning med hjälp av en värmekamera med låg upplösning med 32 x 24 pixlar. Metoden kan användas för att räkna människor i olika scenarier, dvs. att räkna människor i utrymmen som är mindre än kamerans FoV och i stora utrymmen som är större än kamerans FoV. I det första scenariot, att räkna människor i små utrymmen, räknar vi direkt människor inom kamerans FoV med MOD teknik. Vår MOD-metod uppnår upp till 56,8% av den totala procentuella fördelningen. I det andra scenariot använder vi MOT-teknik för att spåra personer som går in i och ut ur rummet. Vi registrerar antalet personer som går in och ut och beräknar sedan antalet personer utifrån spårningsresultaten. MOT-metoden ger 47,4% MOTA, 78,2% MOTP och 59,6% IDF1. Förutom metoden skapar vi ett nytt dataset för värmebilder som innehåller 1770 värmebilder med korrekt annotering.
538

Calculating Minimum Detectable Activity for a moving scintillator detector using real-time speed measurement : Implementing a monitoring system to improve accuracy of surface contamination measurement systems / Beräkning av minsta detekterbara aktivitet för en mobil scintillatordetektor med hastighetsmätning i realtid : Implementation av ett övervakande system som förbättrar mätsäkerheten vid detektion av radioaktiv ytkontamination

Amcoff, Artur, Persson, Oscar January 2021 (has links)
Surface contamination occurs in nuclear facilities, something that is important to detect easily and efficiently. Using today’s methods to detect nuclear surface contamination may cause certain inconsistencies as the human operator is solely trusted to keep the detector at the correct distance and move it at the correct speed. This thesis project aims to address the problem of inconsistent measurements with respect to the current measurement methods. A system is designed to monitor the measurement process with regards to detector velocity and height. The system will trigger a warning when the minimum detectable activity is too high, as it would lead to inconsistent results. This system consists of a cart-detector setup with a scintillation detector and velocity measurement device(s). Software will utilize the measurement data to implement the aforementioned monitoring. The system aims to be compliant with international standards, such as the ISO 11929 and the ISO 7503 standards, and will thus make use of these standards. The result of the part-analysis for each component of the system showed a large inaccuracy regarding the Intertial Measurement Units (IMUs); hence, the robotic wheels were chosen as the main method of measuring speed for this project. The robotic wheels and the detector were shown to be sufficiently accurate for the desired measurements. The Raspberry Pi 4 model B, the on- board computer, was also shown to be performance-wise and property-wise well suited for the project. This project showed that there is a theoretical way to implement the speed of a moving detector-rig into the Minimum Detectable Activity (MDA) formula. However, the implementation investigated in this project suggests that full compatibility with ISO 7503 was not achievable. / Radioaktiv ytkontaminering förekommer i kärnkraftverk, vilket är viktigt att upptäcka snabbt och effektivt. Dagens metoder för att upptäcka radioaktiv ytkontaminering kan lida av viss osäkerhet eftersom man förlitar sig helt på att operatören kan manövrera detektorn på rätt höjd och hastighet. Detta examensarbete behandlar en lösning till det ovan nämnda problemet. Ett ”proof-of-concept”-system som kan övervaka mätprocessen designas. Genom att mäta hastighet och känna till höjden över marken kan en varning meddelas användaren när den minsta detekterbara aktiviteten (MDA) når ett tröskelvärde. Det färdiga systemet är en plattform på hjul med en scintillator- detektor monterad tillsammans med en eller flesta hastighetsmätningsenheter. Systemet bör vara kompatibelt med internationella standarder, till exempel ISO 11929 och ISO 7503. Resultaten från den utvärdering av varje individuell komponent som gjorts visade på en stor mätosäkerhet i de två utvärderade IMUerna. Detta medförde att robothjulen valdes som enda källa för hastighetsmätning. Robothjulen, samt detektorn påvisade god mätsäkerhet, väl lämpad för detta projekt. Även mikrodatorn, Raspberry Pi 4 model B, visade sig vara lämplig sett till prestanda och egenskaper. Projektet resulterade i att det är trott att det finns en lämpligt sätt att i teorin implementera hastighet som en parameter i formeln för MDA. Det är dock värt att nämna att resultaten tyder på att det i denna implementation inte var möjligt att uppnå fullständig kompabilitet ISO 7503.
539

Effect of Secondary Motor and Cognitive Tasks on Timed Up and Go Test in Older Adults

Mukherjee, Anuradha January 2013 (has links)
No description available.
540

[pt] AVALIAÇÃO DE ANALISADOR HEMATOLÓGICO BASEADO EM MEDIÇÕES ÓPTICAS DIRETAS USANDO LED AZUL PARA CONTAGEM DIFERENCIAL DE LEUCÓCITOS / [en] EVALUATION OF HEMATOLOGY ANALYZER BASED ON DIRECT OPTICAL MEASUREMENTS USING BLUE LED FOR DIFFERENTIAL LEUKOCYTE COUNTING

VICENTE MONTEIRO LORENZON 14 January 2020 (has links)
[pt] Os analisadores hematológicos são sistemas de medição que permitem identificar elementos sanguíneos, incluindo a contagem diferencial dos cinco tipos de leucócitos encontrados no sangue periférico: neutrófilos, linfócitos, monócitos, eosinófilos e basófilos. Devido à inexistência de materiais de referência certificados ou métodos de referência para a análise diferencial de leucócitos, tem sido habitual avaliar o desempenho de novos analisadores comparando-os com sistemas tradicionalmente disponíveis e bem consolidados. Em 2015, foi lançada uma nova tecnologia portátil e de baixo custo, o modelo DxH500 (Beckman Coulter), baseada na perda axial de luz usando LED azul como emissor, combinada à medição de impedância. O presente trabalho compara o desempenho do DxH500 com um modelo bastante utilizado desde seu lançamento em 2001 para exames laboratoriais em larga escala, o modelo LH750 (Beckman Coulter), baseado na combinação dos princípios de impedância, condutividade e dispersão de luz (VCS). No estudo foram examinadas 310 amostras pareadas. A análise comparativa entre os resultados fornecidos por cada dispositivo indicou uma boa correlação para a caracterização de neutrófilos, linfócitos, monócitos e eosinófilos. Apesar da reduzida correlação observada para a contagem de basófilos, esse resultado não apresenta relevância clínica já que os valores obtidos para as amostras avaliadas foram muito reduzidos, inferiores aos limites de referência. Embora a análise comparativa realizada tenha apontado para um desempenho equivalente utilizando-se tecnologias com princípios diversos, a adequada avaliação dos dispositivos de análise hematológica requer o desenvolvimento de materiais de referência certificados, uma demanda fundamental para a garantia da confiabilidade na quantificação diferencial de leucócitos. / [en] Hematology analyzers are measurement systems that allow the identification of blood elements, including differential counting of the five types of leukocytes found in peripheral blood: neutrophils, lymphocytes, monocytes, eosinophils, and basophils. Because of the lack of certified reference materials or reference methods for differential leukocyte analysis, it has been customary to evaluate the performance of new analyzers by comparing them with traditionally available and well-established systems. In 2015, a new low-cost, portable technology was introduced, the Beckman Coulter DxH500 model, based on the axial loss of light using blue LED as a combined transmitter for impedance measurement. The present work compares the performance of the DxH500 with a widely used model since its launch in 2001 for large-scale laboratory tests, the Beckman Coulter model LH750, based on the combination of impedance, conductivity and scatter (VCS). In the study, we examined 310 paired samples. The comparative analysis between differential leukocyte evaluation results provided by each device indicated a good correlation for the characterization of neutrophils, lymphocytes, monocytes, and eosinophils. Despite the low correlation observed for basophil counts, this result is not clinically relevant since the values obtained for the evaluated samples were very low, lower than the reference limits. Although the comparative analysis carried out pointed to an equivalent performance using technologies with different principles, the adequate evaluation of hematological analysis devices requires the development of certified reference materials, fundamental demand for the reliability of the differential quantification of leukocytes.

Page generated in 0.247 seconds