• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 700
  • 223
  • 199
  • 92
  • 75
  • 48
  • 25
  • 23
  • 17
  • 16
  • 15
  • 15
  • 14
  • 11
  • 10
  • Tagged with
  • 1740
  • 537
  • 245
  • 184
  • 165
  • 153
  • 153
  • 125
  • 114
  • 108
  • 107
  • 94
  • 80
  • 78
  • 77
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
271

The continued influence of misinformationg following a delayed correction

Rich, Patrick Russell 04 August 2016 (has links)
No description available.
272

Improving Design Strategies for Composite Pavement Overlay: Multi-layered Elastic Approach and Reliability Based Models

Sigdel, Pawan January 2016 (has links)
No description available.
273

Detection and Correction of Global Positioning System Carrier Phase Measurement Anomalies

Achanta, Raghavendra 14 July 2004 (has links)
No description available.
274

Mean curvature mapping: application in laser refractive surgery

Tang, Maolong 12 October 2004 (has links)
No description available.
275

MRI-Based Attenuation Correction for PET Reconstruction

Steinberg, Jeffrey 12 September 2008 (has links)
No description available.
276

Manipulation Of Cognitive Biases And Rumination: An Examination Of Single And Combined Correction Conditions

Adler, Abby Danielle 12 September 2008 (has links)
No description available.
277

The Effects of the Copy, Cover, and Compare Strategy on the Acquisition, Maintenance, and Generalization of Spelling Sight Words for Elementary Students with Disabilities

Moser, Lauren Ashley 16 September 2009 (has links)
No description available.
278

Characterization of Center-of-Mass and Rebinning in Positron Emission Tomography with Motion / Karaktärisering av masscentrum och händelseuppdatering i positronemissionstomografi med rörelse

Hugo, Linder January 2021 (has links)
Medical molecular imaging with positron emission tomography (PET) is sensitive to patient motion since PET scans last several minutes. Despite advancements in PET, such as improved photon-pair time-of-flight (TOF) difference resolution, motion deformations limit image resolution and quantification. Previous research of head motion tracking has produced the data-driven centroid-of-distribution (COD) algorithm. COD generates a 3D center-of-mass (COM) over time via raw list-mode PET data, which can guide motion correction such as gating and event rebinning in non-TOF PET. Knowledge gaps: COD could potentially benefit from sinogram corrections used in image reconstruction, while rebinning has not extended to TOF PET. Methods: This study develops COD with event mass (incorporating random correction and line-of-response (LOR) normalization) and a simplistic TOF rebinner. In scans of phantoms and moving heads with F11 flouro-deoxy-glucose (FDG) tracer, COD alternatives are evaluated with a signal-to-noise ratio (SNR) via linear fit to image COM, while rebinning is evaluated with mean squared error (MSE). Results: COD SNR did not benefit from a corrected event mass. The prototype TOF rebinning reduced MSE, although there were discretization errors and event loss at extreme bins for LOR and TOF due to the simplistic design, which introduced image artifacts. In conclusion, corrected event mass in COD is not promising, while TOF rebinning appears viable if techniques from state-of-the-art LOR rebinning are incorporated.
279

Attenuation Correction in Positron Emission Tomography Using Single Photon Transmission Measurement

Dekemp, Robert A. 09 1900 (has links)
Accurate attenuation correction is essential for quantitative positron emission tomography. Typically, this correction is based on a coincidence transmission measurement using an external source of positron emitter, which is positioned close to the detectors. This technique suffers from poor statistical quality and high dead time losses, especially with a high transmission source strength. We have proposed and tested the use of single photon transmission measurement with a rotating rod source, to measure the attenuation correction factors (ACFs). The singles projections are resampled into the coincidence geometry using the detector positions and the r,)d source location. A nonparalyzable dead time correction algorithm was developed for the block detectors used in the McMaster PET scanner. Transaxial resolution is approximately 6 mm, which is comparable to emission scanning performance. Axial resolution is about 25 mm, with only crude source collimation. ACFs are underestimated by approximately 10% due to increased crossplane scatter, compared to coincidence transmission scanning. Effective source collimation is necessary to obtain suitable axial resolution and improved accuracy. The response of the correction factors to object density is linear to within 15%, when comparing singles transmission measurement to current coincidence transmission measurement. The major advantage of using singles transmission measurement IS a dramatically increased count rate. A factor of seven increase in count rate over coincidence scanning is possible with a 2 mCi transmission rod source. There are no randoms counted in singles transmission scans, which makes the measured count rate nearly linearly proportional with source activity. Singles detector dead time is approximately 6% in the detectors opposite a 2 mCi rod source. Present hardware and software precludes the application of this technique in a clinical environment. We anticipate that real time acquisition of detector singles can reduce the transmission scanning time to under 2 minutes, and produce attenuation coefficient images with under 2% noise. This is a significant improvement compared to the current coincidence transmission technique. / Thesis / Master of Science (MS)
280

Energy-efficient custom integrated circuit design of universal decoders using noise-centric GRAND algorithms

Riaz, Arslan 24 May 2024 (has links)
Whenever data is stored or transmitted, it inevitably encounters noise that can lead to harmful corruption. The communication technologies rely on decoding the data using Error Correcting Codes (ECC) that enable the rectification of noise to retrieve the original message. Maximum Likelihood (ML) decoding has proven to be optimally accurate, but it has not been adopted due to the lack of a feasible implementation arising from its computational complexity. It has been established that ML decoding of arbitrary linear codes is a Nondeterministic Polynomial-time (NP) hard problem. As a result, many code-specific decoders have been developed as an approximation of an ML decoder. This code-centric decoding approach leads to a hardware implementation that tightly couples with a specific code structure. Recently proposed Guessing Random Additive Noise Decoding (GRAND) offers a solution by establishing a noise-centric decoding approach, thereby making it a universal ML decoder. Both the soft-detection and hard-detection variants of GRAND have shown to be capacity achieving for any moderate redundancy arbitrary code. This thesis claims that GRAND can be efficiently implemented in hardware with low complexity while offering significantly higher energy efficiency than state-of-the-art code-centric decoders. In addition to being hardware-friendly, GRAND offers high parallelizability that can be chosen according to the throughput requirement making it flexible for a wide range of applications. To support this claim, this thesis presents custom-designed energy-efficient integrated circuits and hardware architectures for the family of GRAND algorithms. The universality of the algorithm is demonstrated through measurements across various codebooks for different channel conditions. Furthermore, we employ the noise recycling technique in both hard-detection and soft-detection scenarios to improve the decoding by exploiting the temporal noise correlations. Using the fabricated chips, we demonstrate that employing noise recycling with GRAND significantly reduces energy and latency, while providing additional gains in decoding performance. Efficient integrated architectures of GRAND will significantly reduce the hardware complexity while future-proofing a device so that it can decode any forthcoming code. The noise-centric decoding approach overcomes the need for code standardization making it adaptable for a wide range of applications. A single GRAND chip can replace all existing decoders, offering competitive decoding performance while also providing significantly higher energy and area efficiency. / 2026-05-23T00:00:00Z

Page generated in 0.0856 seconds