• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7
  • 2
  • 1
  • 1
  • Tagged with
  • 15
  • 15
  • 5
  • 5
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Understanding the Advantages Gamers Bring to the Workforce and What Their Skillset Means for the Future of Handheld Scanning Technology in Large Industrial Organizations

Burch, Reuben Flournoy 17 May 2014 (has links)
Two of the biggest issues facing large organizations today are knowledge transfer from the retiring Baby Boomers to their younger replacements, the Gamers, and the retention of those younger employees. Retirees are replaced by people 34 years old or younger who think, learn, believe, respond, and work differently further increasing the cultural gap that must be traversed in order to successfully transfer knowledge. This younger demographic is raised on technology and may not remember a time when there were no computers, video games, mobile devices, and the Internet. Large organizations aspiring to stay relevant must learn to take advantage of these unique traits. For organization that utilize repetitive work processes involving ruggedized handheld computing tools, both of these issues mentioned can be remediated through the adoption of modern technology. Some ruggedized handheld device manufacturers, however, have been hesitant to embrace consumer-implemented solutions such as the removal of all physical keys in order to incorporate touchscreen only input. Using Baby Boomer and Gamer-aged workers from a large transportation company experienced with ruggedized handheld devices, a time and error evaluation was performed to determine which input type is best by generation. This study found that moving from physical keyed devices to ruggedized handhelds with touchscreens only is a productive move for an industrial workforce but it’s the Boomers who stand to benefit from this change the most, not the Gamers. This study also identified near future requirements for the next iteration of ruggedized handheld devices based on the expectations of members of the current and future workforce. Results showed that participants from all generations selected a device that followed the touchscreen only model for data input. Experienced users from all generations preferred a smaller device with a large screen size. Lastly, Lean and Six Sigma were combined and their benefits explored in an effort of implementing manufacturing quality tools into a global, service-based, logistics organization. These tools and principles were used to improve the quality and timeliness of selecting and implementing a new ruggedized handheld device for the line-level workers on a global scale.
12

Improving the Robustness of Over-the-Air Synchronization for 5G Networks in a Multipath Environment / Förbättring av robustheten av trådlös synkronisering för 5G-nätverk i en flervägsmiljö

Erninger, Anders January 2023 (has links)
Synchronization between base stations is a fundamental part of any operating telecommunication system. With 5G and future generations of mobile networks, the data speeds are getting higher, which creates the need for fast and accurate synchronization. In wireless systems, the transmitted signals are affected by the environment. Both moving and stationary objects can cause a transmitted signal to be scattered or reflected, causing the receiver to receive multiple instances of one signal. If a synchronization signal is transmitted from one base station and received in multiple instances by another, it is hard for the receiving base station to know which of the received instances that should be used for calculating the synchronization error between the base stations. In this thesis, multiple different algorithms for selecting a synchronization signal pair between two base stations to be used for calculating time alignment error have been tested. The results have been evaluated based on their accuracy of selecting a correct matching signal pair. It is shown that the proposed algorithms in this thesis all perform significantly better than the method currently in use. Further, the advantages and disadvantages of each of the new algorithms are discussed, and finally new concepts for future studies are suggested. / Synkronisering mellan basstationer är en fundamental del av ett fungerande telekommunikationssystem. Med 5G och framtida generationer av mobila nätverk så ökas datahastigheter, vilket skapar behovet av en snabb och precis synkronisering. I trådlösa system påverkas skickade signaler av dess omgivning. Både stationära och icke-stationära objekt i omgivningen kan splittra eller reflektera signaler, vilket ger upphov till en flervägskanal. Detta gör att en mottagare kan ta emot flera instanser av en skickad signal. Om en synkroniseringssignal skickas från en basstation via en flervägskanal till en mottagande basstation, så kommer mottagaren att ta emot flera instanser av den skickade signalen vid olika tidpunkter. Det kan då vara svårt för mottagaren att avgöra vilken av de mottagna signalerna som ska användas vid beräkning av tidsfelet mellan basstationerna. I detta examensarbete testas ett flertal olika algoritmer för att välja vilket synkroniseringssignalpar som ska användas vid beräkning av tidsfelet mellan två basstationer. Resultatet utvärderas baserat på hur hög precision algoritmen har i att välja ett korrekt matchat synkroniseringssignalpar. Resultatet visar att de algoritmer som presenteras i denna uppsats presterar märkbart bättre än den algoritm som används i systemen just nu. Vidare diskuteras fördelar och nackdelar med de olika algoritmerna och förslag på vidareutveckling av algoritmerna läggs fram.
13

Abnormal Group Delay and Detection Latency in the Presence of Noise for Communication Systems

Kayili, Levent 06 April 2010 (has links)
Although it has been well established that abnormal group delay is a real physical phenomenon and is not in violation of Einstein causality, there has been little investigation into whether or not such abnormal behaviour can be used to reduce signal latency in practical communication systems in the presence of noise. In this thesis, we use time-varying probability of error to determine if abnormal group delay “channels” can offer reduced signal latency. Since the detection system plays a critical role in the analysis, three important detection systems are considered: the correlation, matched filter and envelope detection systems. Our analysis shows that for both spatially negligible microelectronic systems and spatially extended microwave systems, negative group delay “channels” offer reduced signal latency as compared to conventional “channels”. The results presented in the thesis can be used to design a new generation of electronic and microwave interconnects with reduced or eliminated signal latency.
14

Abnormal Group Delay and Detection Latency in the Presence of Noise for Communication Systems

Kayili, Levent 06 April 2010 (has links)
Although it has been well established that abnormal group delay is a real physical phenomenon and is not in violation of Einstein causality, there has been little investigation into whether or not such abnormal behaviour can be used to reduce signal latency in practical communication systems in the presence of noise. In this thesis, we use time-varying probability of error to determine if abnormal group delay “channels” can offer reduced signal latency. Since the detection system plays a critical role in the analysis, three important detection systems are considered: the correlation, matched filter and envelope detection systems. Our analysis shows that for both spatially negligible microelectronic systems and spatially extended microwave systems, negative group delay “channels” offer reduced signal latency as compared to conventional “channels”. The results presented in the thesis can be used to design a new generation of electronic and microwave interconnects with reduced or eliminated signal latency.
15

The Systematic Design and Application of Robust DNA Barcodes

Buschmann, Tilo 02 September 2016 (has links)
High-throughput sequencing technologies are improving in quality, capacity, and costs, providing versatile applications in DNA and RNA research. For small genomes or fraction of larger genomes, DNA samples can be mixed and loaded together on the same sequencing track. This so-called multiplexing approach relies on a specific DNA tag, index, or barcode that is attached to the sequencing or amplification primer and hence accompanies every read. After sequencing, each sample read is identified on the basis of the respective barcode sequence. Alterations of DNA barcodes during synthesis, primer ligation, DNA amplification, or sequencing may lead to incorrect sample identification unless the error is revealed and corrected. This can be accomplished by implementing error correcting algorithms and codes. This barcoding strategy increases the total number of correctly identified samples, thus improving overall sequencing efficiency. Two popular sets of error-correcting codes are Hamming codes and codes based on the Levenshtein distance. Levenshtein-based codes operate only on words of known length. Since a DNA sequence with an embedded barcode is essentially one continuous long word, application of the classical Levenshtein algorithm is problematic. In this thesis we demonstrate the decreased error correction capability of Levenshtein-based codes in a DNA context and suggest an adaptation of Levenshtein-based codes that is proven of efficiently correcting nucleotide errors in DNA sequences. In our adaptation, we take any DNA context into account and impose more strict rules for the selection of barcode sets. In simulations we show the superior error correction capability of the new method compared to traditional Levenshtein and Hamming based codes in the presence of multiple errors. We present an adaptation of Levenshtein-based codes to DNA contexts capable of guaranteed correction of a pre-defined number of insertion, deletion, and substitution mutations. Our improved method is additionally capable of correcting on average more random mutations than traditional Levenshtein-based or Hamming codes. As part of this work we prepared software for the flexible generation of DNA codes based on our new approach. To adapt codes to specific experimental conditions, the user can customize sequence filtering, the number of correctable mutations and barcode length for highest performance. However, not every platform is susceptible to a large number of both indel and substitution errors. The Illumina “Sequencing by Synthesis” platform shows a very large number of substitution errors as well as a very specific shift of the read that results in inserted and deleted bases at the 5’-end and the 3’-end (which we call phaseshifts). We argue in this scenario that the application of Sequence-Levenshtein-based codes is not efficient because it aims for a category of errors that barely occurs on this platform, which reduces the code size needlessly. As a solution, we propose the “Phaseshift distance” that exclusively supports the correction of substitutions and phaseshifts. Additionally, we enable the correction of arbitrary combinations of substitution and phaseshift errors. Thus, we address the lopsided number of substitutions compared to phaseshifts on the Illumina platform. To compare codes based on the Phaseshift distance to Hamming Codes as well as codes based on the Sequence-Levenshtein distance, we simulated an experimental scenario based on the error pattern we identified on the Illumina platform. Furthermore, we generated a large number of different sets of DNA barcodes using the Phaseshift distance and compared codes of different lengths and error correction capabilities. We found that codes based on the Phaseshift distance can correct a number of errors comparable to codes based on the Sequence-Levenshtein distance while offering the number of DNA barcodes comparable to Hamming codes. Thus, codes based on the Phaseshift distance show a higher efficiency in the targeted scenario. In some cases (e.g., with PacBio SMRT in Continuous Long Read mode), the position of the barcode and DNA context is not well defined. Many reads start inside the genomic insert so that adjacent primers might be missed. The matter is further complicated by coincidental similarities between barcode sequences and reference DNA. Therefore, a robust strategy is required in order to detect barcoded reads and avoid a large number of false positives or negatives. For mass inference problems such as this one, false discovery rate (FDR) methods are powerful and balanced solutions. Since existing FDR methods cannot be applied to this particular problem, we present an adapted FDR method that is suitable for the detection of barcoded reads as well as suggest possible improvements.

Page generated in 0.0902 seconds