• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 30
  • 12
  • 8
  • 4
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 65
  • 22
  • 22
  • 14
  • 12
  • 11
  • 8
  • 8
  • 7
  • 7
  • 7
  • 6
  • 6
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Program pro demonstraci kanálového kódování / Programme for channel coding demonstration

Závorka, Radek January 2020 (has links)
The main subject of this thesis is creating a programme, used for channel coding demonstration. This programme will be used for teaching purposes. The programme contains various codes from simple ones, to those which almost reach Shanon’s channel capacity theorem. Specifically these are the Hamming code, cyclic code, convolutional code and LDPC code. These functions are based on theoretical background described in this thesis and have been programed in Matlab. Practical output of this thesis is user interface, where the user is able to input information word, simulate transmission through the transmission channel and observe coding and decoding for each code. This thesis also contains a comparison between individual codes, concerning bit-error rate depending on SNR and various parameters. There is a computer lab with theoretical background, assignment and sheets for convenient accomplishment of each task.
52

Protichybové zabezpečení v digitálních komunikačních systémech / Forward Error Correction in Digital Communication Systems

Kostrhoun, Jan January 2013 (has links)
This work deals with forward error correction. In the work, basic methods and algorithms of error correction are described. For the presentation of encoding and decoding process of Hamming code, Reed-Müller code, Fire code, Reed-Solomon code and Trellis coded modulation programs in Matlab were created.
53

Improving Dependability of Space-Cloud Payload Processor by Storage System

Said, Hassan, Johansson, Stephanie Liza January 2023 (has links)
Due to the usage of complicated platforms and current high-performance space computing technology, onboard processing in small satellites is expanding. Space-cloud payload processors with Commercial Off-The-Shelf (COTS) components, that are required to be radiation-tolerant, are used to perform the onboard processing. In this thesis, the research will aim to increase the dependability of a generic space-cloud payload processor through its Solid State Drive (SSD) storage unit. To achieve this, a more dependable NAND-flash-based SSD Redundant Array of Independent Disc (RAID) storage system is designed and tested. The reliability of NAND-flash-based SSDs can suffer wear-outs due to increased Program/Erase (P/E) cycles, making them more prone to radiation effects. These radiation effects are considered non-destructive events in the form of bit errors (both single bit-flip and multiple bit-flips). Therefore, making the storage system more dependable involves increasing its reliability against non-destructive events and developing analytical models that account for the considered dynamic of the SSD RAID. The challenge that comes with achieving the aim of this thesis is twofold. First, to explore different RAID levels such that a combination of RAID levels can be incorporated into one SSD for better reliability than a RAID-1 setup. Hence, in this thesis, a RAID array of several SSDs is not considered. Furthermore, the combinations of RAID levels need to account for mixed-critical data. Second, to demonstrate, via simulation and analytical models, the impact on the reliability of the storage system. A comparison study is also undertaken due to the support that the Fourth Extended (Ext4) file system or Zettabyte File System (ZFS) may give to enhance the storage system, and since little research exists that compares the file systems in some feature categories. The solution is a RAID-5 + 6 storage system that is Error Detection And Correction (EDAC) protected by Hamming codes and Reed Solomon (RS) codes. Low-critical data is stored using RAID-5 whereas high-critical data is stored using RAID-6. The simulation of the storage system proves that low-critical stripes of data achieve single fault tolerance whereas high-critical stripes of data tolerate a maximum of 5-bit burst errors. In parallel, several Continuous Time Markov Chain (CTMC) models are analysed, which show that the proposed solution is indeed highly reliable. The comparison study is carried out in a systematic way, and the findings are established as substantial,i.e., ZFS provides greater storage system support. In summary, the results of creating the storage system and analysing it suggest that incorporating RAID-5 and RAID-6 offers better SSD RAID reliability than RAID-1. / Användningen av komplicerade plattformar och aktuell högpresterande rymdberäkningsteknik expanderar onboard-processing i små satelliter. Space-Cloud lösningar med kommersiellt tillgängliga komponenter som är toleranta mot strålningar i rymden används för att utföra onboard-processing. I detta examensarbete syftar forskningen till att förbättra tillförlitligheten hos en generisk rymd dator genom dess SSD-lagringsenhet. För att uppnå detta har ett mer tillförlitligt lagringssystem bestående av NAND-flash och RAID designats och testats. Tillförlitligheten hos NAND-flash-baserade SSD:er kan försämras då dessa kan drabbas av slitage på grund av ökade P/E cykler, vilket gör dem mer benägna för strålningseffekter. Dessa strålningseffekter anses vara icke-destruktiva i form av bit-fel (både enskilda bit-flippar och flera bit-flippar). Med denna anledning görs lagringssystemet mer tillförlitligt för att tolerera icke-destruktiva händelser. Utöver detta, utvecklas analytiska modeller som tar hänsyn till den betraktade dynamiken i SSD RAID. Utmaningen som följer med att uppnå syftet med denna avhandling är tvådelad. För det första, för att utforska olika RAID-nivåer så att en kombination av RAID-nivåer kan inkorporeras i en SSD för bättre tillförlitlighet än RAID-1. Således övervägs inte en RAID-array av flera SSD:er i denna avhandling. Dessutom måste kombinationerna av RAID-nivåer ta hänsyn till data av olika kritikalitet. För det andra, för att genom simulering och analytiska modeller indikera påverkan på lagringssystemets tillförlitlighet. En jämförelsestudie genomförs också på grund av stödet som filsystemen Ext4 eller ZFS kan ge för att förbättra lagringssystemet och eftersom det finns lite forskning som jämför filsystemen i några funktionella kategorier. Lösningen baseras på ett RAID-5+6 lagringssystem som är skyddat av Hamming-koder och RS koder för att upptäcka fel och korrigera dem. Lågkritisk data lagras med RAID-5 medan högkritisk data lagras med RAID-6. Simuleringen av lagringssystemet visar att lågkritiska datasektioner uppnår en fel tolerans mot enskilda bit-flippar medan högkritiska datasektioner kan tåla maximalt 5 bit-flippar. Samtidigt analyseras flera CTMC modeller som visar att den föreslagna lösningen verkligen är mycket tillförlitlig. Jämförelsestudien utförs på ett systematiskt sätt och resultaten fastställs som betydande, det vill säga att ZFS ger större stöd för lagringssystemet. Sammanfattningsvis antyder resultaten av att skapa lagringssystemet och analysera det att inkorporering av RAID-5 och RAID-6 erbjuder bättre tillförlitlighet för SSD RAID än RAID-1.
54

On the evaluation of regional climate model simulations over South America

Lange, Stefan 28 October 2015 (has links)
Diese Dissertation beschäftigt sich mit regionaler Klimamodellierung über Südamerika, der Analyse von Modellsensitivitäten bezüglich Wolkenparametrisierungen und der Entwicklung neuer Methoden zur Modellevaluierung mithilfe von Klimanetzwerken. Im ersten Teil untersuchen wir Simulationen mit dem COnsortium for Small scale MOdeling model in CLimate Mode (COSMO-CLM) und stellen die erste umfassende Evaluierung dieses dynamischen regionalen Klimamodells über Südamerika vor. Dabei untersuchen wir insbesondere die Abhängigkeit simulierter tropischer Niederschläge von Parametrisierungen subgitterskaliger cumuliformer und stratiformer Wolken und finden starke Sensitivitäten bezüglich beider Wolkenparametrisierungen über Land. Durch einen simultanen Austausch der entsprechenden Schemata gelingt uns eine beträchtliche Reduzierung von Fehlern in klimatologischen Niederschlags- und Strahlungsmitteln, die das COSMO-CLM über tropischen Regionen für lange Zeit charakterisierten. Im zweiten Teil führen wir neue Metriken für die Evaluierung von Klimamodellen bezüglich räumlicher Kovariabilitäten ein. Im Kern bestehen diese Metriken aus Unähnlichkeitsmaßen für den Vergleich von simulierten mit beobachteten Klimanetzwerken. Wir entwickeln lokale und globale Unähnlichkeitsmaße zum Zwecke der Darstellung lokaler Unähnlichkeiten in Form von Fehlerkarten sowie der Rangordnung von Modellen durch Zusammenfassung lokaler zu globalen Unähnlichkeiten. Die neuen Maße werden dann für eine vergleichende Evaluierung regionaler Klimasimulationen mit COSMO-CLM und dem Statistical Analogue Resampling Scheme über Südamerika verwendet. Dabei vergleichen wir die sich ergebenden Modellrangfolgen mit solchen basierend auf mittleren quadratischen Abweichungen klimatologischer Mittelwerte und Varianzen und untersuchen die Abhängigkeit dieser Rangfolgen von der betrachteten Jahreszeit, Variable, dem verwendeten Referenzdatensatz und Klimanetzwerktyp. / This dissertation is about regional climate modeling over South America, the analysis of model sensitivities to cloud parameterizations, and the development of novel model evaluation techniques based on climate networks. In the first part we examine simulations with the COnsortium for Small scale MOdeling weather prediction model in CLimate Mode (COSMO-CLM) and provide the first thorough evaluation of this dynamical regional climate model over South America. We focus our analysis on the sensitivity of simulated tropical precipitation to the parameterizations of subgrid-scale cumuliform and stratiform clouds. It is shown that COSMO-CLM is strongly sensitive to both cloud parameterizations over tropical land. Using nondefault cumulus and stratus parameterization schemes we are able to considerably reduce long-standing precipitation and radiation biases that have plagued COSMO-CLM across tropical domains. In the second part we introduce new performance metrics for climate model evaluation with respect to spatial covariabilities. In essence, these metrics consist of dissimilarity measures for climate networks constructed from simulations and observations. We develop both local and global dissimilarity measures to facilitate the depiction of local dissimilarities in the form of bias maps as well as the aggregation of those local to global dissimilarities for the purposes of climate model intercomparison and ranking. The new measures are then applied for a comparative evaluation of regional climate simulations with COSMO-CLM and the STatistical Analogue Resampling Scheme (STARS) over South America. We compare model rankings obtained with our new performance metrics to those obtained with conventional root-mean-square errors of climatological mean values and variances, and analyze how these rankings depend on season, variable, reference data set, and climate network type.
55

Classes de Steinitz, codes cycliques de Hamming et classes galoisiennes réalisables d'extensions non abéliennes de degré p³ / Steinitz classes, cyclic Hamming codes and realizable Galois module classes of nonabelian extensions of degree p³

Khalil, Maya 21 June 2016 (has links)
Le résumé n'est pas disponible. / Le résumé n'est pas disponible.
56

Bezdrátová čidla pro měření hladiny vody / Wireless water level sensors

Pospíšil, Jakub January 2010 (has links)
The thesis deals with both scheme and its implementation of water-level metering apparatus. This data are send wireless into 500 m distant station. Potential ways of solution are gradually studied and final design suggested. Detailed implementation methods are described in the following section. Ultrasonic sensors are employed for level measurement and controlling element is processor ATmega162, data are transmitted by transceiver RC1280HP. Apparatus is suggested with a view to the lowest power consumption considering it will be supplied only with a accumulator. Solving of accepting station is not a part of the thesis. Functional tested sample is understated in the execution section.
57

Protichybové systémy s prokládáním / Antierror systems with interleaving

Pacher, Jakub January 2010 (has links)
This work involves in anti-error coding systems with interleaving. At first is given summary of high-frequency use error correction codes. Below there are described two basic techniques of interleaving and their confrontation. The next text is focusing on survey and characteristics of codes which conform to submission. After selection of optimal system is verified its function in MATLAB environment. Final step is creation of functional application in C++ environment. This application serves to transmission of error correction BMP pictures.
58

Zpracování snímků duhovky pro biometrické aplikace / Processing of iris images for biometric applications

Osičková, Kristýna January 2015 (has links)
Biometrics is a method of recognizing the identity of a person based on unique biological characteristics that are unique to each person. The methods of biometric identification is currently becoming increasingly widespread in various sectors. This work is focused on the identification of a person by iris images. The introductory section describes the principles of the well-known methods for biometric applications and the next part describes the design method and its implementation in Matlab. In the practical part, fast radial symmetry method is used for detection of pupil, from which it derives further image processing. Two dimensional discrete welvet transform is used here. The proposed algorithm is tested on databases CASIA-Iris- Interval and database IITD.
59

A smart sound fingerprinting system for monitoring elderly people living alone

El Hassan, Salem January 2021 (has links)
There is a sharp increase in the number of old people living alone throughout the world. More often than not, such people require continuous and immediate care and attention in their everyday lives, hence the need for round the clock monitoring, albeit in a respectful, dignified and non-intrusive way. For example, continuous care is required when they become frail and less active, and immediate attention is required when they fall or remain in the same position for a long time. To this extent, various monitoring technologies have been developed, yet there are major improvements still to be realised. Current technologies include indoor positioning systems (IPSs) and health monitoring systems. The former relies on defined configurations of various sensors to capture a person's position within a given space in real-time. The functionality of the sensors varies depending on receiving appropriate data using WiFi, radio frequency identification (RFIO), ultrawide band (UWB), dead reckoning (OR), infrared indoor (IR), Bluetooth (BLE), acoustic signal, visible light detection, and sound signal monitoring. The systems use various algorithms to capture proximity, location detection, time of arrival, time difference of arrival angle, and received signal strength data. Health monitoring technologies capture important health data using accelerometers and gyroscope sensors. In some studies, audio fingerprinting has been used to detect indoor environment sound variation and have largely been based on recognising TV sound and songs. This has been achieved using various staging methods, including pre-processing, framing, windowing, time/frequency domain feature extraction, and post-processing. Time/frequency domain feature extraction tools used include Fourier Transforms (FTs}, Modified Discrete Cosine Transform (MDCT}, Principal Component Analysis (PCA), Mel-Frequency Cepstrum Coefficients (MFCCs), Constant Q Transform (CQT}, Local Energy centroid (LEC), and Wavelet transform. Artificial intelligence (Al) and probabilistic algorithms have also been used in IPSs to classify and predict different activities, with interesting applications in healthcare monitoring. Several tools have been applied in IPSs and audio fingerprinting. They include Radial Basis Kernel (RBF), Support Vector Machine (SVM), Decision Trees (DTs), Hidden Markov Models (HMMs), Na'ive Bayes (NB), Gaussian Mixture Modelling (GMM), Clustering algorithms, Artificial Neural Networks (ANNs), and Deep Learning (DL). Despite all these attempts, there is still a major gap for a completely non-intrusive system capable of monitoring what an elderly person living alone is doing, where and for how long, and providing a quick traffic-like risk score prompting, therefore immediate action or otherwise. In this thesis, a cost-effective and completely non-intrusive indoor positioning and activity-monitoring system for elderly people living alone has been developed, tested and validated in a typical residential living space. The proposed system works based on five phases: (1)Set-up phase that defines the typical activities of daily living (TADLs). (2)Configuration phase that optimises the implementation of the required sensors in exemplar flat No.1. (3)Learning phase whereby sounds and position data of the TADLs are collected and stored in a fingerprint reference data set. (4)Listening phase whereby real-time data is collected and compared against the reference data set to provide information as to what a person is doing, when, and for how long. (5)Alert phase whereby a health frailty score varying between O unwell to 10 healthy is generated in real-time. Two typical but different residential flats (referred to here are Flats No.1 and 2) are used in the study. The system is implemented in the bathroom, living room, and bedroom of flat No.1, which includes various floor types (carpet, tiles, laminate) to distinguish between various sounds generated upon walking on such floors. The data captured during the Learning Phase yields the reference data set and includes position and sound fingerprints. The latter is generated from tests of recording a specific TADL, thus providing time and frequency-based extracted features, frequency peak magnitude (FPM), Zero Crossing Rate (ZCR), and Root Mean Square Error (RMSE). The former is generated from distance measurement. The sampling rate of the recorded sound is 44.1kHz. Fast Fourier Transform (FFT) is applied on 0.1 seconds intervals of the recorded sound with minimisation of the spectral leakage using the Hamming window. The frequency peaks are detected from the spectrogram matrices to get the most appropriate FPM between the reference and sample data. The position detection of the monitored person is based on the distance between that captured from the learning and listening phases of the system in real-time. A typical furnished one-bedroom flat (flat No.2) is used to validate the system. The topologies and floorings of flats No.1 and No.2 are different. The validation is applied based on "happy" and "unusual" but typical behaviours. Happy ones include typical TADLs of a healthy elderly person living alone with a risk metric higher than 8. Unusual one's mimic acute or chronic activities (or lack thereof), for example, falling and remaining on the floor, or staying in bed for long periods, i.e., scenarios when an elderly person may be in a compromised situation which is detected by a sudden drop of the risk metric (lower than 4) in real-time. Machine learning classification algorithms are used to identify the location, activity, and time interval in real-time, with a promising early performance of 94% in detecting the right activity and the right room at the right time.
60

Viterbi Decoded Linear Block Codes for Narrowband and Wideband Wireless Communication Over Mobile Fading Channels

Staphorst, Leonard 08 August 2005 (has links)
Since the frantic race towards the Shannon bound [1] commenced in the early 1950’s, linear block codes have become integral components of most digital communication systems. Both binary and non-binary linear block codes have proven themselves as formidable adversaries against the impediments presented by wireless communication channels. However, prior to the landmark 1974 paper [2] by Bahl et al. on the optimal Maximum a-Posteriori Probability (MAP) trellis decoding of linear block codes, practical linear block code decoding schemes were not only based on suboptimal hard decision algorithms, but also code-specific in most instances. In 1978 Wolf expedited the work of Bahl et al. by demonstrating the applicability of a block-wise Viterbi Algorithm (VA) to Bahl-Cocke-Jelinek-Raviv (BCJR) trellis structures as a generic optimal soft decision Maximum-Likelihood (ML) trellis decoding solution for linear block codes [3]. This study, largely motivated by code implementers’ ongoing search for generic linear block code decoding algorithms, builds on the foundations established by Bahl, Wolf and other contributing researchers by thoroughly evaluating the VA decoding of popular binary and non-binary linear block codes on realistic narrowband and wideband digital communication platforms in lifelike mobile environments. Ideally, generic linear block code decoding algorithms must not only be modest in terms of computational complexity, but they must also be channel aware. Such universal algorithms will undoubtedly be integrated into most channel coding subsystems that adapt to changing mobile channel conditions, such as the adaptive channel coding schemes of current Enhanced Data Rates for GSM Evolution (EDGE), 3rd Generation (3G) and Beyond 3G (B3G) systems, as well as future 4th Generation (4G) systems. In this study classic BCJR linear block code trellis construction is annotated and applied to contemporary binary and non-binary linear block codes. Since BCJR trellis structures are inherently sizable and intricate, rudimentary trellis complexity calculation and reduction algorithms are also presented and demonstrated. The block-wise VA for BCJR trellis structures, initially introduced by Wolf in [3], is revisited and improved to incorporate Channel State Information (CSI) during its ML decoding efforts. In order to accurately appraise the Bit-Error-Rate (BER) performances of VA decoded linear block codes in authentic wireless communication environments, Additive White Gaussian Noise (AWGN), flat fading and multi-user multipath fading simulation platforms were constructed. Included in this task was the development of baseband complex flat and multipath fading channel simulator models, capable of reproducing the physical attributes of realistic mobile fading channels. Furthermore, a complex Quadrature Phase Shift Keying (QPSK) system were employed as the narrowband communication link of choice for the AWGN and flat fading channel performance evaluation platforms. The versatile B3G multi-user multipath fading simulation platform, however, was constructed using a wideband RAKE receiver-based complex Direct Sequence Spread Spectrum Multiple Access (DS/SSMA) communication system that supports unfiltered and filtered Complex Spreading Sequences (CSS). This wideband platform is not only capable of analysing the influence of frequency selective fading on the BER performances of VA decoded linear block codes, but also the influence of the Multi-User Interference (MUI) created by other users active in the Code Division Multiple Access (CDMA) system. CSS families considered during this study include Zadoff-Chu (ZC) [4, 5], Quadriphase (QPH) [6], Double Sideband (DSB) Constant Envelope Linearly Interpolated Root-of- Unity (CE-LI-RU) filtered Generalised Chirp-like (GCL) [4, 7-9] and Analytical Bandlimited Complex (ABC) [7, 10] sequences. Numerous simulated BER performance curves, obtained using the AWGN, flat fading and multi-user multipath fading channel performance evaluation platforms, are presented in this study for various important binary and non-binary linear block code classes, all decoded using the VA. Binary linear block codes examined include Hamming and Bose-Chaudhuri-Hocquenghem (BCH) codes, whereas popular burst error correcting non-binary Reed-Solomon (RS) codes receive special attention. Furthermore, a simple cyclic binary linear block code is used to validate the viability of employing the reduced trellis structures produced by the proposed trellis complexity reduction algorithm. The simulated BER performance results shed light on the error correction capabilities of these VA decoded linear block codes when influenced by detrimental channel effects, including AWGN, Doppler spreading, diminished Line-of-Sight (LOS) signal strength, multipath propagation and MUI. It also investigates the impact of other pertinent communication system configuration alternatives, including channel interleaving, code puncturing, the quality of the CSI available during VA decoding, RAKE diversity combining approaches and CSS correlation characteristics. From these simulated results it can not only be gathered that the VA is an effective generic optimal soft input ML decoder for both binary and non-binary linear block codes, but also that the inclusion of CSI during VA metric calculations can fortify the BER performances of such codes beyond that attainable by classic ML decoding algorithms. / Dissertation (MEng(Electronic))--University of Pretoria, 2006. / Electrical, Electronic and Computer Engineering / unrestricted

Page generated in 0.064 seconds