• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 219
  • 51
  • 48
  • 18
  • 16
  • 15
  • 14
  • 12
  • 11
  • 7
  • 4
  • 3
  • 2
  • 2
  • 2
  • Tagged with
  • 485
  • 485
  • 163
  • 101
  • 79
  • 67
  • 66
  • 51
  • 47
  • 39
  • 38
  • 37
  • 36
  • 34
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Matlab Implementation of a Tornado Forward Error Correction Code

Noriega, Alexandra 05 1900 (has links)
This research discusses how the design of a tornado forward error correcting channel code (FEC) sends digital data stream profiles to the receiver. The complete design was based on the Tornado channel code, binary phase shift keying (BPSK) modulation on a Gaussian channel (AWGN). The communication link was simulated by using Matlab, which shows the theoretical systems efficiency. Then the data stream was input as data to be simulated communication systems using Matlab. The purpose of this paper is to introduce the audience to a simulation technique that has been successfully used to determine how well a FEC expected to work when transferring digital data streams. The goal is to use this data to show how FEC optimizes a digital data stream to gain a better digital communications systems. The results conclude by making comparisons of different possible styles for the Tornado FEC code.
32

Wavelet based MIMO-multicarrier system using forward error correction and beam forming

Asif, Rameez, Ali, N.T., Migdadi, Hassan S.O., Abd-Alhameed, Raed, Hussaini, Abubakar S., Ghazaany, Tahereh S., Naveed, S., Noras, James M., Excell, Peter S., Rodriguez, Jonathan January 2013 (has links)
No / Wavelet based multicarrier systems have attracted the attention of the researchers over the past few years to replace the conventional OFDM systems in the next generation communication systems. In this paper we have investigated the performance of such wavelet based systems using forward error correction with covolutional coding and interleaving in a Wavelet-SISO system and then in a Wavelet multicarrier modulation (WMCM) multiple input multiple output (MIMO) system using Convolutional coding and beamforming to reduce the source bit rate and overall system error and increase the data rate. Results show outstanding Bit Error Rate vs. Signal to Noise Ratio Performance. Other than better performance the proposed systems keep the computational burden off the receiver that has more cost and power constraints.
33

LOW DENSITY PARITY CHECK CODES FOR TELEMETRY APPLICATIONS

Hayes, Bob 10 1900 (has links)
ITC/USA 2007 Conference Proceedings / The Forty-Third Annual International Telemetering Conference and Technical Exhibition / October 22-25, 2007 / Riviera Hotel & Convention Center, Las Vegas, Nevada / Next generation satellite communication systems require efficient coding schemes that enable high data rates, require low overhead, and have excellent bit error rate performance. A newly rediscovered class of block codes called Low Density Parity Check (LDPC) codes has the potential to revolutionize forward error correction (FEC) because of the very high coding rates. This paper presents a brief overview of LDPC coding and decoding. An LDPC algorithm developed by Goddard Space Flight Center is discussed, and an overview of an accompanying VHDL development by L-3 Communications Cincinnati Electronics is presented.
34

Vad driver de svenska småhuspriserna?

Bergendahl, Robin January 2014 (has links)
Syftet med denna uppsats är att utreda vilka faktorer som påverkar de svenska småhuspriserna, och i så fall hur och i vilken utsträckning. Med stöd av tidigare studier som enhetlig pekar ut bolåneräntan och disponibel inkomst som de faktorer vilka har tydligast inverkan på fastighetspriserna i Sverige, utökas de förklarande variablerna i denna studie med hjälp av en stock-flow modell. Tidsseriedata från 1993-2013 behandlas för enhetsrötter och kointegration för att skattas i en regressionsanalys i form en "Error Correction Model", med avsikten att utreda både ett kort- och långsiktigt samband. Resultatet bekräftar reporäntan och disponibel inkomst som två viktiga faktorer för att förklara det långsiktiga sambandet med priserna på småhus i Sverige, tillsammans med ytterligare faktorer såsom BNP, hushållens skuldsättning och arbetslösheten. På kort sikt är dels den historiska utvecklingen av huspriserna en nyckelfaktor, men faktorer som disponibel inkomst, ränta, BNP och hushållens skuldsättning är också viktiga krafter för att förklara småhuspriserna. En slutsats som kan dras är att hushållens förmåga till ökad konsumtion, när inkomsterna ökar, avspeglas i småhuspriserna. En låg ränta gör samtidigt att fler än någonsin har råd att låna på en marknad med ett redan mycket begränsat bostadsutbud / The purpose of this study is to investigate which factors affect the Swedish real estate prices of small house dwellings, and if so, how and to what extent. With the use of earlier studies, that coherently claims mortgage rate and household disposable income to be the most valuable factors to explain the Swedish real estate prices, this study will consider additional determinant factors with the respect to a stock-flow model. 1993-2013 time series data will be tested for unit roots and cointegration before its run in a regression as an "Error Correction Model", which considers both long- and short run equilibrium. The result confirms the short run rate and disposable income as two determinant key factors when it comes to explaining the long run Swedish housing prices, together with other factors such as GDP, household debt and unemployment rate. In the short run, the historical development of housing prices act as a key determinant, but disposable income, short term rate, GDP and household debt are also important explanatory factors. The study shows that the increased income, and the ability to increase household spending, will be reflected in the housing prices. A low loan rate will concurrently make it possible for more households than ever to loan at a market with an already very restricted housing supply
35

Exploiting the implicit error correcting ability of networks that use random network coding / by Suné von Solms

Von Solms, Suné January 2009 (has links)
In this dissertation, we developed a method that uses the redundant information implicitly generated inside a random network coding network to apply error correction to the transmitted message. The obtained results show that the developed implicit error correcting method can reduce the effect of errors in a random network coding network without the addition of redundant information at the source node. This method presents numerous advantages compared to the documented concatenated error correction methods. We found that various error correction schemes can be implemented without adding redundancy at the source nodes. The decoding ability of this method is dependent on the network characteristics. We found that large networks with a high level of interconnectivity yield more redundant information allowing more advanced error correction schemes to be implemented. Network coding networks are prone to error propagation. We present the results of the effect of link error probability on our scheme and show that our scheme outperforms concatenated error correction schemes for low link error probability. / Thesis (M.Ing. (Computer Engineering))--North-West University, Potchefstroom Campus, 2010.
36

Exploiting the implicit error correcting ability of networks that use random network coding / by Suné von Solms

Von Solms, Suné January 2009 (has links)
In this dissertation, we developed a method that uses the redundant information implicitly generated inside a random network coding network to apply error correction to the transmitted message. The obtained results show that the developed implicit error correcting method can reduce the effect of errors in a random network coding network without the addition of redundant information at the source node. This method presents numerous advantages compared to the documented concatenated error correction methods. We found that various error correction schemes can be implemented without adding redundancy at the source nodes. The decoding ability of this method is dependent on the network characteristics. We found that large networks with a high level of interconnectivity yield more redundant information allowing more advanced error correction schemes to be implemented. Network coding networks are prone to error propagation. We present the results of the effect of link error probability on our scheme and show that our scheme outperforms concatenated error correction schemes for low link error probability. / Thesis (M.Ing. (Computer Engineering))--North-West University, Potchefstroom Campus, 2010.
37

EFFECT OF ANCILLA LOSSES ON FAULT-TOLERANT QUANTUM ERROR CORRECTION IN THE [[7,1,3]] STEANE CODE

Nawaf, Sameer Obaid 01 December 2013 (has links)
Fault tolerant quantum error correction is a procedure which satisfies the feature that if one of the gates in the procedure has failed then the failure causes at most one error in the output qubits of the encoded block. Quantum computer is based on the idea of two quantum state systems (Qubits). However, the majority of systems are constructed from higher than two- level subspace. Bad control and environmental interactions in these systems lead to leakage fault. Leakage errors are errors that couple the states inside a code subspace to the states outside a code subspace. One example for leakage fault is loss errors. Since the fault tolerant procedure may be unable to recognize the leakage fault because it was designed to deal with Pauli errors. In that case a single leakage fault might disrupt the fault tolerant technique. In this thesis we investigate the effect of ancilla losses on fault-tolerant quantum error correction in the [[7,1,3]] Steane code. We proved that both Shor and Steane methods are still fault tolerant if loss errors occur.
38

What is driving house prices in Stockholm?

Ångman, Josefin January 2016 (has links)
An increased mortgage cap was introduced in 2010, and as of May 1st 2016 an amortization requirement was introduced in an attempt to slow down house price development in Sweden. Fluctuations in the house prices can significantly influence macroeconomic stability, and with house prices in Stockholm rising even more rapidly than Sweden as a whole makes the understanding of Stockholm’s dynamics very important, especially for policy implications. Stockholm house prices between the first quarter of 1996 and the fourth quarter of 2015 is therefore investigated using a Vector Error Correction framework. This approach allows a separation between the long run equilibrium price and short run dynamics. Decreases in the real mortgage rate and increased real financial wealth seem to be most important in explaining rising house prices. Increased real construction costs and increased real disposable income also seem to have an effect. The estimated models suggest that around 40-50 percent, on average, of a short-term deviation from the long-run equilibrium price is closed within a year. As of the last quarter 2015, real house prices are significantly higher compared to the long run equilibrium price modeled. The deviation is found to be around 6-7 percent.
39

The Efficacy of Dynamic Written Corrective Feedback on Intermediate-high ESL Learners' Writing Accuracy

Lee, Soon Yeun 28 November 2009 (has links) (PDF)
This study investigated the efficacy of dynamic written corrective feedback (DWCF) on intermediate-high students' writing accuracy when compared to a traditional grammar instruction approach. DWCF is an innovative written corrective feedback method that requires a multifaceted process and interaction between the teacher and the students in order to help the students improve their writing accuracy. The central principle of DWCF is that feedback should be manageable, meaningful, timely, and constant. The research question was raised based on the positive effects of DWCF found in advanced-low and advanced-mid proficiency level students (Evans et al., in press; Evans, Hartshorn, & Strong-Krause, 2009; Hartshorn, 2008; Hartshorn et al., in press). Similar to previous studies, this study attempted to examine the effectiveness of DWCF in terms of proficiency level. It further explored students' perspectives and attitudes towards DWCF. Two groups of ESL students participated in this study: a control group (n=18) that was taught using a traditional grammar instruction method, and a treatment group (n=35) that was taught using a DWCF approach. The findings in this study revealed that both methods improved the intermediate-high students' linguistic accuracy in writing. However, the findings of this study suggest that the instruction utilizing DWCF is preferable to traditional grammar instruction when it comes to improving intermediate-high students' writing accuracy for two reasons: first, DWCF was slightly more effective than the traditional grammar instruction used, and second, students strongly preferred the instruction using DWCF to traditional grammar instruction. The findings of this study further validate other work suggesting the positive effects found in advanced proficiency levels. This study indicates that ESL learners benefit from manageable, meaningful, timely, and constant error feedback in improving their linguistic accuracy in writing. Furthermore, this study suggests the desirability of applying DWCF to other contexts.
40

An analysis of a relationship between Remuneration and Labour Productivity in South Africa / Johannes Tshepiso Tsoku

Tsoku, Johannes Tshepiso January 2014 (has links)
This study analyses the relationship between remuneration (real wage) and labour productivity in South Africa at the macroeconomic level, using time series and econometric techniques. The results depict that there is a significant evidence of a structural break in 1990. The break appears to have affected the employment level and subsequently fed through into employees' remuneration (real wage) and productivity. A long run cointegrating relationship was found between remuneration and labour productivity for the period 1990 to 2011. In the long run, 1% increase in labour productivity is linked with an approximately 1.98% rise in remuneration. The coefficient of the error correction term in the labour productivity is large, indicating a rapid adjustment of labour productivity to equilibrium. However, remuneration does not Granger cause labour productivity and vice versa. / Thesis (M.Com.(Statistics) North-West University, Mafikeng Campus, 2014

Page generated in 0.0788 seconds