Spelling suggestions: "subject:"errorcorrection"" "subject:"overcorrection""
61 |
On Constructing Low-Density Parity-Check CodesMa, Xudong January 2007 (has links)
This thesis focuses on designing Low-Density Parity-Check (LDPC)
codes for forward-error-correction. The target application is
real-time multimedia communications over packet networks. We
investigate two code design issues, which are important in the
target application scenarios, designing LDPC codes with low
decoding latency, and constructing capacity-approaching LDPC codes
with very low error probabilities.
On designing LDPC codes with low decoding latency, we present a
framework for optimizing the code parameters so that the decoding
can be fulfilled after only a small number of iterative decoding
iterations. The brute force approach for such optimization is
numerical intractable, because it involves a difficult discrete
optimization programming. In this thesis, we show an asymptotic
approximation to the number of decoding iterations. Based on this
asymptotic approximation, we propose an approximate optimization
framework for finding near-optimal code parameters, so that the
number of decoding iterations is minimized. The approximate
optimization approach is numerically tractable. Numerical results
confirm that the proposed optimization approach has excellent
numerical properties, and codes with excellent performance in terms
of number of decoding iterations can be obtained. Our results show
that the numbers of decoding iterations of the codes by the proposed
design approach can be as small as one-fifth of the numbers of
decoding iterations of some previously well-known codes. The
numerical results also show that the proposed asymptotic
approximation is generally tight for even non-extremely limiting
cases.
On constructing capacity-approaching LDPC codes with very low error
probabilities, we propose a new LDPC code construction scheme based
on $2$-lifts. Based on stopping set distribution analysis, we
propose design criteria for the resulting codes to have very low
error floors. High error floors are the main problems of previously
constructed capacity-approaching codes, which prevent them from
achieving very low error probabilities. Numerical results confirm
that codes with very low error floors can be obtained by the
proposed code construction scheme and the design criteria. Compared
with the codes by the previous standard construction schemes, which
have error floors at the levels of $10^{-3}$ to $10^{-4}$, the codes
by the proposed approach do not have observable error floors at the
levels higher than $10^{-7}$. The error floors of the codes by the
proposed approach are also significantly lower compared with the
codes by the previous approaches to constructing codes with low
error floors.
|
62 |
On single-crystal solid-state NMR based quantum information processingMoussa, Osama January 2010 (has links)
Quantum information processing devices promise to solve some problems more efficiently than their classical counterparts. The source of the speedup is the structure of quantum theory itself. In that sense, the physical units that are the building blocks of such devices are its power. The quest then is to find or manufacture a system that behaves according to quantum theory, and yet is controllable in such a way that the desired algorithms can be implemented. Candidate systems are benchmarked against general criteria to evaluate their success. In this thesis, I advance a particular system and present the progress made towards each of these criteria. The system is a three-qubit 13C solid-state nuclear magnetic resonance (NMR) based quantum processor. I report results concerning system characterization and control, pseudopure state preparation, and quantum error correction. I also report on using the system to test a central question in the foundation of quantum mechanics.
|
63 |
Error correction model estimation of the Canada-US real exchange rateYe, Dongmei 18 January 2008 (has links)
Using the error correction model, we link the long-run behavior of the Canada-US real exchange rate to its short-run dynamics. The equilibrium real exchange rate is determined by the energy and non-energy commodity prices over the period 1973Q1-1992Q1. However such a single long-run relationship does not hold when the sample period is extended to 2004Q4. This breakdown can be explained by the break point which we find at 1993Q3. At the break point, the effect of the energy price shocks on Canadas real exchange rate turns from negative to positive while the effect of the non-energy commodity price shocks is constantly positive. We find that after one year 40.03% of the gap between the actual and equilibrium real exchange rate is closed. The Canada-US interest rate differential affects the real exchange rate temporarily. The Canadas real exchange rate depreciates immediately after a decrease in Canadas interest rate and appreciates next quarter but not by as much as it has depreciated.
|
64 |
The Study of G7 Business Cycle CorrelationChen, Yi-Shin 22 June 2007 (has links)
Abstract
With the processing of globalization and the large increases in international trade and openness, it is important to capture the business cycle correlation with the intimate countries for government to make better policies and keep the economy steady.
This study investigated the changes in relationships between the G7 business cycle after the European integration. Choosing 1993 the Europe Union (EU) commencement as the segment, we separated the sample period into 1965:1-1992:4 and 1993:1-2006:4.We adopted kinds of unit root tests to exam if these variables were stationary and the Johansen co-integration analysis to test whether the stationary long-run equilibrium exist or not. With the consideration of long run information, the Vector Error Correction Model (VECM) was applied to study the relationship between the business cycles of United State, EU and the other G7 countries.
By Johansen co-integration analysis, we found that the stationary long-run relationship did exist between their industrial productions¡]IP¡^. In addition, the VECM evidence supported the emergence of two cyclically coherent groups -- the Euro-zone and the English-speaking countries -- after the EU commencement in 1993. In conclusion, with the greater correlation of business cycles, the party in office should take account of the business cycle movement of the closed countries more deliberately in this regionalization era.
|
65 |
An Examination of the Relationship between Oil Price and Income in Taiwan by Threshold Vector Error Correction Model.Wang, Yu-wun 27 June 2007 (has links)
Since petroleum is a kind of exhaustive resource, it can not be
regenerated after being consumed. And petroleum is distributed
extremely uneven in the world, more than half of petroleum is
distributed in the Middle East area. In the recent years, the oil
price was so fluctuating and broke the record again and again.
However, the productivity of petroleum in Taiwan is very low and
we are a price taker. So it turns to be important that how the
oil price affects the economy. According to Economics, high oil
price often causes the staginflation. In the purpose of this study
we examine the long run relationship between oil price and
personal income in Taiwan by cointegration theory. And we find
that there indeed exists a negative longrun relationship. In
addition, we consider a nonlinear model, Threshold Vector Error
Correction Model, to test a threhold effect in the long run
relationship between variables. Finally we have a result that
there is a threshold cointegrating relationship between the oil
price and personal income in Taiwan.
|
66 |
An approach for improving performance of aggregate voice-over-IP trafficAl-Najjar, Camelia 30 October 2006 (has links)
The emerging popularity and interest in Voice-over-IP (VoIP) has been accompanied
by customer concerns about voice quality over these networks. The lack of an
appropriate real-time capable infrastructure in packet networks along with the threats of
denial-of service (DoS) attacks can deteriorate the service that these voice calls receive.
And these conditions contribute to the decline in call quality in VoIP applications;
therefore, error-correcting/concealing techniques remain the only alternative to provide a
reasonable protection for VoIP calls against packet losses. Traditionally, each voice call
employs its own end-to-end forward-error-correction (FEC) mechanisms. In this paper,
we show that when VoIP calls are aggregated over a provider's link, with a suitable
linear-time encoding for the aggregated voice traffic, considerable quality improvement
can be achieved with little redundancy. We show that it is possible to achieve rates
closer to channel capacity as more calls are combined with very small output loss rates
even in the presence of significant packet loss rates in the network. The advantages of
the proposed scheme far exceed similar or other coding techniques applied to individual
voice calls.
|
67 |
noneLin, Yu-cheng 30 June 2009 (has links)
Abstract
Divident discount model found further expected dividend discounting to some fix period. The dividends are determined from the the core of company and relates retain earning. In Taiwan stock market, divedneds are not paid per season. So, I adept earning per share to proxy variable and employ market value weight to conduct dividends for Taiwan stock idnex. The next step,
investgate the relationship between price index and diviednds using the econometric model was created by Kapetanios et al. (2006). Consequencely, the relationship are fitted discribtion by ESTR cointegration rather than linear cointegration.
|
68 |
The efficiacy of written corrective feedback and students´perceptions : A survey about the impact of written response on L2 writingMunther, Pernilla January 2015 (has links)
The purpose of this study was to investigate to what extent written corrective feedback (WCF) is a good way to treat errors that L2 (second language) pupils make and if they attend to the comments in future written assignments. WCF is the most used response on written assignments. Some research takes the perspective that it is fruitful (Chandler 2003, Ferris 2003) while other research argues that it is inefficient and unnecessary (e.g.Truscott 1996, 1999). This study presents the findings of a survey on the topic which was conducted at a small school in the south east of Sweden. A comparison between previous research and the findings of the present survey is made and the conclusion from this is that there are limitations in the efficacy of WCF and the results suggest that the type of feedback and how it is delivered are important. It is also likely to be beneficial that pupils revise their texts in order to improve in writing English.
|
69 |
The cointegrating relationship in Asian markets with applications to stock prices, exchange rates and interest ratesTanonklin, Tippawan January 2013 (has links)
The aim of this research is to investigate the long-run co-integrating relationships in the Asian markets. Our research focuses on 4 areas; pair trading, out-of-sample forecasting, testing the unbiased forward exchange rate hypothesis and testing the expectation hypothesis of the term structure of interest rates. The introduction is provided in chapter one. In chapter two, we develop a pairs trading strategy using individual stocks listed in the Stock Exchange of Thailand. Engle and Granger approach is used to identify the potential pairs that are cointegrated. The results show that pairs trading strategy is profitable in this market. Chapter three examines the forecasting performance of the error correction model on daily share price series from the Stock Exchange of Thailand. The disequilibrium term is classified into “correct” and “mix” sign based on Alexander (2008)’s criterion; the results indicate that the error correction component can help to improve the predictability in the long run. Chapter four tests the unbiased forward rate hypothesis of 11 Asian exchange rates using linear conventional regression, ECM and logistic smooth transition regression with the forward premium as the transition variable. Out-of-sample forecasting results also suggest that inferior forecasting performance could be obtained as a result of using linear models. In chapter five, we investigate the expectation hypothesis of the term structure of interest rate for four Asian countries. We employ linear models and nonlinear approaches that allow to capture asymmetric and symmetric adjustments. The result also indicates that the term structure can be better modeled by means of LSTR models. The forecasting exercise also confirms these findings.
|
70 |
On single-crystal solid-state NMR based quantum information processingMoussa, Osama January 2010 (has links)
Quantum information processing devices promise to solve some problems more efficiently than their classical counterparts. The source of the speedup is the structure of quantum theory itself. In that sense, the physical units that are the building blocks of such devices are its power. The quest then is to find or manufacture a system that behaves according to quantum theory, and yet is controllable in such a way that the desired algorithms can be implemented. Candidate systems are benchmarked against general criteria to evaluate their success. In this thesis, I advance a particular system and present the progress made towards each of these criteria. The system is a three-qubit 13C solid-state nuclear magnetic resonance (NMR) based quantum processor. I report results concerning system characterization and control, pseudopure state preparation, and quantum error correction. I also report on using the system to test a central question in the foundation of quantum mechanics.
|
Page generated in 0.0632 seconds