Return to search

I/O test methods in high-speed wireline communication systems

The advent of serial tera-bit telecommunication and multi-gigahertz I/O interfaces is posing challenges on the semiconductor and ATE industries. There is a gap in signal integrity testing between what has been specified in serial link standards and what can be practically tested in production. A thorough characterization and a more cost-effective test of the signal integrity, such as BER, jitter, and eye margin, are critical to identify and isolate the root cause of the system degradation and to the binning in production. In this dissertation, measurement and testing schemes on signal integrity are explored. A solution for diagnosing jitter and predicting the range of consequent BER is proposed. This solution is applicable to decomposition of correlated and uncorrelated jitter in both clock and data signals. The statistical information of jitter is estimated using TLC functions. TLC treats jitter in its original form, as a time series, resulting in good accuracy in the decomposition. Hardware results in a PLL indicate that the approach is still valid when the traditional histogram-based method fails. This approach can be implemented using only one-shot capture instead of multiple captures to average out the uncorrelated jitter from the correlated jitter. Therefore, the TLC functions enable test time reduction in jitter decomposition compared to traditional averaging methods. Hardware measurements on stressed data signals are presented to validate the proposed technique. We have also explored low cost, high bandwidth techniques using Built In Self Test(BIST) for on-chip jitter measurement. Undersampling provides a lowcost test solution for on-chip jitter measurement. However, it suffers from sampling clock phase error and time quantization noise. These timing uncertainties on the test accuracy of the traditional technique using a single channel structure can be alleviated by extracting the correlation between two channels using a single reference clock. Simulation results indicate that the proposed approach can achieve a better measurement accuracy and a higher degree of tolerance to sampling clock uncertainty and quantization error than does the single-channel structure, with little additional test overhead. TIADCs provide an attractive solution to the realization of analog front ends in high speed communication systems,such as 10GBASE-T and 10GBASEFiber. However, gain mismatch, offset mismatch, and sampling time mismatch between time-interleaved channels limit the performance of TIADCs. A low-cost test scheme is developed to measure timing mismatch using an undersampling clock. This method is applicable to an arbitrary number of channels, achieving picosecond resolution with low power consumption. Simulation results and hardware measurements on a 10GSps TIADC are presented to validate the proposed technique. / text

Identiferoai:union.ndltd.org:UTEXAS/oai:repositories.lib.utexas.edu:2152/18345
Date12 October 2012
CreatorsDou, Qingqi
Source SetsUniversity of Texas
LanguageEnglish
Detected LanguageEnglish
Formatelectronic
RightsCopyright is held by the author. Presentation of this material on the Libraries' web site by University Libraries, The University of Texas at Austin was made possible under a limited license grant from the author who has retained all copyrights in the works.

Page generated in 0.2185 seconds