Oscilloscope crossover distortion test evaluates the relationship between oscilloscope sampling rate and sampling fidelity

Publisher:hxcp18Latest update time:2021-06-11 Source: eefocusKeywords:Oscilloscope Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

While oscilloscope vendors will not provide customers with specifications in their DSO data sheets that can directly quantify the digital processing of the oscilloscope, there are still a variety of tests that can be easily performed to not only measure the effects of sampling distortion, but also to determine and quantify sampling distortion. Below is a list of tests that can be performed on oscilloscopes to detect and compare crossover distortion:


Crossover distortion test

1. Effective number of bits analysis using sine wave

2. Sine wave comparison test

3. Spectrum Analysis

4. Measurement stability


Effective digit analysis

Some oscilloscope vendors offer the most stringent specification for quantifying sampling fidelity, the effective number of bits (ENOB). However, ENOB is a composite specification made up of several error components, including input amplifier harmonic distortion and random noise. While the ENOB test can provide a good baseline comparison of the overall accuracy between different oscilloscopes, the ENOB concept is not an easy concept to understand and requires a lot of complex calculations to be performed on the digitized data before it is imported into a PC. Basically, the ENOB test extracts a theoretical best fit sinusoid from the digitized sinusoid. This sinusoid curve fitting algorithm will remove the errors introduced by the oscilloscope amplifier gain and offset errors. The test then calculates the RMS error of the digitized sinusoid relative to the ideal/extracted sinusoid over a period of time. This RMS error is then compared to the theoretical RMS error produced by an "N" bit ideal ADC. For example, if the oscilloscope acquisition system has an accuracy of 5.3 effective bits, then an ideal 5.3 bit ADC system should produce the same amount of RMS error.


For more information on how to perform effective number of bits (ENOB) testing, download the application note, Understanding the Metrics Used in Evaluating Oscilloscope Quality.


You can also perform a more intuitive and simple test: simply input a sine wave generated by a high-quality signal generator (with a frequency close to the bandwidth of the oscilloscope being tested) to see if the oscilloscope will produce ADC interleaving distortion. Then make a judgment on the filtered digitized waveform.


Additionally, you can measure the ADC distortion due to uncalibrated ADCs in the frequency domain using the scope's Fast Fourier Transformation capability. With a pure sine wave input, the ideal/undistorted spectrum should consist of a single frequency component at the input frequency. Anything else in frequency is a distortion component. You can also use this technique for digital clock signals, but the spectrum becomes more complicated, so you need to know where to start.


Another simple test that can be performed is to compare the stability of parametric measurements such as rise time, fall time, or standard deviation of Vp-p between oscilloscopes with similar bandwidths. If crossover distortion is present, it will produce unstable measurements just like random noise.


Sine wave comparison test

Figure 1 shows the simplest and most intuitive comparison test—the sine wave test. The waveform shown in Figure 1a is a single-shot capture of a 200-MHz sine wave using a Keysight InfiniiVision 1-GHz bandwidth oscilloscope with a sample rate of 4 GSa/s. This oscilloscope uses non-interleaved ADC technology, which gives a sample rate to bandwidth ratio of 4:1. The waveform shown in Figure 1b is a single-shot capture of the same 200-MHz sine wave using a LeCroy's 1-GHz bandwidth oscilloscope with a sample rate of 10 GSa/s. This oscilloscope uses interleaved ADC technology, which gives a maximum sample rate to bandwidth ratio of 10:1.

Figure 1a: A 200-MHz sine wave captured using a Keysight 1-GHz bandwidth oscilloscope at 4 GSa/s sampling rate.

We would intuitively think that for oscilloscopes of the same bandwidth, the one with the higher sampling rate should produce more accurate measurements, but from this comparison we can see that the oscilloscope with the lower sampling rate actually represents the 200 MHz input sine wave more accurately. This is not because lower sampling rates are better, but because a poorly calibrated interleaved real-time ADC will negate the advantage of the higher sampling rate.


Accurately calibrated interleaved ADC techniques become more important for oscilloscopes with higher bandwidths and higher sample rates. While a fixed amount of phase-delayed clock error may not be important at lower sample rates, at higher sample rates (lower sample periods), the same amount of phase-delayed clock error becomes very important. Now we will compare a higher bandwidth oscilloscope that uses real-time interleaving techniques with a higher bandwidth oscilloscope that does not use this technique.

Figure 1b: A 200-MHz sine wave captured using a LeCroy 1-GHz bandwidth oscilloscope sampling at 10 GSa/s.


Figure 2 shows a screenshot of two sine wave tests comparing a 2.5 GHz sine wave captured by a Keysight 3-GHz bandwidth oscilloscope at 20 GSa/s sampling rate (non-interleaved) and 40 GSa/s sampling rate (interleaved). This particular DSO uses a single-chip 20 GSa/s ADC behind each of the four channels. However, if only two channels of the oscilloscope are used, the instrument automatically interleaves the ADC pairs to provide a real-time sampling rate of no less than 40 GSa/s.

Figure 2a: 2.5-GHz sine wave captured using a Keysight Infiniium oscilloscope at 20 GSa/s (non-interleaved) sampling rate

On the surface, we cannot observe much difference between the quality of these two waveforms. Both waveforms appear to be relatively pure sine waves with only minimal distortion. However, when we perform Vp-p statistical measurements, we find that the higher sample rate measurement results in slightly more stable measurements, which is consistent with what we would expect.

Figure 2b: 2.5-GHz sine wave captured using a Keysight Infiniium oscilloscope at 40 GSa/s (interleaved) sampling rate

Figure 3 shows a set of sine wave tests comparing a Tektronix 2.5-GHz bandwidth oscilloscope capturing a 2.5 GHz sine wave at a 10 GSa/s sample rate (non-interleaved) versus capturing the same 2.5 GHz sine wave at a 40 GSa/s sample rate (interleaved). This particular DSO uses a single-chip 10 GSa/s ADC behind each of the four channels. However, if only one channel of the oscilloscope is used, the instrument automatically interleaves the four ADCs to provide a real-time sample rate of no less than 40 GSa/s in one channel.

Figure 3a: A 2.5-GHz sine wave captured using a Tektronix 2.5-GHz sine wave sampled at 10 GSa/s (non-interleaved)

In this sine wave test, we can see a significant difference in waveform fidelity between each sample rate setting. When the oscilloscope is sampling at 10 GSa/s (Figure 3a) without an interleaved ADC, its display of the input sine wave is quite good—although the Vp-p measurement is only one-quarter as stable as that of a similar bandwidth Keysight oscilloscope. When sampling at 40 GSa/s (Figure 3b) with the interleaved ADC technique, we can clearly see the waveform distortion produced by the Tek DSO, and the stability of the Vp-p measurement is similarly poor. This is counter-intuitive: most engineers would expect to get more accurate and stable measurements when sampling at a higher sample rate with the same oscilloscope. The main reason this measurement is inconsistent with intuitive expectations is poor vertical and/or timing alignment of the real-time interleaved ADC system.

Figure 3b: A 2.5-GHz sine wave captured using a Tektronix 2.5-GHz sine wave sampled at 40 GSa/s (interleaved)

Sine wave testing does not truly identify the source of distortion, but only shows the effects of the various errors/components of distortion. However, spectrum/FFT analysis can correctly determine the components of distortion, including harmonic distortion, random noise, and cross-sampling distortion. When using a sine wave generated by a high-quality signal generator, there should be only one frequency component in the input signal. Any frequency components other than the fundamental frequency detected by performing an FFT analysis on the digitized waveform are distortion components introduced by the oscilloscope.

Figure 4a: FFT analysis of a 2.5-GHz sine wave captured with a Keysight Infiniium oscilloscope at 40 GSa/s.

Figure 4a shows the results of an FFT analysis of a 2.5 GHz sine wave captured in a single shot at 40 GSa/s using a Keysight Infiniium oscilloscope. The worst distortion spur is measured approximately 90 dB below the fundamental frequency. This distortion component is actually second harmonic distortion, most likely generated by the signal generator. Its magnitude is extremely negligible, even below the in-band noise floor of the oscilloscope.

Figure 4b: FFT analysis of a 2.5-GHz sine wave captured with a Tektronix oscilloscope at 40 GSa/s.

Figure 4b shows the results of an FFT analysis of the same 2.5-GHz sine wave captured in a single shot using a Tektronix oscilloscope, also sampling at 40 GSa/s. The worst distortion spur in this FFT analysis is measured approximately 32 dB below the fundamental frequency. This high level of distortion explains why the sine wave test (Figure 3b) produces a distorted waveform. This distortion frequency occurs at 7.5 GHz, which is exactly 10 GHz below the input signal frequency (2.5 GHz), but folded back into the positive domain. The next highest distortion component occurs at 12.5 GHz. This is exactly 10 GHz above the input signal frequency (2.5 GHz). Both of these distortion components are directly related to the 40-GSa/s sampling clock and its crossover clock frequency (10 GHz). These distortion components are not caused by random or harmonic distortion, but rather by real-time crossover ADC distortion.

Digital clock measurement stability comparison test

As a digital designer, you might say that you don't really care about distortion of analog signals such as sine waves. However, it is important to remember that all digital signals can be decomposed into an infinite number of sine waves. If the fifth harmonic of the digital clock is distorted, then the resulting digital waveform will also be distorted.

[1] [2]
Keywords:Oscilloscope Reference address:Oscilloscope crossover distortion test evaluates the relationship between oscilloscope sampling rate and sampling fidelity

Previous article:What details should be paid attention to when using the pre-distortion function of an arbitrary waveform generator?
Next article:Oscilloscope cross real-time sampling evaluates the relationship between oscilloscope sampling rate and sampling fidelity

Latest Test Measurement Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号