System clock optimization can improve system performance, but it can be challenging. It is relatively easy to design an encoding circuit for an analog-to-digital converter with a jitter of 350 femtoseconds (fs), but is this sufficient for today\'s high-speed requirements? For example, when testing the AD9446-1001 (16-bit 100 MHz ADC), using a 100 MHz sampling clock frequency in the Nyquist zone, 350 fs of jitter will degrade the signal-to-noise ratio (SNR) by about 3 dB. If the same device is tested in the third Nyquist domain with a 105 MHz analog input signal, the SNR can degrade by as much as 10 dB. To reduce clock jitter to 100 fs or less, designers need to understand where the clock jitter comes from and how much jitter the ADC can tolerate. If you discover that the clock circuit performance is limited by jitter after the circuit design is complete, and that the problem could have been easily avoided during the design phase, it is already too late. …
You Might Like
Recommended ContentMore
Open source project More
Popular Components
Searched by Users
Just Take a LookMore
Trending Downloads
Trending ArticlesMore