Quantcast
Channel: Multifunction DAQ topics
Viewing all articles
Browse latest Browse all 6878

DAQ 6255 time interval accuracy

$
0
0

I am trying to determine the timing accuracy of a measured time interval using the 6255 DAQ with an A/D differential signal on 1 channel.  The sampling rate is 1.25 MS/sec. The specifications state the Timing Accuracy as 50 ppm of sample rate.  Since the frequency generator base clock accuracy is also 50 ppm, I interpret this to mean the sample period of 0.8 us is also accurate to 50 ppm.  Since my requirement is to measure 80 us or 100 samples, the error is (measurement * 50 ppm) = 80 * 0.00005 = 0.004 us.  This is consistent with Oscilloscope accuracy specifications.

 

This is fine but it does not address any quantization error in a single sample.  Oscilloscopes usually specify a factor multiplied by the sample period.  For example, a typical specification for an oscilloscope is (ppm * measurement) + (factor * sample interval) where sample interval is 1/ sample rate.  The factor is usually from 0.06 to 0.2 and depends on trigger errors, jitter, and software algorithms etc.  This factor term is usually the dominate error in the equation.  For example if I used a factor = 0.1, the error is (0.1 *0.8) = 0.08 which is significantly larger than 0.004 us.  Oscilloscopes use much high sampling rates so this is usually not a problem.

 

The current software algorithm for finding the leading edges is simple zero crossing from minus to plus and subtracting the corresponding times for the measurement interval.   Please let me know if what I have stated is correct and do you have any insight into the second factor term.  I believe the total accuracy error is understated.  Some of these issues were previously touched on in 5114 Frequency Accuracy discussion.

 

John Anderson


Viewing all articles
Browse latest Browse all 6878

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>