Digital I/O

cancel
Showing results for 
Search instead for 
Did you mean: 

Calibrated frequency measurements with CompactRIO?

Hi folks,

Our company is going through the process of ISO:9001 certification and the issue of calibrated Test & Measurement (T&M) equipment has been raised. My manager is rather gun-shy now about everything being in cal. We just bought a CompactRIO system with the NI 9402 high-speed digital I/O module. See http://sine.ni.com/nips/cds/view/p/lang/en/nid/204678

 

The issue is whether we can use this module to take official frequency readings or generated specific pulses to put in a test report or something similarly official. I've asked NI about it and the several AEs there have found my question rather unusual and their response is various flavors of "we don't calibrate digital modules". Back in my EMC testing life if I was creating a pulsed waveform I would use a calibrated oscilloscope as the standard, but I am loathe to hook up an oscilloscope and control that extra piece of equipment to verify waveforms if I can just rely on the cRIO system to do the work accurately.

 

Does anybody else have experience with dealing with ISO:9001 calibrated T&M equipment requirements in a case like this? Is there a justification for not performing calibration? My gut feeling is that SOMETHING to do with timing needs to be calibrated, whether that be the FPGA backplane clock or whatever else the 9402 module is using for timing.

 

We also have the NI 9269 and 9222 Analog modules that both came with calibration certificates (as one would expect). None of the other digital modules we purchased came with calibrations, but that is usually understandable since it generally just works or it doesn't (see what I did there?), and isn't a question of calibrated digits.

 

Thanks!

Ryan Rutledge

____
Ryan R.
R&D
0 Kudos
Message 1 of 10
(6,042 Views)

Ryan,

 

I wonder if you are asking the wrong questions?

 

Since you are specifically interested in frequency (and pssibly pulse timing) measurements, the digital input and output parts are not particularly relevant.  I suspect that the AEs you  talked to were thinking "amplitude," which is mostly irrelevant and not "timing," which is.

 

What is relevant is the accuracy and stability of the timebase, and the latency or delays involved in getting signals into or out of the DIO modules.  The specifications list a maximum propagation delay of 55 ns, which is longer than one clock cycle at the maximum clock frequency, while the typical delay is much shorter than one clock period.  In my mind that suggests a possible latency of 1 or 2 two clock cycles, and that it could possibly vary between modules or over temperature.

 

And who knows what the "pulse width distortion" specification means.  I have been working with digital logic devices since shortly after logic ICs came on the market and I have never heard of such a specification.  At least they could show a timing diagram defining it.

 

I think NI does have some people who do think in terms of calibration and certification.  The people who work on digital IO all day are probably not them.

 

Lynn

0 Kudos
Message 2 of 10
(6,034 Views)

Lynn,

Thanks for your response. I think I explained to NI very clearly that the issue was with timing, not amplitude, but I think you have a firm grasp of the situation. I don't care about propogation delay or variation thereof unless it is short enough to matter, or I am measuring in such short and frequent bursts as to make such jitter an issue. The critical parameter is the accuracy of the timebase, and I can find nothing regarding that. Is it implied that since I'd using LabVIEW Real-Time that all time-related data that I gathered will be absolutely accurate? If it does I would be astounded because such accuracy does not exist in my world as of yet.

 

Ryan

____
Ryan R.
R&D
0 Kudos
Message 3 of 10
(6,027 Views)

Ryan,

 

Some things NI does very well. Specifying timing is not one of them.  Unfortunately for people who think in terms of metrology, the digital people usually do not.

 

Just out of curiosity I looked up the specs on one of the cRIO chassis (cRIO-9081/82). The specs for the reconfigurable FPGA give timebase accuracy as 100 ppm (max) and jitter as 250 ps on the 40 MHz timebase and 450 ps on the 80 MHz timebase.  In the Internal Real-Time Clock section they specify 140 ppm or 35 ppm at 25 degrees C.

 

I am not sure how you know what clock you have for your DIO module, but it does not look like anyone expected to use these devices for frequency measurements!

 

Lynn

0 Kudos
Message 4 of 10
(6,016 Views)

I am really baffled that this hasn't come up before, and I don't mind telling you that it's got me stumped. I guess I might have to call an ISO accreditation agency and pick their brains on how to get legal.

____
Ryan R.
R&D
0 Kudos
Message 5 of 10
(6,014 Views)

i agree that if you are using the cRIO equipment for production testing that the pertinent measurement or stimulus should be verified.  we would do this with a transfer standard (calibrated piece of equipment) like  a scope or function generator and a procedure to verify the timing characteristics of the cRIO.  record the results of your verification and file them with the rest of your calibration certs as proof.

i say verify as opposed to calibrate because in this case, you will not be compensating anything, just verifying operation.  if it does not verify, you will likely send it out for repair as opposed to calibration.

however, since the fpga is a software defined instrument, it is very easy to assume that your programmed intent is actually what is going on which is another good reason to verify assumptions.

Stu
Message 6 of 10
(5,997 Views)

@stu@viewpointusa.com wrote:

i agree that if you are using the cRIO equipment for production testing that the pertinent measurement or stimulus should be verified.  we would do this with a transfer standard (calibrated piece of equipment) like  a scope or function generator and a procedure to verify the timing characteristics of the cRIO.  record the results of your verification and file them with the rest of your calibration certs as proof.

i say verify as opposed to calibrate because in this case, you will not be compensating anything, just verifying operation.  if it does not verify, you will likely send it out for repair as opposed to calibration.

however, since the fpga is a software defined instrument, it is very easy to assume that your programmed intent is actually what is going on which is another good reason to verify assumptions.


 

That might be what I end up doing around the same time we calibrate the analog modules for the cRIO - break out the oscilloscope, have a little function that generates a 5 MHz waveform, and another that another that captures the frequency of the cal output on same oscilloscope.

____
Ryan R.
R&D
0 Kudos
Message 7 of 10
(5,994 Views)

Six years later, I have the same problem - how to check the clock frequency in the FPGA. Why not just bring the clock out to a pin? Well, NI's digital output cards are ridiculously slow. A 9375 card has a 1KHz max clock output rate.

0 Kudos
Message 8 of 10
(2,803 Views)

Hey Khalsans:

 

     I used the DAQmx property node of Maximum Rate to obtain that maximum update rate of the NI 9375 and got 10 MHz. Here is a screenshot of the code (which is mostly based on an example). 

 

Max_Rate.pngMax_Rate2.png

0 Kudos
Message 9 of 10
(2,789 Views)

The data sheet lists the update rate as 500 μs (DO), 7 uS (DI).  That seems ridiculously slow. 10 MHz seems a bit fast, considering it's probably using a serial link on the backplane, but doable.

0 Kudos
Message 10 of 10
(2,783 Views)