LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

implicit counter operation

Hello,

     I have a question on how the implicit timing on a frequency input works.  I am inputting a straight frequency into a counter input on a cDaq unit.  I set it up using the 20Mhz Timebase. and counter 0, However If for example I input a 10Khz frequency and read data from the read counter vi every 10ms I would expect to see 10,000 * .010 samples or ~100 samples per read right?  I am only seeing about 10 samples per read though.  Am I not figuring something right?  I am setting up the sample clock as implicit using parameters (continuous, 200000 buffer size).

0 Kudos
Message 1 of 4
(2,652 Views)

Are you using the counter on-board the cDAQ chassis or on a cDAQ module? Which device model? Can you post your code of how you are configuring?

David H.
National Instruments
0 Kudos
Message 2 of 4
(2,610 Views)

Hello,

     I'm using the built in counters on a cDaq-9174,  However I've realized what my problem was.  When I configured the counter for a two counter measurement task I inadvertently left the time measurement paramater at the default of 1ms.  My reads make sense now because no matter how fast I input the frequency it measures over 1 ms.  Once I've changed this value to match the time period between pulses that I was inputing the amount of data that I was reading was accurate to my calculations.  However if I have a varying frequency my hands are tied to use only the single counter measurement due to this particular parameter affecting the results by pretty much requiring a consistant frequency input in order to calculate the frequency accurately.  Is this true?  Also an issue with using a single counter measurement task is that the signal seems to have a measurement error that increases as the frequency increases.  If I input a frequency signal of 1 Mhz the data will read as much as 100 hz +-.  Maybe not that bad but close.

0 Kudos
Message 3 of 4
(2,602 Views)

You should read the section in the 917x User Manual entitled Choosing a Method for Measuring Frequency.  In summary:

 

The one counter method loses accuracy as frequency increases.

 

The two counter (large range) method is the same as the one counter method but it takes the measurement over N consecutive periods which allows you to increase accuracy at the expense of measurement time.  To get a reasonable accuracy at high frequencies however would require your low frequency measurements to take a very long time.

 

The two counter (high frequency) method loses accuracy as frequency decreases.

 

The sample clocked method has good accuracy at all frequency ranges and maintains consistent measurement time.

 

 

 

The sample clocked method should be the obvious choice for measuring a large range of input signals.  However, it comes with the caveat that your input signal must be faster than the sample clock or you will receive an error and have to restart the task (which might not even be an issue depending on synchronization requirements).  You'll probably end up using a second counter to generate the sample clock.

 

 

Best Regards,

John Passiak
0 Kudos
Message 4 of 4
(2,591 Views)