LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

NI9205: Setting Minimum Time Between Conversions

Good morning,

I’m searching a possibility to set the minimum time between conversions (NI9205) during the runtime in the block diagram. It’s important to edit this setting without editing the module properties dialog and recompiling the VI.

Thanks for all suggestions.

 

Greetings

Marcel

0 Kudos
Message 1 of 6
(3,124 Views)
Can you use an external sample clock?

Mike...

Certified Professional Instructor
Certified LabVIEW Architect
LabVIEW Champion

"... after all, He's not a tame lion..."

For help with grief and grieving.
0 Kudos
Message 2 of 6
(3,101 Views)

Hi Marcel,

 

unfortunately there is no Node where you could set this Property programmatically. I was wondering what your purpose behind your request is, why do you need to change this property on runtime? Maybe there is another solution. In case you missed it, here is some further information to this Property:

 

Conversion Timing for the NI 92059206 (FPGA Interface) - CompactRIO Reference and Procedures (FPGA Interface) - National Instruments
http://zone.ni.com/reference/en-XX/help/370984R-01/criodevicehelp/conversion_timing/

 

 

Best Regards,

 

Gregor

0 Kudos
Message 3 of 6
(3,075 Views)

Thanks for reply,

 

the main problem is, that I've to modify an existing project and I would like to edit it as little as possible.

 

The current programm requests 1..32 AD-conversions parallel in seperate nodes and I need to slow down the time between the channels because of ghosting caused by high-impedance (MOhm) inputs. Depending on the measurment task a higher sampling rate could be needed. The sampling rate should be set without recompiling the module.

 

Is there a way to set/slow down the sampling rate without modifying the parallel structure of the current programm?

 

Marcel

0 Kudos
Message 4 of 6
(3,054 Views)

Hi Marcel,

 

sorry for the late reply. From my understanding, the the system choses always the fastest sample rate from the two configuration possibilitys. That means, if you specify a minimum time between conversions of 10µs and you have 10 channels, the rate is 100µs. Now if you call the Node inside a loop and specify a faster loop rate, for example 50µs, The minimum time between conversions will be overwritten. 

 

A possible option for you would be to set the minimum time between conversions to a very slow rate and then control the actual rate with the loop timer.

0 Kudos
Message 5 of 6
(2,882 Views)
As problematic as it might be Marcel, you are going to have to change the code. The settling time that the DAQ drivers provide are adequate to handle the settling of NIs input logic. There is no way that the driver can know what sensors you have attached or how fast they can be sampled.
If your inputs take (for example) 500ms, that defines how fast you can sample, period. There is no magic to be had here. If you need data faster than that, you need better sensors, better wiring, or a better interface technique.

Mike...

Certified Professional Instructor
Certified LabVIEW Architect
LabVIEW Champion

"... after all, He's not a tame lion..."

For help with grief and grieving.
0 Kudos
Message 6 of 6
(2,872 Views)