From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

Timed loop to better than OS millisecond level software

Background:

I'm a physicist so the question I'm asking goes beyond simpling solving the technical issue in front of my face, I'd like to know a little more of the bigger picture too.

 

The job I've been given is to generate a LabVIEW + hardware combination which can simultaneously do 3 operations:

a) run a loop at 100 Hz sampling 6 channels of analog voltage

b) run an independent 60 Hz loop which governs

     i) sampling 2 channels of analog voltage

     ii) stepping of 3 digital outputs through the bit values which roll from 000 to 111 via simple increments 001, 010 ...

 

My experience:

I've been writing LabVIEW code actively from 1998 to 2010 (LabVIEW 4 to about 😎

from 2010-2012 I've been mostly maintaining other authors code or inheriting their complete systems

so not done much programming till recently

 

In 2013 had to write some realtime LabVIEW code on a compact RIO system and that all worked ok

 

What I've tried to solve the above problems

1) First I inherited a measurement computing 1608fs box which has digital outputs and analog inputs

I used the MCC ULx libraries and had expected to delegate a task for the 100 Hz, a task for the 60 Hz and a task

for the 60 Hz output and just let the 1608fs get on with running the three tasks - this did not seem to be supported

for timing coming from the box itself

 

In the end I resorted to a LabVIEW software timed loop using the OS based 1 kHz clock and used wait until

multiples of 10 ms for the 100 Hz and 16 ms for the (roughly 60 Hz)

 

The software loop proved unsatisfactory (acquisition was too slow and input channels had too much lag from one channel to the neighbour) so I was just abandoning that when an NI USB 6218 arrived to take the place of the measurement computing box

 

I'm now trying to write some LabVIEW to run the 6218 via DAQmx functions

I tried copying elements of this example  

http://zone.ni.com/reference/en-XX/help/370466V-01/mxcncpts/hwtimediotimedloop/

 

only initiated my task as digital output but the subVI DAQmx Timing: Sample clock throws an error -200077

which claims that sample clock is not an option and that the default "On demand" should be used

 

Since "On demand" doesn't show up - I'm assuming hardware driven timing of the Timed (while) loop isn't

going to be supported with this hardware.

 

Questions:

A) Is it really not possible to use a USB 6218 to drive some hardware timed loops - do I have to resort to the feeble OS driven 1 kHz clock - even though the 6218 device is fairly expensive and has onboard clocks 

 

B) I realise I could do some fancy channel to channel triggering either with physical metal wires from terminal to terminal or perhaps some software initiated connections - but all this user intervention seems excessive,

isn't there a simple way of delegating to the USB 6218 a hardware timed task and letting it just get on with it?

 

C) So far the only platforms where I've discovered for sure that hardware timing is supported is on a realtime system

such as compact RIO (even though I appreciate realtime and hardware timing are separate - albeit overlapping concepts) but these are very expensive - a compact RIO chassis and 3 modules is several thousand pounds

 

Is it really the case that to get hardware timing one should expect to pay way more than 1000 dollars or pounds

to even get going?

 

I would have thought I'm missing something simple and basic - but I can't tell what it is, nor where to find it

so I've turned to the wisdom of this forum

 

Thanks

0 Kudos
Message 1 of 4
(3,476 Views)

The analog parts should be relatively easy. Run one AI task for all channels at 300 Hz (300 = least common multiple for 60 and 100). Discard (or average) the samples you do not need.

 

The USB-6218 has only software timed digital outputs although you can route some counter and frequency generator lines to digital outputs. You will not get reliable 60 Hz operation with software timing on any non-real-time OS. Jitter will be quite noticeable and occasional long delays may occur. If it is feasible for you, I would suggest that a simple counter IC be used to generate your count to 8 outputs. You could control the clock via one of the counters on the USB-6218 and possibly could use a static digital line for control logic (Rest or Run/Hold).  You could also possible use the two analog output lines, which have hardware timing, but you would still need external circuitry to make good logic-level signals. And that only gets you two of the three lines you need.

 

Unfortunately, it appears that NI does not offer an inexpensive solution for hardware timed digital input or output.  Some X series devices such as the USB-6341 do have that capability but the price is above your $1000 US threshold. This has been a frustration for me as well.

 

I cannot answer your questions about timed loops, but I think the latency in the USB drivers at the OS level preclude their use as timing sources.

 

Lynn

0 Kudos
Message 2 of 4
(3,470 Views)

Dear Lynn,

 

thanks for the comprehensive answer.

I had come to the same conclusions regarding 300 Hz - but felt my original message was already long enough so didn't elaborate

although with an OS based loop 300 Hz was going to be more like 3 ms = 333.333 Hz + significant jitter

 

It is reassuring to understand that I'm not the only person who is amazed by this apparent gap in provided technology.

 

I'm going to fall back on making a compact RIO implementation for now - just to have something ready to hand over

and then worry about more complex hardware implementations along the lines you've mentioned for the USB device

if that proves necessary

 

I would have thought that this missing gap in the market would be filled in by some company or other selling

something at around the 800 dollar region, with readable internal timing which can drive a labview loop and offers

flexible digital I/O and some analog sampling up to +/- 10 V

 

perhaps there's something very awkward in the firmware / software interaction between such hardware and the OS which is deeply buried

but proves too difficult to implement so cheaply - because it doesn't seem that the hardware should cost so much - perhaps someone from NI might comment

0 Kudos
Message 3 of 4
(3,455 Views)

Hi,

 

Firstly, with regards to the assumption that 300 Hz will actually be 333.33 + jitter. That is actually not entirely true as the internal Analog Input timing will be used rather than the OS timer. From the specification sheet, it is rated at 250 kS/s, essentially meaning it's more around 40 uSec, in other words just a fraction of the 300Hz you are planning to run at (~~0.36Hz).

 

Have you thought about using a PCI card solution for a hardware-timed approach? There are some products in this category that fall within your suggested budget of 1000 USD / GBP, such as the NI-6221 (http://sine.ni.com/nips/cds/view/p/lang/en/nid/14132).

 

However this is only one of possibly many solutions. To get a broader view of all the products National Instruments sell, I strongly suggest calling your local NI branch and having a talk with an ISR. They will surely have something to offer within your price range.

 

 

______________________________________________________

Mark N
Applications Engineer
National Instruments UK & Ireland
0 Kudos
Message 4 of 4
(3,436 Views)