LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Streaming DAQ on a [ms] level

Hello,

 

I have a DAQ of 72 PXI channels with minimal sampling rate of 1000 S/s. Attached is PXI_SyncExtTrig_Start-Stop_VIEW.vi which starts DAQ upon rising edge trigger and ends it upon level change on a DI line. The reason for the latter is the lack of STOP trigger functionality on my PXI DSAs. The DAQ design forces me to scan sample-by-sample in order to avoid acquisition after the level change. Here comes the problem - if I put:

 

# of Scans per Chunk = 1 (on the front panel)

 

The VI is unstable due to the TDMS accessing disk every [ms] (mili second) and fails randomly but never achieves minutes of DAQ. It becomes stable if the chunks are raised to 10:

 

# of Scans per Chunk = 10

 

Now the reason I post this here! I have to stream to ethernet DAQ synchronously with a video. The network-streaming system packs DAQ samples along with channels names etc. but again EXPECTS SINGLE SAMPLES OF ALL INCLUDED CHANNELS per iteration. This means that I again need to either:

 

1) process scanned chunks sample-by-sample and stream them to ethernet (10 iterations within a single DAQ iteration) or

 

2) to scan sample-by sample, stream them to ethernet immediately and than pack them by 10 to ensure stable streaming to disk

 

I have tried the 2nd approach (attached in the PXI_SyncExtTrig_Start-Stop_VIEW_SINGLE.vi) but it is not stable - it fails soon upon start with error msg reporting loss of samples (DAQ cannot catch-up with the timing inside the main loop).

 

My questions is how to solve this properly?

 

Thank you in advance,

0 Kudos
Message 1 of 7
(2,286 Views)

When you say "The network-streaming system packs DAQ samples along with channels names etc. but again EXPECTS SINGLE SAMPLES OF ALL INCLUDED CHANNELS per iteration," what do you mean by the 'network-streaming system'?  What does this system consist of (software and hardware)?

 

Thanks,

 

Sean

Applications Engineering Specialist - Semiconductor Test
National Instruments
0 Kudos
Message 2 of 7
(2,262 Views)

Hi Sean,

 

It is a VI that accepts formatted data and knows how to stream it to the recepient (Java client). Data is a string consisted of the names of all channels (originating from an array of single channel names - no expanded selection), single samples of all involved DAQ channels and a time-stamp (UTC format). It is forwarded to a server-daemon VI that forwards it to the remote clients that have established link with it.

 

So, basically I may have a threading problem but have no knowledge how to solve it in that sense. I have 3 "threads" - (1) the DAQ that ideally should perform single sample scan, (2) theTDMS streaming those samples to disk and (3) TCP streaming of them.

 

Jack Hamilton suggested I should use multithreading by using separate lops synchronized by queues but I am not familiar to that concept. Is that feasible and how can I do that? PS: the desktop connected to the PXI is a single-core Dell P4 I believe, purchased in 2003, with SCSI etc.

 

Thanks in advance,

0 Kudos
Message 3 of 7
(2,258 Views)

Sean,

 

I have implemented queue for multi-loop synhcronization but there seem to be a phantom problem:

 

See attached figure.

 

I have now 2 problems with this implementation:

 

(1) producer (upper) loop sometimes stops for reason that "samples attempted to be read are no longer available" which is unbeleivable considering the low/minimal rate of just 1kS/s acquisition

 

(2) more important, when producer loop stops, the local variable [Stopped] receives TRUE inside it BUT DOES NOT PASS ITS NEW VALUE in its consumer loop, so the lower loop NEVER stops and the VI halts and needs to be terminated manually

 

I have used probes to be sure the local does not pass between loops. Can this be solved? If this wueueing concept worked I had planned to also put my TDMS inside the consumer loop.

 

Any ideas?

Download All
0 Kudos
Message 4 of 7
(2,239 Views)

1) Which DAQmx read does this occur at? The one for the AI or for the DI? If it's for the AI, then you are only reading one sample at a time and your buffer is overflowing (so the sample is no longer available).

 

2) It looks like you used a "Nor" so if you set the value to "True", the value of the "Nor" that is passed is false.

Applications Engineering Specialist - Semiconductor Test
National Instruments
0 Kudos
Message 5 of 7
(2,224 Views)

Sean,

 

I am embarrassed:

 

(1) Scan of 1 S/ch was the cause of this problem - I have increased chunks to 10 S/ch/scan and stream to ehternet decimated scans (1 of 10)!

 

(2) Timeout of dequeue was infinite - I put 100 ms and everything runs smooth, even with the TDMS streaming  inside the consumer loop and along with the network stream.

 

Sorry for the inconvenience and thank you for your support!

 

Sincerely,

Roman

0 Kudos
Message 6 of 7
(2,219 Views)

No problem, just glad you got it working!

 

Thanks,

 

Sean

Applications Engineering Specialist - Semiconductor Test
National Instruments
0 Kudos
Message 7 of 7
(2,195 Views)