NI Home > Community > NI Discussion Forums
Reply
Member
pgraebel
Posts: 72
0 Kudos
Accepted Solution

XNET buffer problem

I am trying to sample a CAN signal at 5 KHz. The signal is sent every (approx.) 240 us (~4 KHz). I chose the mode Signal In Waveform with ResampRate with 5000, so that I get a resampling of the received signal at the desired rate of 5 KHz.

 

I get a read buffer overflow (-1074384885) if I try to wait for 1000 available values at each iteration (timeout -1). The program works, if I change the timeout to zero instead (i. e. read returns immediately, whether data is available or not). Number of Values Pending seems to have a value of 4294967295 most of the time.

 

What's the problem with read timeout here? Why does blocking Read not work? Is the sampling or the resampling killing my buffer?  I successfully use "1000 values/-1 timeout" with DAQmx Read. I want to sync DAQmx Read and XNET Read at 5 KHz later on, so that I get 1000 AI/CAN sample pairs with each read call.

 

 

 

xnet_screenshot.jpg

 

 

Member
pgraebel
Posts: 72
0 Kudos

Found the problem but can not explain

 

I found out that setting transmit time (XNET database editor) to a value greater than zero solves the problem. I am now able to read 1000 values via blocking calls (timeout -1).

 

I know that this settings is supposed to be the interval for cyclic transmission. I read in the documentation that it means "debounce time" for reading mode, but what does that mean? What's the ideal debounce time to set?

 

XNET-Editor.jpg

Member
Ceule
Posts: 21

Re: XNET buffer problem

Your observation is basically correct. For buffered input modes, the necessary internal queue size for the XNET internal buffers is calculated based on the assumed frame rate which is taken from the database (the transmit time entry). If the rate is slow, a missing transmit time (i.e. 0) will not harm since there is a minimum queue size of 64 frames, but if the frame rate is fast (e.g. in the kHz range as in your example), the default queue size will not be sufficient to accomodate enough frames for a blocking read.

The queue size is calculated for .4 sec worth of data on Windows, and .1 sec worth of data on RT, based on the frame rate from the transmit time property. So if the transmit time is set correctly, you should be able to block for that time. But you shouldn't push the limits on Windows... timing is somewhat arbitrary there, so you might see it work *sometimes* but fail else.

Member
pgraebel
Posts: 72
0 Kudos

Re: XNET buffer problem

[ Edited ]

Thanks for the explanation. I did not know that the transmit time has a direct influence on the internal frame queue size. I am using a PXI system with RT btw.

 

So Signal In Waveform mode has two queues, an internal frame queue and a subsequent resampling signal buffer? So that explains why setting the queue size via property node did not have any effect.

Member
Ceule
Posts: 21

Re: XNET buffer problem

Setting the queue size in waveform input mode suffers from the same problem.... you enter the queue size in samples (of the resampled data), but this has to be recalculated to a frame buffer in the firmware of the 'same size', i.e. the same amount of time. If the frame rate is not given (or given incorrectly), the calculation of the size of that internal buffer gives incorrect results and thus the internal queue size is incorrect.