12-09-2020 02:19 PM
I have a data acquisition loop, running at 1000 samples/second. I am using a queue to pass a single command into the loop. Unless I set the dequeue wait time to about 30mS, the message just disappears. I know it is being put in the queue in the message loop, or at least I know it is being presented to the enqueue.
I want the dequeue wait time as short as possible, so as not to slow down my acquisition. If it is set to long enough to catch the data my DAQ card buffer overflows.
Where is my message going? If it's on the queue why does the dequeue need to wait a long time to get it? Can't it just pick it up the next time around?
Thanks
12-09-2020 03:48 PM
I suggest attaching a VI so we can SEE what you are trying to describe in words. What is the purpose of the message you are sending?
Note, you can set a timeout value on a Dequeue function to 0 if you want. Just monitor the Timed Out? boolean to see if it timed out or received an element in the queue. With a 0 value, the Dequeue Element function will not "wait" at all.
12-09-2020 04:42 PM
I second the motion. We need the code. I could speculate several things, but the code would most likely quickly eliminate (at least) most of them as suspects.
FWIW, I've had schemes with calculated timeouts that were allowed to go as low as 0 msec. Short timeouts didn't cause a loss of queue elements. That isn't a limitation built into the queue primitives. There's gonna be some other explanation.
-Kevin P
12-09-2020 06:06 PM - edited 12-09-2020 06:07 PM
I don't even know why your DAQ buffer overflows. A queue is designed specifically to gather info as fast as possible so this doesn't happen. Dequeuing has absolutely nothing to do with why your DAQ buffer overflows. (On the other hand, an improperly managed queue could eat up all available memory on your PC if you let it.)
edit: nvm, I misunderstood the post.
12-11-2020 10:10 AM
Guys, I thank you for your interest, but I changed direction. The message was a request for a small bit of data being produced in the main loop. I got fed up and did what I should have done, which is an LV2-style global. Works like a champ. KISS.
I don't want to copy the code from my stand-alone machine, but FWIW the request was sent via TCP. The TCP listener was in a separate message handling loop. The queue was used to send the request into the main loop and data back out to the message handler, and back over the TCP connection. The request would make it out of the dequeue only if the timeout was set to more than 100ms. Then the whole Rube Goldberg system worked. Except my main loop was slowed by the long timeout.
I know you want to see the code. but I'm moving on with my life. Thanks again.
12-11-2020 10:11 AM
Sorry, I meant 30ms as in the original post. But whatever.
12-11-2020 10:56 AM
@AndyTailored wrote:
I got fed up and did what I should have done, which is an LV2-style global. Works like a champ. KISS.
If it is a simple Get/Set, you might as well just use an actual Global Variable. A Look At Race Conditions
12-11-2020 11:08 AM
Do I really want to chime in on this????
@AndyTailored wrote:
Sorry, I meant 30ms as in the original post. But whatever.
Yes, I think that I do. For everyone else...this is a good BAD example of a post! The OP brain deaded some code and won't share the information to help you!
12-11-2020 09:09 PM
@AndyTailored wrote:
Guys, I thank you for your interest, but I changed direction. The message was a request for a small bit of data being produced in the main loop. I got fed up and did what I should have done, which is an LV2-style global. Works like a champ. KISS.
I don't want to copy the code from my stand-alone machine, but FWIW the request was sent via TCP. The TCP listener was in a separate message handling loop. The queue was used to send the request into the main loop and data back out to the message handler, and back over the TCP connection. The request would make it out of the dequeue only if the timeout was set to more than 100ms. Then the whole Rube Goldberg system worked. Except my main loop was slowed by the long timeout.
I know you want to see the code. but I'm moving on with my life. Thanks again.
I think even an FGV is Rube-ish. Do you know why they call it a "LV2-style global"? Because, in LV2 you didn't have real globals. You may as well just use a "regular" global and dispense with all that overhead.