12-31-2017 11:54 AM
Hemant_LV wrote:But Whether LabVIEW enqueue and dequeue functions execute at the same time (simultaneous, parallel ) as they are shared resource.
Or at a give point only one function will execute enqueue or dequeue.
That's probably a slightly different issue.
Yes, the queue data is a shared resource so accessing it needs to be done in a critical section. I am sure that LabVIEW implements all proper locking procedures to prevent parallel access. Do you have any example code that demonstrates any issue with this?
Are you trying to solve a specific problem?
12-31-2017 12:02 PM
@altenbach wrote:
Hemant_LV wrote:But Whether LabVIEW enqueue and dequeue functions execute at the same time (simultaneous, parallel ) as they are shared resource.
Or at a give point only one function will execute enqueue or dequeue.
That's probably a slightly different issue.
Yes, the queue data is a shared resource so accessing it needs to be done in a critical section. I am sure that LabVIEW implements all proper locking procedures to prevent parallel access. Do you have any example code that demonstrates any issue with this?
Are you trying to solve a specific problem?
Don't have any issue with this, just trying to understand the how queue functions operate and comparing with the RT Queue.
01-01-2018 05:23 AM
Hemant_LV wrote:Don't have any issue with this, just trying to understand the how queue functions operate and comparing with the RT Queue.
RT FIFOs are optimized with determinism in mind. They are limited in the data types and they preallocate the buffer memory. So you can think of the RT FIFO as a fixed sized queue that will not grow in memory. Not much else is different.
01-01-2018 05:30 AM
Thanks, Queue are shared resources thats why enqueue and dequeue does not execute concurrently.
RT queue are not shared resources thats why possibility is that dequeue and enqueue may execute in concurrent or parallelly to each other.
Is above points are correct?
01-01-2018 05:56 AM
I am really confused about what you mean by concurrently. I have loops sitting idle in the Dequeue Element and then another loop add data with the Enqueue Element. The two functions are being called at the same time, so I would consider this running concurrently.
Digging a little, I remember the other main difference: the RT FIFO polls for the data instead of going idle. This is a trade off between determinism vs CPU usage. Yes, the RT FIFO will use up a ton of CPU when waiting for data where the queue uses none (relies on an interrupt).
A little more details are in the help: Data Communication Methods in LabVIEW
01-01-2018 06:12 AM
@crossrulz wrote:
I am really confused about what you mean by concurrently. I have loops sitting idle in the Dequeue Element and then another loop add data with the Enqueue Element. The two functions are being called at the same time, so I would consider this running concurrently.
Digging a little, I remember the other main difference: the RT FIFO polls for the data instead of going idle. This is a trade off between determinism vs CPU usage. Yes, the RT FIFO will use up a ton of CPU when waiting for data where the queue uses none (relies on an interrupt).
A little more details are in the help: Data Communication Methods in LabVIEW
Let me explain by an example.
2 loops, first loop enqeueing the data and second loop dequeuing the data, running slowly then the first loop.
Is it possible that both loops will run parallelly, enqeueing the data and dequeueing the data at the same time , exactly at the same time.
Dequeue loop not sitting idle, have enough data to dequeue.
01-01-2018 06:25 AM
@LV_COder wrote:
Let me explain by an example.
2 loops, first loop enqeueing the data and second loop dequeuing the data, running slowly then the first loop.
Is it possible that both loops will run parallelly, enqeueing the data and dequeueing the data at the same time , exactly at the same time.
Dequeue loop not sitting idle, have enough data to dequeue.
They had better. That is the exact setup of a Producer/Consumer and Queued Message Handler. Inside of the queue functions there might be a small block, but those picoseconds are not worth arguing over. The RT FIFO will have the same issue (multiple places trying to access the same memory) if that is what you are looking for.
01-01-2018 06:38 AM
@crossrulz wrote:
@LV_COder wrote:
Let me explain by an example.
2 loops, first loop enqeueing the data and second loop dequeuing the data, running slowly then the first loop.
Is it possible that both loops will run parallelly, enqeueing the data and dequeueing the data at the same time , exactly at the same time.
Dequeue loop not sitting idle, have enough data to dequeue.
They had better. That is the exact setup of a Producer/Consumer and Queued Message Handler. Inside of the queue functions there might be a small block, but those picoseconds are not worth arguing over. The RT FIFO will have the same issue (multiple places trying to access the same memory) if that is what you are looking for.
Thanks for the information.
01-01-2018 09:32 AM
Hemant_LV wrote:Thanks for the information.
Proper ways than give thanks are with kudos and marked solutions.