LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

slow vi execution- data loss

Highlighted

the problem still remains.  i get 2 good points @ 2 samples to read, and then a 2 sec gap.  I put DAQ asst in while loop, set for continuous samples, and it flies.  everything I have read advises against that, but it is the only thing that works so far.

0 Kudos
Message 11 of 14
(154 Views)
Highlighted

Hmmm- Let see where you were with the timing fixed. that 200 @ 100hz was definately a slow down.  continious samples can cause some serious lag between when the vaules were taken vs when you got them out of the device memory.  and you may eventually fill the device buffer.

 

Is there a specific channel that slows down the task?

 

Another thing to try is to use "Controll task.vi" to "Commit" the task prior to entering the loop- this isn't the major issue though.  unless you found a really odd bug

0 Kudos
Message 12 of 14
(147 Views)
Highlighted

Ideally, I would like a timed loop, or 1 sample on demand with a timer.  I played with 10, 20 , 50, 100, and 200 samples to read, but they all come with a 2-12 sec lag between sample groups. at least the continuous samples have no data loss. all channels are sampling at 100/sec with only a few ms lag between DAQ units. I didn't try it earlier mainly b/c every sample code I see is with the task being passed into the loop.  Maybe an on demand setup with an inline timer being passsed into the loop?  For now, I am up and running, but as I add more features, the producer loop structure may become more critical.  I think the way I pass the task into the loop is the issue.  Going back to what you said earlier,  it's almost as if the task is recreated after n samples are taken.  more channels, more lag.   For example, using my test setup of one channel, I monitored items in Queue in producer and consumer loop.  producer set @ 100/sec and consumer on timed loop.  when the consumer timing matched the data gaps (31ms), the enque and deque numbers match.  that means to me that the que was only recieving data at 31ms increments, despite the higher rate of 100/sec.  regardless of the final issue, I learned quite a bit and using continuous samples gets me running, giving me the time to tweak this, so thank you for the time you are investing into this discussion.

0 Kudos
Message 13 of 14
(140 Views)
Highlighted

Not a problem at all! glad to hep!

 

I learn new things all the time by investing in the forums.  (If I only had my own mistakes and misundrestandings to learn from I'd be really handicaped since, of course, I'm nearly perfect Smiley Very HappySmiley Very HappySmiley Very Happy )

 

 

Glad you are up for now.  THAT is important! and sometimes the WHY is more self-pedantic than functional.  when it is time to dig deeper we will both know more. 

 

Keep posting!

 

0 Kudos
Message 14 of 14
(137 Views)