LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Missing data when using large arrays

When saving data, I have this time gap with missing data.  (i.e. .7 secs, then a jump in time, starts saving data again at 1 sec, jump in time, etc...)  I am saving an array size of 25000x3.  I know this array is taking alot of memory, but I need there to be no time gap or a time gap that is >.1 secs.  I was wondering if anyone has advice on how this can be done?
 
Thanks,
TK
0 Kudos
Message 1 of 6
(2,574 Views)

"I was wondering if anyone has advice on how this can be done?"

Yes we do! Smiley Happy

Unforntunately, I do not know which of those advices to offer without doing a LOT of guessing. Smiley Sad

If you post the VI (and its sub-VI's) in a reply to this thread, we will be able to take a look at your code and offer the appropriate advice.

Ben

PS If you are new to this forum, you can attached your VI using the "attachment" box located below the "Submit Post" button.

Message Edited by Ben on 10-04-2006 02:24 PM

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 2 of 6
(2,566 Views)
Just a quick idea, I once noticed that large arrays may cause problems and queues can be far better than large arrays 😉

Maybe this though is out of the ballpark but I wonder if somebodyelse is using queues instead of large array Smiley Tongue ?

Message Edité par TiTou le 10-05-2006 03:14 PM


We have two ears and one mouth so that we can listen twice as much as we speak.

Epictetus

Antoine Chalons

0 Kudos
Message 3 of 6
(2,549 Views)
Here is a simplified program.  Make sure the boolean 2 button is off when you start it.
 
When you run it look at the clock, you will notice that it gathers data multiple times (about 5 ot 6 sets) per second.  When you turn on the saving block, boolean 2 button on, you notice the clock slows down only gathering 2-3 sets per second.  Now this is a simplified program, the main program is running at the same speed with saving off, but then jumps down to only saving 1 set per second when saving.  I really need to get continuous data with no time gaps.
 
One thing I was doing was using simple vi's instead of express vi's.  I'm trying to do that now with the DAQ cards.  The daq cards I have are simulated as well through labview.
 
Let me know if you need more info.
 
Thanks,
Travis
0 Kudos
Message 4 of 6
(2,531 Views)

Sorry, I don't have and DAQ drivers installed, so I cannot test. I'm just looking at your code and a few things stick out.

  1. Don't you want some kind of synchronization between signal generation and acquisition? At the moment, it is undefined which one starts first.
  2. It might help to use low-level DAQ. How often do the inputs change?
  3. For large datasets, and if speed is important, you should work with simple datatypes. All these dynamic data conversions are very expensive.
  4. Btw: much more of the code can go inside the "save data" case. It is not needed to convert and merge all that data if you throw it away anyway (not that this would help reducing the save speed, but it probably would speed things up if you don't save). 😉
  5. Keep your representations aligned. For example, why is the array indicator next to the save operation set to EXT???
  6. It seems silly to keep a loop running without any defined pacing. What determines your loop rate?
  7. Write to a plain binary file. All that spreadsheet formatting is very (very!) expensive for large datasets. In any case, you should open the file outside the loop (or whenever the filename, etc. changes), then keep the file open for subsequent write operations. Use the "append?" button to either write at the beginning or end of the file (tweak code as needed). Your high-level file function opens and closes the file with each iteration!

Good luck! I'm sure others have more comments. 🙂

Message 5 of 6
(2,525 Views)

Thanks for the help.  I'm new to labview so I'm going to ask some questions about it.

 

1.  Don't you want some kind of synchronization between signal generation and acquisition? At the moment, it is undefined which one starts first.

How can I synchronize which starts first?  Do I have to put in "Wait" loops?  I did notice that my 6143 was starting first before my signal was being sent out.

2.  It might help to use low-level DAQ. How often do the inputs change?

I might be confusing low-level DAQ vi's with the Top-Level.  I started with the Express Vi for the cards and noticed it was taking up ~2MB of memory.  I went to the pallete and pulled up the AO an AI vi's (start, stop, write, etc...) and ran it.  It still took up ~2MB so I ditched them.  Was I using the right vi's?  DAQmx?

3.  For large datasets, and if speed is important, you should work with simple datatypes. All these dynamic data conversions are very expensive.

When I wire it up, it automatically converts it.  How do I go about stopping this?

4.  It seems silly to keep a loop running without any defined pacing. What determines your loop rate?

Not sure what you mean by determining loop rate.  Right now, we just want this to run continously and save without time gaps.  Later were just going to run this for 10sec once saving has begun. 

5.  Write to a plain binary file. All that spreadsheet formatting is very (very!) expensive for large datasets. In any case, you should open the file outside the loop (or whenever the filename, etc. changes), then keep the file open for subsequent write operations. Use the "append?" button to either write at the beginning or end of the file (tweak code as needed). Your high-level file function opens and closes the file with each iteration!

I tried this before, but when trying to read it back, it didn't give me the same vaules that I should be getting.  I'll try this again.

Thanks,

Travis

Message Edited by guilio_2000 on 10-05-2006 12:27 PM

0 Kudos
Message 6 of 6
(2,506 Views)